The Modern Physics Laboratory course is one in which many practical skills for professional physicists are learned. This is an upper division laboratory course where students are exposed to a variety of modern physics concepts primarily through guided laboratory experiments. The learning outcomes from this course, such as effective presentation in written and oral formats and the ability to design various apparatus to test hypotheses, are a substantial part of the major-level outcomes for the physics major at University of California, Merced. A cornerstone learning outcome for this course is the ability to write about laboratory activities in a technical report.
The technical reports are formatted to reflect the writing style of academic journal papers. Students cycle through six different experiments throughout the semester and are required to write a technical report in the same style for each experiment. Assessment of student performance on the technical reports in this course is aligned with the program learning objectives in the physics department. The exact language used in the Program Learning Outcomes of the UC Merced Physics Undergraduate Major states “students will be able to clearly explain their mathematical and physical reasoning, both orally and in writing.” The rubric and feedback provided to students on their technical reports reflects this objective.
The work presented here details our approach to providing feedback on the technical reports and our assessment of the efficacy of this feedback. Feedback is provided in a variety of ways, and is mostly formative, continuously throughout the semester. Students are expected to refine their technical writing by utilizing feedback from their previous report to better write the following report. In the first study we monitored student utilization of feedback and observed the correlation of that utilization with improvement in technical report scores. The sample size in this study was limited to eight students, nevertheless some correlation was observed. In a follow up study we monitored the correlation between two different forms of feedback, one very dense in commenting and the other brief in comparison. The aim of this study was to demonstrate a method for providing commented feedback on student writing that is scalable to a larger class size. Results from our study suggest that dense and detailed written feedback is an effective method of providing feedback to refine scientific writing skills and it can be effective after a single implementation.
In the first implementation, student reports were evaluated using a 15 point rubric where 10 of the points deal with providing critical information about their experiment and 5 points that dealt with clarity of the ideas and messages in the report. We applied a thorough feedback method for their technical reports wherein each paragraph was scrutinized and detailed comments were provided to students. Feedback was aligned with the expectations in the rubric but went beyond an explanation of why students were losing points. To further support the learning outcome of professionalism in their writing, formative feedback in the form of questions posed about vague areas of the report was provided. These questions were intended to call attention to issues with clarity and logical flow in their writing and to encourage students to think about how they can improve sections of the report. This questioning was combined with comments that contained examples of how to rephrase sentences to achieve a balance of specificity and generality. Summative feedback was provided on the overall impression of the reports, written at the end of the report or addressing the entire class.
The reports were delivered to instructors through an online platform. The online submission process allowed instructors to grade electronically as well as monitor whether students accessed the feedback. This was a valuable resource for determining which students were utilizing feedback by reading the comments posted to their submissions. A correlation was drawn between the utilization of feedback and gain in technical report grades. The number of times each student sought or utilized feedback was compiled. This included the number of times they viewed comments on their reports, the number of times they sent rough drafts for comments before the assignment was due and how many times students received one-on-one feedback outside of regular class time. The gain in technical report grades was calculated for each student with the following equation from Hake (2002).1 Individual student normalized gain, g, is defined as,
where %gain is the percentage gain of a students final technical report grade from their initial technical report grade and %gainmax is the maximum possible percentage of improvement in technical report grade for an individual student. This was used as a metric for quantifying student learning of technical writing skills.
The second implementation was motivated by providing dense and detailed feedback to larger classes. In order to address the scalability of our method we implemented a cross correlated feedback method with dense and detailed feedback (DDF) in the form of comments as in the previous study along with a more passive form of feedback as highlighted bullet points in a detailed rubric. The rubric used in the previous study was too brief to provide exceptional feedback in the form of highlights so we adopted a more meticulous rubric based on one published online from MIT OpenCourseWare.2 We provided the highlighted feedback on each section for each technical report throughout the semester however we staggered the implementation of the commented feedback to build section by section though the course of the semester. We assumed that the number of rubric highlights in a section would be indicative of student performance on that section of the report. This allowed us to observe the correlation of student performance with the provision of feedback in the form of commenting.
In the feedback utilization monitoring study, by mid-semester it could be seen that students ability to hone their writing skills in this course was found to be correlated with the amount of feedback they were actually utilizing. A correlation could be drawn for students that were utilizing the TA feedback with students who were improving their technical report grades. We presented this correlation to the students and many students in the class responded positively to this new information. There was a surge in students seeking one-on-one explanations of feedback on their technical reports. The average grade of the following technical report improved by fourteen percent compared to the previous report and by thirty-four percent compared to the first technical reports. To characterize this improvement over the semester duration, a plot of the individual student normalized gain vs. the number of times a student utilized feedback is shown below. It can be seen from this plot that in general students who are utilizing the provided feedback are also displaying improvement in their technical report scores. Students who are showing utilization of feedback more than six times over the course are showing the greatest gains in their technical report scores. Alternatively, students that are utilizing feedback minimally achieve a variety of personal gain values. This suggests that there may be some threshold of feedback utilization for the benefits to show in the technical report scores or that students are receiving high quality feedback that is not observable to the teaching assistant (TA). An example of the receipt of feedback that would not be recorded as utilization of TA feedback would be a peer review of a student’s technical report which likely occurs between lab partners in this course. Students may also be entering the course with a variety of writing skill levels and their score improvement by this measure is larger when they achieve a high score on their first report.
In the scaled feedback study we examined the total number of highlights per section for each iteration of the technical report. The number of these highlights showed a general decrease over the semester indicating improvement of writing as assessed by the rubric. This was then used as a metric to assess the efficacy of the commented feedback. We chose the Methods section for our first implementation of commented feedback because it is often the section that students are most prepared to write at the beginning of the course. The Results and Discussion sections were then commented on in the second and third technical reports. There was a fifty percent decrease in the number of highlights post-commenting for the Results section and a forty percent decrease for the Discussion section. The Abstract and Introduction section were commented on later in the semester in the fourth and fifth technical report. The number of highlights in these sections also decreases over time but less so in response to the dense and detailed commented feedback. The cause of this is likely due to students having received much feedback from the highlights and from summative feedback discussed during regular class time so their writing improved before the commented feedback was provided.
To further explain the effect of commented feedback we show three plots, Fig. 2 (a) (b) (c), of the total number of highlights in a given section of the lab report summed over the total number of students in the class versus the technical report numbers, from the first to the sixth. The plots show that for a given section there is almost no change in the number of highlights per report before the commented feedback is given followed by a dramatic decrease in highlights after the commented feedback is provided. This implies that students are able to better improve their written work when given dense and detailed feedback. In Fig. 2 (d) we show results for all sections of the report demonstrating the overall improvement of student writing over time.
The Modern Physics Lab course enables students to learn and practice scientific writing skills. This necessitates the careful attention of the course instructors to the needs of each student to drive them to develop these professional skills to their full potential. Here, we have provided students with dense, detailed and specific feedback from which they have advanced their skills. Student work clearly improved more for students that utilized our dense and detailed commented feedback than for those who did not. Additionally student work could be improved with a more gradual application of feedback which presents a scalable method for applying dense and detailed feedback. Another approach is to apply commented feedback to a full report early in the semester then to follow up with the highlighted rubric feedback and adjust the commenting application as needed. This time dependent application of feedback may also be useful for measuring the efficacy of other pedagogical activities.
The authors would like to acknowledge support from the Council of Graduate Schools (CGS) through the Undergraduate Learning Outcomes Assessment Certificate Program and from the University of California, Merced.
R. R. Hake, “Relationship of individual student normalized learning gains in mechanics with gender, high-school physics, and pretest scores on mathematics and spatial visualization,” Physics Education Research Conference (PERC), August 2002.Google Scholar