Translator Disclaimer
2 July 2019 Harnessing peer instruction in and out of class with myDALITE
Author Affiliations +
Proceedings Volume 11143, Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019; 111430Z (2019) https://doi.org/10.1117/12.2523778
Event: Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019, 2019, Quebec City, Quebec, Canada
Abstract
We present the design and use of myDALITE, a web-based learning platform that provides students with asynchronous peer instruction that can be used in- and out-of-class. Supporting active learning pedagogy, myDALITE engages students in 1) written explanation (rationales for their answers); 2) comparison of these rationales to those of peers; and 3) reflection on the quality of the rationales. Furthermore, myDALITE assignments provide teachers with the opportunity to see what their students are thinking, before and/or after classroom instruction.

1.

INTRODUCTION

Student-centered active learning (AL) instruction privileges the role of collaboration, reflection and discourse in and out of the classroom. There is accumulating evidence that active learning techniques, when implemented correctly, positively impact the learning of subject matter (including student success as well as understanding fundamental concepts), motivation towards the subject matter, and retention in STEM (science, technology, engineering and math)-based programs [1-7]. Peer Instruction (PI) is one widely adopted, in-class AL approach. Championed by physicist Eric Mazur at Harvard University, PI allows students with opposing viewpoints to discuss and debate in class [8]. In the context of postsecondary physics, it is becoming more common for students to engage with concepts of classical mechanics, electromagnetism and wave theory via PI in-class. As in-class participation becomes more centered on students and less on teacher-driven lectures, instructors are increasingly looking for ways to engage their students in PI-like interactions outside of class. The ability for students to use a PI approach outside of class, either as homework or preparatory work, requires innovative solutions to produce the types of rich conversations and reflection generated in the voting, discussion and re-voting processes. It is precisely this need that drove our development of myDALITE (mydalite.org), a free web-based learning platform that provides students with asynchronous PI that can be used in-class, but more importantly, out-of-class. This paper presents the design and use of myDALITE along with quantitative and qualitative findings supporting the continued use and development of this platform to support active learning.

Following the PI approach, myDALITE engages students in 1) written explanation – students must provide rationales for their answers; 2) comparison of these rationales to those of peers – after submission of their rationale, students are presented with rationales from peers for their answer and, similarly, rationales for the most popular other answer; and 3) reflection on the quality of the rationales – students can answer the question again, choosing a different rationale, keeping their original rationale or even submitting a new rationale. These three features are grounded in theories of learning that emphasize the importance of intentional reflection [9], explanation [10], and categorizations such as compare and contrast tasks [11]. The database of student-generated rationales, generated from the activity described above, becomes an integral part of the PI approach that makes myDALITE unique. Through this unique database of written explanations for conceptual multiple-choice questions student-generated answers and rationales, students share their understandings with peers asynchronously. Furthermore, myDALITE assignments provide teachers with insight into what their students are thinking, before and/or after classroom instruction, allowing for identification of misconceptions that often go undiagnosed in traditional multiple-choice assessments.

2.

CONTENT FLOW IN MYDALITE

Activity in myDALITE starts with the class being provided with a myDALITE assignment, typically a collection of 3-5 conceptual multiple-choice questions intended to activate thinking on a particular topic, either as pre-class preparation or post-class review. Individual students can then log into the system using a computer or mobile device. Each time a student answers a question they are asked to provide a written rationale then reconsider their answer and explanation by looking at a sampling of what other students have chosen and said. A maximum of four peer-generated rationales are shown for two of the possible answers (the student’s chosen answer and one other); these samples are taken from the myDALITE database. The decision of which rationales to show is based on an algorithm, which either selects two of the top voted rationales and two randomly selected, or all randomly selected. The student-generated rationales become part of the myDALITE database and are eventually used in the myDALITE script for future students. We elaborate on the steps within this asynchronous PI script below.

2.1.

Step 1: Multiple choice and short answer questions

In the myDALITE platform, recall that students log into the system and are directed to a prepared assignment that consists of sets of multiple choice questions, what we are calling a collection, similar to those used by in-class PI. Students are then asked to follow the sequence of six steps. In the first step presented below, students select an answer for a multiple-choice question, similar to the first step in PI. We have selected a thin lens examples to illustrate these steps.

Figure 1:

myDALITE screen capture for step 1.

00026_PSISDG11143_111430Z_page_2_1.jpg

2.2.

Step 2: Providing a rationale to peers

Each student answering a question must provide an explanation of the reasoning behind the answer given (see Figure 2 below). This rationale is saved into a database that stores the answers of all students having previously responded to this question. This database serves as the basis from which the system can provide students with differing viewpoints.

Figure 2:

myDALITE screen capture for step 2.

00026_PSISDG11143_111430Z_page_3_1.jpg

2.3.

Steps 3 and 4: Exposing students to rationales that are both aligned and counter to theirs

In Steps 3 and 4, students are asked to reconsider their original answer in the context of other rationales for their own answer, and a similar selection of rationales for an alternative answer. This purposeful comparison is designed to provide the variety that is sometimes missing in face-to-face PI. Step 4, students re-select their answer, choosing to stay with their original selection or change, based on the rationales (see Figure 3).

Figure 3:

myDALITE screen capture for steps 3 and 4.

00026_PSISDG11143_111430Z_page_4_1.jpg

In questions that have an identifiable correct answer, such as the one displayed, students are presented with two sets of student rationales. One set will always be the same category of response as the student. If that answer choice is not correct, the student is also presented with a group of rationales for the correct answer. However, if the student originally chose the correct answer, that student is also presented with a set of rationales for the most popular wrong answer. Hence, students who answer correctly are not moved onto the next question but instead are exposed to the thinking of their peers as they would be in class-base PI. This process enables the students to contextualize their thinking in the broader context of what other students are thinking, just as they would in real-time in-class PI.

2.4.

Steps 5 and 6: Students are shown their first and second answers as well as correct answer

In the previous steps of the asynchronous PI process in myDALITE, students selected the most convincing student rationale. Having completed the process of selecting answers and reflecting on explanations of their peers, in the last two steps, students are shown what their initial response was, followed by their second response to display the change (or lack thereof) in their thinking. Finally, the correct answer, or alternately an expert rationale, is provided to the student to close this assessment cycle.

Figure 4:

myDALITE screen capture for steps 5 and 6.

00026_PSISDG11143_111430Z_page_5_1.jpg

Note that myDALITE allows for alternative scripts and question types that are not PI in nature. In such scripts, questions do not need to be multiple choice; instead they can have text boxes for students to respond to a prompt (e.g., open-ended questions, reflections, journaling). Figure 5 presents an example of a specific question requiring a short explanation.

Figure 5:

myDALITE screen capture for a short explanation question.

00026_PSISDG11143_111430Z_page_6_1.jpg

2.5.

Enabling “Flipped Classrooms”

As the community of users grows, teachers are finding novel ways to use myDALITE to “flip their classrooms.” Flipped Classroom is an approach to instruction that allows teachers to replace traditional lectures with strategies that engage students in learning activities instead of relying on a transition of knowledge model [12-13]. With this approach, learning starts outside the classroom by providing students with readings, videos and/or practice exercises to introduce new content and processes. The myDALITE system offers teachers a set of features that allow them to verify whether students are completing these assigned pre-class work and better still, whether or not they are learning. One of these features is the assignment gradebook (see Figure 6), the other is a feature for in-class polling using the same database of questions, this is called Blink (see Figure 7). Below we present an example of this possible flipped classroom script using a sequence of myDALITE questions designed to confirm understanding as well as prepare for in class activity.

  • Before class, the teacher distributes a pre-instruction reading assignment. After completing the assigned reading students are asked to respond to a myDALITE question designed to assess students’ level of comprehension by asking them to describe their perceived understanding, based on a three-point scale (very clear, somewhat clear, very fuzzy). In addition, they are asked to accompany this with a written reflection (their rationale).

  • Students are then asked several other myDALITE conceptual questions to prime them for the upcoming lecture.

  • Once the assignment is completed (before class), the teacher accesses their myDALITE gradebook, and gets an idea of which concepts students have understood and which are difficult and need greater focus in the upcoming lecture.

  • Instructors can also see which questions have the greatest number of transitions towards the incorrect answer for each of the questions thereby needing further practice for greater evidence of understanding – i.e., signs of learning. This can be set up as a series of in-class peer-instruction questions, also using myDALITE, in its web-based, mobile ready clicker functionality, Blink.

Figure 6:

myDALITE screen capture for teacher gradebook (N = number of student responses; RR = right to right transition; RW = right to wrong transition; WR = wrong to right transition; WW = wrong to wrong transition).

00026_PSISDG11143_111430Z_page_7_1.jpg

Figure 7:

Blink screen capture as seen by the teacher.

00026_PSISDG11143_111430Z_page_7_2.jpg

3.

DOES MYDALITE WORK?

Representing five sections of an introductory college-level Mechanics course, 137 students, across three similar institutions, participated in a semester long study using myDALITE (called simply DALITE at the time of the study, therefore we will refer to the tool as DALITE in the upcoming sections 3 and 4). Learning gains were assessed using the Force Concept Inventory (FCI) [14] as a pre-instructional and post-instructional measure. The FCI is a 30-item multiple-choice questionnaire and the most widely used and researched assessment of Newtonian concepts [15]. The FCI results of these DALITE students were compared to two comparison cohorts. Cohort 1 came from a large database consisting of FCI results from similar populations of post-secondary students (N=2913). We argue that these data represented students taught with a variety of pedagogical approaches and thereby provided an unbiased comparison. Cohort 2, the “peer instruction no-DALITE” group, used a purposeful sampling method, that is, we knowingly selected students from classes where the teachers used AL instruction, PI specifically, but did not use DALITE. These students were part of two sections, one taught by a teacher in one of three post-secondary institutions involved in the study and another taught by a teacher at a larger institution of higher education (N=188). Comparison to such a sample is critical to ensure that the tool, and not the pedagogy, is responsible for any changes to the learning outcome – i.e., ensures our results are authentic and meaningful. Full details and results of this study are reported elsewhere [16] therefore we provide only the highlights.

Results of this study show that the DALITE students outperformed the regular control group (0.47±0.02 vs 0.35±0.006; p<0.000). The results show no differences between the conceptual gain of the DALITE students and those who had inclass PI (0.47±0.02 vs 0.48±0.02; p=0.84). In other words, students using DALITE (N=137) in their college courses do not differ significantly in conceptual gains (p=0.38) from students who used real-time in-class PI (N=188). The results show a surprising similarity between four of the five sections, with a small difference for one section (g5). The differences between the DALITE sections are not statistically significant (g1=0.50; g2=0.50; g3=0.47; g4=0.48; g5=0.38; p=0.06). The small but not significant difference for section g5 is likely due to the somewhat lower number of DALITE questions assigned by that teacher. A plausible explanation is that those students may not have adopted a culture of using the system and therefore were less motivated to learn from its use.

4.

QUALITATIVE RESULTS FROM MYDALITE

Twenty-six post-instruction interviews were conducted with students from the five course sections that used DALITE. Using a purposeful sampling method [17] aimed at identifying the different ways students had generated the rationales, we selected a range of students, some who wrote long rationales and others who wrote short ones. The interview data provides us with information about the specific features designed into DALITE and how students used them. These data suggest that students were motivated by a variety of reasons. The most prominent were related to the following features of the DALITE script: (1) prompting for explanation; (2) providing an opportunity to compare with peers. For instance, students referred to how they were “learning how to learn” when doing their DALITE assignments. In response to the question, “Why did you write longer than average rationales,” a student answered:

So I was trying to explain it to myself. I wanted to get all the points out and didn’t want to leave anything out because I would print out the notes after and study that. So if I only had 2 sentences, I’d go back to it [the notes] and say I don’t understand so when I wrote out my thought process when I would go back to it [DALITE]. Next time it would be a lot easier to understand the material. I believe that’s why I write a lot.

Other students stated that DALITE provided a low-risk environment. Students were encouraged to express their understandings without the punitive specter of grades and judgement. Although grades can be assigned in DALITE for correct answers, in this study, students were graded on participation only. One student in particular, who wrote progressively longer rationales throughout the term talked about how it encouraged her to express herself:

I think it was my confidence in my physics knowledge towards the end of the course… I was kind of tentative at first. And, I was kind of like figuring out DALITE, theory, all that’s happening. So I would write a little sentence but by the end I was so used to the process and had knowledge so I put everything down, everything I could possibly think of. It’s almost [as if] I had lost the fear of being wrong, which is cool… It’s like thinking out loud.

Another finding from the interviews was the growing sense of increased self-regulation and awareness of one’s explanation. By reading their peer’s rationales, some students began wanting to model good explanations both for themselves and others. One student stated that reading “choppy” rationales changed the way she wrote her own.

I used to write short rationales just thinking why I thought this was the answer, but now I explain the concept behind it and everything, so I give more detailed rationales… at first I found [rationales] like all over the place and choppy, but then I got used to [rationales] being somebody’s thinking, so it’s easier to read now…. Since you have to present [your rationale], you have to say “ok this is what we think and why.” It organizes your thoughts.

5.

UNLOCKING NEW TERRAINS FOR RESEARCH

The data generated by myDALITE is fueling its own research program in learning analytics. A preliminary study of the language used by students in their rationales showed that strong students wrote rationales that were more convincing than those of weaker students, even when they were justifying wrong answers [18]. Follow-up work showed that these strong students were using the same vocabulary as the weaker ones [19], which is driving current work in better understanding the evolution of syntactic patterns students use as they move from novice to experts in different disciplines. As the user base grows, myDALITE.org has become a service with its own need for machine learning based solutions to operational concerns, such as quality control of the database. Very recent work has examined the effectiveness of text clustering-based methods for automatically filtering student rationales that are inappropriate or irrelevant [20].

6.

CONCLUSIONS

We have provided a description of a free and open web-based platform that allows instructors to use asynchronous peer instruction. Questions in our database can produce different levels of engagement (as measured by length of rationales, and time spent reviewing peer rationales). The strength of the myDALITE platform is how it brings together and can promote synergies between three of education’s major stakeholders: 1) students, who are prompted to engage in cognitive processes at the higher levels of Bloom’s taxonomy (analysis and evaluation); 2) teachers, who have access to a growing free database of conceptually rich content, from a variety of disciplines, contributed to by an international community of teachers, which can be used to provide regular formative feedback to their students, with little to no additional infrastructure overhead; and 3) educational researchers, who can work with practitioners to analyze data streams and event logs to ensure that high quality content is consistently offered in a pedagogically sound form.

In the last two years, the database of questions in myDALITE has grown exponentially. The bank of questions is over 2400, with over 700 in Physics (representing Classical Mechanics, Electricity and Magnetism, Waves and Modern Physics). There are similar numbers in Chemistry and Biology, as well as many other subjects ranging from language arts to photography. The database contains over 145,000 rationales with over 200 teachers from a wide range of institutions. We are entering the next phase of development of this platform which will allow teachers to share within and between their collections thereby generating even better question types and better student-generated rationales. The continued goal of this project is to improve student learning through an evidence-based, free and open web-based platform that supports the use of active learning pedagogy.

ACKNOWLEDGMENTS

The authors thank the Programme d’aide à la recherche sur l’enseignement et l’apprentissage (PAREA; grant PA2011-06), the Ministry of Education (MELS & MERST) program, in the Province of Quebec; the software programmer (2011-14) Edu.8 Development and (2017-19) Scivero; our research assistant Chao Zhang.

REFERENCES

[1] 

Hake, R. R., “Interactive-engagement vs traditional methods: A six thousand-student survey of mechanics test data for introductory physics courses,” American Journal of Physics, 66 64 –74 (1998). https://doi.org/10.1119/1.18809 Google Scholar

[2] 

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., and Wenderoth, M. P., “Active learning increases student performance in science, engineering, and mathematics,” Proc. Natl Acad Sci USA, 111 (23), 8410 –8415 (2014). https://doi.org/10.1073/pnas.1319030111 Google Scholar

[3] 

Weiman, C. E., “Large-scale comparison of science teaching methods sends clear message,” Proc. Natl Acad Sci USA, 111 (23), 8319 –8320 (2014). https://doi.org/10.1073/pnas.1407304111 Google Scholar

[4] 

Watkins, J., and Mazur, E., “Retaining Students in Science, Technology, Engineering, and Mathematics (STEM) Majors,” Journal of College Science Teaching, 42 (5), 36 –41 (2013). Google Scholar

[5] 

Crouch, C. H., and Mazur, E., “Peer Instruction: Ten Years of Experience and Results,” American Journal of Physics, 69 970 –977 (2001). https://doi.org/10.1119/1.1374249 Google Scholar

[6] 

Lasry, N., Mazur, E., and Watkins, J., “Peer instruction: From Harvard to the two-year college,” American Journal of Physics, 76 1066 –1069 (2008). https://doi.org/10.1119/1.2978182 Google Scholar

[7] 

Zhang, P., Ding, L., and Mazur, E., “Peer Instruction in introductory physics: A method to bring about positive changes in students’ attitudes and beliefs,” Physical Review Physics Education Research, 13 010104 (2017). https://doi.org/10.1103/PhysRevPhysEducRes.13.010104 Google Scholar

[8] 

Mazur, E., “Peer instruction,” Upper Saddle River, NJ: Prentice Hall, (1997). Google Scholar

[9] 

Sinatra, G. M., and Pintrich, P. R., “The role of intentions in conceptual change learning,” Intentional conceptual change, 1 –18 Lawrence Erlbaum Associates, Inc., Mahwah, NJ (2003). Google Scholar

[10] 

Chi, M. T., Leeuw, N., Chiu, M. H., and LaVancher, C., “Eliciting self-explanations improve understanding,” Cognitive Science, 18 439 –477 (1994). Google Scholar

[11] 

Bransford, J. D., Franks, J. J., Vye, N. J., and Sherwood, R. D., “New approaches to instruction: Because wisdom can’t be told,” Similarity and analogical reasoning, 470 –497 Cambridge University Press, New York, NY, US (1989). https://doi.org/10.1017/CBO9780511529863 Google Scholar

[12] 

Bishop, J. L. and Verleger, M. A., “The flipped classroom: A survey of the research,” in Proc. American Society for Engineering Education (ASEE) National Conference Proceedings, (2013). Google Scholar

[13] 

Lasry, N. Dugdale, M. and Charles, E. S., “Just in time to flip your classroom,” The Physics Teacher, 52 34 (2014). https://doi.org/10.1119/1.4849151 Google Scholar

[14] 

Hestenes, D., Wells, M., and Swackhamer, G., “Force Concept Inventory,” The Physics Teacher, 30 141 –158 (1992). https://doi.org/10.1119/1.2343497 Google Scholar

[15] 

McDermott, L. C., and Redish, E. F., “Resource Letter: PER-1: Physics Education Research,” American Journal of Physics, 67 755 –767 (1999). https://doi.org/10.1119/1.19122 Google Scholar

[16] 

Charles, E.S, Lasry, N., Whittaker, C., Dugdale, M., Lenton, K., Bhatnagar, S., and Guillemette, J., “Beyond and Within Classroom Walls: Designing Principled Pedagogical Tools for Student and Faculty Uptake,” in Proc. The International Society of the Learning Sciences - Computer Supported Collaborative Learning (CSCL) Conference, (2015). Google Scholar

[17] 

Suri, H., “Purposeful sampling in qualitative research synthesis,” Qualitative research journal, 11 63 –75 (2011). https://doi.org/10.3316/QRJ1102063 Google Scholar

[18] 

Bhatnagar, S., Desmarais, M.C., Whittaker, C., Lasry, N., Dugdale, M., Lenton, M. and, Charles, E. S., “An analysis of peer-submitted and peer-reviewed answer rationales in a web-based Peer Instruction based learning environment,” in Proc. Conference on Educational Data Mining (EDM), 456 –459 (2015). Google Scholar

[19] 

Bhatnagar, S., Lasry, N., Desmarais, M.C., and Charles, E. S., “DALITE: Asynchronous Peer Instruction for MOOCs,” in Proc. European Conference on Technology Enhanced Learning - Adaptive and Adaptable Learning (EC-TEL), (2016). https://doi.org/10.1007/978-3-319-45153-4 Google Scholar

[20] 

Gagnon, V., Labrie, A., Desmarais, M.C., and Bhatnagar, S., “Filtering non-relevant short answers in peer learning applications,” in Proc. Conference on Educational Data Mining (EDM), (2019). Google Scholar
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Elizabeth S. Charles, Nathaniel Lasry, Sameer Bhatnagar, Rhys Adams, Kevin Lenton, Yann Brouillette, Michael Dugdale, Chris Whittaker, and Phoebe Jackson "Harnessing peer instruction in and out of class with myDALITE", Proc. SPIE 11143, Fifteenth Conference on Education and Training in Optics and Photonics: ETOP 2019, 111430Z (2 July 2019); https://doi.org/10.1117/12.2523778
PROCEEDINGS
10 PAGES


SHARE
Advertisement
Advertisement
Back to Top