Student-centered active learning (AL) instruction privileges the role of collaboration, reflection and discourse in and out of the classroom. There is accumulating evidence that active learning techniques, when implemented correctly, positively impact the learning of subject matter (including student success as well as understanding fundamental concepts), motivation towards the subject matter, and retention in STEM (science, technology, engineering and math)-based programs [1-7]. Peer Instruction (PI) is one widely adopted, in-class AL approach. Championed by physicist Eric Mazur at Harvard University, PI allows students with opposing viewpoints to discuss and debate in class . In the context of postsecondary physics, it is becoming more common for students to engage with concepts of classical mechanics, electromagnetism and wave theory via PI in-class. As in-class participation becomes more centered on students and less on teacher-driven lectures, instructors are increasingly looking for ways to engage their students in PI-like interactions outside of class. The ability for students to use a PI approach outside of class, either as homework or preparatory work, requires innovative solutions to produce the types of rich conversations and reflection generated in the voting, discussion and re-voting processes. It is precisely this need that drove our development of myDALITE (mydalite.org), a free web-based learning platform that provides students with asynchronous PI that can be used in-class, but more importantly, out-of-class. This paper presents the design and use of myDALITE along with quantitative and qualitative findings supporting the continued use and development of this platform to support active learning.
Following the PI approach, myDALITE engages students in 1) written explanation – students must provide rationales for their answers; 2) comparison of these rationales to those of peers – after submission of their rationale, students are presented with rationales from peers for their answer and, similarly, rationales for the most popular other answer; and 3) reflection on the quality of the rationales – students can answer the question again, choosing a different rationale, keeping their original rationale or even submitting a new rationale. These three features are grounded in theories of learning that emphasize the importance of intentional reflection , explanation , and categorizations such as compare and contrast tasks . The database of student-generated rationales, generated from the activity described above, becomes an integral part of the PI approach that makes myDALITE unique. Through this unique database of written explanations for conceptual multiple-choice questions student-generated answers and rationales, students share their understandings with peers asynchronously. Furthermore, myDALITE assignments provide teachers with insight into what their students are thinking, before and/or after classroom instruction, allowing for identification of misconceptions that often go undiagnosed in traditional multiple-choice assessments.
CONTENT FLOW IN MYDALITE
Activity in myDALITE starts with the class being provided with a myDALITE assignment, typically a collection of 3-5 conceptual multiple-choice questions intended to activate thinking on a particular topic, either as pre-class preparation or post-class review. Individual students can then log into the system using a computer or mobile device. Each time a student answers a question they are asked to provide a written rationale then reconsider their answer and explanation by looking at a sampling of what other students have chosen and said. A maximum of four peer-generated rationales are shown for two of the possible answers (the student’s chosen answer and one other); these samples are taken from the myDALITE database. The decision of which rationales to show is based on an algorithm, which either selects two of the top voted rationales and two randomly selected, or all randomly selected. The student-generated rationales become part of the myDALITE database and are eventually used in the myDALITE script for future students. We elaborate on the steps within this asynchronous PI script below.
Step 1: Multiple choice and short answer questions
In the myDALITE platform, recall that students log into the system and are directed to a prepared assignment that consists of sets of multiple choice questions, what we are calling a collection, similar to those used by in-class PI. Students are then asked to follow the sequence of six steps. In the first step presented below, students select an answer for a multiple-choice question, similar to the first step in PI. We have selected a thin lens examples to illustrate these steps.
Step 2: Providing a rationale to peers
Each student answering a question must provide an explanation of the reasoning behind the answer given (see Figure 2 below). This rationale is saved into a database that stores the answers of all students having previously responded to this question. This database serves as the basis from which the system can provide students with differing viewpoints.
Steps 3 and 4: Exposing students to rationales that are both aligned and counter to theirs
In Steps 3 and 4, students are asked to reconsider their original answer in the context of other rationales for their own answer, and a similar selection of rationales for an alternative answer. This purposeful comparison is designed to provide the variety that is sometimes missing in face-to-face PI. Step 4, students re-select their answer, choosing to stay with their original selection or change, based on the rationales (see Figure 3).
In questions that have an identifiable correct answer, such as the one displayed, students are presented with two sets of student rationales. One set will always be the same category of response as the student. If that answer choice is not correct, the student is also presented with a group of rationales for the correct answer. However, if the student originally chose the correct answer, that student is also presented with a set of rationales for the most popular wrong answer. Hence, students who answer correctly are not moved onto the next question but instead are exposed to the thinking of their peers as they would be in class-base PI. This process enables the students to contextualize their thinking in the broader context of what other students are thinking, just as they would in real-time in-class PI.
Steps 5 and 6: Students are shown their first and second answers as well as correct answer
In the previous steps of the asynchronous PI process in myDALITE, students selected the most convincing student rationale. Having completed the process of selecting answers and reflecting on explanations of their peers, in the last two steps, students are shown what their initial response was, followed by their second response to display the change (or lack thereof) in their thinking. Finally, the correct answer, or alternately an expert rationale, is provided to the student to close this assessment cycle.
Note that myDALITE allows for alternative scripts and question types that are not PI in nature. In such scripts, questions do not need to be multiple choice; instead they can have text boxes for students to respond to a prompt (e.g., open-ended questions, reflections, journaling). Figure 5 presents an example of a specific question requiring a short explanation.
Enabling “Flipped Classrooms”
As the community of users grows, teachers are finding novel ways to use myDALITE to “flip their classrooms.” Flipped Classroom is an approach to instruction that allows teachers to replace traditional lectures with strategies that engage students in learning activities instead of relying on a transition of knowledge model [12-13]. With this approach, learning starts outside the classroom by providing students with readings, videos and/or practice exercises to introduce new content and processes. The myDALITE system offers teachers a set of features that allow them to verify whether students are completing these assigned pre-class work and better still, whether or not they are learning. One of these features is the assignment gradebook (see Figure 6), the other is a feature for in-class polling using the same database of questions, this is called Blink (see Figure 7). Below we present an example of this possible flipped classroom script using a sequence of myDALITE questions designed to confirm understanding as well as prepare for in class activity.
• Before class, the teacher distributes a pre-instruction reading assignment. After completing the assigned reading students are asked to respond to a myDALITE question designed to assess students’ level of comprehension by asking them to describe their perceived understanding, based on a three-point scale (very clear, somewhat clear, very fuzzy). In addition, they are asked to accompany this with a written reflection (their rationale).
• Students are then asked several other myDALITE conceptual questions to prime them for the upcoming lecture.
• Once the assignment is completed (before class), the teacher accesses their myDALITE gradebook, and gets an idea of which concepts students have understood and which are difficult and need greater focus in the upcoming lecture.
• Instructors can also see which questions have the greatest number of transitions towards the incorrect answer for each of the questions thereby needing further practice for greater evidence of understanding – i.e., signs of learning. This can be set up as a series of in-class peer-instruction questions, also using myDALITE, in its web-based, mobile ready clicker functionality, Blink.
DOES MYDALITE WORK?
Representing five sections of an introductory college-level Mechanics course, 137 students, across three similar institutions, participated in a semester long study using myDALITE (called simply DALITE at the time of the study, therefore we will refer to the tool as DALITE in the upcoming sections 3 and 4). Learning gains were assessed using the Force Concept Inventory (FCI)  as a pre-instructional and post-instructional measure. The FCI is a 30-item multiple-choice questionnaire and the most widely used and researched assessment of Newtonian concepts . The FCI results of these DALITE students were compared to two comparison cohorts. Cohort 1 came from a large database consisting of FCI results from similar populations of post-secondary students (N=2913). We argue that these data represented students taught with a variety of pedagogical approaches and thereby provided an unbiased comparison. Cohort 2, the “peer instruction no-DALITE” group, used a purposeful sampling method, that is, we knowingly selected students from classes where the teachers used AL instruction, PI specifically, but did not use DALITE. These students were part of two sections, one taught by a teacher in one of three post-secondary institutions involved in the study and another taught by a teacher at a larger institution of higher education (N=188). Comparison to such a sample is critical to ensure that the tool, and not the pedagogy, is responsible for any changes to the learning outcome – i.e., ensures our results are authentic and meaningful. Full details and results of this study are reported elsewhere  therefore we provide only the highlights.
Results of this study show that the DALITE students outperformed the regular control group (0.47±0.02 vs 0.35±0.006; p<0.000). The results show no differences between the conceptual gain of the DALITE students and those who had inclass PI (0.47±0.02 vs 0.48±0.02; p=0.84). In other words, students using DALITE (N=137) in their college courses do not differ significantly in conceptual gains (p=0.38) from students who used real-time in-class PI (N=188). The results show a surprising similarity between four of the five sections, with a small difference for one section (g5). The differences between the DALITE sections are not statistically significant (g1=0.50; g2=0.50; g3=0.47; g4=0.48; g5=0.38; p=0.06). The small but not significant difference for section g5 is likely due to the somewhat lower number of DALITE questions assigned by that teacher. A plausible explanation is that those students may not have adopted a culture of using the system and therefore were less motivated to learn from its use.
QUALITATIVE RESULTS FROM MYDALITE
Twenty-six post-instruction interviews were conducted with students from the five course sections that used DALITE. Using a purposeful sampling method  aimed at identifying the different ways students had generated the rationales, we selected a range of students, some who wrote long rationales and others who wrote short ones. The interview data provides us with information about the specific features designed into DALITE and how students used them. These data suggest that students were motivated by a variety of reasons. The most prominent were related to the following features of the DALITE script: (1) prompting for explanation; (2) providing an opportunity to compare with peers. For instance, students referred to how they were “learning how to learn” when doing their DALITE assignments. In response to the question, “Why did you write longer than average rationales,” a student answered:
So I was trying to explain it to myself. I wanted to get all the points out and didn’t want to leave anything out because I would print out the notes after and study that. So if I only had 2 sentences, I’d go back to it [the notes] and say I don’t understand so when I wrote out my thought process when I would go back to it [DALITE]. Next time it would be a lot easier to understand the material. I believe that’s why I write a lot.
Other students stated that DALITE provided a low-risk environment. Students were encouraged to express their understandings without the punitive specter of grades and judgement. Although grades can be assigned in DALITE for correct answers, in this study, students were graded on participation only. One student in particular, who wrote progressively longer rationales throughout the term talked about how it encouraged her to express herself:
I think it was my confidence in my physics knowledge towards the end of the course… I was kind of tentative at first. And, I was kind of like figuring out DALITE, theory, all that’s happening. So I would write a little sentence but by the end I was so used to the process and had knowledge so I put everything down, everything I could possibly think of. It’s almost [as if] I had lost the fear of being wrong, which is cool… It’s like thinking out loud.
Another finding from the interviews was the growing sense of increased self-regulation and awareness of one’s explanation. By reading their peer’s rationales, some students began wanting to model good explanations both for themselves and others. One student stated that reading “choppy” rationales changed the way she wrote her own.
I used to write short rationales just thinking why I thought this was the answer, but now I explain the concept behind it and everything, so I give more detailed rationales… at first I found [rationales] like all over the place and choppy, but then I got used to [rationales] being somebody’s thinking, so it’s easier to read now…. Since you have to present [your rationale], you have to say “ok this is what we think and why.” It organizes your thoughts.
UNLOCKING NEW TERRAINS FOR RESEARCH
The data generated by myDALITE is fueling its own research program in learning analytics. A preliminary study of the language used by students in their rationales showed that strong students wrote rationales that were more convincing than those of weaker students, even when they were justifying wrong answers . Follow-up work showed that these strong students were using the same vocabulary as the weaker ones , which is driving current work in better understanding the evolution of syntactic patterns students use as they move from novice to experts in different disciplines. As the user base grows, myDALITE.org has become a service with its own need for machine learning based solutions to operational concerns, such as quality control of the database. Very recent work has examined the effectiveness of text clustering-based methods for automatically filtering student rationales that are inappropriate or irrelevant .
We have provided a description of a free and open web-based platform that allows instructors to use asynchronous peer instruction. Questions in our database can produce different levels of engagement (as measured by length of rationales, and time spent reviewing peer rationales). The strength of the myDALITE platform is how it brings together and can promote synergies between three of education’s major stakeholders: 1) students, who are prompted to engage in cognitive processes at the higher levels of Bloom’s taxonomy (analysis and evaluation); 2) teachers, who have access to a growing free database of conceptually rich content, from a variety of disciplines, contributed to by an international community of teachers, which can be used to provide regular formative feedback to their students, with little to no additional infrastructure overhead; and 3) educational researchers, who can work with practitioners to analyze data streams and event logs to ensure that high quality content is consistently offered in a pedagogically sound form.
In the last two years, the database of questions in myDALITE has grown exponentially. The bank of questions is over 2400, with over 700 in Physics (representing Classical Mechanics, Electricity and Magnetism, Waves and Modern Physics). There are similar numbers in Chemistry and Biology, as well as many other subjects ranging from language arts to photography. The database contains over 145,000 rationales with over 200 teachers from a wide range of institutions. We are entering the next phase of development of this platform which will allow teachers to share within and between their collections thereby generating even better question types and better student-generated rationales. The continued goal of this project is to improve student learning through an evidence-based, free and open web-based platform that supports the use of active learning pedagogy.
The authors thank the Programme d’aide à la recherche sur l’enseignement et l’apprentissage (PAREA; grant PA2011-06), the Ministry of Education (MELS & MERST) program, in the Province of Quebec; the software programmer (2011-14) Edu.8 Development and (2017-19) Scivero; our research assistant Chao Zhang.