17 July 2014 Predicting scientific oral presentation scores in a high school photonics science, technology, engineering and mathematics (STEM) program
Author Affiliations +
Proceedings Volume 9289, 12th Education and Training in Optics and Photonics Conference; 92890O (2014) https://doi.org/10.1117/12.2070741
Event: 12th Education and Training in Optics and Photonics Conference, 2013, Porto, Portugal
Abstract
A hybrid teacher professional development, student science technology mathematics and engineering pipeline enrichment program was operated by the reporting research group for the past 3 years. Overall, the program has reached 69 students from 13 counties in North Carolina and 57 teachers from 30 counties spread over a total of five states. Quantitative analysis of oral presentations given by participants at a program event is provided. Scores from multiple raters were averaged and used as a criterion in several regression analyses. Overall it was revealed that student grade point averages, most advanced science course taken, extra quality points earned in their most advanced science course taken, and posttest scores on a pilot research design survey were significant predictors of student oral presentation scores. Rationale for findings, opportunities for future research, and implications for the iterative development of the program are discussed.
Gilchrist, Carpenter, and Gray-Battle: Predicting Scientific Oral Presentation Scores in a High School Photonics Science, Technology, Engineering and Mathematics (STEM) Program

1.

RELEVANT LITERATURE

Research indicates that the physical science workforce requires oral presentation skills (OPS)1. 80% of survey respondents ranked OPS highly important to their daily work. OPS were found to be important in academia and industry. OPS were rated as more important than writing skills and were rated second only to critical thinking. Compared to ratings concerning the training of the same skills in graduate school, only 62% reported in-depth OPS training. Training regarding oral presentations was the third highest rated skill, indicating a gap between work requirements and graduate training.

Previous research collected surveys from 87 engineers2. While 62% of those surveyed stated writing collaboratively as a job requirement, 78% of respondents indicated being required to give oral presentations. Multiple participants reported oral presentations as being required regularly. Out of thirteen areas suggested for improving engineering preparedness, OPS were mentioned most. Areas for improvement included keyboarding skills and training for preparing reports.

Previous research examined OPS in K-12 environments3, 4 focusing on eliminating problem behaviors or speaking anxiety. Research using college samples involved scientific and non-scientific presentation grading, courses for improving OPS, physiological responses while speaking, and public speaking anxiety5, 6, 7, 8, 9, 10, 12. Recent research has acknowledged the scarcity of studies on OPS13.

The intervention in question is a year-round science and information technology program for 69 high school students, their parents and 59 teachers. Goals are to prepare underrepresented minority high school students for careers in science, technology, engineering and mathematics (STEM) directly and to aid parents and teachers in preparing students indirectly. The five program components are recruitment and retention activities; physics content; teacher professional development; parental engagement; and dissemination and evaluation. Each program component is operationalized through the principles of hands-on investigations; engagement in a supportive yet challenging environment; participation in leadership and professional development training; and interacting with outside professionals.

It was predicted that student demographic information and post-test scores for photonics and research methods would reveal a significant predictive model of student OPS scores. It was hypothesized that both post-tests would be uniquely predictive and positively correlated to student scores. Another hypothesis was that previous exposure to the same research group would add significant predictive ability to this model. Previous exposure was hypothesized to increase OPS scores due to increased formal experience discussing science. As novel skills, discussing scientific information or speaking in public would be considered controlled processes. Performing multiple controlled processes can be difficult and taxing. Automatizing a task aids an individual who is attempting to perform multiple tasks at once14. Public speaking is a common fear, and extra exposure to science might reduce a participant’s speaking anxiety. It was also hypothesized that a student’s total number of extracurricular activities would improve the predictive ability of a model including demographic information.

2.

METHOD

This paper is part of a larger study following students for three years.

2.1

Participants

A total of 50 students participated in this research. The mean age for students was 14.5 years old (SD=0.61).

2.2

Procedure

Students delivered scientific presentations discussing self-directed projects at a program event. Presentations were on topics in the science, technology, engineering, and mathematics (STEM) areas. Presentations were graded by instructors using an Oral Presentation in Science Performance List Rubric15 consisting of 18 criteria graded on a 1-4 scale. Performance criteria were considered related to content and organization, presentation, or audience aspects of delivery. Instructors graded multiple presentations during the event, resulting in 2-5 grades per presentation. Analyses conducted used a student’s averaged presentation score.

2.3

Measurement

Ten predictor variables and one criterion were used in this research. The criterion was a student’s average oral presentation grade. Student cohort (0 for first or 1 for second) was included. Application age was a student’s age at the time they applied for the program. Information regarding weighted grade point average (G.P.A.), base science course, extra science course quality points, and total number of extracurricular activities were all taken from student applications. Extra science quality points were defined as the number of additional quality points earned by a class towards weighted G.P.A. (0-2). Base science courses were given numeric values based on their place in the North Carolina Standard Course of Study (NCSCOS). Previous participation in programs by the research group (previous exposure) was determined from records within the organization and scored in a binary manner.

3.

RESULTS

Multiple regression analysis was used to test hypothesis 1, that participant demographic and post-test information would produce a significantly predictive model of student OPS scores. The resulting model explained 56% of the variance in the dataset (F(8, 45)=5.98, p<.05). While a student’s cohort, age, weighted grade point average (G.P.A.), science course, extra quality points, and photonics knowledge post-test scores correlated positively with OPS grades, only initial G.P.A. (b=0.49, t=2.87, p<.05), science course (b=0.49, t=2.84, p<.05), and extra quality points (b=0.38, t=2.25, p<.05) were significantly predictive. While gender and research post-test scores correlated negatively with OPS grades, only research post-test scores (b=-0.42, t=-2.23, p<.05) were significantly predictive (see Table 3).

Multiple regression analysis was also used to test hypothesis 2, that both photonics and research knowledge post-test scores would be significant predictors of student oral presentation grades. While the model was significant, only research post-test scores (b=-0.42, t=-2.23, p<.05) were significantly predictive. Photonics post-test scores (b=0.13, t=0.86, p>.05) were not uniquely predictive of OPS scores.

Hierarchical regression testing investigated the hypothesis that previous exposure would be a unique predictor of OPS scores and would add a significant amount of variance to the model from hypothesis 1. Block 1 contained the variables of cohort, age, weighted G.P.A., gender, science course, extra quality points, photonics post test score, and research post test score. Block 1 was significantly predictive, explaining 56% of the variance in the data (F(8, 45)=5.98, p<.05). While cohort, age, G.P.A., science course, extra quality points, previous exposure, and photonics knowledge post test scores all correlated positively with OPS grades, only G.P.A. (b=0.49, t=2.87, p<.05), science course (b=0.49, t=2.84, p<.05), and extra quality points (b=0.38, t=2.25, p<.05) were uniquely predictive. Gender and research knowledge post test scores correlated negatively with OPS grades, but only research post test scores (b=-0.42, t=-2.23, p<.05) were uniquely predictive (see Table 2). Block 2 added previous exposure which correlated positively to the criterion, with a non-significant change in variance explained resulting (F(1, 36)= 5.18,p<.05). All unique predictors from Block 1 were predictive in Block 2.

Table 1

Results of Multiple Regression for Student Oral Presentation Scores using Demographic and Posttest Data

PredictorBSEβ
Cohort3.393.450.18
Application Age2.951.760.20
Initial Weighted GPA6.60*2.300.49
Gender-4.132.39-0.22
Base Science Course3.85*1.360.49
Extra Science QPs7.18*3.190.38
Photonics Posttest0.130.160.13
Research Posttest-1.57*0.70-0.42
R20.56

Note. *p<.05

Table 2

Results of Hierarchical Regression for Student Oral Presentation Scores using Demographic, Posttest Data, and Previous Intervention Exposure

Block 1Block 2
PredictorBSEβBSEβ
Cohort3.293.450.183.393.620.18
Application Age2.951.760.202.931.800.19
Initial Weighted GPA6.60*2.300.496.65*2.390.49
Gender-4.132.39-0.22-4.192.50-0.22
Base Science Course3.85*1.360.493.81*1.430.48
Extra Science QPs7.18*3.190.387.15*3.250.38
Photonics Posttest0.130.160.130.130.160.13
Research Posttest-1.57*0.70-0.42-1.55*0.72-0.41
Previous Contact0.292.800.02
R20.560.56
ΔR20.00

Note. *p<.05

Hierarchical regression investigated unique predictability for student extracurricular activities. Block 1 consisted of cohort, age, G.P.A., gender, science course, extra quality points, photonics post-test score, and research post-test score. Block 1 was significantly predictive, explaining 56% of the variance in the dataset (F(8, 45)=5.98, p<.05). Cohort, age, G.P.A., science course, extra quality points, previous exposure, and photonics post test scores correlated positively with OPS grades, but only G.P.A. (b=0.49, t=2.87, p<.05), science course (b=0.49, t=2.84, p<.05), and extra quality points (b=0.38, t=2.25, p<.05) were uniquely predictive. Gender and research post-test scores correlated negatively with OPS grades, but only research scores (b=-0.42, t=-2.23, p<.05) were uniquely predictive (see Table 3). Block 2 added the number of student extracurricular activities, resulting in a non-significant change in the amount of variance explained(F(1, 36)= 5.38, p<.05). Extracurricular activities correlated positively to OPS grades without being uniquely predictive. Unique predictors in Block 1 remained significantly predictive in Block 2.

Table 3.

Results of Hierarchical Regression for Student Oral Presentation Scores using Demographic, Posttest Data, and Total Number of Extracurricular Activities

Block 1Block 2
PredictorBSEβBSEβ
Cohort3.293.450.182.863.500.15
Application Age2.951.760.203.161.780.21
Initial Weighted GPA6.60*2.300.496.82*2.320.50
Gender-4.132.39-0.22-4.132.40-0.22
Base Science Course3.85*1.360.493.74*1.370.47
Extra Science QPs7.18*3.190.387.23*3.200.38
Photonics Posttest0.130.160.130.130.160.13
Research Posttest-1.57*0.70-0.42-1.60*0.71-0.43
Extracurriculars0.750.850.10
R20.560.57
ΔR20.01

Note. *p<.05

4.

DISCUSSION

The hypothesis that demographics and post-tests data would result in a predictive model of OPS scores was supported. The hypothesis that both post-tests would be significantly predictive of OPS scores was partially supported. Previous exposure and number of extracurricular activities failed to explain significant amounts of variance. Thus, hypotheses 3 and 4 were not supported.

Lack of significant results for student cohort suggests that iterations in the multi-year program may not have significantly improved an individual’s OPS scores. Future testing should investigate this further. Age was non-significantly correlated with OPS grades. This may be due to the limited age range of participants. Future investigation should consider including a wider age range of students. Non-significance of photonics post-tests suggests that while increased general knowledge of photonics contributed to grades, it was not essential for discussing specific projects. Future test versions could include open-ended questions to mimic students’ expressing original thoughts. Gender correlated non-significantly and negatively with OPS scores. This should be further investigated using between groups testing. Previous exposure correlated positively to OPS scores but was not significant. This suggests although multiple programs by the same research group contribute to developing OPS, their benefit was not strong enough to be detected. The number of extracurricular activities correlated positively but non-significantly with OPS scores. Future investigation should examine the differing utility of total number of extracurricular activities versus only science-based extracurricular activities. The finding that the research posttest was uniquely predictive suggests research-based extracurricular activities may be beneficial for developing OPS.

Results suggest students taking science courses further along the NCSCOS were better equipped to present information. An explanation for this finding is in students’ improved mental models for science due to increased exposure to scientific topics. Results suggest taking a more rigorous course is related to effectiveness at delivering oral scientific information. This may be due to increased breadth or depth of topics covered, or it may be due to the nature of the assignments in such courses. Future research should examine how advanced courses contribute to improving OPS. Results suggest higher performance in overall coursework reflects an increased ability to present information. Explanations include that students with higher G.P.A.s have more developed OPS independent of coursework, or that students with higher G.P.A.s may be more likely to take more advanced courses which have an increased probability of using assignments that require public dissemination of information.

While the research test predicted OPS score, it was the only negatively correlated predictor. It is possible the research test focuses on declarative knowledge concerning scientific presentations instead of procedural knowledge. It is also possible that the research knowledge test focuses on procedural knowledge that students are unable to express due to automaticity. One of the difficulties in working with subject matter experts is their inability to verbalize reasons behind their actions or opinions. One of the issues in designing training meant to alter behavior is difficulty in helping participants translate declarative information into appropriate procedural modifications16. It is possible that students who knew more about proper research may have focused on elements of their study that were uninteresting or beyond the scientific understanding of those grading them. Finally, it may be that those with an increased knowledge of proper research are more likely to have deficits in their public speaking or other social skills.

Findings here may be of interest to those involved with interventions for high school students, but this research was not without limitations. Longitudinal efforts should attempt to determine if the predictive value of variables identified in this research remains over time with the same students. The current research paired demographic application information with student test scores after treatment. Future research should attempt to replicate findings with more current information as course load rigor may change over time.

While the photonics knowledge test was positively correlated with OPS grade and the research knowledge test was a unique predictor of presentation score, both tests were developed in-house and are in need of validation. Future research should investigate the reliability and validity of these tools. The 10 judges averaged only 2-3 grades per student. Future research into student presentations should attempt to obtain a larger number of scores for each student, minimizing the risk of bias or noise due to individual judges.

The National Science Foundation Innovative Technology Experiences for Students and Teachers under the division of Research on Learning in Formal and Informal Settings fully funded the Photonics Leaders II Program Award #0833615.

REFERENCES

[1] 

Smith, S. J., Pedersen-Gallegos, L., & Riegle-Crumb, C. (2002). The training, careers, and work of Ph.D. physical scientists: Not simply academic. American Journal of Physics, 70(11), 1081-1092. doi: 10.1119/1.1510884.Google Scholar

[2] 

Keane, A., & Gibson, I. (1999). Communication Trends in Engineering Firms : Implications for Undergraduate Engineering Courses*. International Journal of Engineering Education, 15(2), 115-121.Google Scholar

[3] 

Scheeler, M. C., Macluckie, M., & Albright, K. (2010). Effects of Immediate Feedback Delivered by Peer Tutors on the Oral Presentation Skills of Adolescents With Learning Disabilities. Remedial and Special Education, 31(2), 77-86. doi: 10.1177/0741932508327458.Google Scholar

[4] 

Rickards-Schlichting, K.. A., Kehle, T. J., & Bray, M. A. (2004). A Self-Modeling Intervention for High School Students with Public Speaking Anxiety. Journal of applied school psychology, 20(2), 47-60.Google Scholar

[5] 

Gray, L. E., McCrorie, P., & Cushing, A. (1997). Presentation Skills: a Course for Students on Voice Production and Confidence-Building. Advances in Medical Education, 753-755.Google Scholar

[6] 

Langan, A. M., Wheater, C. P., Shaw, E. M., Haines, B. J., Cullen, W. R., Boyle, J. C., Penney, D., Oldekop, J. A., Ashcroft, C., Lockey, L., Preziosi, R. F. (2005). Peer assessment of oral presentations : effects of student gender, university affiliation and participation in the development of assessment criteria. Assessment & Evaluation in Higher Education, 30(1), 21-34. doi: 10.1080/0260293042000243878.Google Scholar

[7] 

Hay, I. (1994). Justifying and applying oral presentations in geographical education. Journal of Geography in Higher Education, 18(1), 43-55. doi: 10.1080/03098269408709236.Google Scholar

[8] 

Magin, D. J., Helmore, P. J., & Baker, J. E. (2001). Assessing students’ oral communication skills - which skills? 4th UICEE Annual Conference on Engineering Education, Bangkok, Thailand, 237-242.Google Scholar

[9] 

Magin, D., Helmore, P., & Barber, T. (2003). Differences in the criteria used by staff and students in assessing oral presentation skills. 6th UICEE Annual Conference on Engineering Education, Cairns, Australia, 263-266.Google Scholar

[10] 

Avery, S. (1999). Teaching Advanced Skills in English Studies: The Work of the Speak-Write Project. Innovations in education and teaching international, 36(3), 192-197. doi: 10.1080/1355800990360304.Google Scholar

[11] 

Fichera, L. V., & Andreassi, J. L. (2000). Cardiovascular reactivity during public speaking as a function of personality variables. International journal of psychophysiology : Official journal of the International Organization of Psychophysiology, 37(3), 267-73.Google Scholar

[12] 

Wigal, C. M. (2007). The use of Peer Evaluations to Measure Student Performance and Critical Thinking Ability. 2007 37th Annual Frontiers in Education Conference, S3B-7-S3B-12. doi: 10.1109/FIE.2007.4417964.Google Scholar

[13] 

De Grez, L., Valcke, M., & Roozen, I. (2009). The impact of an innovative instructional intervention on the acquisition of oral presentation skills in higher education. Computers and Education, 53(1), 112-120. doi: 10.1016/j.compedu.2009.01.005.Google Scholar

[14] 

Rosenshine, B., & Stevens, R. (1986). Teaching Functions. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 376-391). New York: Macmillan.Google Scholar

[15] 

Lantz, H. B. (2004). Rubrics for Assessing Student Achievement in Science Grades K-12. Thousand Oaks, CA: Corwin Press.Google Scholar

[16] 

Lippa, K. D., Klein, H. A., & Shalin, V. L. (2008). Everyday Expertise: Cognitive Demands in Diabetes Self-Management. Human Factors, 50(1), 112-120. doi: 10.1518/001872008X250601.Google Scholar

© (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Pamela Olivia Gilchrist, Pamela Olivia Gilchrist, Eric D. Carpenter, Eric D. Carpenter, Asia Gray-Battle, Asia Gray-Battle, } "Predicting scientific oral presentation scores in a high school photonics science, technology, engineering and mathematics (STEM) program", Proc. SPIE 9289, 12th Education and Training in Optics and Photonics Conference, 92890O (17 July 2014); doi: 10.1117/12.2070741; https://doi.org/10.1117/12.2070741
PROCEEDINGS
6 PAGES


SHARE
RELATED CONTENT


Back to Top