Translator Disclaimer
17 March 2015 Parameterized framework for the analysis of visual quality assessments using crowdsourcing
Author Affiliations +
Proceedings Volume 9394, Human Vision and Electronic Imaging XX; 93940C (2015)
Event: SPIE/IS&T Electronic Imaging, 2015, San Francisco, California, United States
The ability to assess the quality of new multimedia tools and applications relies heavily on the perception of the end user. In order to quantify the perception, subjective tests are required to evaluate the effectiveness of new technologies. However, the standard for subjective user studies requires a highly controlled test environment and is costly in terms of both money and time. To circumvent these issues we are utilizing crowdsourcing platforms such as CrowdFlower and Amazon's Mechanical Turk. The reliability of the results relies on factors that are not controlled and can be considered “hidden”. We are using pre-test survey to collect responses from subjects that reveal some of the hidden factors. Using statistical analysis we build parameterized model allowing for proper adjustments to collected test scores.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Anthony Fremuth, Velibor Adzic, and Hari Kalva "Parameterized framework for the analysis of visual quality assessments using crowdsourcing", Proc. SPIE 9394, Human Vision and Electronic Imaging XX, 93940C (17 March 2015);

Back to Top