4.6 Article

Assessment of Video Accessibility by Students of a MOOC on Digital Materials for All

Journal

IEEE ACCESS
Volume 9, Issue -, Pages 72357-72367

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2021.3079199

Keywords

Visualization; Media; Streaming media; Electronic learning; Computer aided instruction; Task analysis; Synchronization; Accessibility; evaluation; human computer interaction; social; video

Funding

  1. Fundacion ONCE (ONCE Foundation)
  2. Real Patronato sobre Discapacidad (Royal Board on Disability) of the Spanish Ministry of Social Rights and 2030 Agenda

Ask authors/readers for more resources

The study addressed the challenge of multimedia accessibility assessment by involving a group of novice evaluators and applying a set of criteria for assessment.
The assessment of multimedia accessibility is a relevant, complex and time-consuming task, which takes more than simply checking whether the video has audiodescription and captions or not. In our study, we face this challenge through the: 1) involvement of a cohort of novice evaluators, who previously took part in a MOOC on the accessibility of digital content and 2) the division of the accessibility assessment into the application of a set of criteria. Two groups of novice accessibility testers were asked to evaluate the accessibility of two similar videos, one video per group. While both videos were equivalent in terms of their pedagogical content, only one of them had non-severe accessibility barriers for people with low vision and for blind people. Each participant was asked to rate qualitatively a set of statements extracted from the WCAG 2.1 success criteria, one generic statement about the video accessibility, and a set of statements on the quality perception and the aspects of personal preference. The largest differences in ratings occurred for statements whose success criteria had been improved. It was also the case for one success criterion that is understandable but hardly applicable by novice evaluators, according to the literature. However, the difference was statistically significant only for the success criterion with more salient differences between both videos. As a main conclusion, a group of novice evaluators can identify accessibility problems in videos when using specific accessibility statements.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available