Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Group vs Individual Web Accessibility Evaluations: Effects with Novice Evaluators
Download
index.pdf
Date
2016-11-01
Author
Brajnik, Giorgio
Vigo, Markel
Yesilada, Yeliz
Harper, Simon
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
46
views
0
downloads
Cite This
We present an experiment comparing performance of 20 novice evaluators of accessibility carrying out Web Content Accessibility Guidelines 2.0 conformance reviews working individually to performance obtained when they work in teams of two. They were asked to first carry out an individual assessment of a web page. Later on, they were matched randomly to constitute a group of two and they were asked to revise their initial assessment and to produce a group assessment of the same page. Results indicate that significant differences were found for sensitivity (inversely related to false negatives: +8%) and agreement (when measured in terms of the majority view: +10%). Members of groups exhibited strong agreement on the evaluation results among them and with the group outcome. Other measures of validity and reliability are not significantly affected by group work. Practical implications of these findings are that, for example, when it is important to reduce the false-negative rate, then employing a group of two people is more useful than having individuals carrying out the assessment. Openings for future research include further explorations of whether similar results hold for groups larger than two or what is the effect of mixing people with different accessibility background.
Subject Keywords
Teamwork
,
User interfaces
,
Web accessibility
,
Accessibility evaluation
URI
https://hdl.handle.net/11511/67600
Journal
INTERACTING WITH COMPUTERS
DOI
https://doi.org/10.1093/iwc/iww006
Collections
Engineering, Article
Suggestions
OpenMETU
Core
An interdisciplinary heuristic evaluation method for universal building design
AFACAN, YASEMİN; Erbuğ, Çiğdem (Elsevier BV, 2009-07-01)
This study highlights how heuristic evaluation as a usability evaluation method can feed into Current building design practice to conform to universal design principles. It provides a definition of universal usability that is applicable to an architectural design context. It takes the seven universal design principles as a set of heuristics and applies an iterative sequence of heuristic evaluation in a shopping mall, aiming to achieve a cost-effective evaluation process. The evaluation was composed of three...
Barriers common to mobile and disabled web users
Yesilada, Yeliz; Brajnik, Giorgio; Harper, Simon (2011-09-01)
World Wide Web accessibility and best practice audits and evaluations are becoming increasingly complicated, time consuming, and costly because of the increasing number of conformance criteria which need to be tested. In the case of web access by disabled users and mobile users, a number of commonalities have been identified in usage, which have been termed situationally-induced impairments; in effect the barriers experienced by mobile web users have been likened to those of visually disabled and motor impa...
Visual concept detection by stacked generalization
Tekin, Mashar; Yarman Vural, Fatoş Tunay; Department of Computer Engineering (2014)
In this thesis, we propose a new Stacked Generalization method, called Fuzzy Stacked Generalized Ranking Optimizer, to optimize the ranking performances of visual concept detection systems. In the proposed method, fuzzy k-NN classifiers are employed in the base-layer. Then, a classifier selection algorithm is employed to select the classifiers which will be combined in meta-layer. Finally, the results of the selected classifiers are combined and classified by a fuzzy k-NN meta classifier. In the experiments...
A usability evaluation framework and a case study on a supplier portal system
Babayiğit, Elif Fatma; Şen, Tayyar; Department of Industrial Engineering (2003)
The goal of this thesis is to provide a usability evaluation framework in the area of e-procurement technologies and a case study on this base. A survey of the concepts of human computer interaction, usability and usability evaluation techniques is carried out. Additionally current e-procurement technologies are explored and specifically a Company̕s Supplier Portal System which was employed in year 2003, as an e-procurement technology for the procurement of direct goods, is taken into consideration. Pointin...
Security Qualitative Metrics for Open Web Application Security Project Compliance
Sönmez, Ferda Özdemir (Elsevier BV; 2019)
The focus of this study is to find out repeatable features for large-scale enterprise web application production process related to based on OWASP security requirement list. As a result of a rigorous work including domain analysis for Java language and development frameworks and the examination of a large set of technical documents, 230 security qualitative metrics are discovered, under six categories. These security qualitative metrics are beneficial for security analysts as well as other parties such as d...
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
G. Brajnik, M. Vigo, Y. Yesilada, and S. Harper, “Group vs Individual Web Accessibility Evaluations: Effects with Novice Evaluators,”
INTERACTING WITH COMPUTERS
, pp. 843–861, 2016, Accessed: 00, 2020. [Online]. Available: https://hdl.handle.net/11511/67600.