Show/Hide Menu
Hide/Show Apps
Logout
Türkçe
Türkçe
Search
Search
Login
Login
OpenMETU
OpenMETU
About
About
Open Science Policy
Open Science Policy
Open Access Guideline
Open Access Guideline
Postgraduate Thesis Guideline
Postgraduate Thesis Guideline
Communities & Collections
Communities & Collections
Help
Help
Frequently Asked Questions
Frequently Asked Questions
Guides
Guides
Thesis submission
Thesis submission
MS without thesis term project submission
MS without thesis term project submission
Publication submission with DOI
Publication submission with DOI
Publication submission
Publication submission
Supporting Information
Supporting Information
General Information
General Information
Copyright, Embargo and License
Copyright, Embargo and License
Contact us
Contact us
Trust attribution in collaborative robots: An experimental investigation of non-verbal cues in a virtual human-robot interaction setting
Download
index.pdf
Date
2021-6
Author
Ahmet Meriç, Özcan
Metadata
Show full item record
This work is licensed under a
Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License
.
Item Usage Stats
456
views
295
downloads
Cite This
This thesis reports the development of non-verbal HRI (Human-Robot Interaction) behaviors on a robotic manipulator, evaluating the role of trust in collaborative assembly tasks. Towards this end, we developed four non-verbal HRI behaviors, namely gazing, head nodding, tilting, and shaking, on a UR5 robotic manipulator. We used them under different degrees of trust of the user to the robot actions. Specifically, we used a certain head-on neck posture for the cobot using the last three links along with the gripper. The gaze behavior directed the gripper towards the desired point in space, alongside with the head nodding and shaking behaviors. We designed a remote setup to experiment subjects interacting with the cobot remotely via Zoom teleconferencing. In a simple collaborative scenario, the efficacy of these behaviors was assessed in terms of their impact on the formation of trust between the robot and the user and task performance. Nineteen people participated in the experiment with varying ages and genders.
Subject Keywords
Robotics
,
Human-Robot Interaction
,
Non-verbal Gestures
,
Telepresence robot
URI
https://hdl.handle.net/11511/91641
Collections
Graduate School of Informatics, Thesis
Suggestions
OpenMETU
Core
Designing Social Cues for Collaborative Robots: The Role of Gaze and Breathing in Human-Robot Collaboration
Terzioglu, Yunus; Mutlu, Bilge; Şahin, Erol (2020-01-01)
In this paper, we investigate how collaborative robots, or cobots, typically composed of a robotic arm and a gripper carrying out manipulation tasks alongside human coworkers, can be enhanced with HRI capabilities by applying ideas and principles from character animation. To this end, we modified the appearance and behaviors of a cobot, with minimal impact on its functionality and performance, and studied the extent to which these modifications improved its communication with and perceptions by human collab...
FRACTAL SET-THEORETIC ANALYSIS OF PERFORMANCE LOSSES FOR TUNING TRAINING DATA IN LEARNING-SYSTEMS
Erkmen, Aydan Müşerref (1992-08-28)
This paper focuses on the evaluation of learning performance in intelligent dynamic processes with supervised learning. Learning dynamics are characterized by basins of attraction generated by state transitions in control space (statespace + parameter space). State uncertainty is modelled as a cellular control space, namely the cell space. Learning performance losses are related to nonseparable basins of attractions with fuzzy boundaries and to their erosions under parameter changes. Basins erosions are ana...
Simple and complex behavior learning using behavior hidden Markov model and CobART
Seyhan, Seyit Sabri; Alpaslan, Ferda Nur; Yavaş, Mustafa (2013-03-01)
This paper proposes behavior learning and generation models for simple and complex behaviors of robots using unsupervised learning methods. While the simple behaviors are modeled by simple-behavior learning model (SBLM), complex behaviors are modeled by complex-behavior learning model (CBLM) which uses previously learned simple or complex behaviors. Both models include behavior categorization, behavior modeling, and behavior generation phases. In the behavior categorization phase, sensory data are categoriz...
Communication behaviors and trust in collaborative online teams
Tokel, Saniye Tuğba; Yıldırım, Zahide (2008-01-01)
This study investigates preservice teachers' trust levels and collaborative communication behaviors namely leadership, feedback, social interaction, enthusiasm, task and technical uncertainties, and task-oriented interactions in online learning environment. A case study design involving qualitative and quantitative data collection and analysis was employed. The sample consisted of 32 (24 female, 8 male) 3rd year foreign language education students who enrolled in the "Instructional Technology and Material D...
Design of a variable five-axes adjustable configuration robot manipulator
YUCEL, AS; Ersak, Aydın (1994-04-14)
A robot manipulator design is presented in this paper supplying a few kinematical configurations in a single structure which is in the mean time, reconfigurable for given tasks and hence making the level of flexibility and adaptability much higher for changing working environments
Citation Formats
IEEE
ACM
APA
CHICAGO
MLA
BibTeX
Ö. Ahmet Meriç, “Trust attribution in collaborative robots: An experimental investigation of non-verbal cues in a virtual human-robot interaction setting,” M.S. - Master of Science, Middle East Technical University, 2021.