Stockholms universitet

Razan JaberibraheemForskningsassistent

Forskningsprojekt

Publikationer

I urval från Stockholms universitets publikationsdatabas

  • Designing with Gaze

    2019. Donald McMillan (et al.). Proceedings of the ACM on Human-Computer Interaction 3

    Artikel

    Recent developments in gaze tracking present new opportunities for social computing. This paper presents a study of Tama, a gaze actuated smart speaker. Tama was designed taking advantage of research on gaze in conversation. Rather than being activated with a wake word (such as "Ok Google") Tama detects the gaze of a user, moving an articulated 'head' to achieve mutual gaze. We tested Tama's use in a multi-party conversation task, with users successfully activating and receiving a response to over 371 queries (over 10 trials). When Tama worked well, there was no significant difference in length of interaction. However, interactions with Tama had a higher rate of repeated queries, causing longer interactions overall. Video analysis lets us explain the problems users had interacting with gaze. In the discussion, we describe implications for designing new gaze systems, using gaze both as input and output. We also discuss how the relationship to anthropomorphic design and taking advantage of learned skills of interaction. Finally, two paths for future work are proposed, one in the field of speech agents, and the second in using human gaze as an interaction modality more widely.

    Läs mer om Designing with Gaze
  • Patterns of gaze in speech agent interaction

    2019. Razan Jaberibraheem (et al.). Proceedings of the 1st International Conference on Conversational User Interfaces

    Konferens

    While gaze is an important part of human to human interaction, it has been neglected in the design of conversational agents. In this paper, we report on our experiments with adding gaze to a conventional speech agent system. Tama is a speech agent that makes use of users' gaze to initiate a query, rather than a wake word or phrase. In this paper, we analyse the patterns of detected gaze when interacting with the device. We use k-means clustering of the log data from ten users tested in a dual-participant discussion tasks. These patterns are verified and explained through close analysis of the video data of the trials. We present similarities of patterns between conditions both when querying the agent and listening to the answers. We also present the analysis of patterns detected when only in the gaze condition. Users can take advantage of their understanding of gaze in conversation to interact with a gaze-enabled agent but are also able to fluently adjust their use of gaze to interact with the technology successfully. Our results point to some patterns of interaction which can be used as a starting point to build gaze-awareness into voice-user interfaces.

    Läs mer om Patterns of gaze in speech agent interaction

Visa alla publikationer av Razan Jaberibraheem vid Stockholms universitet