Change search
Refine search result
1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Avramova, Vanya
    et al.
    KTH.
    Yang, Fangkai
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Li, Chengjie
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Peters, Christopher
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Skantze, Gabriel
    KTH, School of Computer Science and Communication (CSC), Speech, Music and Hearing, TMH.
    A virtual poster presenter using mixed reality2017In: 17th International Conference on Intelligent Virtual Agents, IVA 2017, Springer, 2017, Vol. 10498, p. 25-28Conference paper (Refereed)
    Abstract [en]

    In this demo, we will showcase a platform we are currently developing for experimenting with situated interaction using mixed reality. The user will wear a Microsoft HoloLens and be able to interact with a virtual character presenting a poster. We argue that a poster presentation scenario is a good test bed for studying phenomena such as multi-party interaction, speaker role, engagement and disengagement, information delivery, and user attention monitoring.

  • 2.
    Li, Chengjie
    et al.
    KTH.
    Androulakaki, Theofronia
    KTH.
    Gao, Alex Yuan
    Yang, Fangkai
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
    Saikia, Himangshu
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
    Peters, Christopher
    KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST).
    Skantze, Gabriel
    KTH, School of Electrical Engineering and Computer Science (EECS), Speech, Music and Hearing, TMH.
    Effects of Posture and Embodiment on Social Distance in Human-Agent Interaction in Mixed Reality2018In: Proceedings of the 18th International Conference on Intelligent Virtual Agents, ACM Digital Library, 2018, p. 191-196Conference paper (Refereed)
    Abstract [en]

    Mixed reality offers new potentials for social interaction experiences with virtual agents. In addition, it can be used to experiment with the design of physical robots. However, while previous studies have investigated comfortable social distances between humans and artificial agents in real and virtual environments, there is little data with regards to mixed reality environments. In this paper, we conducted an experiment in which participants were asked to walk up to an agent to ask a question, in order to investigate the social distances maintained, as well as the subject's experience of the interaction. We manipulated both the embodiment of the agent (robot vs. human and virtual vs. physical) as well as closed vs. open posture of the agent. The virtual agent was displayed using a mixed reality headset. Our experiment involved 35 participants in a within-subject design. We show that, in the context of social interactions, mixed reality fares well against physical environments, and robots fare well against humans, barring a few technical challenges.

  • 3.
    Yang, Fangkai
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Li, Chengjie
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Palmberg, Robin
    KTH, School of Computer Science and Communication (CSC).
    Van der Heide, Ewoud
    KTH, School of Computer Science and Communication (CSC).
    Peters, Christopher
    KTH, School of Computer Science and Communication (CSC), Computational Science and Technology (CST).
    Expressive Virtual Characters for Social Demonstration Games2017In: 2017 9th International Conference on Virtual Worlds and Games for Serious Applications, VS-Games 2017 - Proceedings, IEEE, 2017, p. 217-224Conference paper (Refereed)
    Abstract [en]

    Virtual characters are an integral part of many game and learning environments and have practical applications as tutors, demonstrators or even representations of the user. However, creating virtual character behaviors can be a time-consuming and complex task requiring substantial technical expertise. To accelerate and better enable the use of virtual characters in social games, we present a virtual character behavior toolkit for the development of expressive virtual characters. It is a midlleware toolkit which sits on top of the game engine with a focus on providing high-level character behaviors to quickly create social games. The toolkit can be adapted to a wide range of scenarios related to social interactions with individuals and groups at multiple distances in the virtual environment and supports customization and control of facial expressions, body animations and group formations. We describe the design of the toolkit, providing an examplar of a small game that is being created with it and our intended future work on the system.

1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf