Change search
Refine search result
1 - 16 of 16
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Green, Anders
    et al.
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    Hüttenrauch, Helge
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    Topp, Elin Anna
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    Severinsson Eklundh, Kerstin
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    Developing a Contextualized Multimodal Corpus for Human-Robot Interaction2006In: Proceedings of the fifth international conference on language resources and evaluation, 2006, p. 401-406Conference paper (Refereed)
    Abstract [en]

    This paper describes the development process of a contextualized corpus for research on Human-Robot Communication. The data have been collected in two Wizard-of-Oz user studies performedwith 22 and 5 users respectively in a scenario that is called the HomeTour. In this scenario the users show the environment (a single room, or a whole floor) to the robot using a combination of speech and gestures. The corpus has been transcribed and annotated with respect to gestures and conversational acts, thus forming a core annotation. We have also annotated or linked other types of data, e.g., laser range finder readings, positioning analysis, questionnaire data and task descriptions that form the annotated context of the scenario. By providing a rich set of different annotated data, thecorpus is thus an important resource both for research on natural language speech interfaces for robots and for research on human-robot communication in general.

  • 2.
    Huettenrauch, Helge
    et al.
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Severinson Eklundh, Kerstin
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Green, Anders
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    Topp, Elin A
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Investigating spatial relationships in human-robot interaction2006In: IEEE/RSJ International Conference on Intelligent Robots and Systems, NEW YORK, NY: IEEE , 2006, p. 5052-5059Conference paper (Refereed)
    Abstract [en]

    Co-presence and embodied interaction are two fundamental characteristics of the command and control situation for service robots. This paper presents a study of spatial distances and orientation of a robot with respect to a human user in an experimental setting. Relevant concepts of spatiality from social interaction studies are introduced and related to Human-Robot Interaction (HRI). A Wizard-of-Oz study quantifies the observed spatial distances and spatial formations encountered. However, it is claimed that a simplistic parameterization and measurement of spatial interaction misses the dynamic character and might be counterproductive in the design of socially appropriate robots.

  • 3.
    Hüttenrauch, Helge
    et al.
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Severinson Eklundh,, Kerstin
    KTH, School of Computer Science and Communication (CSC).
    Green, Anders
    Topp, Elin A.
    KTH, School of Computer Science and Communication (CSC).
    Christensen, Henrik
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    What's in the gap?: Interaction Transitions that make the HRI work2006In: Proceedings of the 15th IEEE international symposium on robot and human interactive communication, 2006Conference paper (Other academic)
    Abstract [en]

    This paper presents an in-depth analysis from a Human Robot Interaction (HRI) study on spatial positioning and interaction episode transitions. Subjects showed a living room to a robot to teach it new places and objects. This joint task was analyzed with respect to organizing strategies for interaction episodes. Noticing the importance of transitions between interaction episodes, small adaptive movements in posturewere observed. This finding needs to be incorporated into HRI modules that plan and execute robots’ spatial behavior in interaction, e.g., through dynamic adaptation of spatial formations and distances depending on interaction episode.

  • 4.
    Hüttenrauch, Helge
    et al.
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    Topp, Elin Anna
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    Severinson Eklundh, Kerstin.
    KTH, School of Computer Science and Communication (CSC), Human - Computer Interaction, MDI.
    The Art of Gate-Crashing Bringing HRI into users' homes2009Article in journal (Refereed)
    Abstract [en]

    Special purpose service robots have already entered the market and their users homes. Also the idea of the general purpose service robot or personal robot companion is increasingly discussed and investigated. To probe human-robot interaction with a mobile robot in arbitrary domestic settings, we conducted a study in eight different homes. Based on previous results from laboratory studies we identified particular interaction situations which should be studied thoroughly in real home settings. Based upon the collected sensory data from the robot we found that the different environments influenced the spatial management observable during our subjects' interaction with the robot. We also validated empirically that the concept of spatial prompting can aid spatial management and communication, and assume this concept to be helpful for Human-Robot Interaction (HRI) design. In this article we report on our exploratory field study and our findings regarding, in particular, the spatial management observed during show episodes and movement through narrow passages.

  • 5. Peltason, J.
    et al.
    Siepmann, F. H. K.
    Spexard, T. P.
    Wrede, B.
    Hanheide, M.
    Topp, Elin A.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Mixed-initiative in human augmented mapping2009In: ICRA: 2009 IEEE International Conference on Robotics and Automation, IEEE , 2009, p. 2146-2153Conference paper (Refereed)
    Abstract [en]

    In scenarios that require a close collaboration and knowledge transfer between inexperienced users and robots, the "learning by interacting" paradigm goes hand in hand with appropriate representations and learning methods. In this paper we discuss a mixed initiative strategy for robotic learning by interacting with a user in a joint map acquisition process. We propose the integration of an environment representation approach into our interactive learning framework. The environment representation and mapping system supports both user driven and data driven strategies for the acquisition of spatial information, so that a mixed initiative strategy for the learning process is realised. We evaluate our system with test runs according to the scenario of a guided tour, extending the area of operation from structured laboratory environment to less predictable domestic settings.

  • 6. Spexard, T.
    et al.
    Li, S.
    Wrede, B.
    Hanheide, M.
    Topp, Elin Anna
    KTH, School of Computer Science and Communication (CSC).
    Hüttenrauch, Helge
    KTH, School of Computer Science and Communication (CSC).
    Interaction awareness for joint environment exploration2007In: 2007 RO-MAN: The 16th IEEE International Symposium on Robot and Human interactive Communication, IEEE , 2007, p. 546-551Conference paper (Refereed)
    Abstract [en]

    An important goal for research on service robots is the cooperation of a human and a robot as team. A service robot in a domestic environment needs to build a representation of its future workspace that corresponds to the human user's understanding of these surroundings. But it also needs to apply this model about the "where" and "what" in its current interaction to allow communication about objects and places in a human-adequate way. In this paper we present the integration of a hierarchical robotic mapping system into an interactive framework controlled by a dialog system. The goal is to use interactively acquired environment models to implement a robot with interaction aware behaviors. A major contribution of this work is a three-level hierarchy of spatial representation affecting three different communication dimensions. This hierarchy is consequently applied in the design of the grounding-based dialog, laser-based topological mapping, and an objects attention system. We demonstrate the benefits of this integration for learning and tour guiding in a human-comprehensible interaction between a robot and its user in a home-tour scenario. The enhanced interaction capabilities are crucial for developing a new generation of robots that will be accepted not only as service robots but also as robot companions.

  • 7.
    Topp, Elin Anna
    et al.
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Christensen, Henrik I.
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Tracking for following and passing persons2005In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vols 1-4, 2005, p. 70-76Conference paper (Refereed)
    Abstract [en]

    This paper presents a multiple target tracking approach for following and passing persons in the context of human-robot interaction. The general purpose for the approach is the use in Human Augmented Mapping. This concept is presented and it is described how navigation and person following are subsumed under it. Results from experiments under test conditions and from data collected during a user study are also provided.

  • 8.
    Topp, Elin A.
    et al.
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Christensen, Henrik I.
    Detecting Region Transitions for Human-Augmented Mapping2010In: IEEE TRANS ROBOT, ISSN 1552-3098, Vol. 26, no 4, p. 715-720Article in journal (Refereed)
    Abstract [en]

    In this paper, we describe a concise method for the feature-based representation of regions in an indoor environment and show how it can also be applied for door-passage-independent detection of transitions between regions to improve communication with a human user.

  • 9.
    Topp, Elin A.
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Christensen, Henrik I.
    Detecting structural ambiguities and transitions during a guided tour2008In: 2008 IEEE International Conference On Robotics And Automation: Vols 1-9, 2008, p. 2564-2570Conference paper (Refereed)
    Abstract [en]

    Service robots designed for domestic settings need to navigate in an environment that they have to share with their users. Thus, they have to be able to report their current state and whereabouts in a way that is comprehensible for the user. Pure metric maps do not usually correspond to the understanding of the environment a user would provide. Thus, the robotic map needs to be integrated with the human representation. With our framework for Human Augmented Mapping we aim to deal with this issue and assume a guided tour as basis for an initial mapping process. During such a tour the robotic system needs to be able to detect significant changes in its environment representation - structural ambiguities to be able to invoke a clarification discourse with the user. In this paper we present our approach to the detection of such ambiguities, that is independent from prior specification and training of particular spatial categories. We evaluate our method on data sets obtained during several runs in indoor environments in the context of a guided tour scenario.

  • 10.
    Topp, Elin A.
    et al.
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Christensen, Henrik I.
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Topological modelling for human augmented mapping2006In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vols 1-12, NEW YORK: IEEE , 2006, p. 2257-2263Conference paper (Refereed)
    Abstract [en]

    Service robots designed for domestic settings need to navigate in an environment that they have to share with their users. Thus, they have to be able to report their current state and whereabouts in a way that is comprehensible for the user. Pure metric maps do not usually correspond to the understanding of the environment a user would provide. Thus, the robotic map needs to be integrated with the human representation. This paper describes our framework of Human Augmented Mapping that allows us to achieve this integration. We propose further a method to specify and represent regions that relate to a user's view on the environment. We assume an interactive setup for the specification of regions and show the applicability of our method in terms of distinctiveness for space segmentation and in terms of localisation purposes.

  • 11.
    Topp, Elin A.
    et al.
    KTH, School of Computer Science and Communication (CSC).
    Huettenrauch, Helge
    KTH, School of Computer Science and Communication (CSC).
    Christensen, Henrik I.
    KTH, School of Computer Science and Communication (CSC).
    Eklundh, Kerstin Severinson
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Bringing together human and robotic environment representations - A pilot study2006In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vols 1-12, NEW YORK: IEEE , 2006, p. 4946-4952Conference paper (Refereed)
    Abstract [en]

    Human interaction with a service robot requires a shared representation of the environment for spoken dialogue and task specification where names used for particular locations are depending on personal preferences. A question is how such human oriented models can be tied to the geometric robotic models needed for precise localisation and navigation. We assume that this integration can be based on the information potential users give to a service robot about its working environment. We further believe that this information is best given in an interactive setting (a "guided tour") in this particular environment. This paper presents a pilot study that investigates how humans present a familiar environment to a mobile robot. The study is set up within our concept of Human Augmented Mapping, for which we assume an initial "guided tour" scenario to teach a robot its environment. Results from this pilot study are used to validate a proposed generic environment model for a service robot.

  • 12.
    Topp, Elin Anna
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Human-Robot Interaction and Mapping with a Service Robot: Human Augmented Mapping2008Doctoral thesis, monograph (Other scientific)
    Abstract [en]

    An issue widely discussed in robotics research is the ageing society with its consequences for care-giving institutions and opportunities for developments in the area of service robots and robot companions. The general idea of using robotic systems in a personal or private context to support an independent way of living not only for the elderly but also for the physically impaired is pursued in different ways, ranging from socially oriented robotic pets to mobile assistants. Thus, the idea of the personalised general service robot is not too far fetched. Crucial for such a service robot is the ability to navigate in its working environment, which has to be assumed an arbitrary domestic or office-like environment that is shared with human users and bystanders. With methods developed and investigated in the field of simultaneous localisation and mapping it has become possible for mobile robots to explore and map an unknown environment, while they can stay localised with respect to their starting point and the surroundings. These approaches though do not consider the representation of the environment that is used by humans to refer to particular places. Robotic maps are often metric representations of features that can be obtained from sensory data. Humans have a more topological, in fact partially hierarchical way of representing environments. Especially for the communication between a user and her personal robot it is thus necessary to provide a link between the robotic map and the human understanding of the robot's workspace.

    The term Human Augmented Mapping is used for a framework that allows to integrate a robotic map with human concepts. Communication about the environment can thus be facilitated. By assuming an interactive setting for the map acquisition process it is possible for the user to influence the process significantly. Personal preferences can be made part of the environment representation that is acquired by the robot. Advantages become also obvious for the mapping process itself, since in an interactive setting the robot can ask for information and resolve ambiguities with the help of the user. Thus, a scenario of a ``guided tour'' in which a user can ask a robot to follow and present the surroundings is assumed as the starting point for a system for the integration of robotic mapping, interaction and human environment representations.

    A central point is the development of a generic, partially hierarchical environment model, that is applied in a topological graph structure as part of an overall experimental Human Augmented Mapping system implementation. Different aspects regarding the representation of entities of the spatial concepts used in this hierarchical model, particularly considering regions, are investigated. The proposed representation is evaluated both as description of delimited regions and for the detection of transitions between them. In three user studies different aspects of the human-robot interaction issues of Human Augmented Mapping are investigated and discussed. Results from the studies support the proposed model and representation approaches and can serve as basis for further studies in this area.

  • 13.
    Topp, Elin Anna
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Initial steps toward human augmented mapping2006Licentiate thesis, monograph (Other scientific)
    Abstract [en]

    With the progress in research and product development humans and robots get more and more close to each other and the idea of a personalised general service robot is not too far fetched. Crucial for such a service robot is the ability to navigate in its working environment. The environment has to be assumed an arbitrary domestic or office-like environment that has to be shared with human users and bystanders. With methods developed and investigated in the field of simultaneous localisation and mapping it has become possible for mobile robots to explore and map an unknown environment, while they can stay localised with respect to their starting point and the surroundings. These approaches though do not consider the representation of the environment that is used by humans to refer to particular places. Robotic maps are often metric representations of features that could be obtained from sensory data. Humans have a more topological, in fact partially hierarchical way of representing environments. Especially for the communication between a user and her personal robot it is thus necessary to provide a link between the robotic map and the human understanding of the robot's workspace.

    The term Human Augmented Mapping is used for a framework that allows to integrate a robotic map with human concepts. Communication about the environment can thus be facilitated. By assuming an interactive setting for the map acquisition process it is possible for the user to influence the process significantly. Personal preferences can be made part of the environment representation that the robot acquires. Advantages become also obvious for the mapping process itself, since in an interactive setting the robot could ask for information and resolve ambiguities with the help of the user. Thus, a scenario of a "guided tour" in which a user can ask a robot to follow and present the surroundings is assumed as the starting point for a system for the integration of robotic mapping, interaction and human environment representations.

    Based on results from robotics research, psychology, human-robot interaction and cognitive science a general architecture for a system for Human Augmented Mapping is presented. This architecture combines a hierarchically organised robotic mapping approach with interaction abilities with the help of a high-level environment model. An initial system design and implementation that combines a tracking and following approach with a mapping system is described. Observations from a pilot study in which this initial system was used successfully are reported and support the assumptions about the usefulness of the environment model that is used as the link between robotic and human representation.

  • 14.
    Topp, Elin Anna
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Hütenrauch, Helge
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Christensen, Henrik Iskov
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Eklundh, Kerstin Severinson
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Acquiring a shared environment representation2006In: HRI Proc. ACM Conf. Human-Robot Interact., 2006, p. 361-362Conference paper (Refereed)
    Abstract [en]

    Interacting with a domestic service robot implies the existence of a joint environment model for a user and a robot. We present a pilot study that investigates, how humans present a familiar environment to a mobile robot. Results from this study are used to evaluate a generic environment model for a service robot that can be personalised by interaction.

  • 15.
    Topp, Elin Anna
    et al.
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Kragic, Danica
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Jensfelt, Patric
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    Christensen, Henrik
    KTH, Superseded Departments, Numerical Analysis and Computer Science, NADA.
    An interactive interface for service robots2004In: 2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, p. 3469-3474Conference paper (Refereed)
    Abstract [en]

    In this paper, we present an initial design of an interactive interface for a service robot based on multi sensor fusion. We show how the integration of speech, vision and laser range data can be performed using a high level of abstraction. Guided by a number of scenarios commonly used in a service robot framework, the experimental evaluation will show the benefit of sensory integration which allows the design of a robust and natural interaction system using a set of simple perceptual algorithms.

  • 16. Zivkovic, Zoran
    et al.
    Booij, Olaf
    Kröse, Ben
    Topp, Elin Anna
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Christensen, Henrik I.
    KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    From sensors to human spatial concepts: An annotated data set2008In: IEEE Transactions on Robotics, ISSN 1552-3098, Vol. 24, no 2, p. 501-505Article in journal (Refereed)
    Abstract [en]

    An annotated data set is presented meant to help researchers in developing, evaluating, and comparing various approaches in robotics for building space representations appropriate for communicating with humans. The data consist of omnidirectional images, laser range scans, sonar readings, and robot odometry. A set of base-level human spatial concepts is used to annotate the data.

1 - 16 of 16
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf