kth.sePublications KTH
Change search
Link to record
Permanent link

Direct link
Publications (10 of 62) Show all publications
Gokan Khan, M., Guarese, R., Johnson, F., Wang, X. V., Bergman, A., Edvinsson, B., . . . Kronqvist, J. (2025). PerfCam: Digital Twinning for Production Lines Using 3D Gaussian Splatting and Vision Models. IEEE Access, 1-1
Open this publication in new window or tab >>PerfCam: Digital Twinning for Production Lines Using 3D Gaussian Splatting and Vision Models
Show others...
2025 (English)In: IEEE Access, E-ISSN 2169-3536, p. 1-1Article in journal (Refereed) Epub ahead of print
Abstract [en]

We introduce PerfCam, an open source Proof-of-Concept (PoC) digital twinning framework that combines camera and sensory data with 3D Gaussian Splatting and computer vision models for digital twinning, object tracking, and Key Performance Indicators (KPIs) extraction in industrial production lines. By utilizing 3D reconstruction and Convolutional Neural Networks (CNNs), PerfCam offers a semi-automated approach to object tracking and spatial mapping, enabling highly accurate digital twins that capture real-time KPIs such as availability, performance, Overall Equipment Effectiveness (OEE), and rate of conveyor belts in the production line. We validate the effectiveness of PerfCam through a practical deployment within realistic test production lines in the pharmaceutical industry and contribute an openly published dataset to support further research and development in the field. The results demonstrate PerfCam’s ability to deliver actionable insights through its precise digital twin capabilities, underscoring its value as an effective tool for developing usable digital twins in smart manufacturing environments and extracting operational analytics.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2025
Keywords
Production Line, Visual Model, Digital Twin, Convolutional Neural Network, Computer Vision, Sensor Data, 3D Reconstruction
National Category
Computer Sciences
Research subject
Computer Science; Industrial Engineering and Management
Identifiers
urn:nbn:se:kth:diva-363250 (URN)10.1109/access.2025.3567702 (DOI)001492129400039 ()2-s2.0-105004694919 (Scopus ID)
Projects
SMART Pharmaceutical Manufacturing
Funder
AstraZeneca, KTH-RPROJ-0146472
Note

QC 20250509

Available from: 2025-05-09 Created: 2025-05-09 Last updated: 2025-09-22Bibliographically approved
Vasiliu, M. M., Guarese, R., Jaatinen, J., Johnson, F., Edvinsson, B. & Romero, M. (2025). Towards Enhancing Industrial Training Through Conversational AI. In: CUI '25: Proceedings of the 7th ACM Conference on Conversational User Interfaces: . Paper presented at 7th ACM Conference on Conversational User Interfaces, CUI ’25, July 08–10, 2025, Waterloo, ON, Canada. Association for Computing Machinery (ACM)
Open this publication in new window or tab >>Towards Enhancing Industrial Training Through Conversational AI
Show others...
2025 (English)In: CUI '25: Proceedings of the 7th ACM Conference on Conversational User Interfaces, Association for Computing Machinery (ACM) , 2025Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

Conversational AI (CAI) has proven effective in educational settings, however its potential in industrial training, where higher precision and reliability are required, remains under-explored. This work-in-progress paper proposes a study to examine how AI persona design (Machine vs. Expert Operator) and voice embodiment (Diegetic vs. Disembodied) influence cognitive load, task efficiency, and usability in industrial training. By training a large language model (LLM) on Standard Operating Procedure (SOP) data, this project aims to develop a CAI assistant that provides real-time, easy-to-access information during task execution, in an attempt to enhance training efficiency and reduce reliance on text-heavy manuals through a user-centered approach.

Place, publisher, year, edition, pages
Association for Computing Machinery (ACM), 2025
Keywords
Natural language interfaces
National Category
Natural Language Processing
Research subject
Computer Science
Identifiers
urn:nbn:se:kth:diva-367516 (URN)10.1145/3719160.3737643 (DOI)001539402100008 ()2-s2.0-105011598225 (Scopus ID)
Conference
7th ACM Conference on Conversational User Interfaces, CUI ’25, July 08–10, 2025, Waterloo, ON, Canada
Note

QC 20250729

Available from: 2025-07-18 Created: 2025-07-18 Last updated: 2025-12-08Bibliographically approved
Westin, T., Rahmani, R., Palosaari-Eladhari, M. & Romero, M. (2024). An extended reality platform for inclusion of adults on the autism spectrum: A position paper. In: 15th International Conference on Ambient Systems, Networks and Technologies Networks, ANT 2024 / The 7th International Conference on Emerging Data and Industry 4.0, EDI40 2024: . Paper presented at 15th International Conference on Ambient Systems, Networks and Technologies Networks, ANT 2024 / The 7th International Conference on Emerging Data and Industry 4.0, EDI40 2024, Hasselt, Belgium, Apr 23 2024 - Apr 25 2024 (pp. 476-483). Elsevier BV
Open this publication in new window or tab >>An extended reality platform for inclusion of adults on the autism spectrum: A position paper
2024 (English)In: 15th International Conference on Ambient Systems, Networks and Technologies Networks, ANT 2024 / The 7th International Conference on Emerging Data and Industry 4.0, EDI40 2024, Elsevier BV , 2024, p. 476-483Conference paper, Published paper (Refereed)
Abstract [en]

Extended reality (XR) enables both new opportunities but also introduces new barriers for inclusion in society. Furthermore, XR is less researched than web, desktop and mobile applications. This position paper presents the concept of an XR platform for inclusion, with the purpose to make people on the autism spectrum and with other disabilities, more independent of help from others in everyday life situations. Based on previous research, our position is that, through current and future XR technologies combined with civic and artificial intelligence, it is possible to create individually personalised support for this purpose, grounded in practice to ensure validation.

Place, publisher, year, edition, pages
Elsevier BV, 2024
Keywords
augmented reality, autism, disability, inclusive, intellectual, metaverse, universal design, XR
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-351767 (URN)10.1016/j.procs.2024.06.050 (DOI)2-s2.0-85199493428 (Scopus ID)
Conference
15th International Conference on Ambient Systems, Networks and Technologies Networks, ANT 2024 / The 7th International Conference on Emerging Data and Industry 4.0, EDI40 2024, Hasselt, Belgium, Apr 23 2024 - Apr 25 2024
Note

QC 20240820

Available from: 2024-08-13 Created: 2024-08-13 Last updated: 2024-08-20Bibliographically approved
Westin, T., Romero, M., Palosaari-Eladhari, M., Bejnö, H. & Rahmani, R. (2024). Assistive Augmented Reality for Adults on the Autism Spectrum with Intellectual Disability. In: Computers Helping People with Special Needs - 19th International Conference, ICCHP 2024, Proceedings: . Paper presented at 19th International Conference on Computers Helping People with Special Needs, ICCHP 2024, Linz, Austria, Jul 8 2024 - Jul 12 2024 (pp. 257-266). Springer Nature
Open this publication in new window or tab >>Assistive Augmented Reality for Adults on the Autism Spectrum with Intellectual Disability
Show others...
2024 (English)In: Computers Helping People with Special Needs - 19th International Conference, ICCHP 2024, Proceedings, Springer Nature , 2024, p. 257-266Conference paper, Published paper (Refereed)
Abstract [en]

A common challenge for people on the autism spectrum with intellectual disability, is indoor navigation and related daily activities, as found in previous research. In this paper we report on co-design of assistive augmented reality applications, where the goal is to help people on the autism spectrum to gain more independence in their daily lives. This study is based on initially two full-day workshops with staff only, followed by ten individual workshops with the end-users and their staff at day centers, using a mix of methods and prototypes. The results show a clear potential of augmented reality as assistive technology for indoor navigation, depending on individual capability and/or complexity of environments, as well as for other activities. We also found that new barriers may arise, which are discussed for future research.

Place, publisher, year, edition, pages
Springer Nature, 2024
Keywords
Ambient and Assisted Living (AAL), Assistive Technology (AT), Labour Market Inclusion, User Centered Design and User Participation
National Category
Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-351970 (URN)10.1007/978-3-031-62849-8_32 (DOI)001313663100031 ()2-s2.0-85200421122 (Scopus ID)
Conference
19th International Conference on Computers Helping People with Special Needs, ICCHP 2024, Linz, Austria, Jul 8 2024 - Jul 12 2024
Note

Part of ISBN [9783031628481]

QC 20240830

Available from: 2024-08-19 Created: 2024-08-19 Last updated: 2025-12-05Bibliographically approved
Buwaider, A., El-Hajj, V. G., Iop, A., Romero, M., Jean, W., Edstrom, E. & Elmi-Terander, A. (2024). Augmented reality navigation in external ventricular drain insertion-a systematic review and meta-analysis. Virtual Reality, 28(3), Article ID 141.
Open this publication in new window or tab >>Augmented reality navigation in external ventricular drain insertion-a systematic review and meta-analysis
Show others...
2024 (English)In: Virtual Reality, ISSN 1359-4338, E-ISSN 1434-9957, Vol. 28, no 3, article id 141Article in journal (Refereed) Published
Abstract [en]

External ventricular drain (EVD) insertion using the freehand technique is often associated with misplacements resulting in unfavorable outcomes. Augmented Reality (AR) has been increasingly used to complement conventional neuronavigation. The accuracy of AR guided EVD insertion has been investigated in several studies, on anthropomorphic phantoms, cadavers, and patients. This review aimed to assess the current knowledge and discuss potential benefits and challenges associated with AR guidance in EVD insertion. MEDLINE, EMBASE, and Web of Science were searched from inception to August 2023 for studies evaluating the accuracy of AR guidance for EVD insertion. Studies were screened for eligibility and accuracy data was extracted. The risk of bias was assessed using the Cochrane Risk of Bias Tool and the quality of evidence was assessed using the Newcastle-Ottawa-Scale. Accuracy was reported either as the average deviation from target or according to the Kakarla grading system. Of the 497 studies retrieved, 14 were included for analysis. All included studies were prospectively designed. Insertions were performed on anthropomorphic phantoms, cadavers, or patients, using several different AR devices and interfaces. Deviation from target ranged between 0.7 and 11.9 mm. Accuracy according to the Kakarla grading scale ranged between 82 and 96%. Accuracy was higher for AR compared to the freehand technique in all studies that had control groups. Current evidence demonstrates that AR is more accurate than free-hand technique for EVD insertion. However, studies are few, the technology developing, and there is a need for further studies on patients in relevant clinical settings.

Place, publisher, year, edition, pages
Springer Nature, 2024
Keywords
External ventricular drain, Augmented reality, Ventriculostomy, Systematic review, Accuracy, Neuronavigation
National Category
Surgery
Identifiers
urn:nbn:se:kth:diva-351442 (URN)10.1007/s10055-024-01033-9 (DOI)001275553900001 ()2-s2.0-85199449541 (Scopus ID)
Note

QC 20240819

Available from: 2024-08-19 Created: 2024-08-19 Last updated: 2024-08-19Bibliographically approved
Wrife, A., Guarese, R., Iop, A. & Romero, M. (2024). Comparative analysis of spatiotemporal playback manipulation on virtual reality training for External Ventricular Drainage. Computers & graphics, 124, Article ID 104106.
Open this publication in new window or tab >>Comparative analysis of spatiotemporal playback manipulation on virtual reality training for External Ventricular Drainage
2024 (English)In: Computers & graphics, ISSN 0097-8493, E-ISSN 1873-7684, Vol. 124, article id 104106Article in journal (Refereed) Published
Abstract [en]

Extensive research has been conducted in multiple surgical specialities where Virtual Reality (VR) has been utilised, such as spinal neurosurgery. However, cranial neurosurgery remains relatively unexplored in this regard. This work explores the impact of adopting VR to study External Ventricular Drainage (EVD). In this study, pre-recorded Motion Captured data of an EVD procedure is visualised on a VR headset, in comparison to a desktop monitor condition. Participants (N = 20) ) were tasked with identifying and marking a key moment in the recordings. Objective and subjective metrics were recorded, such as completion time, temporal and spatial error distances, workload, and usability. The results from the experiment showed that the task was completed on average twice as fast in VR, when compared to desktop. However, desktop showed fewer error- prone results. Subjective feedback showed a slightly higher preference towards the VR environment concerning usability, while maintaining a comparable workload. Overall, VR displays are promising as an alternative tool to be used for educational and training purposes in cranial surgery.

Place, publisher, year, edition, pages
Elsevier BV, 2024
Keywords
Virtual reality, Surgical simulations, External ventricular drainage, Motion capture, Interaction controls
National Category
Surgery
Identifiers
urn:nbn:se:kth:diva-355300 (URN)10.1016/j.cag.2024.104106 (DOI)001334942500001 ()2-s2.0-85206016553 (Scopus ID)
Note

QC 20241030

Available from: 2024-10-30 Created: 2024-10-30 Last updated: 2024-10-30Bibliographically approved
Iop, A., Viberg, O., Francis, K., Norström, V., Mattias Persson, D., Wallin, L., . . . Matviienko, A. (2024). Exploring the Influence of Object Shapes and Colors on Depth Perception in Virtual Reality for Minimally Invasive Neurosurgical Training. In: CHI 2024 - Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Sytems: . Paper presented at 2024 CHI Conference on Human Factors in Computing Sytems, CHI EA 2024, Hybrid, Honolulu, United States of America, May 11 2024 - May 16 2024. Association for Computing Machinery, Article ID 154.
Open this publication in new window or tab >>Exploring the Influence of Object Shapes and Colors on Depth Perception in Virtual Reality for Minimally Invasive Neurosurgical Training
Show others...
2024 (English)In: CHI 2024 - Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Sytems, Association for Computing Machinery , 2024, article id 154Conference paper, Published paper (Refereed)
Abstract [en]

Minimally invasive neurosurgery (MIS) involves inserting a medical instrument, e.g., a catheter, through a small incision to target an area inside the patient's body. Training surgeons to perform MIS is challenging since the surgical site is not directly visible from their perspective. In this paper, we conducted two pilot studies focused on object shapes and colors to collect preliminary results on their influence on depth perception for MIS in Virtual Reality. In the first study (N = 8), participants inserted a virtual catheter into objects of different shapes. In the second study (N = 5), they observed the insertion of a virtual catheter into objects of different colors and backgrounds under different lighting conditions. We found that participants' precision decreased with distance and was lower with the skull shape than with a cube. Moreover, depth perception was higher with blue backgrounds under better lighting conditions.

Place, publisher, year, edition, pages
Association for Computing Machinery, 2024
Keywords
depth perception, minimally invasive neurosurgery, virtual reality
National Category
Neurology
Identifiers
urn:nbn:se:kth:diva-347323 (URN)10.1145/3613905.3650813 (DOI)001227587702041 ()2-s2.0-85194135109 (Scopus ID)
Conference
2024 CHI Conference on Human Factors in Computing Sytems, CHI EA 2024, Hybrid, Honolulu, United States of America, May 11 2024 - May 16 2024
Note

QC 20240613

Part of ISBN 979-840070331-7

Available from: 2024-06-10 Created: 2024-06-10 Last updated: 2024-10-30Bibliographically approved
Buwaider, A., El-Hajj, V. G., Mahdi, O. A., Iop, A., Gharios, M., de Giorgio, A., . . . Elmi-Terander, A. (2024). Extended reality in cranial and spinal neurosurgery – a bibliometric analysis. Acta Neurochirurgica, 166(1), Article ID 194.
Open this publication in new window or tab >>Extended reality in cranial and spinal neurosurgery – a bibliometric analysis
Show others...
2024 (English)In: Acta Neurochirurgica, ISSN 0001-6268, E-ISSN 0942-0940, Vol. 166, no 1, article id 194Article, review/survey (Refereed) Published
Abstract [en]

Purpose: This bibliometric analysis of the top 100 cited articles on extended reality (XR) in neurosurgery aimed to reveal trends in this research field. Gender differences in authorship and global distribution of the most-cited articles were also addressed. Methods: A Web of Science electronic database search was conducted. The top 100 most-cited articles related to the scope of this review were retrieved and analyzed for trends in publications, journal characteristics, authorship, global distribution, study design, and focus areas. After a brief description of the top 100 publications, a comparative analysis between spinal and cranial publications was performed. Results: From 2005, there was a significant increase in spinal neurosurgery publications with a focus on pedicle screw placement. Most articles were original research studies, with an emphasis on augmented reality (AR). In cranial neurosurgery, there was no notable increase in publications. There was an increase in studies assessing both AR and virtual reality (VR) research, with a notable emphasis on VR compared to AR. Education, surgical skills assessment, and surgical planning were more common themes in cranial studies compared to spinal studies. Female authorship was notably low in both groups, with no significant increase over time. The USA and Canada contributed most of the publications in the research field. Conclusions: Research regarding the use of XR in neurosurgery increased significantly from 2005. Cranial research focused on VR and resident education while spinal research focused on AR and neuronavigation. Female authorship was underrepresented. North America provides most of the high-impact research in this area.

Place, publisher, year, edition, pages
Springer Nature, 2024
Keywords
Augmented reality, Bibliometrics, Extended reality, Mixed reality, Neurosurgery, Virtual reality
National Category
Surgery
Identifiers
urn:nbn:se:kth:diva-346427 (URN)10.1007/s00701-024-06072-4 (DOI)001211101700001 ()38662229 (PubMedID)2-s2.0-85191395591 (Scopus ID)
Note

QC 20240514

Available from: 2024-05-14 Created: 2024-05-14 Last updated: 2025-12-05Bibliographically approved
Buvari, S., Viberg, O., Iop, A. & Romero, M. (2023). A student-centered learning analytics dashboard towards course goal achievement in STEM education. In: Responsive and Sustainable Educational Futures: 18th European Conference on Technology Enhanced Learning, EC-TEL 2023, Proceedings. Paper presented at Proceedings of the 18th European Conference on Technology Enhanced Learning, ECTEL 2023, Aveiro, Portugal, Sep 4 2023 - Sep 8 2023 (pp. 698-704). Springer Nature
Open this publication in new window or tab >>A student-centered learning analytics dashboard towards course goal achievement in STEM education
2023 (English)In: Responsive and Sustainable Educational Futures: 18th European Conference on Technology Enhanced Learning, EC-TEL 2023, Proceedings, Springer Nature , 2023, p. 698-704Conference paper, Published paper (Refereed)
Abstract [en]

Online learning has become an everyday form of learning for many students across different disciplines, including STEM subjects in the setting of higher education. Studying in these settings requires students to self-regulate their learning to a higher degree as compared to campus-based education. A vital aspect of self-regulated learning is the application of goal-setting strategies. Universities act to support students’ goal-setting through the achievement of course learning outcomes, which work both as a promise and metric of academic achievement. However, a lack of clear integration between course activities and course learning outcomes leaves a dissonance between students’ study efforts and the course progress. This demo study presents a student-centered learning analytics dashboard aimed at assisting students in their achievement of course learning goals in the setting of STEM higher education. The dashboard was designed using a design science methodological approach. Thirty-seven students have contributed to its development and evaluation during different stages of the design process, including the conceptual iterative design and prototyping. The preliminary results show that students found the tool to be easy to use and useful for the achievement of the course goals.

Place, publisher, year, edition, pages
Springer Nature, 2023
Keywords
Learning Analytics Dashboard, Learning Outcomes, Participatory Design, STEM Higher Education
National Category
Didactics Educational Sciences Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-337821 (URN)10.1007/978-3-031-42682-7_64 (DOI)001351067800074 ()2-s2.0-85171977964 (Scopus ID)
Conference
Proceedings of the 18th European Conference on Technology Enhanced Learning, ECTEL 2023, Aveiro, Portugal, Sep 4 2023 - Sep 8 2023
Note

Part of ISBN 9783031426810

QC 20231009

Available from: 2023-10-09 Created: 2023-10-09 Last updated: 2025-02-18Bibliographically approved
de Giorgio, A., Monetti, F. M., Maffei, A., Romero, M. & Wang, L. (2023). Adopting extended reality?: A systematic review of manufacturing training and teaching applications. Journal of manufacturing systems, 71, 645-663
Open this publication in new window or tab >>Adopting extended reality?: A systematic review of manufacturing training and teaching applications
Show others...
2023 (English)In: Journal of manufacturing systems, ISSN 0278-6125, E-ISSN 1878-6642, Vol. 71, p. 645-663Article in journal (Refereed) Published
Abstract [en]

The training of future experts and operators in manufacturing engineering relies on understanding procedural processes that require applied practice. Yet, current manufacturing education and training overwhelmingly continues to depend on traditional pedagogical methods that segregate theoretical studies and practical training. While educational institutes have generally improved theoretical studies, they often lack facilities and labs to properly reproduce the working environments necessary for practice. Even in industrial settings, it is difficult, if not impossible, to halt the actual production lines to train new operators. Recently, applications with extended reality (XR) technologies, such as virtual, augmented, or mixed reality, reached a mature technology readiness level. With this technological advancement, we can envision a transition to a new teaching paradigm that exploits simulated learning environments. Thus, it becomes possible to bridge the gap between theory and practice for both students and industrial trainees. This article presents a systematic literature review of the main applications of XR technologies in manufacturing education, their goals and technology readiness levels, and a comprehensive overview of the development tools and experimental strategies deployed. This review contributes: (1) a state-of-the-art description of current research in XR education for manufacturing systems, and (2) a comprehensive analysis of the technological platforms, the experimental procedures and the analytical methodologies deployed in the body of literature examined. It serves as a guide for setting up and executing experimental designs for evaluating interventions of XR in manufacturing education and training.

Place, publisher, year, edition, pages
Elsevier BV, 2023
Keywords
Extended reality, Augmented reality, Virtual reality, Manufacturing, Education, Technology readiness level (TRL)
National Category
Production Engineering, Human Work Science and Ergonomics Human Computer Interaction
Identifiers
urn:nbn:se:kth:diva-340328 (URN)10.1016/j.jmsy.2023.10.016 (DOI)001107069600001 ()2-s2.0-85175525171 (Scopus ID)
Funder
KTH Royal Institute of Technology
Note

QC 20231215

Available from: 2023-12-02 Created: 2023-12-02 Last updated: 2023-12-15Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-4616-189X

Search in DiVA

Show all publications