LifeSnaps, a 4-month multi-modal dataset capturing unobtrusive snapshots of our lives in the wildShow others and affiliations
2022 (English)In: Scientific Data, E-ISSN 2052-4463, Vol. 9, no 1, article id 663Article in journal (Refereed) Published
Abstract [en]
Ubiquitous self-tracking technologies have penetrated various aspects of our lives, from physical and mental health monitoring to fitness and entertainment. Yet, limited data exist on the association between in the wild large-scale physical activity patterns, sleep, stress, and overall health, and behavioral and psychological patterns due to challenges in collecting and releasing such datasets, including waning user engagement or privacy considerations. In this paper, we present the LifeSnaps dataset, a multi-modal, longitudinal, and geographically-distributed dataset containing a plethora of anthropological data, collected unobtrusively for the total course of more than 4 months by n = 71 participants. LifeSnaps contains more than 35 different data types from second to daily granularity, totaling more than 71 M rows of data. The participants contributed their data through validated surveys, ecological momentary assessments, and a Fitbit Sense smartwatch and consented to make these data available to empower future research. We envision that releasing this large-scale dataset of multi-modal real-world data will open novel research opportunities and potential applications in multiple disciplines.
Place, publisher, year, edition, pages
Springer Nature , 2022. Vol. 9, no 1, article id 663
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-321615DOI: 10.1038/s41597-022-01764-xISI: 000876999500003PubMedID: 36316345Scopus ID: 2-s2.0-85140940218OAI: oai:DiVA.org:kth-321615DiVA, id: diva2:1712639
Note
QC 20221122
2022-11-222022-11-222022-11-22Bibliographically approved