kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
LifeSnaps, a 4-month multi-modal dataset capturing unobtrusive snapshots of our lives in the wild
Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki 54124, Greece..
Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki 54124, Greece..ORCID iD: 0000-0001-5772-3187
Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki 54124, Greece..
Aristotle Univ Thessaloniki, Sch Informat, Thessaloniki 54124, Greece..
Show others and affiliations
2022 (English)In: Scientific Data, E-ISSN 2052-4463, Vol. 9, no 1, article id 663Article in journal (Refereed) Published
Abstract [en]

Ubiquitous self-tracking technologies have penetrated various aspects of our lives, from physical and mental health monitoring to fitness and entertainment. Yet, limited data exist on the association between in the wild large-scale physical activity patterns, sleep, stress, and overall health, and behavioral and psychological patterns due to challenges in collecting and releasing such datasets, including waning user engagement or privacy considerations. In this paper, we present the LifeSnaps dataset, a multi-modal, longitudinal, and geographically-distributed dataset containing a plethora of anthropological data, collected unobtrusively for the total course of more than 4 months by n = 71 participants. LifeSnaps contains more than 35 different data types from second to daily granularity, totaling more than 71 M rows of data. The participants contributed their data through validated surveys, ecological momentary assessments, and a Fitbit Sense smartwatch and consented to make these data available to empower future research. We envision that releasing this large-scale dataset of multi-modal real-world data will open novel research opportunities and potential applications in multiple disciplines.

Place, publisher, year, edition, pages
Springer Nature , 2022. Vol. 9, no 1, article id 663
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-321615DOI: 10.1038/s41597-022-01764-xISI: 000876999500003PubMedID: 36316345Scopus ID: 2-s2.0-85140940218OAI: oai:DiVA.org:kth-321615DiVA, id: diva2:1712639
Note

QC 20221122

Available from: 2022-11-22 Created: 2022-11-22 Last updated: 2022-11-22Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textPubMedScopus

Authority records

Girdzijauskas, Sarunas

Search in DiVA

By author/editor
Karagianni, ChristinaPalotti, JoaoGirdzijauskas, Sarunas
By organisation
Software and Computer systems, SCS
In the same journal
Scientific Data
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 148 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf