kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Diagnostic Prediction Using Discomfort Drawing with IBTM
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. (CVAP)ORCID iD: 0000-0002-8640-9370
KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL. (Datorseende och robotik, CVAP, Computer Vision and Active Perception, CVAP)ORCID iD: 0000-0002-5750-9655
(KI)
2016 (English)Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we explore the possibility to apply machine learning to make diagnostic predictions using discomfort drawings. A discomfort drawing is an intuitive way for patients to express discomfort and pain related symptoms. These drawings have proven to be an effective method to collect patient data and make diagnostic decisions in real-life practice. A dataset from real-world patient cases is collected for which medical experts provide diagnostic labels. Next, we use a factorized multimodal topic model, Inter-Battery Topic Model (IBTM), to train a system that can make diagnostic predictions given an unseen discomfort drawing. The number of output diagnostic labels is determined by using mean-shift clustering on the discomfort drawing. Experimental results show reasonable predictions of diagnostic labels given an unseen discomfort drawing. Additionally, we generate synthetic discomfort drawings with IBTM given a diagnostic label, which results in typical cases of symptoms. The positive result indicates a significant potential of machine learning to be used for parts of the pain diagnostic process and to be a decision support system for physicians and other health care personnel.

Place, publisher, year, edition, pages
2016. Vol. 56
National Category
Other Engineering and Technologies
Identifiers
URN: urn:nbn:se:kth:diva-197296OAI: oai:DiVA.org:kth-197296DiVA, id: diva2:1051363
Conference
Machine Learning in Health Care
Note

QC 20161205

Available from: 2016-12-01 Created: 2016-12-01 Last updated: 2024-03-15Bibliographically approved

Open Access in DiVA

fulltext(5192 kB)155 downloads
File information
File name FULLTEXT01.pdfFile size 5192 kBChecksum SHA-512
148dfb7912ed0f87d7b69d15bc3901d48a86f2f47c1adc275de589611b1ab24edc3c3631bc1cc01c1abcf671664f9da400b141b6aaac10e74d0ab4cf416d7c74
Type fulltextMimetype application/pdf

Authority records

Zhang, ChengKjellström, Hedvig

Search in DiVA

By author/editor
Zhang, ChengKjellström, Hedvig
By organisation
Computer Vision and Active Perception, CVAPRobotics, perception and learning, RPL
Other Engineering and Technologies

Search outside of DiVA

GoogleGoogle Scholar
Total: 155 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 114 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf