kth.sePublications KTH
Operational message
There are currently operational disruptions. Troubleshooting is in progress.
Change search
Link to record
Permanent link

Direct link
Publications (2 of 2) Show all publications
Sounart, H., Lázár, E., Masarapu, Y., Wu, J., Várkonyi, T., Glasz, T., . . . Giacomello, S. (2023). Dual spatially resolved transcriptomics for human host–pathogen colocalization studies in FFPE tissue sections. Genome Biology, 24(1), Article ID 237.
Open this publication in new window or tab >>Dual spatially resolved transcriptomics for human host–pathogen colocalization studies in FFPE tissue sections
Show others...
2023 (English)In: Genome Biology, ISSN 1465-6906, E-ISSN 1474-760X, Vol. 24, no 1, article id 237Article in journal (Refereed) Published
Abstract [en]

Technologies to study localized host–pathogen interactions are urgently needed. Here, we present a spatial transcriptomics approach to simultaneously capture host and pathogen transcriptome-wide spatial gene expression information from human formalin-fixed paraffin-embedded (FFPE) tissue sections at a near single-cell resolution. We demonstrate this methodology in lung samples from COVID-19 patients and validate our spatial detection of SARS-CoV-2 against RNAScope and in situ sequencing. Host–pathogen colocalization analysis identified putative modulators of SARS-CoV-2 infection in human lung cells. Our approach provides new insights into host response to pathogen infection through the simultaneous, unbiased detection of two transcriptomes in FFPE samples.

Place, publisher, year, edition, pages
Springer Nature, 2023
Keywords
Colocalization analysis, Formalin-fixed paraffin-embedded (FFPE) tissues, Host–pathogen interactions, Spatial transcriptomics
National Category
Cancer and Oncology Cell and Molecular Biology
Identifiers
urn:nbn:se:kth:diva-339050 (URN)10.1186/s13059-023-03080-y (DOI)001097440100002 ()37858234 (PubMedID)2-s2.0-85174494064 (Scopus ID)
Note

QC 20231128

Available from: 2023-11-28 Created: 2023-11-28 Last updated: 2023-12-05Bibliographically approved
Sun, L., Wen, J., Wang, J., Zhao, Y., Zhang, B., Wu, J. & Xu, Y. (2023). Two-view attention-guided convolutional neural network for mammographic image classification. CAAI Transactions on Intelligence Technology, 8(2), 453-467
Open this publication in new window or tab >>Two-view attention-guided convolutional neural network for mammographic image classification
Show others...
2023 (English)In: CAAI Transactions on Intelligence Technology, ISSN 2468-6557, Vol. 8, no 2, p. 453-467Article in journal (Refereed) Published
Abstract [en]

Deep learning has been widely used in the field of mammographic image classification owing to its superiority in automatic feature extraction. However, general deep learning models cannot achieve very satisfactory classification results on mammographic images because these models are not specifically designed for mammographic images and do not take the specific traits of these images into account. To exploit the essential discriminant information of mammographic images, we propose a novel classification method based on a convolutional neural network. Specifically, the proposed method designs two branches to extract the discriminative features from mammographic images from the mediolateral oblique and craniocaudal (CC) mammographic views. The features extracted from the two-view mammographic images contain complementary information that enables breast cancer to be more easily distinguished. Moreover, the attention block is introduced to capture the channel-wise information by adjusting the weight of each feature map, which is beneficial to emphasising the important features of mammographic images. Furthermore, we add a penalty term based on the fuzzy cluster algorithm to the cross-entropy function, which improves the generalisation ability of the classification model by maximising the interclass distance and minimising the intraclass distance of the samples. The experimental results on The Digital database for Screening Mammography INbreast and MIAS mammography databases illustrate that the proposed method achieves the best classification performance and is more robust than the compared state-of-the-art classification methods. 

Place, publisher, year, edition, pages
Institution of Engineering and Technology (IET), 2023
Keywords
convolutional neural network, deep learning, mammographic image, medical image processing, Classification (of information), Computer aided diagnosis, Convolution, Convolutional neural networks, Image classification, X ray screens, Automatic feature extraction, Classification methods, Classification results, Images classification, Learning models, Mammographic images, Medical images processing, Two views, Mammography
National Category
Medical Imaging Computer graphics and computer vision
Identifiers
urn:nbn:se:kth:diva-323278 (URN)10.1049/cit2.12096 (DOI)000784624100001 ()2-s2.0-85128516179 (Scopus ID)
Note

QC 20250512

Available from: 2023-01-24 Created: 2023-01-24 Last updated: 2025-05-12Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0003-0421-0112

Search in DiVA

Show all publications