kth.sePublikationer KTH
Driftmeddelande
För närvarande är det driftstörningar. Felsökning pågår.
Ändra sökning
Länk till posten
Permanent länk

Direktlänk
Publikationer (2 of 2) Visa alla publikationer
Sounart, H., Lázár, E., Masarapu, Y., Wu, J., Várkonyi, T., Glasz, T., . . . Giacomello, S. (2023). Dual spatially resolved transcriptomics for human host–pathogen colocalization studies in FFPE tissue sections. Genome Biology, 24(1), Article ID 237.
Öppna denna publikation i ny flik eller fönster >>Dual spatially resolved transcriptomics for human host–pathogen colocalization studies in FFPE tissue sections
Visa övriga...
2023 (Engelska)Ingår i: Genome Biology, ISSN 1465-6906, E-ISSN 1474-760X, Vol. 24, nr 1, artikel-id 237Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Technologies to study localized host–pathogen interactions are urgently needed. Here, we present a spatial transcriptomics approach to simultaneously capture host and pathogen transcriptome-wide spatial gene expression information from human formalin-fixed paraffin-embedded (FFPE) tissue sections at a near single-cell resolution. We demonstrate this methodology in lung samples from COVID-19 patients and validate our spatial detection of SARS-CoV-2 against RNAScope and in situ sequencing. Host–pathogen colocalization analysis identified putative modulators of SARS-CoV-2 infection in human lung cells. Our approach provides new insights into host response to pathogen infection through the simultaneous, unbiased detection of two transcriptomes in FFPE samples.

Ort, förlag, år, upplaga, sidor
Springer Nature, 2023
Nyckelord
Colocalization analysis, Formalin-fixed paraffin-embedded (FFPE) tissues, Host–pathogen interactions, Spatial transcriptomics
Nationell ämneskategori
Cancer och onkologi Cell- och molekylärbiologi
Identifikatorer
urn:nbn:se:kth:diva-339050 (URN)10.1186/s13059-023-03080-y (DOI)001097440100002 ()37858234 (PubMedID)2-s2.0-85174494064 (Scopus ID)
Anmärkning

QC 20231128

Tillgänglig från: 2023-11-28 Skapad: 2023-11-28 Senast uppdaterad: 2023-12-05Bibliografiskt granskad
Sun, L., Wen, J., Wang, J., Zhao, Y., Zhang, B., Wu, J. & Xu, Y. (2023). Two-view attention-guided convolutional neural network for mammographic image classification. CAAI Transactions on Intelligence Technology, 8(2), 453-467
Öppna denna publikation i ny flik eller fönster >>Two-view attention-guided convolutional neural network for mammographic image classification
Visa övriga...
2023 (Engelska)Ingår i: CAAI Transactions on Intelligence Technology, ISSN 2468-6557, Vol. 8, nr 2, s. 453-467Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Deep learning has been widely used in the field of mammographic image classification owing to its superiority in automatic feature extraction. However, general deep learning models cannot achieve very satisfactory classification results on mammographic images because these models are not specifically designed for mammographic images and do not take the specific traits of these images into account. To exploit the essential discriminant information of mammographic images, we propose a novel classification method based on a convolutional neural network. Specifically, the proposed method designs two branches to extract the discriminative features from mammographic images from the mediolateral oblique and craniocaudal (CC) mammographic views. The features extracted from the two-view mammographic images contain complementary information that enables breast cancer to be more easily distinguished. Moreover, the attention block is introduced to capture the channel-wise information by adjusting the weight of each feature map, which is beneficial to emphasising the important features of mammographic images. Furthermore, we add a penalty term based on the fuzzy cluster algorithm to the cross-entropy function, which improves the generalisation ability of the classification model by maximising the interclass distance and minimising the intraclass distance of the samples. The experimental results on The Digital database for Screening Mammography INbreast and MIAS mammography databases illustrate that the proposed method achieves the best classification performance and is more robust than the compared state-of-the-art classification methods. 

Ort, förlag, år, upplaga, sidor
Institution of Engineering and Technology (IET), 2023
Nyckelord
convolutional neural network, deep learning, mammographic image, medical image processing, Classification (of information), Computer aided diagnosis, Convolution, Convolutional neural networks, Image classification, X ray screens, Automatic feature extraction, Classification methods, Classification results, Images classification, Learning models, Mammographic images, Medical images processing, Two views, Mammography
Nationell ämneskategori
Medicinsk bildvetenskap Datorgrafik och datorseende
Identifikatorer
urn:nbn:se:kth:diva-323278 (URN)10.1049/cit2.12096 (DOI)000784624100001 ()2-s2.0-85128516179 (Scopus ID)
Anmärkning

QC 20250512

Tillgänglig från: 2023-01-24 Skapad: 2023-01-24 Senast uppdaterad: 2025-05-12Bibliografiskt granskad
Organisationer
Identifikatorer
ORCID-id: ORCID iD iconorcid.org/0000-0003-0421-0112

Sök vidare i DiVA

Visa alla publikationer