Endre søk
Link to record
Permanent link

Direct link
Publikasjoner (2 av 2) Visa alla publikasjoner
Sounart, H., Lázár, E., Masarapu, Y., Wu, J., Várkonyi, T., Glasz, T., . . . Giacomello, S. (2023). Dual spatially resolved transcriptomics for human host–pathogen colocalization studies in FFPE tissue sections. Genome Biology, 24(1), Article ID 237.
Åpne denne publikasjonen i ny fane eller vindu >>Dual spatially resolved transcriptomics for human host–pathogen colocalization studies in FFPE tissue sections
Vise andre…
2023 (engelsk)Inngår i: Genome Biology, ISSN 1465-6906, E-ISSN 1474-760X, Vol. 24, nr 1, artikkel-id 237Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Technologies to study localized host–pathogen interactions are urgently needed. Here, we present a spatial transcriptomics approach to simultaneously capture host and pathogen transcriptome-wide spatial gene expression information from human formalin-fixed paraffin-embedded (FFPE) tissue sections at a near single-cell resolution. We demonstrate this methodology in lung samples from COVID-19 patients and validate our spatial detection of SARS-CoV-2 against RNAScope and in situ sequencing. Host–pathogen colocalization analysis identified putative modulators of SARS-CoV-2 infection in human lung cells. Our approach provides new insights into host response to pathogen infection through the simultaneous, unbiased detection of two transcriptomes in FFPE samples.

sted, utgiver, år, opplag, sider
Springer Nature, 2023
Emneord
Colocalization analysis, Formalin-fixed paraffin-embedded (FFPE) tissues, Host–pathogen interactions, Spatial transcriptomics
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-339050 (URN)10.1186/s13059-023-03080-y (DOI)001097440100002 ()37858234 (PubMedID)2-s2.0-85174494064 (Scopus ID)
Merknad

QC 20231128

Tilgjengelig fra: 2023-11-28 Laget: 2023-11-28 Sist oppdatert: 2023-12-05bibliografisk kontrollert
Sun, L., Wen, J., Wang, J., Zhao, Y., Zhang, B., Wu, J. & Xu, Y. (2023). Two-view attention-guided convolutional neural network for mammographic image classification. CAAI Transactions on Intelligence Technology, 8(2), 453-467
Åpne denne publikasjonen i ny fane eller vindu >>Two-view attention-guided convolutional neural network for mammographic image classification
Vise andre…
2023 (engelsk)Inngår i: CAAI Transactions on Intelligence Technology, ISSN 2468-6557, Vol. 8, nr 2, s. 453-467Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Deep learning has been widely used in the field of mammographic image classification owing to its superiority in automatic feature extraction. However, general deep learning models cannot achieve very satisfactory classification results on mammographic images because these models are not specifically designed for mammographic images and do not take the specific traits of these images into account. To exploit the essential discriminant information of mammographic images, we propose a novel classification method based on a convolutional neural network. Specifically, the proposed method designs two branches to extract the discriminative features from mammographic images from the mediolateral oblique and craniocaudal (CC) mammographic views. The features extracted from the two-view mammographic images contain complementary information that enables breast cancer to be more easily distinguished. Moreover, the attention block is introduced to capture the channel-wise information by adjusting the weight of each feature map, which is beneficial to emphasising the important features of mammographic images. Furthermore, we add a penalty term based on the fuzzy cluster algorithm to the cross-entropy function, which improves the generalisation ability of the classification model by maximising the interclass distance and minimising the intraclass distance of the samples. The experimental results on The Digital database for Screening Mammography INbreast and MIAS mammography databases illustrate that the proposed method achieves the best classification performance and is more robust than the compared state-of-the-art classification methods. 

sted, utgiver, år, opplag, sider
Institution of Engineering and Technology (IET), 2023
Emneord
convolutional neural network, deep learning, mammographic image, medical image processing, Classification (of information), Computer aided diagnosis, Convolution, Convolutional neural networks, Image classification, X ray screens, Automatic feature extraction, Classification methods, Classification results, Images classification, Learning models, Mammographic images, Medical images processing, Two views, Mammography
HSV kategori
Identifikatorer
urn:nbn:se:kth:diva-323278 (URN)10.1049/cit2.12096 (DOI)000784624100001 ()2-s2.0-85128516179 (Scopus ID)
Merknad

QC 20250512

Tilgjengelig fra: 2023-01-24 Laget: 2023-01-24 Sist oppdatert: 2025-05-12bibliografisk kontrollert
Organisasjoner
Identifikatorer
ORCID-id: ORCID iD iconorcid.org/0000-0003-0421-0112