kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Implementing a Network Optimized Federated Learning Method From the Ground up
KTH, School of Electrical Engineering and Computer Science (EECS).
KTH, School of Electrical Engineering and Computer Science (EECS).
2023 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

This bachelor thesis presents the implementation ofa simple fully connected neural network (FCNN) and federatedneural network with stochastic quantization from scratch andcompares their performance. Federated learning enables multipleparties to contribute to a machine learning model withoutsharing their sensitive data. The federated learning approach isbecoming increasingly popular due to its ability to train modelson decentralized data sources while maintaining privacy andsecurity. Both the FCNN and federated network are trainedand tested on the Modified National Institute of Standards andTechnology (MNIST) database, the first one achieving around90% accuracy after 50 epochs while the federated architectureonly able to reach around 45% accuracy. This remains the samewhen data is quantized.

Abstract [sv]

Denna kandidatuppsats presenterar imple-menteringen av ett enkelt fullt anslutet neuralt nätverk (FCNN)och ett federerat neuralt nätverk med stokastisk kvantiser-ing från grunden, samt jämför deras prestanda. Federativtlärande möjliggör att flera parter kan bidra till en mask-ininlärningsmodell utan att dela med sig av känsliga data.Federativt lärande har blivit allt mer populärt på grund avdess förmåga att träna modeller på decentraliserade datakällorsamtidigt som integritet och säkerhet bevaras. Både FCNN ochdet federerade nätverket tränas och testas på Modified NationalInstitute of Standards and Technology (MNIST) databas, där detförsta uppnår 90% noggrannhet efter 50 epoker medan detfedererade nätverket endast når cirka 45% noggrannhet.Detta förblir densamma när data är kvantiserat.

Place, publisher, year, edition, pages
2023. , p. 477-485
Series
TRITA-EECS-EX ; 2023:177
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-341726OAI: oai:DiVA.org:kth-341726DiVA, id: diva2:1823402
Supervisors
Examiners
Projects
Kandidatexjobb i elektroteknik 2023, KTH, StockholmAvailable from: 2024-01-02 Created: 2024-01-02

Open Access in DiVA

fulltext(211487 kB)437 downloads
File information
File name FULLTEXT01.pdfFile size 211487 kBChecksum SHA-512
69786101c351a58f7bd524c3aeee40c661028b577366c4a725033372b88c624c87c2183b6acca2d3d43bbd2bb2f3942326c69263e70c99cf1db027ce9c4e9ae2
Type fulltextMimetype application/pdf

By organisation
School of Electrical Engineering and Computer Science (EECS)
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 437 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 171 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf