kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Optimizing Consensus Protocols with Machine Learning Models: A cache-based approach
KTH, School of Electrical Engineering and Computer Science (EECS).
2023 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Distributed systems offer a reliable and scalable solution for tackling massive and complex tasks that cannot be handled by a single computer. However, standard consensus protocols used in such systems often replicate data without considering the workload, leading to unnecessary retransmissions. This thesis proposes using machine learning (ML) to optimize consensus protocols and make them adaptable to recurring workloads. It introduces a cache that encodes frequently-transmitted data between nodes to reduce network traffic. To implement this, the thesis builds a caching layer at all nodes using the decided logs, which represent a consistent view of the application history. The cache can encode and decode incoming log entries to reduce the average message size and improve throughput under limited network bandwidth. The thesis selects an ML-based model that combines various caching policies and adapts to changing access patterns in the workload. Experimental results show that this approach can improve throughput up to 250%, assuming negligible preprocessing overhead.

Abstract [sv]

Distribuerade system erbjuder en pålitlig och skalbar lösning för att hantera massiva och komplexa uppgifter som inte kan hanteras av en enskild dator. Konventionella konsensusprotokoll som används i dessa system replikerar emellertid ofta data utan att ta hänsyn till arbetsbelastningen, vilket leder till överflödig dataöverföring. Denna avhandling föreslår att använda maskinin lärning (ML) för att optimera konsensusprotokoll och göra dem anpassade till återkommande mönster i arbetsbelastningen. Den introducerar en cache som kodar och komprimerar data som ofta överförs mellan noder för att minska nätverkstrafiken. För att implementera detta byggs ett cache baserat på den bestämda loggen på alla noder, som representerar en konsekvent syn på programhistoriken. Cachen kan koda inkommande data för att minska genomsnittlig meddelandestorlek och förbättra genomströmning under begränsad nätverksbandbredd. En ML-baserad modell som kombinerar olika cachningpolicyer och anpassar sig till ändrade åtkomstmönster i arbetsbelastningen används. Experimentella resultat visar att denna metod kan förbättra genomströmningen med 250%, under förutsättning att förbearbetningsöverhuvudet är försumbart.

Place, publisher, year, edition, pages
2023. , p. 37
Series
TRITA-EECS-EX ; 2023:74
Keywords [en]
Distributed Systems, Consensus Algorithms, Machine Learning, Caching
Keywords [sv]
Distribuerade system, Konsensusalgoritmer, Maskininlärning, Cachning
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:kth:diva-326119OAI: oai:DiVA.org:kth-326119DiVA, id: diva2:1752828
Supervisors
Examiners
Available from: 2023-06-26 Created: 2023-04-24 Last updated: 2023-06-26Bibliographically approved

Open Access in DiVA

fulltext(1879 kB)225 downloads
File information
File name FULLTEXT01.pdfFile size 1879 kBChecksum SHA-512
053588e247d25367cdf5c8dc62eff2893aec9ecf73e838f855948b51a813355429eceece88686c5ce0042a5d477edca9d5a675941b688aa07f28717bf4746229
Type fulltextMimetype application/pdf

By organisation
School of Electrical Engineering and Computer Science (EECS)
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 225 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 301 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf