kth.sePublications KTH
Operational message
There are currently operational disruptions. Troubleshooting is in progress.
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Storage Capacity of Associative Neural Networks: Simulations Across four Learning Rules
KTH, School of Electrical Engineering and Computer Science (EECS).
KTH, School of Electrical Engineering and Computer Science (EECS).
2025 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

This project investigates the storage capacity and recall performance of associative memory networks using different biologically inspired learning rules. The aim is to investigate how different local learning rules influence the storage capacity and recall performance of attractor neural networks under varying conditions. These conditions include pattern sparsity, pattern correlation, and input noise. A discrete Hopfield network with 100 fully connected neurons was implemented in Python and pattern recall was tested using both sparse (10\%) and dense (50\%) binary and bipolar patterns with and without correlation and noise. Results show that the Storkey rule is most effective for dense, correlated patterns, while Willshaw and BCPNN perform best with sparse, uncorrelated inputs. The Hebbian learning rule demonstrated limited storage capacity in all scenarios. The findings highlight the trade-offs between learning rules and offer practical guidance for selecting memory models based on statistical properties of the data.

Abstract [sv]

Detta projekt undersöker lagringskapacitet och återkallelsförmåga hos associativa minnesnätverk med olika biologiskt inspirerade inlärningsregler. Syftet är att undersöka hur olika lokala inlärningsregler påverkar lagringskapaciteten och återkallningsförmågan hos attraktor-nätverk under varierande förhållanden. Dessa förhållanden inkluderar mönstergleshet, mönsterkorrelation och inmatningsbrus. Ett diskret Hopfield-nätverk med 100 fullt kopplade neuroner implementerades i Python, och återkallelsetester genomfördes med både glesa (10\%) och täta (50\%) binära och bipolära mönster, med och utan korrelation och brus. Resultaten visar att Storkey-regeln är mest effektiv för täta, korrelerade mönster, medan Willshaw och BCPNN fungerar bäst för glesa, okorrelerade mönster. Hebbian inlärningsregel visade begränsad lagringskapacitet i alla scenarier. Resultaten belyser avvägningar mellan inlärningsreglerna och erbjuder praktisk vägledning för att välja minnesmodeller baserat på de statistiska egenskaperna hos data.

Place, publisher, year, edition, pages
2025. , p. 577-582
Series
TRITA-EECS-EX ; 2025:158
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-376180OAI: oai:DiVA.org:kth-376180DiVA, id: diva2:2034633
Supervisors
Examiners
Projects
Kandidatexamensarbete i Elektroteknik 2025, EECS, KTHAvailable from: 2026-02-02 Created: 2026-02-02

Open Access in DiVA

fulltext(80627 kB)21 downloads
File information
File name FULLTEXT01.pdfFile size 80627 kBChecksum SHA-512
35ce0a386dafe4649eb99cbe0efdfed651a3c9044e3339612422234d17a7e8ec21d4fd4aa201500c3c7a8f57194994b78b3e0cfbd5319ecd49f18a5d8a7ff775
Type fulltextMimetype application/pdf

By organisation
School of Electrical Engineering and Computer Science (EECS)
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 3686 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf