kth.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Synaptic Plasticity Modulation and Sequence Storage Capacity in Recurrent BCPNN Networks
KTH, School of Electrical Engineering and Computer Science (EECS).
2024 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

The simulation of the biological memory can be implemented using a computationalneural network called the Bayesian Confidence Propagating Neural Network(BCPNN). This network architecture consists of units called minicolumns, organizedinto hypercolumns, and operates with a competitive winner-takes-all mechanism.The project studies the network's capability to store sequential memories as the sizeof the network varies.By varying the number of minicolumns and hypercolumns, as well as the timeconstant known as the learning rate, we can gain better insights into how thenetwork's learning and recall success varies. The findings reveal that increasing thenumber of minicolumns enhances learning success more effectively than varying thenumber of hypercolumns. The impact of the learning rate on the networkperformance is less conclusive, indicating the need for further study to optimize thisparameter.Further research is needed to explore larger network architectures and longersequences with variations in patterns to enhance the applicability of these findings.

Abstract [sv]

Simuleringen av det biologiska minnet kan implementeras med hjälp av ettberäkningsnätverk kallat Bayesian Confidence Propagating Neural Network(BCPNN). Denna nätverksarkitektur består av enheter kallade minikolumner,organiserade i hyperkolumner, och fungerar med en konkurrensmekanism därvinnaren tar allt. Projektet studerar nätverkets förmåga att lagra sekventiella minnennär nätverkets storlek varierar.Genom att variera antalet minikolumner och hyperkolumner, samt tidskonstantenkänd som inlärningshastigheten, kan vi få bättre insikt i hur nätverkets inlärnings-och återkallningsframgång varierar. Resultaten visar att en ökning av antaletminikolumner förbättrar inlärningsframgången mer effektivt än att variera antalethyperkolumner. Effekten av inlärningshastigheten på nätverkets prestanda är mindretydlig, vilket indikerar behovet av ytterligare studier för att optimera denna parameter.       Ytterligare forskning behövs för att utforska större nätverksarkitekturer och längresekvenser med variationer i mönster för att öka tillämpligheten av dessa resultat.

Place, publisher, year, edition, pages
2024. , p. 707-713
Series
TRITA-EECS-EX ; 2024:199
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-359434OAI: oai:DiVA.org:kth-359434DiVA, id: diva2:1933549
Supervisors
Examiners
Projects
Kandidatexamensarbete Elektroteknik EECS 2024Available from: 2025-01-31 Created: 2025-01-31

Open Access in DiVA

fulltext(121150 kB)14 downloads
File information
File name FULLTEXT01.pdfFile size 121150 kBChecksum SHA-512
82342f1408fe2aae929f55e76f2a176a8521cf94c0fe100464225724a9b74ddff6f61a0fb1cf957b5d6400be30877b9d0aec28080253ae7c1e09793536e10217
Type fulltextMimetype application/pdf

By organisation
School of Electrical Engineering and Computer Science (EECS)
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 14 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 893 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf