Change search
Refine search result
1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Hamid Muhammed, Hamed
    Centre for Image Analysis, Uppsala University.
    Unsupervised Fuzzy Clustering Using Weighted Incremental Neural Networks2004In: International Journal of Neural Systems, ISSN 0129-0657, E-ISSN 1793-6462, Vol. 14, no 6, p. 355-371Article in journal (Refereed)
    Abstract [en]

    A new more efficient variant of a recently developed algorithm for unsupervised fuzzy clustering is introduced. A Weighted Incremental Neural Network (WINN) is introduced and used for this purpose. The new approach is called FC-WINN (Fuzzy Clustering using WINN). The WINN algorithm produces a net of nodes connected by edges, which reflects and preserves the topology of the input data set. Additional weights, which are proportional to the local densities in input space, are associated with the resulting nodes and edges to store useful information about the topological relations in the given input data set. A fuzziness factor, proportional to the connectedness of the net, is introduced in the system. A watershed-like procedure is used to cluster the resulting net. The number of the resulting clusters is determined by this procedure. Only two parameters must be chosen by the user for the FC-WINN algorithm to determine the resolution and the connectedness of the net. Other parameters that must be specified are those which are necessary for the used incremental neural network, which is a modified version of the Growing Neural Gas algorithm (GNG). The FC-WINN algorithm is computationally efficient when compared to other approaches for clustering large high-dimensional data sets.

  • 2.
    Johansson, Christopher
    et al.
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Ekeberg, Örjan
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Lansner, Anders
    KTH, School of Computer Science and Communication (CSC), Computational Biology, CB.
    Clustering of stored memories in an attractor network with local competition2006In: International Journal of Neural Systems, ISSN 0129-0657, E-ISSN 1793-6462, Vol. 16, no 6, p. 393-403Article in journal (Refereed)
    Abstract [en]

    In this paper we study an attractor network with units that compete locally for activation and we prove that a reduced version of it has fixpoint dynamics. An analysis, complemented by simulation experiments, of the local characteristics of the network's attractors with respect to a parameter controlling the intensity of the local competition is performed. We find that the attractors are hierarchically clustered when the parameter of the local competition is changed

  • 3. Orre, R.
    et al.
    Bate, A.
    Noren, G. N.
    Swahn, E.
    Arnborg, Stefan
    KTH, School of Computer Science and Communication (CSC), Numerical Analysis and Computer Science, NADA.
    Edwards, I. R.
    A Bayesian recurrent neural network for unsupervised pattern recognition in large incomplete data sets2005In: International Journal of Neural Systems, ISSN 0129-0657, E-ISSN 1793-6462, Vol. 15, no 3, p. 207-222Article in journal (Refereed)
    Abstract [en]

    A recurrent neural network, modified to handle highly incomplete training data is described. Unsupervised pattern recognition is demonstrated in the WHO database of adverse drug reactions. Comparison is made to a well established method, AutoClass, and the performances of both methods is investigated on simulated data. The neural network method performs comparably to AutoClass in simulated data, and better than AutoClass in real world data. With its better scaling properties, the neural network is a promising tool for unsupervised pattern recognition in huge databases of incomplete observations.

1 - 3 of 3
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf