kth.sePublications KTH
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Online feature selection for rapid, low-overhead learning in networked systems
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science.
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering. RISE Res Inst Sweden, Gothenburg, Sweden..ORCID iD: 0000-0002-6343-7416
KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering. RISE Res Inst Sweden, Gothenburg, Sweden..ORCID iD: 0000-0001-6039-8493
2020 (English)In: 2020 16th international conference on network and service management (CNSM) / [ed] ZincirHeywood, N Ulema, M Sayit, M Clayman, S Kim, MS Cetinkaya, C, IEEE , 2020Conference paper, Published paper (Refereed)
Abstract [en]

Data-driven functions for operation and management often require measurements collected through monitoring for model training and prediction. The number of data sources can be very large, which requires a significant communication and computing overhead to continuously extract and collect this data, as well as to train and update the machine-learning models. We present an online algorithm, called OSFS, that selects a small feature set from a large number of available data sources, which allows for rapid, low-overhead, and effective learning and prediction. OSFS is instantiated with a feature ranking algorithm and applies the concept of a stable feature set, which we introduce in the paper. We perform extensive, experimental evaluation of our method on data from an in-house testbed. We find that OSFS requires several hundreds measurements to reduce the number of data sources by two orders of magnitude, from which models are trained with acceptable prediction accuracy. While our method is heuristic and can be improved in many ways, the results clearly suggests that many learning tasks do not require a lengthy monitoring phase and expensive offline training.

Place, publisher, year, edition, pages
IEEE , 2020.
Series
International Conference on Network and Service Management, ISSN 2165-9605
Keywords [en]
Data-driven engineering, Machine learning (ML), Dimensionality reduction
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:kth:diva-291038DOI: 10.23919/CNSM50824.2020.9269066ISI: 000612229200029Scopus ID: 2-s2.0-85098668191OAI: oai:DiVA.org:kth-291038DiVA, id: diva2:1532890
Conference
16th International Conference on Network and Service Management (CNSM) / 2nd International Workshop on Analytics for Service and Application Management (AnServApp) / 1st International Workshop on the Future Evolution of Internet Protocols (IPFuture), NOV 02-06, 2020, ELECTR NETWORK
Note

QC 20210303

Available from: 2021-03-03 Created: 2021-03-03 Last updated: 2024-06-10Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Wang, XiaoxuanSamani, Forough ShahabStadler, Rolf

Search in DiVA

By author/editor
Wang, XiaoxuanSamani, Forough ShahabStadler, Rolf
By organisation
Computer ScienceNetwork and Systems Engineering
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 85 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf