Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Optimal Radio Frequency Energy Harvesting with Limited Energy Arrival Knowledge
KTH, School of Information and Communication Technology (ICT), Centres, VinnExcellence Center for Intelligence in Paper and Packaging, iPACK.
2016 (English)In: IEEE Journal on Selected Areas in Communications, ISSN 0733-8716, E-ISSN 1558-0008, Vol. 34, no 12, 3528-3539 p., 7543484Article in journal (Refereed) Published
Abstract [en]

We develop optimal sleeping and harvesting policies for radio frequency (RF) energy harvesting devices, formalizing the following intuition: when the ambient RF energy is low, devices consume more energy being awake than what can be harvested and should enter sleep mode; when the ambient RF energy is high, on the other hand, it is essential to wake up and harvest. Toward this end, we consider a scenario with intermittent energy arrivals described by a two-state Gilbert-Elliott Markov chain model. The challenge is that the state of the Markov chain can only be observed during the harvesting action, and not while in sleep mode. Two scenarios are studied under this model. In the first scenario, we assume that the transition probabilities of the Markov chain are known and formulate the problem as a partially observable Markov decision process (POMDP). We prove that the optimal policy has a threshold structure and derive the optimal decision parameters. In the practical scenario where the ratio between the reward and the penalty is neither too large nor too small, the POMDP framework and the threshold-based optimal policies are very useful for finding non-Trivial optimal sleeping times. In the second scenario, we assume that the Markov chain parameters are unknown and formulate the problem as a Bayesian adaptive POMDP and propose a heuristic posterior sampling algorithm to reduce the computational complexity. The performance of our approaches is demonstrated via numerical examples.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2016. Vol. 34, no 12, 3528-3539 p., 7543484
Keyword [en]
ambient radio frequency energy, Bayesian inference, Energy harvesting, learning, partially observable Markov decision process, Bayesian networks, Chains, Inference engines, Optimization, Radio waves, Sleep research, Markov chain models, Radio-frequency energy, Radio-frequency energy harvesting, Sampling algorithm, Transition probabilities, Markov processes
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:kth:diva-201892DOI: 10.1109/JSAC.2016.2600364ISI: 000392473600032Scopus ID: 2-s2.0-85009732782OAI: oai:DiVA.org:kth-201892DiVA: diva2:1079309
Note

QC 20170308

Available from: 2017-03-08 Created: 2017-03-08 Last updated: 2017-03-08Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Zou, Zhuo
By organisation
VinnExcellence Center for Intelligence in Paper and Packaging, iPACK
In the same journal
IEEE Journal on Selected Areas in Communications
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 5 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf