kth.sePublications KTH
Operational message
There are currently operational disruptions. Troubleshooting is in progress.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Deep Reinforcement Learning Based Energy Management for Smart Buildings with Heat Pump and Electric Vehicles
KTH, School of Electrical Engineering and Computer Science (EECS), Electrical Engineering, Electric Power and Energy Systems.ORCID iD: 0009-0005-7707-3405
KTH, School of Electrical Engineering and Computer Science (EECS), Electrical Engineering, Electric Power and Energy Systems.ORCID iD: 0000-0002-2793-9048
2024 (English)In: IECON 2024 - 50th Annual Conference of the IEEE Industrial Electronics Society, Proceedings, Institute of Electrical and Electronics Engineers (IEEE) , 2024Conference paper, Published paper (Refereed)
Abstract [en]

The rapid advancement of intelligent devices and smart meters has positioned smart buildings as key components in modern living. To ensure users’ comfort, regulate the indoor temperature, and minimize the overall cost of a building, an effective energy management system (EMS) is necessary. This paper proposes a deep deterministic policy gradient (DDPG)based energy management system (EMS) for smart buildings, incorporating photovoltaic (PV) panels, energy storage systems (ESS), heat pump (HP), and EV chargers. Unlike traditional methods that linearize temperature variations, our approach leverages DDPG to handle nonlinear models. The proposed EMS automatically addresses uncertainties in temperature and solar irradiation and provides a real-time scheduling commands without the demand for predicted data. Simulation results demonstrate the effectiveness of the DDPG-based EMS in minimizing costs while ensuring user comfort, outperforming traditional methods in both cost savings and temperature control precision.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE) , 2024.
Keywords [en]
Deep reinforcement learning, electrical vehicle, energy management system, heat pump
National Category
Energy Engineering Reliability and Maintenance Energy Systems
Identifiers
URN: urn:nbn:se:kth:diva-361978DOI: 10.1109/IECON55916.2024.10905415Scopus ID: 2-s2.0-105000866789OAI: oai:DiVA.org:kth-361978DiVA, id: diva2:1949651
Conference
50th Annual Conference of the IEEE Industrial Electronics Society, IECON 2024, Chicago, United States of America, November 3-6, 2024
Note

Part of ISBN 9781665464543

QC 20250404

Available from: 2025-04-03 Created: 2025-04-03 Last updated: 2025-04-04Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Jiang, XuanXu, Qianwen

Search in DiVA

By author/editor
Jiang, XuanXu, Qianwen
By organisation
Electric Power and Energy Systems
Energy EngineeringReliability and MaintenanceEnergy Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 35 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf