Show simple item record

dc.contributor.authorAlfaverh, Fayiz
dc.contributor.authorDenai, Mouloud
dc.contributor.authorSun, Yichuang
dc.date.accessioned2021-07-21T09:43:13Z
dc.date.available2021-07-21T09:43:13Z
dc.date.issued2021-06-11
dc.identifier.citationAlfaverh , F , Denai , M & Sun , Y 2021 , ' Electrical Vehicle Grid Integration for Demand Response in Distribution Networks Using Reinforcement Learning ' , IET Electrical Systems in Transportation . https://doi.org/10.1049/els2.12030
dc.identifier.issn2042-9738
dc.identifier.urihttp://hdl.handle.net/2299/24876
dc.description© 2021 The Authors. IET Electrical Systems in Transportation published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology. This is an open access article under the terms of the Creative Commons Attribution License, https://creativecommons.org/licenses/by/4.0/
dc.description.abstractMost utilities across the world already have demand response (DR) programs in place to incentive consumers to reduce or shift their electricity consumption from peak periods to off-peak hours usually in response to financial incentives. With the increasing electrification of vehicles, emerging technologies such as vehicle-to-grid (V2G) and vehicle-to-home (V2H) have the potential to offer a broad range of benefits and services to achieve more effective management of electricity demand. In this way, electric vehicles (EV) become distributed energy storage resources and can conceivably, in conjunction with other electricity storage solutions, contribute to DR and provide additional capacity to the grid when needed. This paper proposes an effective DR approach for V2G and V2H energy management using Reinforcement Learning (RL). Q-learning, a RL strategy based on a reward mechanism, is used to make optimal decisions to charge or delay the charging of the EV battery pack and/or dispatch the stored electricity back to the grid without compromising the driving needs. Simulations are presented to demonstrate how the proposed DR strategy can effectively manage the charging/discharging schedule of the EV battery and how V2H and V2G can contribute to smooth the household load profile, minimise electricity bills and maximise revenue.en
dc.format.extent14
dc.format.extent2360692
dc.language.isoeng
dc.relation.ispartofIET Electrical Systems in Transportation
dc.titleElectrical Vehicle Grid Integration for Demand Response in Distribution Networks Using Reinforcement Learningen
dc.contributor.institutionCentre for Engineering Research
dc.contributor.institutionCommunications and Intelligent Systems
dc.contributor.institutionSchool of Physics, Engineering & Computer Science
dc.contributor.institutionDepartment of Engineering and Technology
dc.description.statusPeer reviewed
rioxxterms.versionofrecord10.1049/els2.12030
rioxxterms.typeJournal Article/Review
herts.preservation.rarelyaccessedtrue


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record