Wang, ZiyueXing, Yang2024-06-202024-06-202024-03Wang Z, Xing Y. (2024) Energy consumption optimisation for unmanned aerial vehicle based on reinforcement learning framework. International Journal of Powertrains, Volume 13, Issue 1, March 2024, pp.75-941742-4267https://doi.org/10.1504/IJPT.2024.138001https://dspace.lib.cranfield.ac.uk/handle/1826/22536The average battery life of drones in use today is around 30 minutes, which poses significant limitations for ensuring long-range operation, such as seamless delivery and security monitoring. Meanwhile, the transportation sector is responsible for 93% of all carbon emissions, making it crucial to control energy usage during the operation of UAVs for future net-zero massive-scale air traffic. In this study, a reinforcement learning (RL)-based model was implemented for the energy consumption optimisation of drones. The RL-based energy optimisation framework dynamically tunes vehicle control systems to maximise energy economy while considering mission objectives, ambient circumstances, and system performance. RL was used to create a dynamically optimised vehicle control system that selects the most energy-efficient route. Based on training times, it is reasonable to conclude that a trained UAV saves between 50.1% and 91.6% more energy than an untrained UAV in this study by using the same map.75-94en-UKAttribution-NonCommercial 4.0 Internationalhttp://creativecommons.org/licenses/by-nc/4.0/Power consumptionMachine LearningReinforcement Learningtrajectory optimizationQ- Learningenergy efficiencypath planningEnergy consumption optimisation for unmanned aerial vehicle based on reinforcement learning frameworkArticle1742-4275131