E3S Web Conf.
Volume 236, 20213rd International Conference on Energy Resources and Sustainable Development (ICERSD 2020)
|Number of page(s)||5|
|Section||Development, Utilization and Protection of Traditional Energy Resources|
|Published online||09 February 2021|
Research on the application of target detection based on deep learning technology in power grid operation inspection
1 Information and communication transportation Inspection Center, State Grid Sichuan Information & telecommunication Company, Sichuan, China
2 Information and communication transportation Inspection Center, State Grid Sichuan Information & telecommunication Company, Sichuan, China
3 Power dispatching control center, State Grid Sichuan Electric Power Company, Sichuan, China
4 Research and development, Whayer Intelligent Technology Group Co., Ltd., Sichuan, China
5 Research and development, Sichuan GRID Power Communication Technology Co., Ltd., Sichuan, China
-External damage to power facilities caused by crane, excavator and other construction operations increases year by year, which will seriously threaten the safe operation of power system. It is an important measure to ensure the safe and reliable operation of power system to implement intelligent monitoring and early warning of power external breakdown through video and other non-contact observation means. The video data of power mainly comes from the fixed monitoring of helicopters, uavs and transformation poles and towers, which is characterized by large amount of data, complex scenes and serious environmental interference. The traditional target detection method usually selects the candidate area first, and then makes judgment based on the characteristics of human construction. The detection speed is slow and the accuracy is low, which makes it impossible to monitor the video data in real time, so as to make timely and accurate early warning and intervention fbr external damage. The target detection method based on deep learning optimizes or even eliminates the selection of candidate regions, which greatly speeds up the detection speed. By learning a lot of target samples through the deep neural network, the characteristics of high robustness are gradually fitted to make the target judgment more accurate.
There are three key problems in introducing the target detection method based on deep learning into the power video detection: Firstly, the target detection method based on deep learning has a large amount of calculation and many parameters. In order to realize in-place operation on terminals with limited computing and storage capacity, it is necessary to find a practical method to simplify the network and reduce the amount of operational data in the detection process, which is the key to realize in-place operation and terminal operation of deep neural network. Secondly, for specific application scenarios, the effect of different target detection algorithms varies greatly, and there is a strong particularity of power video. Finding an effective target detection method is the key to improve the detection speed and accuracy. Finally, with the continuous development of deep learning, the structure of deep neural network changes with each passing day, and each has its own characteristics, which network structure is used as the feature extraction layer of target detection algorithm is the focus of research.
© The Authors, published by EDP Sciences 2021
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.