4.7 Article

PMU Measurements-Based Short-Term Voltage Stability Assessment of Power Systems via Deep Transfer Learning

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIM.2023.3311065

Keywords

Deep transfer learning; least squares generative adversarial networks (LSGANs); phasor measurement unit (PMU) measurements; power system stability; short-term voltage stability assessment (STVSA); temporal ensembling; transformer

Ask authors/readers for more resources

This article proposes a new method for short-term voltage stability assessment using phasor measurement unit measurements, which overcomes limitations of existing methods in adapting to topological changes, sample labeling, and handling small datasets by employing deep transfer learning. Experimental results demonstrate the method's strong adaptability to topological changes and significant improvement in model evaluation accuracy on small-scale datasets.
Deep learning (DL) has emerged as an effective solution for addressing the challenges of short-term voltage stability assessment (STVSA) in power systems; however, existing DL-based STVSA approaches face limitations in adapting to topological changes, sample labeling, and handling small datasets. To overcome these challenges, this article proposes a novel phasor measurement unit (PMU) measurements-based STVSA method by using deep transfer learning. The method leverages the real-time dynamic information captured by PMUs to create an initial dataset. It employs temporal ensembling for sample labeling and uses least squares generative adversarial networks (LSGANs) for data augmentation (DA), enabling effective DL on small-scale datasets. Additionally, the method enhances adaptability to topological changes by exploring connections between different faults. Experimental results on the IEEE 39-bus test system demonstrate that the proposed method improves model evaluation accuracy by approximately 20% through transfer learning (TL), exhibiting strong adaptability to topological changes. By leveraging the self-attention mechanism of the transformer model, this approach offers significant advantages over shallow learning methods and other DL-based approaches.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available