Al Sultan, MuhamedAvci, IsaTalab, Odia2024-09-292024-09-2920231823-4690https://hdl.handle.net/20.500.14619/8740Fault detection and classification in transmission lines is an important problem in power system protection. This paper proposes a novel fault detection and classification approach based on the fine-tuned LSTM model and dbN wavelet transform. Specifically, the selection of the optimal decomposition scale is proposed. An improved Arithmetic Optimisation Algorithm (IAOA) to enhance the accuracy of the LSTM model by optimizing its hyperparameters and reducing model (RMSE) error is implemented. The proposed method makes a significant advancement in the field of fault detection and classification. A simulated version of the model is run through MATLAB using the Three-Phase Series compensation network (735kV, 60 Hz, and 300 km of fault distance) to classify faults. Features are extracted to a depth of three using the dbN, which is modelled as a wavelet function in this investigation. Finally, the IAOA-LSTM model achieves 99.99% accuracy and 0.0010 loss when testing 2545 simulated samples with five different fault types. Maintaining the stability and reliability of power systems relies heavily on fault detection and classification, which is aided greatly by the proposed method. Implementing the IAOA algorithm for hyperparameter optimization and model error reduction has also been shown to enhance the accuracy of the LSTM model further. Therefore, the proposed approach can significantly contribute to developing more advanced and efficient protection systems for power transmission lines.eninfo:eu-repo/semantics/closedAccessdbN wavelet transformGround FaultsImproved AOALSTM modelModulus maximum matrix Short-circuit faultsENHANCED FAULT DETECTION AND CLASSIFICATION IN TRANSMISSION LINES USING FINE-TUNED LSTM MODEL AND DBN TRANSFORM-BASED FEATURE SELECTIONArticle9476WOS:001148672800006Q3