 The research proposes an approach that uses machine learning and LSTM-based neural networks with various configurations to optimize time series models for accurate electric load forecasting and smart grids by selecting the best features using wrapper and embedded feature selection methods and genetic algorithm for optimal time lag and number of layers prediction. The results show that the LSTM model with optimally selected time lag features outperforms machine learning models with hyperparameter tuning leading to decreased mean absolute error MAE and root mean square error RMSC for medium to long range forecasting in a wider metropolitan area. This article was authored by Sula Bhaktif, Ali Faiz, Ali Auni and others.