 This paper explores the use of a neural network to train multiple local models using dispersed data. The authors propose a method which involves generating artificial objects to train each model, depending on the amount of data available. They compare the performance of this approach against other methods such as using fewer or more artificial objects, varying the number of neurons in the hidden layer, and varying the degree of data dispersion. The results show that for larger data sets, a greater number of neurons in the hidden layer leads to improved accuracy. Additionally, for smaller data sets, three or four artificial objects are sufficient to achieve good results. Finally, for larger data sets, data balance has little effect on the accuracy of the model. This article was authored by Quibina Frimpong-Marfo and Maugwarshada Prusabhila Kasparak.