 Abstract machine learning, ML, models have traditionally ignored the idea of innateness, which is the encoding of complex behaviors in the early stages of brain development. This paper proposes a new approach that takes into account this concept, by considering the weight matrix of a neural network to be emergent from well-studied rules of neuronal compatibility. The authors then use these rules to update the network's weights, resulting in a more efficient representation of data with fewer parameters. Additionally, they found that their model could act as a regularizer, selecting simple circuits that provided stable and adapted performance on metal-earning tasks. By incorporating neurodevelopmentally considerations into ML frameworks, the authors were able to model the emergence of innate behaviors and discover structures that promote complex computations. This article was authored by Daniel L. Barabasi, Taliesin Baynon, Adam Katona, and others.