 What is the GPT part of chat GPT? So generative pre-trained models are a type of language model and they have some semblance of language. GPT models are based on the concept of language models and transformer neural networks. Language models have some mathematical understanding of language. Transformer neural networks are sequence to sequence models that have an encoder and a decoder part, which can both independently become language models. If you stack a bunch of the encoders together, you get BERT. If you stack a bunch of the decoders together, you get the GPT models.