The model, codenamed as “Olympus”, has 2 trillion parameters, the people said, which could make it one of the largest models being trained. OpenAI's GPT-4 model, one of the best models available, is reported to have one trillion parameters. Amazon has already trained smaller models such as Titan. It has also partnered with AI model startups such as Anthropic and AI21 Labs, offering them to Amazon Web Services (AWS) users. Amazon believes having homegrown models could make its offerings more attractive on AWS, where enterprise clients want to access top-performing models, the people familiar with the matter said, adding there is no specific timeline for releasing the new model.