OpenAI has pushed out a flurry of updates at its first developer conference – including the release of its new GPT-4 Turbo which can fit the “equivalent of more than 300 pages of text in a single prompt” – and the ability to train and run LLMs powered by proprietary datasets.
Calling the latter simply “GPTs” OpenAI says it will help users create customised versions of ChatGPT for “specific use cases, departments, or proprietary datasets” and the ability to deploy “internal-only GPTs” as it aims to tap an enterprise market keenly looking to develop its own LLMs.
GPT-4 Turbo, which was trained on data as recent as April 2023 meaning it has more “knowledge” of recent events is available as a beta for paying developers to try by passing gpt-4-1106-preview in the API. A “stable production-ready model” will follow in coming weeks OpenAI said.
For enterprise users the company also touted its Assistants API dubbed “our first step towards helping developers build agent-like experiences within their own applications” and the ability to define custom actions by making one or more APIs available to its models: “Like plugins, actions allow GPTs to integrate external data or interact with the real-world. Connect GPTs to databases, plug them into emails, or make them your shopping assistant. For example, you could integrate a travel listings database, connect a user’s email inbox, or facilitate e-commerce orders.
New Function calling capabilities will let users “describe functions of your app or external APIs to models, and have the model intelligently choose to output a JSON object containing arguments to call those functions.”
OpenAI explained: “Users can send one message requesting multiple actions, such as “open the car window and turn off the A/C”, which would previously require multiple roundtrips with the model.”
The release comes amid heightened competition for enterprise LLM users. This week Germany's Aleph Alpha raised $500 million to continue building out its "sovereign" AI capabilities in Europe; enterprise users are also exploring build-it rather than buy-it options using packages of open source tools like those recently released by Docker, Neo4j and Ollama, whilst Amazon is ramping up publicity arounds its Bedrock offering; a fully managed service to help enterprises build generative AI applications on a range of foundation models.