Today, OpenAI announced the first commercial offering outside of ChatGPT Plus, called ChatGPT Enterprise. ChatGPT Enterprise is the first let’s call it company-based offering of ChatGPT. With the service, also comes the following features
* Usage Insight (Analytics dashboard for usage insights)
* SSO (Single-sign on) and bulk-member management
* Unlimited higher-speed GPT-4 access (no usage caps)
* No data is being used or stored to train the models.
* Workspace admins (Enterprise super-users) can define how long conversations should be stored
* 32K context by default
* Access to Code Interpreter
* Shared Chat Templates
* Free credits to use APIs if you need to extend OpenAI into a custom solution.
OpenAI also mentioned some of the services that will be released at a later stage, there has not been any ETA defined yet.
* Data Analysis services
* Connecting company data using grounding to make company data available.
* Make this also an offering for smaller teams.
There are some things that you should also consider.
* The service is still only hosted in the US (Consider latency calls for API calls) if you need to provide service from within the EU using the same models you would still need to use Azure OpenAI as an alternative.
* Does not support self-hosting, meaning that you cannot host this within your own environment, it needs to be hosted by OpenAI.
* If you want to integrate this with your own data you will still need to build your own ecosystem around it. (This is also where we see the most use cases for the usage of LLMs)
* Pricing is unclear, meaning that the price can be quite significant. They should also provide a SaaS-based model for those offering a direct service as well.
I should also mention that you have the option like this microsoft/chat-copilot (github.com) which can be used against Azure OpenAI and uses Azure AD backend as an authentication mechanism. The downside is that it does not provide the same insight, but this means that we can use the EU as a location.