Selecting an AI Model
In this short guide, we will walk you through the process of picking an AI model on the Capacity platform. This decision is essential as it spells out the capabilities and limitations of your bot.
Getting Started
- Accessing the Capacity Console: To initiate the process, you need to log into the Capacity Console.
- Exploring the AI Settings:Once logged in, navigate to the AI settings option, which contains a list of different AI models that you can use.
Understanding the AI Models
Before you make your decision, it's important to understand the capabilities of the models available to you. Notably, the AI model you choose will be used whenever you need to use generative AI inside of your console, such as drafting with Autopilot or adding it via your chatbot using Guided Conversations. To learn about our approach to AI, click here.
Capacity's Internal AI Model
Our own in-house foundational AI model, built on a 70B billion parameter Large Language Model (LLM), offers a robust and private framework. Among its credentials include SOC 2 and HIPAA certification. The model can handle a maximum of 128k tokens.
Capacity's External AI Model – Powered by GPT-4o from OpenAI:
This model has power, boasting the most tokens in the world at 128K tokens. We have signed a BAA with OpenAI pursuant to which OpenAI may create, receive, maintain or transmit PHI from Capacity customers. Our OpenAI access is through its “Zero Data Retention” transmission and processing solution that is designed to avoid retention of any PHI. It's the top choice for extensive LLM functionalities.
Each of these models has unique offerings that cater to different user needs. Ensure to consult with your security team to make the right choice.
Selecting the AI Model
- Once you have chosen your model, the platform allows you to select it in the AI settings.
- After choosing your model, click "Save."
- You can verify how the model performs in the Help Desk AI toolkit, AI Inquiry Generator, Guided Conversations, and Autopilot.
While the Capacity platform offers you the flexibility to switch AI models after you have initially selected one, keep in mind that this switch can lead to significant challenges.
The type of AI model you choose has profound impacts on how your workflows, guided conversations, and overall interface operate. Changing models may necessitate reconfiguring existing workflows and guided conversations to adapt to the capabilities and limitations of the new model.
For instance, if you initially selected Capacity's Internal AI Model and then later decided to change to Capacity's External AI Model, you may need to adjust your workflows to accommodate the increased token limit. These modifications may not only be time-consuming but can also disrupt the smooth functioning of your system and misalign your previously set preferences.
On a similar note, manual adjustments can introduce a risk of errors and inconsistencies, leading to suboptimal performance of the platform. Remember, consistency and stability in model selection can be key to maintaining a smooth platform performance and optimal user experience.
Please note that changing the model later on can lead to difficulties in reconfiguring your existing workflows and guided conversations. Therefore, it's prudent to stick to the model you've chosen from the beginning.
You've Selected Your AI Model—What Comes Next?
Congratulations! Having successfully selected your AI model, you've made a significant step towards customizing and enhancing your Capacity platform capabilities. However, the journey doesn't end here.
Next, you'll need to prepare your documents for integration with your selected model. This process involves organizing your documents into themes or specific folders for effective indexing and appropriate audience delivery. For instance, you might want to create folders for different departments such as HR, Sales, Operations, and Customer Support. This robust organization facilitates intelligent document retrieval and improves your Capacity platform’s performance.