Capacity's Approach to AI
Revised and effective: January 21, 2025
At Capacity, we are committed to helping teams do their best work. Since our founding in 2017, and through our acquisitions, we have focused on using artificial intelligence (AI) to empower people to work more efficiently and better serve customers. As the AI field rapidly evolves, Capacity will remain grounded in our company values and stay true to our established guiding principles for the use of AI.
Capacity is proud to be AI-native. From day one, we built our platform using AI, rather than adapting to dated technology to incorporate AI.
We think it is important to define some terms before we explain how we use AI in the Capacity platform.
Machine learning (ML): techniques for teaching software to improve its performance on specific tasks based on experience, without explicit programming. LLMs and GPT are considered part of ML.
Artificial Intelligence (AI) is a broad term that encompasses various techniques, algorithms, and approaches used to enable software to perform tasks that typically require human intelligence. AI used by Capacity includes:
Natural language processing (NLP): the ability to understand and generate human language, allowing AI systems to interact with users, analyze text data, or translate between languages. It is primarily concerned with processing and manipulating text or speech data in a way that enables computers to understand and generate human language.
Natural language understanding (NLU): is a subset of NLP which has the ability to match against the entire context of words, phrases, paragraphs and documents to determine what data should be used in a response. NLU delves deeper into understanding the meaning and context behind the language, enabling more sophisticated interactions between AI systems and users.
Large language models (LLMs): large models that make language predictions based on training done on very large data sets of collected text. LLMs can predict each word in a translation from English to French or the best words to summarize a recording from a meeting.
Generative AI or generative pre-trained models (GPT): a subset of AI that refers to a group of smart algorithms that creates new, human-like outputs based on learned patterns and structures from input data, including LLMs as well as fine-tuned training-specific tasks.
Retrieval-augmented generation (RAG): a process applied to LLMs to make their outputs more relevant in specific contexts. RAG allows LLMs to access and reference information outside the LLMs own training data, such as an organization’s specific knowledge base (such as in a database, through an API call to an app, or on a webpage), before generating a response and can include citations.
Capacity uses various AI tools and models to power our platform. Since inception, we've used a mixture of open source and in-house NLP models that allow our platform (chat, SMS, email) to understand the intent of incoming requests and messages to provide more relevant answers to users. Capacity continuously improves and grows our customers’ knowledge bases with state-of-the-art, built-in ML feedback systems. Capacity has incorporated NLU techniques to match against the entire context of words, phrases, paragraphs, and documents to determine what data should be used in a response and continues to use some parts of our NLP technology to enhance the success of our more recent NLU-based approach. Capacity has reviewed these tools and models to confirm that they perform as expected and all of them are hosted by us (in other words, Customer Data is not processed outside of Capacity’s or the customer’s private cloud tenant environment in utilizing these ML, NLP, and NLU models).
Beginning in 2023, we have integrated Generative AI power into our Capacity platform. LLMs are used in the Capacity platform to provide a better understanding of language in queries and to generate human-like text in the desired tone (casual or formal), such as in a draft response to a customer’s end-user or in draft KnowledgeBase Q&A pairs.
Capacity utilizes several refined, self-hosted large language models (LLM), each of which is based on pre-trained open-source models. We host our LLMs on our own dedicated cloud instances in AWS and Microsoft Azure (or the customer’s private cloud tenant deployments if contractually agreed). Capacity also licenses certain enterprise models from OpenAI which are particularly well-suited for scenarios that demand robust language processing capabilities. We license these in two ways. First, we have direct licenses with OpenAI for its ChatGPT Enterprise license, as well as a Business Associates Agreement ensuring zero data retention and, which ensures that any data, inputs, or outputs generated by Capacity and its customers are not utilized for the purpose of training OpenAI's models nor do the OpenAI models learn from usage. ChatGPT Enterprise is SOC 2 compliant and all conversations are encrypted in transit and at rest. Second, we utilize Azure OpenAI, which also does not use customer data to retrain models. When Azure OpenAI runs within our dedicated Microsoft Azure cloud instance, customers get the security capabilities of Microsoft Azure while running the same robust models that OpenAI licenses. Accordingly, any fine-tuning and tailoring of LLMs to meet specific Capacity requirements are not distributed to other OpenAI clients or incorporated into the training of additional models.
The foundations of Capacity’s approach to Generative AI are as follows:
We do not sell customer data to train LLMs
We do not train our internal LLM on customer data unless the customer requests that we do so in order to better train the model for its usage
We are compliant with HIPAA, GDPR, GLBA, and other applicable data privacy regulations
Our platform is subject to annual SOC 2, Type 2, and ISO 27001 reviews from outside auditors
We conduct security awareness training for our team members on a regular basis
We perform an IT Risk assessment evaluating the risk of technology and security on an annual basis
We welcome any questions or interest you have in AI, including GPT and LLMs.