Microsoft Copilot is a security and orchestration layer that enables the company to integrate OpenAI’s technology into its suite of apps.
Right now, an essential thing to understand is you can’t just take OpenAI and push it into a product; you want to create an intermediate layer (a middle layer) between your application and the user, which sets the context to make the AI relevant.
Instead, they built an additional layer that worked as an intermediary between ChatGPT and the Bing AI search experience.
This layer was built for safety and constraints but also to ground the model by leveraging the index Bing has for search.
Like Bing AI’s Prometheus, this works as a layer that intermediates OpenAI to search.
The Copilot System is built as a middle layer able to integrate ChatGPT into Microsoft’s apps.
This copilot is made of three elements:
1. Microsoft 365 apps (Word, Excel, PowerPoint, Teams and more).
2. Microsoft Graph (all content and context, like email, files, meetings, chat, and calendars). This is a critical component to ground the model, thus translating the user’s prompt into something more relevant, which makes the app execute commands without hallucinating.
3. And LLM: a creative engine able to parse the text and data to integrate into its apps.
Let me explain this in detail.
If you’re trying to build an app that potentially reaches millions of users or a tool intended for enterprise use cases, this layer is critical to making the app valuable!
It starts with a prompt from you in an app.
This prompt can be about a user asking PowerPoint to generate a slide about any topic.
Yet, that prompt doesn’t directly translate into a command for ChatGPT/OpenAI.
Instead, Microsoft’s copilot takes over, preprocessing the prompt through an approach called grounding.
Put in simple words, (as Microsoft explains) grounding improves the quality of the prompt, so you get relevant and actionable answers.
Take the case of you prompting Microsoft’s Excel to tell you how your company’s sales are doing respective to competitors; if ChatGPT were unleashed without the Microsoft Copilot, it would probably go off to make stuff up.
Instead, Microsoft’s Copilot System intermediates the prompt to tell ChatGPT to access the context (made of the sales data you have) to provide a relevant answer, which is accurate, and it doesn’t make stuff up!
How would this work in practice?
Let me show you how we used a similar approach to build a web app.
Take the case of a user who can talk to an Excel Spreadsheet to ask it questions about the data.
The user might use a prompt like “give me a comparison of sales data between us and our competition.”
That seems a good prompt, doesn’t it?
Well, yes, but it can still generate an inaccurate answer instead of processing the prompt as it is.
Microsoft’s Copilot System translates the prompt into something like
“Access my sales data to provide a comparison respective to competitors, but make sure you use the context in a way that is relevant and accurate, stating only facts!”
This is the prompt that is passed to ChatGPT to provide a secure and accurate answer!
Below, it’s what the process looks like!
In short, you have based on your experience with ChatGPT or Bing chat.
Copilot takes the response from the LLM and post-processes it. This post-processing includes additional grounding calls to the graph.
Thanks to the graph, Microsoft can perform checks, security, compliance and privacy reviews, and command generation.
A good chunk of the intermediation Microsoft does to make ChatGPT prone to be used in its apps is to “reframe” the user’s prompt to ground it based on context (through a graph) so that the prompt is changed in a way that makes it more relevant.
In short, the user doesn’t prompt the apps, but Microsoft copilot intermediates it and translates the user prompting into a relevant one; that is most of the grounding Microsoft does through its copilot.
Sorry if I got too technical, but that is how you build a middle-layer application!
This process is critical either to building consumer-level applications or enterprise-level applications.
- Microsoft Copilot is a security and orchestration layer that integrates OpenAI’s technology into Microsoft’s suite of apps.
- An intermediate layer is created between the user’s application and ChatGPT to set context and make the AI relevant.
- Microsoft built an additional layer to integrate ChatGPT into Bing search and Microsoft 365 apps for safety, constraints, and grounding.
- The Copilot System consists of three elements: Microsoft 365 apps, Microsoft Graph, and LLM (a creative engine).
- Grounding is used to improve prompt quality by accessing relevant and accurate answers from the Microsoft Graph.
- The Copilot System translates user prompts into more relevant ones, ensuring secure and accurate responses.
- Microsoft’s intermediation makes ChatGPT prone to be used in apps, ensuring context-based and accurate results.
- The process is critical for building consumer-level and enterprise-level applications.
Connected Business Model Analyses
Stability AI Ecosystem