Use Mistral Small 3 (24B) in Microsoft Word Locally

Seeking an alternative to Microsoft Copilot? Mistral has just launched Mistral Small 3, and its instruction-tuned model performs exceptionally well, competing with open-weight models up to three times its size and even with the proprietary GPT4o-mini model. This is evident across benchmarks for Code, Math, General Knowledge, and Instruction Following.

One thrilling prospect is the potential to integrate the 24B model (mistral-small-24b-instruct-2501) with Microsoft Word locally, thereby eliminating any monthly fees. To see this in action, you can watch this brief demo video. For more examples of using local LLMs in Microsoft Word without incurring inference fees, be sure to check out our YouTube channel at @GPTLocalhost!

Leave a Reply

Your email address will not be published. Required fields are marked *