Now that I’ve integrated Microsoft Word locally with several local LLM servers, it’s time to put its capabilities to the test. I’ll be benchmarking Microsoft Copilot in Word […]
-
Empower Your Team: Deploy Local LLMs in Microsoft Word on Your Intranet
Seeking an alternative to Microsoft Copilot for your team within your intranet? Here’s a streamlined guide on how it operates: Start by hosting the Phi-4 model using LM […] read more
-
Use OpenLLM in Microsoft Word Locally. No Recurring Inference Costs.
Looking for a Microsoft Copilot alternative without recurring inference costs? You can achieve this in Microsoft Word by utilizing OpenLLM and local LLMs. OpenLLM lets you easily use […] read more
-
Use Xinference in Microsoft Word Locally. No Recurring Inference Costs.
Looking for a Microsoft Copilot alternative without recurring inference costs? You might consider utilizing Xinference in combination with LLMs directly within Microsoft Word. Xinference is a robust and […] read more
A local alternative to Microsoft Copilot in Word
Large language models (LLMs) have been rapidly advancing over the past few years. While bigger and more powerful versions are being developed in the cloud, there’s a growing […]
Demo 1: Using AnythingLLM in Microsoft Word
Demo 2: Using LM Studio in Microsoft Word (local model: Llama 3.2)
Demo 3: Using Transformer Lab in Microsoft Word (local model: Phi-4)
Demo 4: Using Ollama in Microsoft Word (local model: Llama 3.2)
Demo 5: Using llama.cpp in Microsoft Word (local model: gemma-2b)
Demo 6: Using LocalAI in Microsoft Word (local model: Llama 3.2)
Demo 7: Using KoboldCpp in Microsoft Word (local model: mistral 2.2)
Demo 8: Using Xinference in Microsoft Word (local model: Llama 2)
Demo 9: Using OpenLLM in Microsoft Word (local model: llama3.2:1b)
Demo 10: Using LiteLLM in Microsoft Word (gemini-1.5-flash, if cloud-based model is preferred)
Demo 11: Empowering Your Team with Phi-4 in Microsoft Word within Your Intranet