Now that I’ve integrated Microsoft Word locally with several local LLM servers, it’s time to put its capabilities to the test. I’ll be benchmarking Microsoft Copilot in Word […]
-
Use llama.cpp in Microsoft Word Locally. No Recurring Inference Costs.
Looking for a Microsoft Copilot alternative without recurring inference costs? Consider using llama.cpp with local LLMs directly within Microsoft Word. Llama.cpp is designed to facilitate LLM inference with […] read more
-
Use LM Studio in Microsoft Word Locally. No Recurring Inference Costs.
Looking for a Microsoft Copilot alternative without recurring inference costs? Consider LM Studio for seamless integration with local LLMs right within Microsoft Word. With LM Studio, you can […] read more
-
Use LiteLLM in Microsoft Word Locally or Remotely
Looking for a Microsoft Copilot alternative without recurring inference costs? Consider LiteLLM as a viable option. LiteLLM functions as an LLM Gateway, offering access to over 100 LLM […] read more
A local alternative to Microsoft Copilot in Word
Large language models (LLMs) have been rapidly advancing over the past few years. While bigger and more powerful versions are being developed in the cloud, there’s a growing […]
Demo 1: Using AnythingLLM in Microsoft Word
Demo 2: Using LM Studio in Microsoft Word (local model: Llama 3.2)
Demo 3: Using Transformer Lab in Microsoft Word (local model: Phi-4)
Demo 4: Using Ollama in Microsoft Word (local model: Llama 3.2)
Demo 5: Using llama.cpp in Microsoft Word (local model: gemma-2b)
Demo 6: Using LocalAI in Microsoft Word (local model: Llama 3.2)
Demo 7: Using KoboldCpp in Microsoft Word (local model: mistral 2.2)
Demo 8: Using Xinference in Microsoft Word (local model: Llama 2)
Demo 9: Using OpenLLM in Microsoft Word (local model: llama3.2:1b)
Demo 10: Using LiteLLM in Microsoft Word (gemini-1.5-flash, if cloud-based model is preferred)
Demo 11: Empowering Your Team with Phi-4 in Microsoft Word within Your Intranet