Running an LLM on Windows or Mac using Ollama
One of the easiest ways to get started with running LLMs locally on your own machine is using Ollama. Ollama is an open-source product that provides a local llm inference API that you can interact it. It also provides a CLI tool that you can also use to interact with in real-time. You can also …