sm

Tagged "Ollama"

  1. in Articles
    Running Ollama without a GPU
    You can run Ollama on an older device, but the response will be slow and/or low quality.
  2. in Articles
    Use Ollama with the official Python library
    Get started working with AI, Ollama, and large-language models in four steps
  3. in Articles
    Get started prompt engineering with local LLMs
    Ollama makes it easy to run LLMs locally and provides experimental compatibility with OpenAI's APIs