sm

Tagged "LLMs"

  1. in Articles
    Running Ollama without a GPU
    You can run Ollama on an older device, but the response will be slow and/or low quality.
  2. in Articles
    Get started prompt engineering with local LLMs
    Ollama makes it easy to run LLMs locally and provides experimental compatibility with OpenAI's APIs