Bob Head Posted on Jan 24 Deepseek-r1 module of Ollama #ai #llm #aiagent #webdev Locally running Ollama with deepseek-r1 reasoning model: The results are pretty slow, Top comments (0) Subscribe Code of Conduct • Report abuse For further actions, you may consider blocking this person and/or reporting abuse
Top comments (0)