通过ollama管理本地模型
C:\Users\zc459\AppData\Local\Programs\Ollama\ollama app
持续性预载模型
curl http://192.168.50.3:18000/api/generate -d '{"model": "qwen2:7b", "keep_alive": -1}'3.1
curl http://192.168.50.3:18000/api/generate -d '{"model": "wangshenzhi/llama3-8b-chinese-chat-ollama-q8:latest", "keep_alive": -1}' curl http://192.168.50.3:18000/api/generate -d '{"model": "wangshenzhi/llama3.1_8b_chinese_chat:latest", "keep_alive": -1}'deepseek
curl http://192.168.50.3:18000/api/generate -d '{"model": "deepseek-r1:14b", "keep_alive": -1}'重排模型
curl http://192.168.50.3:18000/api/generate -d '{"model": "linux6200/bge-reranker-v2-m3", "keep_alive": -1}'卸载模型
http://192.168.50.3:18000/api/generate -d '{"model": "qwen2:7b","keep_alive": 0}'查看GPU状态
nvidia-smiollama 默认模型保存路径:
Windows:C:\Users\%username%\.ollama\modelsLinux:/usr/share/ollama/.ollama/models
评论区