1. 服務器 ollama 安裝
$ curl -fsSL https://ollama.com/install.sh | sh
>>> Cleaning up old version at /usr/local/lib/ollama
>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> The Ollama API is now available at 127.0.0.1:11434.
>>> Install complete. Run "ollama" from the command line.
2. 運行 ollama
$ ollama run deepseek-r1:1.5b
>>> /bye
多行輸入
對于多行輸入,可以使用 “”" 包裹文本:
>>> """Hello,
... world!
... """
查看下載的模型
$ ollama list
NAME ID SIZE MODIFIED
deepseek-r1:1.5b e0979632db5a 1.1 GB 2 hours ago
ollama create 用于從 Modelfile 創建模型
$ ollama create mymodel -f ./Modelfile
刪除模型
$ ollama rm deepseek-r1:1.5b
復制模型
$ ollama cp deepseek-r1:1.5b my-model
多模態模型
$ ollama run llava "這張圖片中有什么? ~/people.png"
2. 客戶端連接服務器
服務器運行:
$ curl http://localhost:11434/api/generate -d '{"model":"deepseek-r1:1.5b"}'
{"model":"deepseek-r1:1.5b","created_at":"2025-08-09T11:20:03.676711152Z","response":"","done":true,"done_reason":"load"}
客戶端運行:
$ curl http://49.51.197.197:11434/api/generate -d '{"model":"deepseek-r1:1.5b"}'
curl: (7) Failed to connect to 49.51.197.197 port 11434 after 277 ms: Couldn't connect to server
報錯
2.1 解決錯誤
- 先檢查網絡連接
$ ping -c 4 49.51.197.197
PING 49.51.197.197 (49.51.197.197) 56(84) bytes of data.
64 bytes from 49.51.197.197: icmp_seq=1 ttl=44 time=304 ms
64 bytes from 49.51.197.197: icmp_seq=2 ttl=44 time=301 ms
64 bytes from 49.51.197.197: icmp_seq=3 ttl=44 time=289 ms
64 bytes from 49.51.197.197: icmp_seq=4 ttl=44 time=280 ms--- 49.51.197.197 ping statistics ---
4 packets transmitted, 4 received, 0% packet loss, time 3004ms
rtt min/avg/max/mdev = 280.205/293.788/304.213/9.659 ms
- 防火墻放開 11434 端口
- 端口監聽
$ netstat -tuln | grep 11434
tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN
顯示 127.0.0.1:11434:服務僅限本機(服務器)訪問,需修改配置
- 查看配置:
$ sudo systemctl cat ollama.service
# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
[Install]
WantedBy=default.target
- 修改配置
$ sudo vi /etc/systemd/system/ollama.service
添加:
Environment="OLLAMA_HOST=0.0.0.0" # 新增監聽地址
- 重新啟動
$ sudo systemctl daemon-reload
$ sudo systemctl restart ollama
- 查看配置
$ sudo systemctl cat ollama.service
# /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
Environment="OLLAMA_HOST=0.0.0.0"
[Install]
WantedBy=default.target
$ netstat -tuln | grep 11434
tcp6 0 0 :::11434 :::* LISTEN
- 客戶端查看
$ curl http://49.51.197.197:11434/api/generate -d '{"model":"deepseek-r1:1.5b"}'
{"model":"deepseek-r1:1.5b","created_at":"2025-08-09T13:54:49.560955998Z","response":"","done":true,"done_reason":"load"}
查找正確