一、虛擬環境安裝
mkdir open_webui_pipelines
cd open_webui_pipelines
python -m venv py3119_env
call py3119_env\Scripts\activate
二、下載服務以及安裝依賴
git clone https://github.com/open-webui/pipelines.git
cd pipelines
pip install -r requirements.txt
三、跑個簡單示例
copy .\examples\pipelines\providers\ollama_pipeline.py .\pipelines\
start.bat
輸出如下,表示已加載ollama_pipeline
INFO:root:Created subfolder: ./pipelines\ollama_pipeline
INFO:root:Created valves.json in: ./pipelines\ollama_pipeline
Loaded module: ollama_pipeline
INFO:root:Loaded module: ollama_pipeline
on_startup:ollama_pipeline
四、修改使用該PipeLine文件
-
修改文件,指向自己的ollama模型。
修改剛才拷入的 \pipelines\ollama_pipeline.py。
在 32行和33行,指向自己的ollama模型地址和名稱。源代碼如下:
OLLAMA_BASE_URL = "http://localhost:11434"MODEL = "llama3"
-
修改完重啟服務
-
設置:
添加 API 連接(url為pipelines服務的端口):- 進入 Open WebUI 管理員面板 > 設置 > 外部鏈接,打開【OpenAI API】
- 點擊
+
添加新連接。 - 設置 API URL 為
http://localhost:9099
,API 密鑰為0p3n-w3bu!
(默認值。必填)。
-
使用:
點擊【新對話】,在模型下拉中選擇【Ollama Pipeline】,開啟對話。
pipelines服務后臺輸出日志,表示已經啟用改pipeline。
五、完善
1、修改\pipelines\ollama_pipeline.py文件,完成以下任務:
- 在頁面可以配置ollama服務和模型
- 修改一部分python不符合項
- 將print修改為logger輸出
import logging
import os
from typing import List, Union, Generator, Iteratorfrom pydantic import BaseModel, Fieldimport requestsclass Pipeline:class Valves(BaseModel):OLLAMA_BASE_URL: str = Field(default="http://localhost:11434",description="ollama base url")OLLAMA_DEFAULT_MODEL: str = Field(default="llama3", description="ollama default model name")def __init__(self):# self.id = "ollama_pipeline"self.name = "Ollama Pipeline"self.valves = self.Valves(**{k: os.getenv(k, v.default) for k, v inself.Valves.model_fields.items()})self.log = logging.getLogger(__name__)passasync def on_startup(self):self.log.info(f"on_startup:{__name__}")passasync def on_shutdown(self):self.log.info(f"on_shutdown:{__name__}")passdef pipe(self, user_message: str, model_id: str, messages: List[dict], body: dict) -> Union[str, Generator, Iterator]:# This is where you can add your custom pipelines like RAG.self.log.info(f"pipe:{__name__}, model_id:{model_id}, messages:{messages}")ollama_base_url = self.valves.OLLAMA_BASE_URLmodel = self.valves.OLLAMA_DEFAULT_MODELif "user" in body:self.log.info("######################################")self.log.info(f'# User: {body["user"]["name"]} ({body["user"]["id"]})')self.log.info(f"# Message: {user_message}")self.log.info("######################################")try:r = requests.post(url=f"{ollama_base_url}/v1/chat/completions",json={**body, "model": model},stream=True,)r.raise_for_status()if body["stream"]:return r.iter_lines()else:return r.json()except Exception as e:return f"Error: {e}"
- 刪除pipelines\ollama_pipeline目錄
- 重啟服務
六、填坑:當pipelines文件中含有漢字時,文件編碼會引起pipeline加載失敗。
- 文件為utf-8時錯誤如下:
Error loading module: ollama_pipeline
'gbk' codec can't decode byte 0xaf in position 781: illegal multibyte sequence
WARNING:root:No Pipeline class found in ollama_pipeline
- 文件為gbk時錯誤如下:
Error loading module: ollama_pipeline
(unicode error) 'utf-8' codec can't decode byte 0xc6 in position 0: invalid continuation byte (ollama_pipeline.py, line 28)
WARNING:root:No Pipeline class found in ollama_pipeline
- 處理:
修改pipelines\main.py文件,將文件中open()的函數調用,使用了文本方式的讀寫,都添加encoding=“utf-8”
137: with open(module_path, "r", encoding="utf-8") as file
193: with open(valves_json_path, "w", encoding="utf-8") as f
201: with open(valves_json_path, "r", encoding="utf-8") as f
580: with open(valves_json_path, "w", encoding="utf-8") as f:
- 然后將pipelines文件保存為utf-8即可
- 刪除pipelines\ollama_pipeline目錄
- 重啟服務
? 著作權歸作者所有