#工作記錄

kortix-ai/suna: Suna - 開源通用 AI 代理
項目概述
Suna 是一個完全開源的 AI 助手,通過自然對話幫助用戶輕松完成研究、數據分析等日常任務。它結合了強大的功能和直觀的界面,能夠理解用戶需求并提供結果。其強大的工具包包括瀏覽器自動化、文件管理、網頁抓取、命令行執行、網站部署以及與各種 API 和服務的集成,這些功能協同工作,使 Suna 能夠通過簡單的對話解決復雜問題并自動化工作流程。
項目架構
Suna 主要由四個組件組成:
- 后端 API:基于 Python/FastAPI 構建的服務,負責處理 REST 端點、線程管理以及與 Anthropic 等大語言模型(LLM)的集成(通過 LiteLLM)。
- 前端:使用 Next.js/React 開發的應用程序,提供響應式用戶界面,包括聊天界面、儀表盤等。
- Agent Docker:為每個代理提供隔離的執行環境,支持瀏覽器自動化、代碼解釋器、文件系統訪問、工具集成和安全特性。
- Supabase 數據庫:負責數據持久化,包括認證、用戶管理、對話歷史記錄、文件存儲、代理狀態、分析和實時訂閱等功能。
使用案例
倉庫文檔中列舉了多個使用案例,展示了 Suna 在不同場景下的應用,例如:
- 競爭對手分析:分析特定行業的市場情況,生成 PDF 報告。
- 風險投資基金列表:獲取美國重要風險投資基金的信息。
- 候選人搜索:在 LinkedIn 上查找符合特定條件的候選人。
- 公司旅行規劃:生成公司旅行的路線計劃和活動安排。
- Excel 數據處理:設置 Excel 電子表格并填充相關信息。
- 活動演講者挖掘:尋找符合條件的 AI 倫理演講者并輸出聯系方式和演講摘要。
- 科學論文總結和交叉引用:研究和比較科學論文,生成相關報告。
- 潛在客戶研究和初步聯系:研究潛在客戶,生成個性化的初步聯系郵件。
- SEO 分析:基于網站生成 SEO 報告分析。
- 個人旅行規劃:生成個人旅行的詳細行程計劃。
- 近期融資的初創公司:從多個平臺篩選特定領域的初創公司并生成報告。
- 論壇討論抓取:在論壇上查找特定主題的信息并生成列表。
Microsoft Windows [Version 10.0.27868.1000]
(c) Microsoft Corporation. All rights reserved.(.venv) F:\PythonProjects\suna>python setup.py '--admin'
? ?███████╗██╗ ? ██╗███╗ ? ██╗ █████╗?
? ?██╔════╝██║ ? ██║████╗ ?██║██╔══██╗
? ?███████╗██║ ? ██║██╔██╗ ██║███████║
? ?╚════██║██║ ? ██║██║╚██╗██║██╔══██║
? ?███████║╚██████╔╝██║ ╚████║██║ ?██║
? ?╚══════╝ ╚═════╝ ╚═╝ ?╚═══╝╚═╝ ?╚═╝
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ??
? ?Setup Wizard
This wizard will guide you through setting up Suna, an open-source generalist AI agent.
Step 1/8: Checking requirements
==================================================? ?git is installed
? ?docker is installed
? ?python3 is installed
? ?poetry is installed
? ?pip3 is installed
? ?node is installed
? ?npm is installed
? ?Docker is running
? ?Suna repository detected
完整日志
經過十余次部署嘗試,終于成功將項目運行起來,這一路可謂荊棘密布。整個過程需要配置眾多外部軟件及 API 密鑰,從環境搭建到依賴安裝,從密鑰獲取到服務鏈接,每一個環節都可能暗藏 “陷阱”,需要反復排查與調試。盡管在部署過程中仍遺留了一些待解決的細節問題,但項目已實現基本運行。
現將完整部署日志記錄如下,既便于后期復盤總結,也可供大家參考,提前規避常見報錯。后續我將繼續深入調試,并整理成完整教程分享給大家。
[日志中的API key均已失效(未充值),僅用于日志記錄展示,失效的API key會導致部署受阻]
Microsoft Windows [Version 10.0.27868.1000]
(c) Microsoft Corporation. All rights reserved.(.venv) F:\PythonProjects\suna>python setup.py '--admin'███████╗██╗ ██╗███╗ ██╗ █████╗ ██╔════╝██║ ██║████╗ ██║██╔══██╗███████╗██║ ██║██╔██╗ ██║███████║╚════██║██║ ██║██║╚██╗██║██╔══██║███████║╚██████╔╝██║ ╚████║██║ ██║╚══════╝ ╚═════╝ ╚═╝ ╚═══╝╚═╝ ╚═╝Setup WizardThis wizard will guide you through setting up Suna, an open-source generalist AI agent.Step 1/8: Checking requirements
==================================================? git is installed
? docker is installed
? python3 is installed
? poetry is installed
? pip3 is installed
? node is installed
? npm is installed
? Docker is running
? Suna repository detectedStep 2/8: Collecting Supabase information
==================================================?? You'll need to create a Supabase project before continuing
?? Visit https://supabase.com/dashboard/projects to create one
?? After creating your project, visit the project settings -> Data API and you'll need to get the following information:
?? 1. Supabase Project URL (e.g., https://abcdefg.supabase.co)
?? 2. Supabase anon key
?? 3. Supabase service role key
Press Enter to continue once you've created your Supabase project...
Enter your Supabase Project URL (e.g., https://abcdefg.supabase.co): https://gcnijvljsutcxwsdfsgcedjz.supabase.co
Enter your Supabase anon key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9safasfsaf.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6Imdjbmlqdmxqc3V0Y3h3Z2NlZGp6Iiwicm9sZSI6IsdfmFub24iLCJpYXQiOjE3NDg1MjAwNjksImV4cCI6MjA2NDA5NjA2OX0.WkHwZgqXVwVVR6gnjy1BbfPqqTStdx0Tob0iqMQu5TQ
Enter your Supabase service role key: eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpcsdfsafsa3MiOiJzdXBhYmFzZSIsInJlZiI6Imdjbmlqdmxqc3V0Y3h3Z2NlZGp6Iiwicm9sZSI6IsdfnNlcnZpY2Vfcm9sZSIsImlhdCI6MTc0ODUyMDA2OSwiZXhwIjoyMDY0MDk2MDY5fQ.SUGg5LWt41NA_E-fKSt1vBLt4jBFw6sEeMAa1xvYbywStep 3/8: Collecting Daytona information
==================================================?? You'll need to create a Daytona account before continuing
?? Visit https://app.daytona.io/ to create one
?? Then, generate an API key from 'Keys' menu
?? After that, go to Images (https://app.daytona.io/dashboard/images)
?? Click '+ Create Image'
?? Enter 'kortix/suna:0.1.2.8' as the image name
?? Set '/usr/bin/supervisord -n -c /etc/supervisor/conf.d/supervisord.conf' as the Entrypoint
Press Enter to continue once you've completed these steps...
Enter your Daytona API key: dtn_8856676c89b5575977dc9afe69dbe67sdfsfba1d76361c7e5ff537862c98c3827cd2bStep 4/8: Collecting LLM API keys
==================================================?? You need at least one LLM provider API key to use Suna
?? Available LLM providers: OpenAI, Anthropic, OpenRouterSelect LLM providers to configure:
[1] OpenAI
[2] Anthropic
[3] OpenRouter (access to multiple models)
Enter numbers separated by commas (e.g., 1,2,3)Select providers (required, at least one): 1,3
??
Configuring OPENAI
Enter your OpenAI API key: sk-proj-dUUSgK9ysdfsdfsdfsaf1cFHa-f9ImeDrJkiPbE4Ei0Bs87-YT4idKotRaYkMlU61EuT2RxW1yGlm6-6lcRhMmT3BlbkFJp7ZEISV8HsdhWTxORCEvlwZ7Rrsdfsafv568HKuYpU_9dm0WnCelDytNKPkqWrchoFNhUUh-iCIAGfX-oARecommended OpenAI models:
[1] openai/gpt-4o
[2] openai/gpt-4o-mini
Select default model (1-4) or press Enter for gpt-4o: 1
??
Configuring OPENROUTER
Enter your OpenRouter API key: sk-or-v1-5405c9fd3c1f99d9122446sdf6ef81f618sdffad90sdfadf192d77ff17cb65a0d312e621286ee6aRecommended OpenRouter models:
[1] openrouter/google/gemini-2.5-pro-preview
[2] openrouter/deepseek/deepseek-chat-v3-0324:free
[3] openrouter/openai/gpt-4o-2024-11-20
Select default model (1-3) or press Enter for gemini-2.5-flash: 2
? Using openrouter/deepseek/deepseek-chat-v3-0324:free as the default modelStep 5/8: Collecting search and web scraping API keys
==================================================?? You'll need to obtain API keys for search and web scraping
?? Visit https://tavily.com/ to get a Tavily API key
?? Visit https://firecrawl.dev/ to get a Firecrawl API key
Enter your Tavily API key: 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
Enter your Firecrawl API key: fc-1801bsdfsfedf8e2942d4bdf536032f798e03
Are you self-hosting Firecrawl? (y/n): NStep 6/8: Collecting RapidAPI key
==================================================?? To enable API services like LinkedIn, and others, you'll need a RapidAPI key
?? Each service requires individual activation in your RapidAPI account:
?? 1. Locate the service's `base_url` in its corresponding file (e.g., https://linkedin-data-scraper.p.rapidapi.com in backend/agent/tools/data_providers/LinkedinProvider.py)
?? 2. Visit that specific API on the RapidAPI marketplace
?? 3. Subscribe to th`e service (many offer free tiers with limited requests)
?? 4. Once subscribed, the service will be available to your agent through the API Services tool
?? A RapidAPI key is optional for API services like LinkedIn
?? Visit https://rapidapi.com/ to get your API key if needed
?? You can leave this blank and add it later if desired
Enter your RapidAPI key (optional, press Enter to skip): 936154e36fmshe98d7e77835be33p1c63e0jsnd737f78eca0b
?? Setting up Supabase database...
? Extracted project reference 'gcnijvljsutcxwgcedjz' from your Supabase URL
?? Changing to backend directory: F:\PythonProjects\suna\backend
?? Logging into Supabase CLI...
Hello from Supabase! Press Enter to open browser and login automatically.Here is your login link in case browser did not open https://supabase.com/dashboard/cli/login?session_id=99b6b3c2-650b-4554-9c86-971ddf5459f1&token_name=cli_AI\love@AI_1748618285&public_key=0423f5ef16356a29c45508ab16157da5afffbe7ced2f713f1258eeb78313524ae557aab83dsdfafedeb19895a1a6f8bd34b1d9d0d38753e5798c5fff7ffad5d8edf4255Enter your verification code: fd5a5ca0
Token cli_AI\love@AI_17486sdfa18285 created successfully.You are now logged in. Happy coding!
?? Linking to Supabase project gcnijvljsutcxwgcedjz...
Enter your database password (or leave blank to skip):
Connecting to remote database...
NOTICE (42P06): schema "supabase_migrations" already exists, skipping
NOTICE (42P07): relation "schema_migrations" already exists, skipping
NOTICE (42701): column "statements" of relation "schema_migrations" already exists, skipping
NOTICE (42701): column "name" of relation "schema_migrations" already exists, skipping
NOTICE (42P06): schema "supabase_migrations" already exists, skipping
NOTICE (42P07): relation "seed_files" already exists, skipping
Finished supabase link.
?? Pushing database migrations...
Connecting to remote database...
Remote database is up to date.
? Supabase database setup completed
?? IMPORTANT: You need to manually expose the 'basejump' schema in Supabase
?? Go to the Supabase web platform -> choose your project -> Project Settings -> Data API
?? In the 'Exposed Schema' section, add 'basejump' if not already there
Press Enter once you've completed this step...Step 8/8: Starting Suna
==================================================?? You can start Suna using either Docker Compose or by manually starting the frontend, backend and worker.How would you like to start Suna?
[1] Docker Compose (recommended, starts all services)
[2] Manual startup (requires Redis, RabbitMQ & separate terminals) Enter your choice (1 or 2): 1
?? Starting Suna with Docker Compose...
?? Building images locally...
Compose can now delegate builds to bake for better performance.To do so, set COMPOSE_BAKE=true.
[+] Building 426.5s (34/34) FINISHED docker:desktop-linux=> [worker internal] load build definition from Dockerfile 0.0s=> => transferring dockerfile: 1.63kB 0.0s => [backend internal] load metadata for docker.io/library/python:3.11-slim 6.3s => [worker internal] load .dockerignore 0.0s=> => transferring context: 2B 0.0s => [backend 1/7] FROM docker.io/library/python:3.11-slim@sha256:dbf1de478a55d6763afaa39c2f3d7b54b25230614980276de5cacdde79529d0c 0.1s => => resolve docker.io/library/python:3.11-slim@sha256:dbf1de478a55d6763afaa39c2f3d7b54b25230614980276de5cacdde79529d0c 0.0s => [worker internal] load build context 0.0s => => transferring context: 7.75kB 0.0s => CACHED [backend 2/7] WORKDIR /app 0.0s => CACHED [backend 3/7] RUN apt-get update && apt-get install -y --no-install-recommends build-essential curl && rm -rf /var/lib/apt/lists/* 0.0s => CACHED [backend 4/7] RUN useradd -m -u 1000 appuser && mkdir -p /app/logs && chown -R appuser:appuser /app 0.0s => CACHED [worker 5/7] COPY --chown=appuser:appuser requirements.txt . 0.0s => [worker 6/7] RUN pip install --no-cache-dir -r requirements.txt gunicorn 110.4s => [worker 7/7] COPY --chown=appuser:appuser . . 0.1s=> [worker] exporting to image 12.7s=> => exporting layers 9.7s=> => exporting manifest sha256:a6e63d8f4567dc7ce2dd73de276ab5f62b50ae4991dbfa03f890eea7cc0c9d78 0.0s=> => exporting config sha256:236895aed0cf64c4db115b31dbfae75bbe84ec6c4d94d3f7f1648a1961435ef8 0.0s=> => exporting attestation manifest sha256:846935b1db61c8759fc8603810ba0abe08e537d4f5a86f2f678a26d7f96fc6e8 0.0s=> => exporting manifest list sha256:f9938f968b86a5dfdbbdfd7b4eb8b76a848f2937c4c45eaa13e8f5f924d4fad6 0.0s=> => naming to docker.io/library/suna-worker:latest 0.0s => => unpacking to docker.io/library/suna-worker:latest 2.8s => [worker] resolving provenance for metadata file 0.0s=> [backend internal] load build definition from Dockerfile 0.0s=> => transferring dockerfile: 1.63kB 0.0s => [backend internal] load .dockerignore 0.0s=> => transferring context: 2B 0.0s => [backend internal] load build context 0.0s => => transferring context: 5.75kB 0.0s => CACHED [backend 5/7] COPY --chown=appuser:appuser requirements.txt . 0.0s => CACHED [backend 6/7] RUN pip install --no-cache-dir -r requirements.txt gunicorn 0.0s => CACHED [backend 7/7] COPY --chown=appuser:appuser . . 0.0s => [backend] exporting to image 0.1s => => exporting layers 0.0s => => exporting manifest sha256:14fa145bd6eb38ce984f807e8744d0937a4fc107f068d40433d7c14bea4d1476 0.0s => => exporting config sha256:d6f08a5c47d5a9ef5e550f4ef620be566ce98db2b10141b4f123874939dcdef8 0.0s => => exporting attestation manifest sha256:9aa719d69af0e8c88936163351a6fa4cf448145ec7c25f06833782299e46ed28 0.0s => => exporting manifest list sha256:fb06e27847e8b9b247ae01196489d0f75305e6c736b823793bc50850cc55edeb 0.0s => => naming to docker.io/library/suna-backend:latest 0.0s=> => unpacking to docker.io/library/suna-backend:latest 0.0s => [backend] resolving provenance for metadata file 0.0s => [frontend internal] load build definition from Dockerfile 0.0s=> => transferring dockerfile: 704B 0.0s => [frontend internal] load metadata for docker.io/library/node:20-slim 2.8s => [frontend internal] load .dockerignore 0.0s=> => transferring context: 2B 0.0s => [frontend 1/7] FROM docker.io/library/node:20-slim@sha256:cb4abfbba7dfaa78e21ddf2a72a592e5f9ed36ccf98bdc8ad3ff945673d288c2 21.0s => => resolve docker.io/library/node:20-slim@sha256:cb4abfbba7dfaa78e21ddf2a72a592e5f9ed36ccf98bdc8ad3ff945673d288c2 0.0s => => sha256:d9d139bf2ac215a0d57ef09e790699a8fd5587c00200db6a91446278356b32aa 447B / 447B 12.3s => => sha256:b12d1e6fd3ba6067543928fa3e4c9a9307711cf5a4593699d157dba3af3e7d21 1.71MB / 1.71MB 15.3s => => sha256:d34dc2c1b56bf7f58faea3b73986ac0a274f2b369cc5f24a5ea26015fdd57e95 41.17MB / 41.17MB 19.2s => => sha256:057bf83be68af82a505c30eb852a4b542c264fe429954c8e0c0e204a9c9dd86e 3.31kB / 3.31kB 20.4s => => extracting sha256:057bf83be68af82a505c30eb852a4b542c264fe429954c8e0c0e204a9c9dd86e 0.0s => => extracting sha256:d34dc2c1b56bf7f58faea3b73986ac0a274f2b369cc5f24a5ea26015fdd57e95 0.4s => => extracting sha256:b12d1e6fd3ba6067543928fa3e4c9a9307711cf5a4593699d157dba3af3e7d21 0.0s => => extracting sha256:d9d139bf2ac215a0d57ef09e790699a8fd5587c00200db6a91446278356b32aa 0.0s => [frontend internal] load build context 0.4s => => transferring context: 15.10MB 0.3s => [frontend 2/7] WORKDIR /app 0.4s => [frontend 3/7] COPY package*.json ./ 0.0s => [frontend 4/7] RUN apt-get update && apt-get install -y --no-install-recommends python3 make g++ build-essential pkg-config libcairo2-dev libpango1.0-dev libjpeg-dev libgif-dev librsvg2-dev && rm -rf /var/lib/apt/lists/* 95.4s => [frontend 5/7] RUN npm install 31.0s => [frontend 6/7] COPY . . 0.4s => [frontend 7/7] RUN npm run build 96.1s => [frontend] exporting to image 42.2s => => exporting layers 32.0s => => exporting manifest sha256:5aa3bf772b57c08f01051d99a26dd00ca11bd0f6c9964672d854b5a9237ca2cc 0.0s => => exporting config sha256:c29ae31e61f62fbc9cc353572cc75685d91c48c4f930fc0e8aba4f785f0a0a33 0.0s => => exporting attestation manifest sha256:a228930985cc14ebd9460baadf26c81cc3e51c65f11868d9b576ce2c917604a2 0.0s => => exporting manifest list sha256:c905a71017595001e983964ecb5076c266eee581bd54b08a8face117267b8f0e 0.0s => => naming to docker.io/library/suna-frontend:latest 0.0s => => unpacking to docker.io/library/suna-frontend:latest 10.0s => [frontend] resolving provenance for metadata file 0.0s
[+] Running 11/11? backend Built 0.0s ? frontend Built 0.0s ? worker Built 0.0s ? Network suna_default Created 0.6s ? Volume "suna_rabbitmq_data" Created 0.0s ? Volume "suna_redis_data" Created 0.0s ? Container suna-rabbitmq-1 Healthy 15.6s ? Container suna-redis-1 Healthy 15.6s ? Container suna-worker-1 Started 13.8s ? Container suna-backend-1 Started 16.1s ? Container suna-frontend-1 Started 19.0s
?? Waiting for services to start...
?? Some services might not be running correctly. Check 'docker compose ps' for details.? Suna Setup Complete! ??? Suna is configured to use openrouter/deepseek/deepseek-chat-v3-0324:free as the default LLM model
?? Your Suna instance is now running!
?? Access it at: http://localhost:3000
?? Create an account using Supabase authentication to start using SunaUseful Docker commands:docker compose ps - Check the status of Suna servicesdocker compose logs - View logs from all servicesdocker compose logs -f - Follow logs from all servicesdocker compose down - Stop Suna servicesdocker compose up -d - Start Suna services (after they've been stopped)(.venv) F:\PythonProjects\suna>
?
- 訪問日志中輸出的
http://localhost:3000網頁
,使用 Supabase 賬號注冊登錄。
?