🤖
AI总结
这篇文章详细介绍了 Dify.AI,一个开源的大语言模型(LLM)应用开发平台。文章首先介绍了 Dify 的核心概念,包括其作为后端即服务(BaaS)和 LLMOps 平台的特性,强调了它如何通过工具集成、知识库集成和工作流集成来增强 LLM 的能力。接着,文章详细阐述了 Docker 部署 Dify 的两种方式:完整服务版本和必要服务版本。完整服务版本提供了 Dify 的所有功能,而必要服务版本则针对已有 Postgres 和 Redis 部署的情况,提供了一个精简的部署方案,其中特别指出了对原版配置文件的修改,包括仅保留必要服务、添加容器名称、删除不必要功能和环境变量,以及固定 Docker 服务网络 IP。其还提供的
docker-compose.yml
和 .env
配置文件示例,方便用户根据自身需求进行部署。English Version
This article provides a detailed introduction to Dify.AI, an open-source Large Language Model (LLM) application development platform. The article begins by introducing the core concepts of Dify, including its features as a Backend-as-a-Service (BaaS) and LLMOps platform, emphasizing how it enhances LLM capabilities through tool integration, knowledge base integration, and workflow integration. Subsequently, the article elaborates on two methods of deploying Dify via Docker: the full-service version and the essential-service version. The full-service version provides all the functionalities of Dify, while the essential-service version offers a streamlined deployment option for scenarios with existing Postgres and Redis deployments. It specifically highlights modifications to the original configuration file, including retaining only essential services, adding container names, removing unnecessary features and environment variables, and fixing Docker service network IPs. It also provide
docker-compose.yml
and .env
configuration file examples to facilitate users in deploying according to their own needs.
一,DIfy基本介绍
1,简介
Dify.AI 是一个开源的大语言模型(LLM)应用开发平台,旨在帮助开发者快速构建生产级生成式 AI 应用,同时降低技术门槛。它集成了后端即服务(Backend-as-a-Service)和 LLMOps 的理念,支持 GPT 系列模型以及其他大型语言模型,通过工具和接口增强 LLM 的服务能力,并将其无缝集成到现有业务中的。
(1)LLM工具支持
- Dify 提供两种工具类型:第一方工具(平台内置)和 自定义工具(用户导入的 API 工具)。
- 这些工具可以赋予 LLM 执行联网搜索、科学计算、绘制图片等功能,从而增强其与外部世界的交互能力。
(2)知识库集成
- 通过上传长文本、结构化数据等内容,使用向量化处理后形成知识库,使 LLM 能基于最新或私有数据进行上下文对话。
- 使用检索增强生成(RAG)技术,将知识库中的相关内容与模型生成能力结合。
- 支持多种数据格式,如文本、CSV、PDF 等,并能自动解析其中的结构化信息。
(3)工作流集成
- 将一个个微小的工具和节点组装成一个工作流,从而提高了 LLM 应用面向复杂任务的性能,其主要包括两部分
- Chatflow:面向对话类情景,包括客户服务、语义搜索、以及其他需要在构建响应时进行多步逻辑的对话式应用程序。
- Workflow:面向自动化和批处理情景,适合高质量翻译、数据分析、内容生成、电子邮件自动化等应用程序。
2,BaaS
Dify 的“后端即服务”(Backend as a Service, BaaS)理念,旨在通过封装复杂的后端功能,为开发者提供简化且高效的 AI 应用开发体验。
(1)API 驱动的后端服务:Dify 为所有功能提供标准化的 API,开发者可以直接调用这些接口,将大型语言模型(LLM)的能力集成到前端应用或现有业务中,而无需自行开发复杂的后端架构.
(2)屏蔽复杂性:开发者无需关注后端部署、模型管理或供应商切换等细节。Dify 已对 LLM 的基础能力进行封装,并集中管理密钥和模型配置。
(3)实时生效与可视化管理:用户可以通过 Dify 的可视化界面设计和运营应用,例如调整 Prompt、分析日志或标注数据。这些更改能够实时在应用中生效,无需额外部署。
3,LLMOps
LLMOps(Large Language Model Operations)是一个涵盖了大型语言模型(如GPT系列)开发、部署、维护和优化的一整套实践和流程,其目标是确保高效、可扩展和安全地使用这些强大的 AI 模型来构建和运行实际应用程序。它涉及到模型训练、部署、监控、更新、安全性和合规性等方面。
- 数据准备:平台提供数据收集和预处理工具,简化了数据清洗和标注的工作,最小化甚至消除了编码工作。
- Prompt Engineering:所见即所得的 Prompt 编辑和调试,可根据用户输入的数据进行实时优化和调整。
- 嵌入和上下文管理:自动处理长上下文的嵌入、存储和管理,提高效率和扩展性,无需编写大量代码。
- 应用监控与维护:实时监控性能数据,快速发现和处理问题,确保应用程序的稳定运行,提供完整的日志记录。
- 微调数据准备:提供人工标注知识库的批量导出,在应用运营过程中收集线上反馈数据持续改善模型效果。
- 系统和运营:易用的界面,非技术人员也可参与,支持多人协同,降低开发和维护成本。与传统开发方式相比,Dify 提供了更加透明和易于监控的应用管理,让团队成员更好地了解应用的运行情况。
二,Docker部署
1,完整服务版本部署
(1)下载docker-compose.yml文件
docker-compose.yaml
langgenius
(2)下载 .env.example文件并修改为.env
.env.example
langgenius
(3)启动Docker容器
根据你系统上的 Docker Compose 版本,选择合适的命令来启动容器。
- 如果版本是 Docker Compose V2,使用命令:
docker compose up -d
- 如果版本是 Docker Compose V1,使用命令:
docker-compose up -d
2,必要服务版本部署
(1)使用该 docker-compose.yml 前提:
- Postgres部署完毕
- 提前创建了
dify
和dify_plugin
数据库 - 相关参数如下,需要自行修改
plain
# ip形式例如 172.18.0.10 DB_HOST={postgres_ip} DB_PORT=5432 DB_USERNAME=postgres DB_PASSWORD={postgres_passwd} DB_DATABASE=dify DB_PLUGIN_DATABASE=dify_plugin
Plain text
- Reddis部署完毕
- 相关参数如下,需要自行修改
plain
# ip形式例如 172.18.0.11 REDIS_HOST=={redis_ip} REDIS_PORT=6379 REDIS_USERNAME=default REDIS_PASSWORD=={redis_passwd} REDIS_DB=0 CELERY_BROKER_URL=redis://default:{redis_passwd}@{redis_ip}:6379/1
Plain text
- docker设置了dify-net网络
- 类型为
bridge
- 子网络为
172.20.0.0/16
- 网关为
172.20.0.1
(2)此配置相较于原版本的改动
- 仅保留部分必要的服务,包括api、worker、web、plugin_daemon,且将weaviate作为向量数据库
- 每个服务都加上了
container_name
- 删除了部分不必要的功能和环境变量,仅适用于自己部署使用
- 固定各个docker服务的网络ip,方便web端与api端通信
(3)必要服务版本配置文件
docker-compose.yml
yaml
version: '3' x-shared-env: &shared-api-worker-env CONSOLE_API_URL: ${CONSOLE_API_URL:-} CONSOLE_WEB_URL: ${CONSOLE_WEB_URL:-} SERVICE_API_URL: ${SERVICE_API_URL:-} APP_API_URL: ${APP_API_URL:-} APP_WEB_URL: ${APP_WEB_URL:-} FILES_URL: ${FILES_URL:-} LOG_LEVEL: ${LOG_LEVEL:-INFO} LOG_FILE: ${LOG_FILE:-/app/logs/server.log} DEBUG: ${DEBUG:-false} SECRET_KEY: ${SECRET_KEY:-sk-9f73s3ljTXVcMT3Blb3ljTqtsKiGHXVcMT3BlbkFJLK7U} # postgres数据库配置 DB_USERNAME: ${DB_USERNAME:-} DB_PASSWORD: ${DB_PASSWORD:-} DB_HOST: ${DB_HOST:-} DB_PORT: ${DB_PORT:-5432} DB_DATABASE: ${DB_DATABASE:-dify} # redis数据库配置 REDIS_HOST: ${REDIS_HOST:-} REDIS_PORT: ${REDIS_PORT:-6379} REDIS_USERNAME: ${REDIS_USERNAME:-default} REDIS_PASSWORD: ${REDIS_PASSWORD:-} REDIS_DB: ${REDIS_DB:-0} CELERY_BROKER_URL: ${CELERY_BROKER_URL:-redis://default:3W.apple.account%40goodafterconnect@172.18.0.52:6379/1} # 数据库迁移配置 MIGRATION_ENABLED: ${MIGRATION_ENABLED:-true} WEB_API_CORS_ALLOW_ORIGINS: ${WEB_API_CORS_ALLOW_ORIGINS:-*} CONSOLE_CORS_ALLOW_ORIGINS: ${CONSOLE_CORS_ALLOW_ORIGINS:-*} STORAGE_TYPE: ${STORAGE_TYPE:-opendal} OPENDAL_SCHEME: ${OPENDAL_SCHEME:-fs} OPENDAL_FS_ROOT: ${OPENDAL_FS_ROOT:-storage} VECTOR_STORE: ${VECTOR_STORE:-weaviate} WEAVIATE_ENDPOINT: ${WEAVIATE_ENDPOINT:-http://weaviate:8080} WEAVIATE_API_KEY: ${WEAVIATE_API_KEY:-WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih} SERVER_WORKER_AMOUNT: ${SERVER_WORKER_AMOUNT:-1} PLUGIN_DAEMON_URL: ${PLUGIN_DAEMON_URL:-http://dify-plugin_daemon:5002} PLUGIN_DAEMON_KEY: ${PLUGIN_DAEMON_KEY:-lYkiYYT6owG+71oLerGzA7GXCgOT++6ovaezWAjpCjf+Sjc3ZtU+qUEi} PLUGIN_DIFY_INNER_API_KEY: ${PLUGIN_DIFY_INNER_API_KEY:-QaHbTe77CtuXmsfyhR7+vRjI/+XbV1AaFy691iy+kGDv2Jvy0/eAh8Y1} PLUGIN_DIFY_INNER_API_URL: ${PLUGIN_DIFY_INNER_API_URL:-http://dify-api:5001} services: # API service api: image: langgenius/dify-api:1.0.0 container_name: dify-api restart: always environment: <<: *shared-api-worker-env MODE: api PLUGIN_REMOTE_INSTALL_HOST: ${EXPOSE_PLUGIN_DEBUGGING_HOST:-localhost} PLUGIN_REMOTE_INSTALL_PORT: ${EXPOSE_PLUGIN_DEBUGGING_PORT:-5003} PLUGIN_MAX_PACKAGE_SIZE: ${PLUGIN_MAX_PACKAGE_SIZE:-52428800} INNER_API_KEY_FOR_PLUGIN: ${PLUGIN_DIFY_INNER_API_KEY:-QaHbTe77CtuXmsfyhR7+vRjI/+XbV1AaFy691iy+kGDv2Jvy0/eAh8Y1} volumes: - ./volumes/app/storage:/app/api/storage networks: dify-net: ipv4_address: 172.20.0.2 # worker service worker: image: langgenius/dify-api:1.0.0 container_name: dify-worker restart: always environment: <<: *shared-api-worker-env MODE: worker PLUGIN_MAX_PACKAGE_SIZE: ${PLUGIN_MAX_PACKAGE_SIZE:-52428800} INNER_API_KEY_FOR_PLUGIN: ${PLUGIN_DIFY_INNER_API_KEY:-QaHbTe77CtuXmsfyhR7+vRjI/+XbV1AaFy691iy+kGDv2Jvy0/eAh8Y1} volumes: - ./volumes/app/storage:/app/api/storage networks: dify-net: ipv4_address: 172.20.0.4 # Frontend web application web: image: langgenius/dify-web:1.0.0 container_name: dify-web restart: always environment: # CONSOLE_API_URL: ${CONSOLE_API_URL:-} CONSOLE_API_URL: http://172.20.0.2:5001 # APP_API_URL: ${APP_API_URL:-} APP_API_URL: http://172.20.0.2:5001 SERVICE_API_URL: http://172.20.0.2:5001 TEXT_GENERATION_TIMEOUT_MS: ${TEXT_GENERATION_TIMEOUT_MS:-60000} MARKETPLACE_API_URL: ${MARKETPLACE_API_URL:-https://marketplace.dify.ai} MARKETPLACE_URL: ${MARKETPLACE_URL:-https://marketplace.dify.ai} TOP_K_MAX_VALUE: ${TOP_K_MAX_VALUE:-} INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH: ${INDEXING_MAX_SEGMENTATION_TOKENS_LENGTH:-} PM2_INSTANCES: ${PM2_INSTANCES:-2} networks: dify-net: ipv4_address: 172.20.0.3 depends_on: - api # plugin daemon plugin_daemon: image: langgenius/dify-plugin-daemon:0.0.3-local container_name: dify-plugin_daemon restart: always environment: <<: *shared-api-worker-env DB_DATABASE: ${DB_PLUGIN_DATABASE:-dify_plugin} SERVER_PORT: ${PLUGIN_DAEMON_PORT:-5002} SERVER_KEY: ${PLUGIN_DAEMON_KEY:-lYkiYYT6owG+71oLerGzA7GXCgOT++6ovaezWAjpCjf+Sjc3ZtU+qUEi} DIFY_INNER_API_URL: ${PLUGIN_DIFY_INNER_API_URL:-http://dify-api:5001} DIFY_INNER_API_KEY: ${PLUGIN_DIFY_INNER_API_KEY:-QaHbTe77CtuXmsfyhR7+vRjI/+XbV1AaFy691iy+kGDv2Jvy0/eAh8Y1} PLUGIN_WORKING_PATH: ${PLUGIN_WORKING_PATH:-/app/storage/cwd} PLUGIN_REMOTE_INSTALLING_HOST: ${EXPOSE_PLUGIN_DEBUGGING_HOST:-localhost} PLUGIN_REMOTE_INSTALLING_PORT: ${EXPOSE_PLUGIN_DEBUGGING_PORT:-5003} FORCE_VERIFYING_SIGNATURE: ${FORCE_VERIFYING_SIGNATURE:-true} ports: - "${EXPOSE_PLUGIN_DEBUGGING_PORT:-5003}:${PLUGIN_DEBUGGING_PORT:-5003}" volumes: - ./volumes/plugin_daemon:/app/storage networks: dify-net: ipv4_address: 172.20.0.5 # The Weaviate vector store weaviate: image: semitechnologies/weaviate:1.19.0 container_name: dify-weaviate restart: always volumes: - ./volumes/weaviate:/var/lib/weaviate environment: PERSISTENCE_DATA_PATH: ${WEAVIATE_PERSISTENCE_DATA_PATH:-/var/lib/weaviate} QUERY_DEFAULTS_LIMIT: ${WEAVIATE_QUERY_DEFAULTS_LIMIT:-25} AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED: ${WEAVIATE_AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED:-false} DEFAULT_VECTORIZER_MODULE: ${WEAVIATE_DEFAULT_VECTORIZER_MODULE:-none} CLUSTER_HOSTNAME: ${WEAVIATE_CLUSTER_HOSTNAME:-node1} AUTHENTICATION_APIKEY_ENABLED: ${WEAVIATE_AUTHENTICATION_APIKEY_ENABLED:-true} AUTHENTICATION_APIKEY_ALLOWED_KEYS: ${WEAVIATE_AUTHENTICATION_APIKEY_ALLOWED_KEYS:-WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih} AUTHENTICATION_APIKEY_USERS: ${WEAVIATE_AUTHENTICATION_APIKEY_USERS:-hello@dify.ai} AUTHORIZATION_ADMINLIST_ENABLED: ${WEAVIATE_AUTHORIZATION_ADMINLIST_ENABLED:-true} AUTHORIZATION_ADMINLIST_USERS: ${WEAVIATE_AUTHORIZATION_ADMINLIST_USERS:-hello@dify.ai} networks: dify-net: ipv4_address: 172.20.0.6 networks: dify-net: external: true
YAML
.env
plain
# ------------------------------ # 服务 URL 配置 # ------------------------------ CONSOLE_API_URL= CONSOLE_WEB_URL= SERVICE_API_URL= APP_API_URL= APP_WEB_URL= FILES_URL= # ------------------------------ # 数据库迁移配置 # ------------------------------ MIGRATION_ENABLED=true # ------------------------------ # 服务器配置 # ------------------------------ LOG_LEVEL=INFO LOG_FILE=/app/logs/server.log DEBUG=false SECRET_KEY=sk-9f73s3ljTXVcMT3Blb3ljTqtsKiGHXVcMT3BlbkFJLK7U # ------------------------------ # 数据库配置 # ------------------------------ DB_USERNAME=postgres DB_PASSWORD={postgres_passwd} DB_HOST={postgres_ip} DB_PORT=5432 DB_DATABASE=dify DB_PLUGIN_DATABASE=dify_plugin # ------------------------------ # Redis 配置 # ------------------------------ REDIS_HOST=={redis_ip} REDIS_PORT=6379 REDIS_USERNAME=default REDIS_PASSWORD=={redis_passwd} REDIS_DB=0 CELERY_BROKER_URL=redis://default:{redis_passwd}@{redis_ip}:6379/1 # ------------------------------ # CORS 配置 # ------------------------------ WEB_API_CORS_ALLOW_ORIGINS=* CONSOLE_CORS_ALLOW_ORIGINS=* # ------------------------------ # 文件存储配置 # ------------------------------ STORAGE_TYPE=opendal OPENDAL_SCHEME=fs OPENDAL_FS_ROOT=storage # ------------------------------ # 向量数据库配置 # ------------------------------ VECTOR_STORE=weaviate WEAVIATE_ENDPOINT=http://weaviate:8080 WEAVIATE_API_KEY=WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih WEAVIATE_PERSISTENCE_DATA_PATH=/var/lib/weaviate WEAVIATE_QUERY_DEFAULTS_LIMIT=25 WEAVIATE_AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=false WEAVIATE_DEFAULT_VECTORIZER_MODULE=none WEAVIATE_CLUSTER_HOSTNAME=node1 WEAVIATE_AUTHENTICATION_APIKEY_ENABLED=true WEAVIATE_AUTHENTICATION_APIKEY_ALLOWED_KEYS=WVF5YThaHlkYwhGUSmCRgsX3tD5ngdN8pkih WEAVIATE_AUTHENTICATION_APIKEY_USERS=hello@dify.ai WEAVIATE_AUTHORIZATION_ADMINLIST_ENABLED=true WEAVIATE_AUTHORIZATION_ADMINLIST_USERS=hello@dify.ai # ------------------------------ # 服务工作器配置 # ------------------------------ SERVER_WORKER_AMOUNT=1 TEXT_GENERATION_TIMEOUT_MS=60000 # ------------------------------ # 插件守护进程配置 # ------------------------------ PLUGIN_DAEMON_PORT=5002 PLUGIN_DAEMON_KEY=lYkiYYT6owG+71oLerGzA7GXCgOT++6ovaezWAjpCjf+Sjc3ZtU+qUEi PLUGIN_DAEMON_URL=http://dify-plugin_daemon:5002 PLUGIN_DIFY_INNER_API_KEY=QaHbTe77CtuXmsfyhR7+vRjI/+XbV1AaFy691iy+kGDv2Jvy0/eAh8Y1 PLUGIN_DIFY_INNER_API_URL=http://dify-api:5001 PLUGIN_WORKING_PATH=/app/storage/cwd
Plain text
三,DIfy的API格式
1,OpenAI官方API格式
(1)请求格式:
javascript
curl https://api.openai.com/v1/chat/completions \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $OPENAI_API_KEY" \ -d '{ "model": "gpt-4o-mini", "messages": [{"role": "user", "content": "Say this is a test!"}], "temperature": 0.7 }'
JavaScript
(2)响应格式:
javascript
{ "id":"chatcmpl-abc123", "object":"chat.completion", "created":1677858242, "model":"gpt-4o-mini", "usage":{ "prompt_tokens":13, "completion_tokens":7, "total_tokens":20 }, "choices":[ { "message":{ "role":"assistant", "content":"\n\nThis is a test!" }, "finish_reason":"stop", "index":0 } ] }
JavaScript
2,Dify官方API格式
(1)请求格式:
javascript
curl https://api.dify.ai/v1/chat-completions \ -H "Content-Type: application/json" \ -H "Authorization: Bearer $DIFY_API_KEY" \ -d '{ "inputs": {}, "query": "Say this is a test!", "response_mode": "blocking", "conversation_id": "", "user": "api-user" }'
JavaScript
(2)响应格式:
javascript
{ "event": "message", "task_id": "253fd9ba-d3e5-XXXX-a89e-8efde52367e7", "id": "fb8ffade-d400-XXXX-8cd1-677fc423a60c", "message_id": "fb8ffade-d400-XXXX-8cd1-677fc423a60c", "conversation_id": "e62cdadf-1684-XXXX-9ee8-c00e7007bb08", "mode": "chat", "answer": "\n\nThis is a test!", "metadata": { "usage": { "prompt_tokens": 11, "prompt_unit_price": "0.03", "prompt_price_unit": "0.001", "prompt_price": "0.00033", "completion_tokens": 3, "completion_unit_price": "0.06", "completion_price_unit": "0.001", "completion_price": "0.00018", "total_tokens": 14, "total_price": "0.00051", "currency": "USD", "latency": 0.5931752799078822 } }, "created_at": 1747322830 }
JavaScript
3,将Dify API转成OpenAI API格式调用
(1)Github项目地址
dify2api
JayLiu7319 • Updated Apr 4, 2025
(2)Cloudflare Worker代码
基于上述Github代码修改的,通过Cloudlfare Workers进行部署。
V1版本:仅支持对话,单模型,直接以OpenAI格式传入Dify API进行转换。
javascript
// Dify API URL configuration - add to Cloudflare Worker secrets or environment variables const DIFY_API_URL = "https://api.dify.ai/v1"; // Replace with your Dify API URL // Bot type configuration (Chat, Completion, or Workflow) const BOT_TYPE = "Chat"; // You can change this or make it configurable // Optional input/output variable names for specific bot types const INPUT_VARIABLE = ""; // Optional: Set if needed for specific bot configurations const OUTPUT_VARIABLE = ""; // Optional: Set if needed for specific bot configurations const USER = "default-user"; // Set your preferred default user identifier // Determine API path based on bot type let API_PATH; switch (BOT_TYPE) { case "Chat": API_PATH = "/chat-messages"; break; case "Completion": API_PATH = "/completion-messages"; break; case "Workflow": API_PATH = "/workflows/run"; break; default: API_PATH = "/chat-messages"; // Default to chat } /** * Generate a random ID similar to OpenAI's format */ function generateId() { let result = ""; const characters = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"; for (let i = 0; i < 29; i++) { result += characters.charAt(Math.floor(Math.random() * characters.length)); } return result; } /** * Handle CORS preflight requests */ function handleCorsHeaders(request) { const corsHeaders = { "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods": "GET, POST, OPTIONS", "Access-Control-Allow-Headers": "DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization", "Access-Control-Max-Age": "86400", }; // Handle OPTIONS request for CORS preflight if (request && request.method === "OPTIONS") { return new Response(null, { status: 204, headers: corsHeaders }); } return corsHeaders; } /** * Format response with appropriate headers */ function getResponse(body, status = 200, additionalHeaders = {}) { const corsHeaders = { "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods": "GET, POST, OPTIONS", "Access-Control-Allow-Headers": "DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization", "Access-Control-Max-Age": "86400", }; return new Response(body, { status: status, headers: { "Content-Type": "application/json", ...corsHeaders, ...additionalHeaders } }); } /** * Transform messages into Dify query format based on bot type */ function formatDifyQuery(messages, botType) { let queryString; if (botType === 'Chat') { const lastMessage = messages[messages.length - 1]; queryString = `here is our talk history:\n'''\n${messages .slice(0, -1) .map((message) => `${message.role}: ${message.content}`) .join('\n')}\n'''\n\nhere is my question:\n${lastMessage.content}`; } else if (botType === 'Completion' || botType === 'Workflow') { queryString = messages[messages.length - 1].content; } return queryString; } /** * Build Dify API request body */ function buildDifyRequestBody(queryString, inputVariable) { if (inputVariable) { return { inputs: { [inputVariable]: queryString }, response_mode: "streaming", conversation_id: "", user: USER, auto_generate_name: false }; } else { return { "inputs": {}, query: queryString, response_mode: "streaming", conversation_id: "", user: USER, auto_generate_name: false }; } } /** * Process Dify API streaming response */ async function handleStreamingResponse(responseBody, model, transformController) { const reader = responseBody.getReader(); const encoder = new TextEncoder(); let buffer = ""; let isFirstChunk = true; let isResponseEnded = false; while (true) { const { done, value } = await reader.read(); if (done) { // Send final [DONE] marker if not already sent if (!isResponseEnded) { const doneData = encoder.encode("data: [DONE]\n\n"); transformController.enqueue(doneData); } break; } buffer += new TextDecoder().decode(value); const lines = buffer.split("\n"); // Process all complete lines for (let i = 0; i < lines.length - 1; i++) { let line = lines[i].trim(); if (!line.startsWith("data:")) continue; line = line.slice(5).trim(); let chunkObj; try { if (line.startsWith("{")) { chunkObj = JSON.parse(line); } else { continue; } } catch (error) { console.error("Error parsing chunk:", error); continue; } if (chunkObj.event === "message" || chunkObj.event === "agent_message" || chunkObj.event === "text_chunk") { let chunkContent; if (chunkObj.event === "text_chunk") { chunkContent = chunkObj.data.text; } else { chunkContent = chunkObj.answer; } if (isFirstChunk) { chunkContent = chunkContent.trimStart(); isFirstChunk = false; } if (chunkContent !== "" && !isResponseEnded) { const chunkId = `chatcmpl-${Date.now()}`; const chunkCreated = chunkObj.created_at; const chunk = { id: chunkId, object: "chat.completion.chunk", created: chunkCreated, model: model, choices: [ { index: 0, delta: { content: chunkContent, }, finish_reason: null, }, ], }; const stringifiedChunk = JSON.stringify(chunk); const formattedChunk = `data: ${stringifiedChunk}\n\n`; transformController.enqueue(encoder.encode(formattedChunk)); } } else if (chunkObj.event === "workflow_finished" || chunkObj.event === "message_end") { if (!isResponseEnded) { const chunkId = `chatcmpl-${Date.now()}`; const chunkCreated = chunkObj.created_at; const chunk = { id: chunkId, object: "chat.completion.chunk", created: chunkCreated, model: model, choices: [ { index: 0, delta: {}, finish_reason: "stop", }, ], }; const stringifiedChunk = JSON.stringify(chunk); const formattedChunk = `data: ${stringifiedChunk}\n\n`; transformController.enqueue(encoder.encode(formattedChunk)); // Send [DONE] to signal the end of the stream const doneData = encoder.encode("data: [DONE]\n\n"); transformController.enqueue(doneData); isResponseEnded = true; } } else if (chunkObj.event === "error") { console.error(`Error: ${chunkObj.code}, ${chunkObj.message}`); if (!isResponseEnded) { const errorChunk = `data: ${JSON.stringify({ error: chunkObj.message })}\n\n`; transformController.enqueue(encoder.encode(errorChunk)); const doneData = encoder.encode("data: [DONE]\n\n"); transformController.enqueue(doneData); isResponseEnded = true; } } // Silently skip agent_thought and ping events } buffer = lines[lines.length - 1]; } } /** * Process Dify API non-streaming response */ async function handleNonStreamingResponse(responseBody, model, outputVariable) { let result = ""; let usageData = null; let hasError = false; let messageEnded = false; let buffer = ""; let skipWorkflowFinished = false; const reader = responseBody.getReader(); while (true) { const { done, value } = await reader.read(); if (done) break; buffer += new TextDecoder().decode(value); const lines = buffer.split("\n"); for (let i = 0; i < lines.length - 1; i++) { const line = lines[i].trim(); if (line === "") continue; let chunkObj; try { const cleanedLine = line.replace(/^data: /, "").trim(); if (cleanedLine.startsWith("{") && cleanedLine.endsWith("}")) { chunkObj = JSON.parse(cleanedLine); } else { continue; } } catch (error) { console.error("Error parsing JSON:", error); continue; } if (chunkObj.event === "message" || chunkObj.event === "agent_message") { result += chunkObj.answer; skipWorkflowFinished = true; } else if (chunkObj.event === "message_end") { messageEnded = true; usageData = { prompt_tokens: chunkObj.metadata?.usage?.prompt_tokens || 100, completion_tokens: chunkObj.metadata?.usage?.completion_tokens || 10, total_tokens: chunkObj.metadata?.usage?.total_tokens || 110, }; } else if (chunkObj.event === "workflow_finished" && !skipWorkflowFinished) { messageEnded = true; const outputs = chunkObj.data.outputs; if (outputVariable) { result = outputs[outputVariable]; } else { result = outputs; } result = String(result); usageData = { prompt_tokens: chunkObj.metadata?.usage?.prompt_tokens || 100, completion_tokens: chunkObj.metadata?.usage?.completion_tokens || 10, total_tokens: chunkObj.data?.total_tokens || 110, }; } else if (chunkObj.event === "error") { console.error(`Error: ${chunkObj.code}, ${chunkObj.message}`); hasError = true; break; } // Silently skip agent_thought and ping events } buffer = lines[lines.length - 1]; } if (hasError) { return { error: "An error occurred while processing the request.", status: 500 }; } if (messageEnded) { return { response: { id: `chatcmpl-${generateId()}`, object: "chat.completion", created: Math.floor(Date.now() / 1000), model: model, choices: [ { index: 0, message: { role: "assistant", content: result.trim(), }, logprobs: null, finish_reason: "stop", }, ], usage: usageData, system_fingerprint: "fp_2f57f81c11", }, status: 200 }; } return { error: "Unexpected end of stream.", status: 500 }; } /** * Main request handler */ async function handleRequest(request) { try { // Handle CORS preflight requests if (request.method === "OPTIONS") { return new Response(null, { status: 204, headers: { "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods": "GET, POST, OPTIONS", "Access-Control-Allow-Headers": "DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization", "Access-Control-Max-Age": "86400", } }); } // Home page for health check const url = new URL(request.url); if (request.method === "GET" && url.pathname === "/") { return new Response(` <html> <head> <title>DIFY2OPENAI</title> </head> <body> <h1>Dify2OpenAI</h1> <p>Congratulations! Your project has been successfully deployed.</p> </body> </html> `, { headers: { "Content-Type": "text/html", "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods": "GET, POST, OPTIONS", "Access-Control-Allow-Headers": "DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization" } }); } // Validate API endpoint if (url.pathname !== "/v1/chat/completions") { return getResponse(JSON.stringify({ error: "Not Found" }), 404); } // Check for Authorization header const authHeader = request.headers.get("authorization") || request.headers.get("Authorization"); if (!authHeader || !authHeader.startsWith("Bearer ")) { return getResponse(JSON.stringify({ code: 401, errmsg: "Unauthorized. Missing or invalid Authorization header.", }), 401); } // Parse request body const data = await request.json(); const messages = data.messages || []; const stream = data.stream !== undefined ? data.stream : false; const model = data.model || "gpt-3.5-turbo"; // Default model identifier // Format Dify query from messages const queryString = formatDifyQuery(messages, BOT_TYPE); // Build Dify API request body const requestBody = buildDifyRequestBody(queryString, INPUT_VARIABLE); console.log("Sending request to Dify API:", DIFY_API_URL + API_PATH); // Make request to Dify API - passing through the same Authorization header const difyResponse = await fetch(DIFY_API_URL + API_PATH, { method: "POST", headers: { "Content-Type": "application/json", Authorization: authHeader, // Use the same Authorization header received from the client }, body: JSON.stringify(requestBody), }); if (!difyResponse.ok) { const errorText = await difyResponse.text(); return getResponse(JSON.stringify({ error: `Error from Dify API: ${difyResponse.status} ${difyResponse.statusText}`, details: errorText }), difyResponse.status); } // Handle streaming response if (stream) { const { readable, writable } = new TransformStream(); const writer = writable.getWriter(); // Create a transformer controller for handling the stream const transformController = { enqueue: (chunk) => writer.write(chunk), close: () => writer.close(), error: (err) => writer.abort(err) }; // Process the streaming response in background handleStreamingResponse(difyResponse.body, model, transformController) .catch((error) => { console.error("Error processing stream:", error); writer.abort(error); }) .finally(() => { writer.close(); }); // Return the readable stream with appropriate headers return new Response(readable, { headers: { "Content-Type": "text/event-stream", "Cache-Control": "no-cache", "Connection": "keep-alive", "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods": "GET, POST, OPTIONS", "Access-Control-Allow-Headers": "DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization", } }); } // Handle non-streaming response else { const result = await handleNonStreamingResponse(difyResponse.body, model, OUTPUT_VARIABLE); if (result.error) { return getResponse(JSON.stringify({ error: result.error }), result.status); } return getResponse(JSON.stringify(result.response), 200); } } catch (error) { console.error("Error:", error.message, error.stack); return getResponse(JSON.stringify({ error: `Error processing request: ${error.message}` }), 500); } } // Register the event listener addEventListener('fetch', event => { event.respondWith(handleRequest(event.request)); });
JavaScript
V2版本:仅支持对话,多模型,通过创建多个应用并设置,以请求不同模型传递DIfy后端不同API方式实现。
javascript
// 1. 你的OpenAI格式API密钥,用于校验用户请求 const API_KEY = "your-api-key-here"; // 必须和请求头一致,否则拒绝 // 2. 维护模型与 Dify API Key 的映射 const CUSTOMER_MODEL_MAP = { "gpt-4o-mini": "Dify_API_KEY_01", // 你实际的 Dify API KEY "chatgpt-4o-latest": "Dify_API_KEY_02" }; // 3. Dify API地址 const DIFY_API_URL = "https://api.dify.ai/v1"; const BOT_TYPE = "Chat"; const INPUT_VARIABLE = ""; // 如有配置可填写 const OUTPUT_VARIABLE = ""; // 如有配置可填写 const USER = "default-user"; // Dify API请求中的user let API_PATH; switch (BOT_TYPE) { case "Chat": API_PATH = "/chat-messages"; break; case "Completion": API_PATH = "/completion-messages"; break; case "Workflow": API_PATH = "/workflows/run"; break; default: API_PATH = "/chat-messages"; } // 随机ID function generateId() { let result = ""; const chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"; for (let i = 0; i < 29; i++) result += chars.charAt(Math.floor(Math.random() * chars.length)); return result; } function getCorsHeaders() { return { "Access-Control-Allow-Origin": "*", "Access-Control-Allow-Methods": "GET, POST, OPTIONS", "Access-Control-Allow-Headers": "DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization", "Access-Control-Max-Age": "86400", }; } function getResponse(body, status = 200, headers = {}) { return new Response(body, { status, headers: { "Content-Type": "application/json", ...getCorsHeaders(), ...headers, } }); } function formatDifyQuery(messages, botType) { if (botType === "Chat") { const lastMessage = messages[messages.length - 1]; return `here is our talk history:\n'''\n${messages .slice(0, -1) .map((m) => `${m.role}: ${m.content}`) .join('\n')}\n'''\n\nhere is my question:\n${lastMessage.content}`; } else if (botType === "Completion" || botType === "Workflow") { return messages[messages.length - 1].content; } return ""; } function buildDifyRequestBody(queryString, inputVariable) { if (inputVariable) { return { inputs: { [inputVariable]: queryString }, response_mode: "streaming", conversation_id: "", user: USER, auto_generate_name: false }; } else { return { "inputs": {}, query: queryString, response_mode: "streaming", conversation_id: "", user: USER, auto_generate_name: false }; } } /** * Main handler */ async function handleRequest(request) { // CORS预检 if (request.method === "OPTIONS") { return new Response(null, { status: 204, headers: getCorsHeaders() }); } // 健康页 const url = new URL(request.url); if (request.method === "GET" && url.pathname === "/") { return new Response(`<html><head><title>DIFY2OPENAI</title></head><body><h1>Dify2OpenAI</h1><p>Congratulations! Your project has been successfully deployed.</p></body></html>`, { headers: { "Content-Type": "text/html", ...getCorsHeaders() } }); } // 仅允许POST到/v1/chat/completions if (url.pathname !== "/v1/chat/completions" || request.method !== "POST") { return getResponse(JSON.stringify({ error: "Not Found" }), 404); } // 1. 校验OpenAI风格API KEY const authHeader = request.headers.get("authorization") || request.headers.get("Authorization"); if (!authHeader || !authHeader.startsWith("Bearer ")) { return getResponse(JSON.stringify({ error: "Unauthorized. Missing Authorization header." }), 401); } const openaiKey = authHeader.split(" ")[1]; if (openaiKey !== API_KEY) { return getResponse(JSON.stringify({ error: "Unauthorized. Invalid API key." }), 401); } // 2. 解析请求体,提取模型 let data; try { data = await request.json(); } catch (e) { return getResponse(JSON.stringify({ error: "Invalid JSON" }), 400); } const model = data.model; if (!model || !(model in CUSTOMER_MODEL_MAP)) { return getResponse(JSON.stringify({ error: "Unsupported or missing model." }), 400); } const difyApiKey = CUSTOMER_MODEL_MAP[model]; const messages = data.messages || []; const stream = data.stream !== undefined ? data.stream : false; // 3. 转换为Dify格式 const queryString = formatDifyQuery(messages, BOT_TYPE); const requestBody = buildDifyRequestBody(queryString, INPUT_VARIABLE); // 4. 代理到Dify,使用该模型对应的Dify-API-KEY const difyResp = await fetch(DIFY_API_URL + API_PATH, { method: "POST", headers: { "Content-Type": "application/json", "Authorization": `Bearer ${difyApiKey}`, }, body: JSON.stringify(requestBody), }); if (!difyResp.ok) { const errorText = await difyResp.text(); return getResponse(JSON.stringify({ error: `Error from Dify API: ${difyResp.status} ${difyResp.statusText}`, details: errorText }), difyResp.status); } // 5. 流式转发 if (stream) { const { readable, writable } = new TransformStream(); const writer = writable.getWriter(); const reader = difyResp.body.getReader(); const encoder = new TextEncoder(); let buffer = ""; let isFirstChunk = true; let isResponseEnded = false; async function processStream() { while (true) { const { done, value } = await reader.read(); if (done) { if (!isResponseEnded) writer.write(encoder.encode("data: [DONE]\n\n")); break; } buffer += new TextDecoder().decode(value); const lines = buffer.split("\n"); for (let i = 0; i < lines.length - 1; i++) { let line = lines[i].trim(); if (!line.startsWith("data:")) continue; line = line.slice(5).trim(); let chunkObj; try { if (line.startsWith("{")) chunkObj = JSON.parse(line); else continue; } catch { continue; } if (chunkObj.event === "message" || chunkObj.event === "agent_message" || chunkObj.event === "text_chunk") { let chunkContent = (chunkObj.event === "text_chunk") ? chunkObj.data.text : chunkObj.answer; if (isFirstChunk) { chunkContent = chunkContent.trimStart(); isFirstChunk = false; } if (chunkContent && !isResponseEnded) { const chunk = { id: `chatcmpl-${Date.now()}`, object: "chat.completion.chunk", created: chunkObj.created_at, model: model, choices: [{ index: 0, delta: { content: chunkContent }, finish_reason: null }] }; writer.write(encoder.encode("data: " + JSON.stringify(chunk) + "\n\n")); } } else if (chunkObj.event === "workflow_finished" || chunkObj.event === "message_end") { if (!isResponseEnded) { const chunk = { id: `chatcmpl-${Date.now()}`, object: "chat.completion.chunk", created: chunkObj.created_at, model: model, choices: [{ index: 0, delta: {}, finish_reason: "stop" }] }; writer.write(encoder.encode("data: " + JSON.stringify(chunk) + "\n\n")); writer.write(encoder.encode("data: [DONE]\n\n")); isResponseEnded = true; } } else if (chunkObj.event === "error") { writer.write(encoder.encode(`data: ${JSON.stringify({ error: chunkObj.message })}\n\n`)); writer.write(encoder.encode("data: [DONE]\n\n")); isResponseEnded = true; } } buffer = lines[lines.length - 1]; } writer.close(); } processStream(); return new Response(readable, { headers: { "Content-Type": "text/event-stream", ...getCorsHeaders() } }); } // 6. 非流式 let result = ""; let usageData = ""; let hasError = false; let messageEnded = false; let buffer = ""; let skipWorkflowFinished = false; const reader = difyResp.body.getReader(); while (true) { const { done, value } = await reader.read(); if (done) break; buffer += new TextDecoder().decode(value); const lines = buffer.split("\n"); for (let i = 0; i < lines.length - 1; i++) { const line = lines[i].trim(); if (line === "") continue; let chunkObj; try { const cleanedLine = line.replace(/^data: /, "").trim(); if (cleanedLine.startsWith("{") && cleanedLine.endsWith("}")) chunkObj = JSON.parse(cleanedLine); else continue; } catch { continue; } if (chunkObj.event === "message" || chunkObj.event === "agent_message") { result += chunkObj.answer; skipWorkflowFinished = true; } else if (chunkObj.event === "message_end") { messageEnded = true; usageData = { prompt_tokens: chunkObj.metadata?.usage?.prompt_tokens || 100, completion_tokens: chunkObj.metadata?.usage?.completion_tokens || 10, total_tokens: chunkObj.metadata?.usage?.total_tokens || 110, }; } else if (chunkObj.event === "workflow_finished" && !skipWorkflowFinished) { messageEnded = true; const outputs = chunkObj.data.outputs; result = OUTPUT_VARIABLE ? outputs[OUTPUT_VARIABLE] : outputs; result = String(result); usageData = { prompt_tokens: chunkObj.metadata?.usage?.prompt_tokens || 100, completion_tokens: chunkObj.metadata?.usage?.completion_tokens || 10, total_tokens: chunkObj.data?.total_tokens || 110, }; } else if (chunkObj.event === "error") { hasError = true; break; } } buffer = lines[lines.length - 1]; } if (hasError) return getResponse(JSON.stringify({ error: "An error occurred while processing the request." }), 500); if (messageEnded) { return getResponse(JSON.stringify({ id: `chatcmpl-${generateId()}`, object: "chat.completion", created: Math.floor(Date.now() / 1000), model: model, choices: [{ index: 0, message: { role: "assistant", content: result.trim() }, logprobs: null, finish_reason: "stop" }], usage: usageData, system_fingerprint: "fp_2f57f81c11" }), 200); } return getResponse(JSON.stringify({ error: "Unexpected end of stream." }), 500); } addEventListener('fetch', event => event.respondWith(handleRequest(event.request)));
JavaScript
四,Dify和FastGPT的对比和特点
1,二者的相同点和不同点
(1)相同点:
- 开源性质:两者都是开源项目,允许开发者自由使用和贡献代码
- AI应用开发:两个平台都旨在简化AI应用的开发和部署过程
- 大语言模型支持:两者都支持使用大型语言模型(LLM)来构建应用
- API支持:Dify和FastGPT都提供API接口,方便与其他系统集成
(2)不同点:
- 功能重点:
- Dify更注重提供全面的LLMOps功能,包括数据管理、模型微调等
- FastGPT专注于快速响应和高效的信息检索
- 用户界面:
- Dify提供更直观的可视化Prompt编排界面
- FastGPT的界面可能相对简单,但更注重速度和效率
- 数据处理:
- Dify提供更强大的数据收集和预处理工具
- FastGPT在数据处理方面可能不如Dify全面,但在RAG检索上表现更好
2,Dify的优点
- 全面的LLMOps功能:提供从数据准备到模型部署的完整工具链。
- 用户友好:直观的界面使非技术用户也能参与AI应用开发。
- 强大的数据管理:提供robust的数据收集和预处理工具。
- 企业级支持:提供更多适合企业使用的功能,如访问控制和监控。
- 灵活性:支持多种模型和自定义插件。
3,FastGPT的优点
- 速度优势:提供极快的响应时间,适合需要实时交互的应用。
- 高效的RAG检索:在信息检索方面表现出色。
- 资源优化:设计注重性能和低计算复杂度。
- 适应性强:易于适应各种领域和任务。