Skip to content

Latest commit

 

History

History
137 lines (110 loc) · 5.28 KB

readme.md

File metadata and controls

137 lines (110 loc) · 5.28 KB

D2O

Use Dify on your favorite OpenAI client.

This project converts the Dify API to the OpenAI API format, giving you access to Dify's LLMs, knowledge base, tools, and workflows within your preferred OpenAI clients.

Features

  • Convert Dify API into an OpenAI API
  • Support streaming and blocking
  • Support Chat, Completion, Agent and Workflow bots API on dify

Deployment

Zeabur

Deploy on Zeabur

Vercel

Deploy with Vercel

Note: Vercel's serverless functions have a 10-second timeout limit.

Local Deployment

  1. Set the environment variable in the .env file
DIFY_API_URL=https://api.dify.ai/v1
  1. Install dependencies
pnpm install
  1. Run the project
pnpm start

Usage

  1. OpenAI Clients

botgem

  1. Code
const response = await fetch('http://localhost:3000/v1/chat/completions', {
  method: 'POST',
  headers: {
    'Content-Type': 'application/json',
    'Authorization': 'Bearer YOUR_DFIY_API_KEY',
  },
  body: JSON.stringify({
    model: 'dify',
    messages: [
      { role: 'system', content: 'You are a helpful assistant.' },
      { role: 'user', content: 'Hello, how are you?' },
    ],
  }),
});

const data = await response.json();
console.log(data);

Docker Deployment

  • Build the image
docker build -t dify2openai:latest .
  • Run the container
docker run -d -name dify2openai \
    --network bridge \
    -p 3000:3000 \
    -e DIFY_API_URL=https://api.dify.ai/v1 \
    -e BOT_TYPE=Chat \
    --restart always
    dify2openai:latest
  • You can also use Docker Compose to build the image and run the container:
version: '3.5'
services:
  dify2openai:
    container_name: dify2openai
    build:
      context: .
      dockerfile: Dockerfile
    network_mode: bridge
    ports:
      - "3000:3000"
    restart: always
    environment: 
      - DIFY_API_URL=https://api.dify.ai/v1
      - BOT_TYPE=Chat

Please change the environment variables according to your needs.See Environment Variable for more information.

Environment Variable

This project provides some additional configuration items set with environment variables:

Environment Variable Required Description Example
DIFY_API_URL Yes Your Dify API if you self-host it https://api.dify.ai/v1
BOT_TYPE Yes The type of your dify bots Chat,Completion,Workflow
INPUT_VARIABLE No The name of input variable in your own dify workflow bot query,text
OUTPUT_VARIABLE No The name of output variable in your own dify workflow bot text
MODELS_NAME No The value is the model name output by the /v1/models endpoint. The default value is dify. dify

Roadmap

Coming Soon

  • Image support
  • Audio-to-text
  • Text-to-audio
  • Docker support

Available Now

  • Workflow Bot
  • Variables support
  • Continuous dialogue
  • Zeabur & Vercel deployment
  • Streaming & Blocking
  • Agent & Chat bots

Contact

Feel free to reach out for any questions or feedback

X
telegram

Buy Me A Coffee

License

This project is licensed under the MIT License.