Convert Figma logo to code with AI

x-dr logochatgptProxyAPI

🔥 使用cloudflare 搭建免费的 OpenAI api代理 ,解决网络无法访问问题。支持流式输出

2,969
673
2,969
1

Top Related Projects

17,411

AI agent stdlib that works with any LLM and TypeScript AI SDK.

A client implementation for ChatGPT and Bing AI. Available as a Node.js module, REST API server, and CLI app.

5,718

OpenAI API Free Reverse Proxy

Use ChatGPT On Wechat via wechaty

Quick Overview

The x-dr/chatgptProxyAPI is a GitHub repository that provides a proxy API for ChatGPT. It allows users to interact with ChatGPT through a custom API, potentially bypassing rate limits or access restrictions. This project aims to simplify the integration of ChatGPT into various applications and services.

Pros

  • Enables easier access to ChatGPT functionality
  • Potentially bypasses rate limits or access restrictions
  • Simplifies integration of ChatGPT into custom applications
  • Provides a layer of abstraction between the user's application and OpenAI's API

Cons

  • May violate OpenAI's terms of service
  • Potential for misuse or abuse of the ChatGPT API
  • Depends on the stability and availability of OpenAI's services
  • May require frequent updates to maintain compatibility with OpenAI's API changes

Code Examples

# Example 1: Making a simple request to the proxy API
import requests

response = requests.post('http://your-proxy-url/api/chat', json={
    'message': 'Hello, ChatGPT!',
    'model': 'gpt-3.5-turbo'
})
print(response.json())
# Example 2: Streaming responses from the proxy API
import requests

with requests.post('http://your-proxy-url/api/chat/stream', json={
    'message': 'Tell me a story',
    'model': 'gpt-3.5-turbo'
}, stream=True) as response:
    for chunk in response.iter_content(chunk_size=None):
        print(chunk.decode('utf-8'), end='', flush=True)
# Example 3: Using the proxy API with custom parameters
import requests

response = requests.post('http://your-proxy-url/api/chat', json={
    'message': 'Translate this to French: Hello, world!',
    'model': 'gpt-3.5-turbo',
    'temperature': 0.7,
    'max_tokens': 50
})
print(response.json())

Getting Started

  1. Clone the repository:

    git clone https://github.com/x-dr/chatgptProxyAPI.git
    
  2. Install dependencies:

    cd chatgptProxyAPI
    pip install -r requirements.txt
    
  3. Set up your OpenAI API key as an environment variable:

    export OPENAI_API_KEY=your_api_key_here
    
  4. Run the proxy server:

    python app.py
    
  5. Use the proxy API in your application by sending requests to http://localhost:5000/api/chat or http://localhost:5000/api/chat/stream for streaming responses.

Competitor Comparisons

17,411

AI agent stdlib that works with any LLM and TypeScript AI SDK.

Pros of agentic

  • More comprehensive framework for building AI agents
  • Supports multiple AI models and providers beyond just ChatGPT
  • Offers advanced features like memory, planning, and tool use

Cons of agentic

  • More complex setup and usage compared to simpler proxy API
  • Potentially higher resource requirements
  • May be overkill for basic ChatGPT API access needs

Code comparison

chatgptProxyAPI:

from flask import Flask, request, jsonify
import requests

app = Flask(__name__)

@app.route('/v1/chat/completions', methods=['POST'])
def chat_completions():
    # Proxy logic here

agentic:

from agentic import Agent, Task

agent = Agent()
task = Task("Summarize this article")
result = agent.run(task)
print(result)

The chatgptProxyAPI focuses on providing a simple proxy for ChatGPT API requests, while agentic offers a more robust framework for creating AI agents with advanced capabilities. chatgptProxyAPI is likely easier to set up and use for basic ChatGPT API access, but agentic provides more flexibility and features for complex AI agent development.

A client implementation for ChatGPT and Bing AI. Available as a Node.js module, REST API server, and CLI app.

Pros of node-chatgpt-api

  • More comprehensive API support, including official ChatGPT API and Bing Conversation
  • Better documentation and examples for usage
  • Active development with frequent updates and bug fixes

Cons of node-chatgpt-api

  • More complex setup and configuration
  • Larger codebase, potentially harder to maintain
  • Requires more dependencies

Code Comparison

chatgptProxyAPI:

app.post('/chat', async (req, res) => {
  const { message } = req.body;
  const response = await chatgpt.sendMessage(message);
  res.json({ reply: response });
});

node-chatgpt-api:

const chatGPTClient = new ChatGPTClient(apiKey, options);
const response = await chatGPTClient.sendMessage('Hello, how are you?', {
  conversationId: '...',
  parentMessageId: '...',
});
console.log(response);

The code comparison shows that node-chatgpt-api offers more flexibility and options for message handling, while chatgptProxyAPI provides a simpler, more straightforward implementation. node-chatgpt-api allows for conversation tracking and advanced options, which may be beneficial for more complex applications.

5,718

OpenAI API Free Reverse Proxy

Pros of ChatGPT

  • More comprehensive documentation and setup instructions
  • Supports multiple AI models beyond just ChatGPT
  • Includes additional features like conversation management and streaming responses

Cons of ChatGPT

  • More complex setup and configuration required
  • Potentially higher resource usage due to additional features
  • May be overkill for simple proxy use cases

Code Comparison

chatgptProxyAPI:

app.post('/v1/chat/completions', async (req, res) => {
  const { messages } = req.body;
  const response = await fetch(API_URL, {
    method: 'POST',
    headers: { 'Content-Type': 'application/json', 'Authorization': `Bearer ${API_KEY}` },
    body: JSON.stringify({ messages, model: 'gpt-3.5-turbo' })
  });
  const data = await response.json();
  res.json(data);
});

ChatGPT:

app.post('/chat', async (req, res) => {
  const { message, conversationId, parentMessageId } = req.body;
  const response = await api.sendMessage(message, {
    conversationId,
    parentMessageId,
    onProgress: (partialResponse) => res.write(JSON.stringify(partialResponse))
  });
  res.end(JSON.stringify(response));
});

The code comparison shows that ChatGPT offers more advanced features like conversation management and streaming responses, while chatgptProxyAPI provides a simpler, more direct proxy to the OpenAI API.

Use ChatGPT On Wechat via wechaty

Pros of wechat-chatgpt

  • Integrates ChatGPT directly with WeChat, allowing for seamless interaction
  • Supports multiple ChatGPT accounts for load balancing and redundancy
  • Includes features like message history and conversation context management

Cons of wechat-chatgpt

  • More complex setup due to WeChat integration requirements
  • Limited to WeChat platform, less flexible for other use cases
  • May require frequent updates to maintain compatibility with WeChat API changes

Code Comparison

wechat-chatgpt:

const { WechatyBuilder } = require('wechaty');
const qrcodeTerminal = require('qrcode-terminal');
const bot = WechatyBuilder.build();
bot.on('scan', (qrcode, status) => {
  qrcodeTerminal.generate(qrcode, { small: true });
});

chatgptProxyAPI:

const express = require('express');
const { Configuration, OpenAIApi } = require('openai');
const app = express();
app.use(express.json());
const configuration = new Configuration({ apiKey: process.env.OPENAI_API_KEY });

The code snippets highlight the different focus of each project. wechat-chatgpt is centered around WeChat integration, while chatgptProxyAPI provides a more general-purpose API proxy for OpenAI services. chatgptProxyAPI offers greater flexibility for various applications but lacks the specific WeChat integration features of wechat-chatgpt.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

经测试有些ip访问时会显示下图


Demo

https://chatai.451024.xyz

api

https://openai.451024.xyz
https://openai-proxy-api.pages.dev/api

新项目 基于OpenAI的微信机器人


演示站为公共服务,如有大规模使用需求请自行部署,演示站有点不堪重负

worker

利用Cloudflare Worker中转api.openai.com

  1. 新建一个 Cloudflare Worker
  2. 复制 cf_worker.js 里的代码粘贴到 Worker 中并部署
  3. 给 Worker 绑定一个没有被墙的域名
  4. 使用自己的域名代替 api.openai.com

详细教程


使用CloudFlare Pages进行中转

1、部署中转API+ Openai API余额查询 (使用sess-xxxx的Authorization查询,有效时间未知)

官方文档

  1. Fork本项目 点击Use this template按钮创建一个新的代码库。

  2. 登录到Cloudflare控制台.

  3. 在帐户主页中,选择pages> Create a project > Connect to Git

  4. 选择你 Fork 的项目存储库,在Set up builds and deployments部分中,全部默认即可。

  5. 点击Save and Deploy部署,然后点Continue to project即可看到访问域名

把官方接口的https://api.openai.com替换为https://xxx.pages.dev 即可

Demo

https://chatai.451024.xyz

详细教程

2、只部署中转API

详细教程

docker 部署(要境外vps)

好像不支持sse 所以不建议

e.g.
docker run -itd --name openaiproxy \
            -p 3000:3000 \
            --restart=always \
           gindex/openaiproxy:latest

使用

api : http://vpsip:3000/proxy/v1/chat/completions

curl --location 'http://vpsip:3000/proxy/v1/chat/completions' \
--header 'Authorization: Bearer sk-xxxxxxxxxxxxxxx' \
--header 'Content-Type: application/json' \
--data '{
   "model": "gpt-3.5-turbo",
  "messages": [{"role": "user", "content": "Hello!"}]
 }'

用法

JavaScript用fetch
const requestOptions = {
    method: 'POST',
    headers: {
        "Authorization": "Bearer sk-xxxxxxxxxxxx",
        "Content-Type": "application/json"
    },
    body: JSON.stringify({
        "model": "gpt-3.5-turbo",
        "messages": [
            {
                "role": "user",
                "content": "hello word"
            }
        ]
    })
};

fetch("https://openai.1rmb.tk/v1/chat/completions", requestOptions)
    .then(response => response.text())
    .then(result => console.log(result))
    .catch(error => console.log('error', error));
  
用python
import requests

url = "https://openai.1rmb.tk/v1/chat/completions"
api_key = 'sk-xxxxxxxxxxxxxxxxxxxx'

headers = {
  'Authorization': f'Bearer {api_key}',
  'Content-Type': 'application/json'
}

payload = {
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "hello word"
    }
  ]
}

try:
    response = requests.post(url, headers=headers, json=payload)
    response.raise_for_status() # 抛出异常,如果响应码不是200
    data = response.json()
    print(data)
except requests.exceptions.RequestException as e:
    print(f"请求错误: {e}")
except json.JSONDecodeError as e:
    print(f"无效的 JSON 响应: {e}")
用nodejs chatgpt库

transitive-bullshit/chatgpt-api

import { ChatGPTAPI } from 'chatgpt'

async function example() {
  const api = new ChatGPTAPI({
    apiKey: "sk-xxxxxxxxxxxxxx",
  // proxy+/v1
    apiBaseUrl:"https://openai.1rmb.tk/v1"


  })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

example()

查询余额
    const headers = {
      'content-type': 'application/json',
      'Authorization': `Bearer sk-xxxxxxxxxxxxxxxxx`
    }
    // 查是否订阅
    const subscription = await fetch("https://openai.1rmb.tk/v1/dashboard/billing/subscription", {
      method: 'get',
      headers: headers
    })
    if (!subscription.ok) {
      const data = await subscription.json()
      // console.log(data);
      return data
      // throw new Error('API request failed')
    } else {
      const subscriptionData = await subscription.json()
      const endDate = subscriptionData.access_until
      const startDate = new Date(endDate - 90 * 24 * 60 * 60);
      console.log(formatDate(endDate, "YYYY-MM-DD"));
      console.log(formatDate(startDate, "YYYY-MM-DD"));
      const response = await fetch(`https://openai.1rmb.tk/v1/dashboard/billing/usage?start_date=${formatDate(startDate, "YYYY-MM-DD")}&end_date=${formatDate(endDate, "YYYY-MM-DD")}`, {
        method: 'get',
        headers: headers
      })
      
      const usageData = await response.json();
      // 账号类型
      const plan = subscriptionData.plan.id
      console.log(usageData);
      }

Star History

Star History Chart