Top Related Projects
AI agent stdlib that works with any LLM and TypeScript AI SDK.
A client implementation for ChatGPT and Bing AI. Available as a Node.js module, REST API server, and CLI app.
OpenAI API Free Reverse Proxy
Use ChatGPT On Wechat via wechaty
Quick Overview
The x-dr/chatgptProxyAPI is a GitHub repository that provides a proxy API for ChatGPT. It allows users to interact with ChatGPT through a custom API, potentially bypassing rate limits or access restrictions. This project aims to simplify the integration of ChatGPT into various applications and services.
Pros
- Enables easier access to ChatGPT functionality
- Potentially bypasses rate limits or access restrictions
- Simplifies integration of ChatGPT into custom applications
- Provides a layer of abstraction between the user's application and OpenAI's API
Cons
- May violate OpenAI's terms of service
- Potential for misuse or abuse of the ChatGPT API
- Depends on the stability and availability of OpenAI's services
- May require frequent updates to maintain compatibility with OpenAI's API changes
Code Examples
# Example 1: Making a simple request to the proxy API
import requests
response = requests.post('http://your-proxy-url/api/chat', json={
'message': 'Hello, ChatGPT!',
'model': 'gpt-3.5-turbo'
})
print(response.json())
# Example 2: Streaming responses from the proxy API
import requests
with requests.post('http://your-proxy-url/api/chat/stream', json={
'message': 'Tell me a story',
'model': 'gpt-3.5-turbo'
}, stream=True) as response:
for chunk in response.iter_content(chunk_size=None):
print(chunk.decode('utf-8'), end='', flush=True)
# Example 3: Using the proxy API with custom parameters
import requests
response = requests.post('http://your-proxy-url/api/chat', json={
'message': 'Translate this to French: Hello, world!',
'model': 'gpt-3.5-turbo',
'temperature': 0.7,
'max_tokens': 50
})
print(response.json())
Getting Started
-
Clone the repository:
git clone https://github.com/x-dr/chatgptProxyAPI.git
-
Install dependencies:
cd chatgptProxyAPI pip install -r requirements.txt
-
Set up your OpenAI API key as an environment variable:
export OPENAI_API_KEY=your_api_key_here
-
Run the proxy server:
python app.py
-
Use the proxy API in your application by sending requests to
http://localhost:5000/api/chat
orhttp://localhost:5000/api/chat/stream
for streaming responses.
Competitor Comparisons
AI agent stdlib that works with any LLM and TypeScript AI SDK.
Pros of agentic
- More comprehensive framework for building AI agents
- Supports multiple AI models and providers beyond just ChatGPT
- Offers advanced features like memory, planning, and tool use
Cons of agentic
- More complex setup and usage compared to simpler proxy API
- Potentially higher resource requirements
- May be overkill for basic ChatGPT API access needs
Code comparison
chatgptProxyAPI:
from flask import Flask, request, jsonify
import requests
app = Flask(__name__)
@app.route('/v1/chat/completions', methods=['POST'])
def chat_completions():
# Proxy logic here
agentic:
from agentic import Agent, Task
agent = Agent()
task = Task("Summarize this article")
result = agent.run(task)
print(result)
The chatgptProxyAPI focuses on providing a simple proxy for ChatGPT API requests, while agentic offers a more robust framework for creating AI agents with advanced capabilities. chatgptProxyAPI is likely easier to set up and use for basic ChatGPT API access, but agentic provides more flexibility and features for complex AI agent development.
A client implementation for ChatGPT and Bing AI. Available as a Node.js module, REST API server, and CLI app.
Pros of node-chatgpt-api
- More comprehensive API support, including official ChatGPT API and Bing Conversation
- Better documentation and examples for usage
- Active development with frequent updates and bug fixes
Cons of node-chatgpt-api
- More complex setup and configuration
- Larger codebase, potentially harder to maintain
- Requires more dependencies
Code Comparison
chatgptProxyAPI:
app.post('/chat', async (req, res) => {
const { message } = req.body;
const response = await chatgpt.sendMessage(message);
res.json({ reply: response });
});
node-chatgpt-api:
const chatGPTClient = new ChatGPTClient(apiKey, options);
const response = await chatGPTClient.sendMessage('Hello, how are you?', {
conversationId: '...',
parentMessageId: '...',
});
console.log(response);
The code comparison shows that node-chatgpt-api offers more flexibility and options for message handling, while chatgptProxyAPI provides a simpler, more straightforward implementation. node-chatgpt-api allows for conversation tracking and advanced options, which may be beneficial for more complex applications.
OpenAI API Free Reverse Proxy
Pros of ChatGPT
- More comprehensive documentation and setup instructions
- Supports multiple AI models beyond just ChatGPT
- Includes additional features like conversation management and streaming responses
Cons of ChatGPT
- More complex setup and configuration required
- Potentially higher resource usage due to additional features
- May be overkill for simple proxy use cases
Code Comparison
chatgptProxyAPI:
app.post('/v1/chat/completions', async (req, res) => {
const { messages } = req.body;
const response = await fetch(API_URL, {
method: 'POST',
headers: { 'Content-Type': 'application/json', 'Authorization': `Bearer ${API_KEY}` },
body: JSON.stringify({ messages, model: 'gpt-3.5-turbo' })
});
const data = await response.json();
res.json(data);
});
ChatGPT:
app.post('/chat', async (req, res) => {
const { message, conversationId, parentMessageId } = req.body;
const response = await api.sendMessage(message, {
conversationId,
parentMessageId,
onProgress: (partialResponse) => res.write(JSON.stringify(partialResponse))
});
res.end(JSON.stringify(response));
});
The code comparison shows that ChatGPT offers more advanced features like conversation management and streaming responses, while chatgptProxyAPI provides a simpler, more direct proxy to the OpenAI API.
Use ChatGPT On Wechat via wechaty
Pros of wechat-chatgpt
- Integrates ChatGPT directly with WeChat, allowing for seamless interaction
- Supports multiple ChatGPT accounts for load balancing and redundancy
- Includes features like message history and conversation context management
Cons of wechat-chatgpt
- More complex setup due to WeChat integration requirements
- Limited to WeChat platform, less flexible for other use cases
- May require frequent updates to maintain compatibility with WeChat API changes
Code Comparison
wechat-chatgpt:
const { WechatyBuilder } = require('wechaty');
const qrcodeTerminal = require('qrcode-terminal');
const bot = WechatyBuilder.build();
bot.on('scan', (qrcode, status) => {
qrcodeTerminal.generate(qrcode, { small: true });
});
chatgptProxyAPI:
const express = require('express');
const { Configuration, OpenAIApi } = require('openai');
const app = express();
app.use(express.json());
const configuration = new Configuration({ apiKey: process.env.OPENAI_API_KEY });
The code snippets highlight the different focus of each project. wechat-chatgpt is centered around WeChat integration, while chatgptProxyAPI provides a more general-purpose API proxy for OpenAI services. chatgptProxyAPI offers greater flexibility for various applications but lacks the specific WeChat integration features of wechat-chatgpt.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
ç»æµè¯æäºipè®¿é®æ¶ä¼æ¾ç¤ºä¸å¾
Demo
api
https://openai.451024.xyz
https://openai-proxy-api.pages.dev/api
æ°é¡¹ç® åºäºOpenAIç微信æºå¨äºº
æ¼ç¤ºç«ä¸ºå ¬å ±æå¡ï¼å¦æå¤§è§æ¨¡ä½¿ç¨éæ±è¯·èªè¡é¨ç½²ï¼æ¼ç¤ºç«æç¹ä¸å ªéè´
å©ç¨Cloudflare Workerä¸è½¬api.openai.com
- æ°å»ºä¸ä¸ª Cloudflare Worker
- å¤å¶ cf_worker.js éç代ç ç²è´´å° Worker ä¸å¹¶é¨ç½²
- ç» Worker ç»å®ä¸ä¸ªæ²¡æè¢«å¢çåå
- 使ç¨èªå·±çååä»£æ¿ api.openai.com
使ç¨CloudFlare Pagesè¿è¡ä¸è½¬
1ãé¨ç½²ä¸è½¬API+ Openai APIä½é¢æ¥è¯¢ (使ç¨sess-xxxxçAuthorizationæ¥è¯¢ï¼æææ¶é´æªç¥)
-
Forkæ¬é¡¹ç®ç¹å»Use this templateæé®å建ä¸ä¸ªæ°ç代ç åºã -
ç»å½å°Cloudflareæ§å¶å°.
-
å¨å¸æ·ä¸»é¡µä¸ï¼éæ©
pages
>Create a project
>Connect to Git
-
éæ©ä½ Fork ç项ç®åå¨åºï¼å¨
Set up builds and deployments
é¨åä¸ï¼å ¨é¨é»è®¤å³å¯ã -
ç¹å»
Save and Deploy
é¨ç½²ï¼ç¶åç¹Continue to project
å³å¯çå°è®¿é®åå
æå®æ¹æ¥å£ç
https://api.openai.com
æ¿æ¢ä¸ºhttps://xxx.pages.dev
å³å¯
Demo
2ãåªé¨ç½²ä¸è½¬API
docker é¨ç½²ï¼è¦å¢å¤vpsï¼
好å䏿¯æsse æä»¥ä¸å»ºè®®
e.g.
docker run -itd --name openaiproxy \
-p 3000:3000 \
--restart=always \
gindex/openaiproxy:latest
使ç¨
api : http://vpsip:3000/proxy/v1/chat/completions
curl --location 'http://vpsip:3000/proxy/v1/chat/completions' \
--header 'Authorization: Bearer sk-xxxxxxxxxxxxxxx' \
--header 'Content-Type: application/json' \
--data '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello!"}]
}'
ç¨æ³
JavaScriptç¨fetch
const requestOptions = {
method: 'POST',
headers: {
"Authorization": "Bearer sk-xxxxxxxxxxxx",
"Content-Type": "application/json"
},
body: JSON.stringify({
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "hello word"
}
]
})
};
fetch("https://openai.1rmb.tk/v1/chat/completions", requestOptions)
.then(response => response.text())
.then(result => console.log(result))
.catch(error => console.log('error', error));
ç¨python
import requests
url = "https://openai.1rmb.tk/v1/chat/completions"
api_key = 'sk-xxxxxxxxxxxxxxxxxxxx'
headers = {
'Authorization': f'Bearer {api_key}',
'Content-Type': 'application/json'
}
payload = {
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "hello word"
}
]
}
try:
response = requests.post(url, headers=headers, json=payload)
response.raise_for_status() # æåºå¼å¸¸ï¼å¦æååºç 䏿¯200
data = response.json()
print(data)
except requests.exceptions.RequestException as e:
print(f"请æ±é误: {e}")
except json.JSONDecodeError as e:
print(f"æ æç JSON ååº: {e}")
ç¨nodejs chatgptåº
transitive-bullshit/chatgpt-api
import { ChatGPTAPI } from 'chatgpt'
async function example() {
const api = new ChatGPTAPI({
apiKey: "sk-xxxxxxxxxxxxxx",
// proxy+/v1
apiBaseUrl:"https://openai.1rmb.tk/v1"
})
const res = await api.sendMessage('Hello World!')
console.log(res.text)
}
example()
æ¥è¯¢ä½é¢
const headers = {
'content-type': 'application/json',
'Authorization': `Bearer sk-xxxxxxxxxxxxxxxxx`
}
// æ¥æ¯å¦è®¢é
const subscription = await fetch("https://openai.1rmb.tk/v1/dashboard/billing/subscription", {
method: 'get',
headers: headers
})
if (!subscription.ok) {
const data = await subscription.json()
// console.log(data);
return data
// throw new Error('API request failed')
} else {
const subscriptionData = await subscription.json()
const endDate = subscriptionData.access_until
const startDate = new Date(endDate - 90 * 24 * 60 * 60);
console.log(formatDate(endDate, "YYYY-MM-DD"));
console.log(formatDate(startDate, "YYYY-MM-DD"));
const response = await fetch(`https://openai.1rmb.tk/v1/dashboard/billing/usage?start_date=${formatDate(startDate, "YYYY-MM-DD")}&end_date=${formatDate(endDate, "YYYY-MM-DD")}`, {
method: 'get',
headers: headers
})
const usageData = await response.json();
// è´¦å·ç±»å
const plan = subscriptionData.plan.id
console.log(usageData);
}
Star History
Top Related Projects
AI agent stdlib that works with any LLM and TypeScript AI SDK.
A client implementation for ChatGPT and Bing AI. Available as a Node.js module, REST API server, and CLI app.
OpenAI API Free Reverse Proxy
Use ChatGPT On Wechat via wechaty
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot