Top Related Projects
🔮 ChatGPT Desktop Application (Mac, Windows and Linux)
AI agent stdlib that works with any LLM and TypeScript AI SDK.
用 Express 和 Vue3 搭建的 ChatGPT 演示网页
GUI for ChatGPT API and many LLMs. Supports agents, file-based QA, GPT finetuning and query with web search. All with a neat UI.
✨ Light and Fast AI Assistant. Support: Web | iOS | MacOS | Android | Linux | Windows
Quick Overview
PawanOsman/ChatGPT is an unofficial ChatGPT API client for Node.js. It provides a simple way to interact with OpenAI's ChatGPT model, allowing developers to integrate conversational AI capabilities into their applications without the need for an official API key.
Pros
- Easy to use and integrate into existing Node.js projects
- Doesn't require an official OpenAI API key
- Supports both free and paid (Plus) ChatGPT accounts
- Includes features like conversation management and proxy support
Cons
- Unofficial API, which may be less stable or reliable than an official one
- Potential for breaking changes if OpenAI modifies their ChatGPT interface
- May violate OpenAI's terms of service
- Limited documentation and examples
Code Examples
- Creating a ChatGPT instance and sending a message:
const ChatGPT = require('chatgpt-official');
const chatGPT = new ChatGPT({
email: 'your-email@example.com',
password: 'your-password'
});
const response = await chatGPT.sendMessage('Hello, how are you?');
console.log(response);
- Using a proxy server:
const chatGPT = new ChatGPT({
email: 'your-email@example.com',
password: 'your-password',
proxy: 'http://proxy-server:8080'
});
- Continuing a conversation:
const initialResponse = await chatGPT.sendMessage('What is the capital of France?');
console.log(initialResponse);
const followUpResponse = await chatGPT.sendMessage('What is its population?', {
conversationId: initialResponse.conversationId,
parentMessageId: initialResponse.messageId
});
console.log(followUpResponse);
Getting Started
To use PawanOsman/ChatGPT in your Node.js project:
-
Install the package:
npm install chatgpt-official
-
Import and initialize the ChatGPT client:
const ChatGPT = require('chatgpt-official'); const chatGPT = new ChatGPT({ email: 'your-email@example.com', password: 'your-password' });
-
Send a message and handle the response:
async function askChatGPT(question) { try { const response = await chatGPT.sendMessage(question); console.log(response); } catch (error) { console.error('Error:', error); } } askChatGPT('What is the meaning of life?');
Remember to replace 'your-email@example.com' and 'your-password' with your actual ChatGPT account credentials.
Competitor Comparisons
🔮 ChatGPT Desktop Application (Mac, Windows and Linux)
Pros of ChatGPT (lencx)
- Cross-platform desktop application (Windows, macOS, Linux)
- Feature-rich UI with customizable themes and prompts
- Supports multiple languages and text-to-speech functionality
Cons of ChatGPT (lencx)
- Larger file size due to being a full desktop application
- May require more system resources compared to web-based alternatives
- Less frequent updates compared to ChatGPT (PawanOsman)
Code Comparison
ChatGPT (lencx):
export const chatRoot = () => {
return getPath('chat', 'index.html');
};
ChatGPT (PawanOsman):
app.post('/api/chat', async (req, res) => {
const { message } = req.body;
const response = await chatgpt.sendMessage(message);
res.json({ response });
});
The code snippets show different approaches:
- ChatGPT (lencx) focuses on file path management for the desktop app
- ChatGPT (PawanOsman) implements a server-side API endpoint for chat functionality
Both repositories aim to provide ChatGPT functionality, but with different implementations:
- ChatGPT (lencx) offers a standalone desktop application with a rich UI
- ChatGPT (PawanOsman) provides a web-based solution with a focus on API integration
Users should choose based on their specific needs: desktop app vs. web-based solution, UI preferences, and development requirements.
AI agent stdlib that works with any LLM and TypeScript AI SDK.
Pros of agentic
- Focuses on autonomous AI agents, offering a more specialized and advanced approach
- Provides a flexible framework for creating and managing multiple AI agents
- Includes built-in tools for task planning and execution
Cons of agentic
- Less user-friendly for beginners compared to ChatGPT's simpler interface
- Requires more setup and configuration to get started
- May have a steeper learning curve due to its more complex architecture
Code Comparison
ChatGPT:
const chatgpt = new ChatGPT({
apiKey: 'your-api-key',
model: 'gpt-3.5-turbo',
});
const response = await chatgpt.sendMessage('Hello, how are you?');
console.log(response);
agentic:
from agentic import Agent, Task
agent = Agent()
task = Task("Greet the user")
result = agent.run(task)
print(result)
Both repositories offer ways to interact with AI models, but agentic provides a more complex framework for creating autonomous agents, while ChatGPT focuses on simpler conversational interactions.
用 Express 和 Vue3 搭建的 ChatGPT 演示网页
Pros of chatgpt-web
- User-friendly web interface with a modern design
- Multi-language support for a wider audience
- Easy deployment options, including Docker support
Cons of chatgpt-web
- Limited customization options compared to ChatGPT
- Fewer advanced features for power users
- Smaller community and less frequent updates
Code Comparison
chatgpt-web:
const message = ref('')
const loading = ref(false)
const controller = ref<AbortController>(null)
const onSubmit = async () => {
if (loading.value)
return
ChatGPT:
async function sendMessage(message) {
const response = await fetch('/api/chat', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ message })
});
return response.json();
}
The code snippets show different approaches to handling message submission. chatgpt-web uses Vue.js reactive variables and an async function, while ChatGPT demonstrates a more straightforward fetch API call for sending messages.
Both projects aim to provide ChatGPT-like functionality, but chatgpt-web focuses on a polished web interface with multi-language support, while ChatGPT offers more customization options and advanced features for developers. The choice between the two depends on the user's specific needs and technical expertise.
GUI for ChatGPT API and many LLMs. Supports agents, file-based QA, GPT finetuning and query with web search. All with a neat UI.
Pros of ChuanhuChatGPT
- User-friendly GUI with a web interface for easier interaction
- Supports multiple language models, including GPT-3.5-turbo and GPT-4
- Offers advanced features like conversation history and model switching
Cons of ChuanhuChatGPT
- More complex setup process compared to ChatGPT
- Requires additional dependencies and configurations
- May have a steeper learning curve for non-technical users
Code Comparison
ChatGPT:
const chatgpt = new ChatGPTClient(apiKey, options);
const response = await chatgpt.sendMessage('Hello, how are you?');
console.log(response);
ChuanhuChatGPT:
import gradio as gr
from modules import config
from modules.utils import *
from modules.presets import *
iface = gr.Blocks()
with iface:
# GUI components and logic
The ChatGPT repository provides a simpler, JavaScript-based implementation for interacting with the ChatGPT API. In contrast, ChuanhuChatGPT uses Python and the Gradio library to create a more comprehensive web interface with additional features and customization options.
While ChatGPT focuses on providing a straightforward API wrapper, ChuanhuChatGPT offers a full-fledged application with a graphical user interface, making it more accessible to users who prefer visual interactions over command-line interfaces.
✨ Light and Fast AI Assistant. Support: Web | iOS | MacOS | Android | Linux | Windows
Pros of NextChat
- More modern UI with a sleek, user-friendly interface
- Built with Next.js, offering better performance and SEO capabilities
- Supports multiple languages and themes out of the box
Cons of NextChat
- Less extensive documentation compared to ChatGPT
- Fewer customization options for API integration
- May require more setup time due to its more complex architecture
Code Comparison
ChatGPT:
const chatgpt = new ChatGPT(apiKey);
const response = await chatgpt.sendMessage('Hello, how are you?');
console.log(response);
NextChat:
import { ChatGPTAPI } from 'chatgpt'
const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })
const res = await api.sendMessage('Hello, how are you?')
console.log(res.text)
Both repositories provide easy-to-use interfaces for interacting with ChatGPT, but NextChat offers a more comprehensive web application experience. ChatGPT focuses on simplicity and ease of integration, while NextChat provides a full-featured chat interface with additional bells and whistles. The code comparison shows that both projects have similar API usage patterns, but NextChat's implementation is more aligned with modern JavaScript practices and environment variable usage.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
This project is a bit outdated and isnât working right now. Weâll update it, but weâre tied up with another project at the moment. In the meantime, youâre welcome to use our hosted models for free.
ChatGPT gpt-3.5-turbo
API for Free (as a Reverse Proxy)
Welcome to the ChatGPT API Free Reverse Proxy, offering free self-hosted API access to ChatGPT (gpt-3.5-turbo
) with OpenAI's familiar structure, so no code changes are needed.
Quick Links
- Join our Discord Community for support and questions.
- â¡Note: Your Discord account must be at least 7 days old to be able join our Discord community.
Table of Contents
- Features
- Option 1: Installing/Self-Hosting Guide (Without using any API key)
- Method 1: Using Docker or Run it with a Chat Web UI using docker-compose
- Method 2: Your PC/Server (manually)
- Method 3: Termux on Android Phones
- Option 2: Accessing Our Hosted API (Free)
- Usage Examples
- License
Features
- Streaming Response: The API supports streaming response, so you can get the response as soon as it's available.
- API Endpoint Compatibility: Full alignment with official OpenAI API endpoints, ensuring hassle-free integration with existing OpenAI libraries.
- Complimentary Access: No charges for API usage, making advanced AI accessible to everyone even without an API key.
Installing/Self-Hosting Guide
Using Docker
- Ensure Docker is installed by referring to the Docker Installation Docs.
- Run the following command:
docker run -dp 3040:3040 pawanosman/chatgpt:latest
- Done! You can now connect to your local server's API at:
Note that the base URL ishttp://localhost:3040/v1/chat/completions
http://localhost:3040/v1
.
Install with chat web interfaces
â You can run third-party chat web interfaces, such as BetterChatGPT and LobeChat, with this API using Docker Compose. Click here for the installation guide.
Your PC/Server
To install and run the ChatGPT API Reverse Proxy on your PC/Server by following these steps:
Note: This option is not available to all countries yet. if you are from a country that is not supported, you can use a U.S. VPN or use our hosted API.
- Ensure NodeJs (v19+) is installed: Download NodeJs
- Clone this repository:
git clone https://github.com/PawanOsman/ChatGPT.git
- Open
start.bat
(Windows) orstart.sh
(Linux withbash start.sh
command) to install dependencies and launch the server. - Done, you can connect to your local server's API at:
Note that the base url will behttp://localhost:3040/v1/chat/completions
http://localhost:3040/v1
To include installation instructions for Termux on Android devices, you can add the following section right after the instructions for Linux in the Installing/Self-Hosting Guide:
Termux on Android Phones
To install and run the ChatGPT API Reverse Proxy on Android using Termux, follow these steps:
-
Install Termux from the Play Store.
-
Update Termux packages:
apt update
-
Upgrade Termux packages:
apt upgrade
-
Install git, Node.js, and npm:
apt install -y git nodejs
-
Clone the repository:
git clone https://github.com/PawanOsman/ChatGPT.git
-
Navigate to the cloned directory:
cd ChatGPT
-
Start the server with:
bash start.sh
-
Your local server will now be running and accessible at:
http://localhost:3040/v1/chat/completions
Note that the base url will be
http://localhost:3040/v1
You can now use this address to connect to your self-hosted ChatGPT API Reverse Proxy from Android applications/websites that support reverse proxy configurations, on the same device.
Accessing Our Hosted API
Utilize our pre-hosted ChatGPT-like API for free by:
- Joining our Discord server.
- Obtaining an API key from the
#Bot
channel with the/key
command. - Incorporating the API key into your requests to:
https://api.pawan.krd/v1/chat/completions
Usage Examples
Leverage the same integration code as OpenAI's official libraries by simply adjusting the API key and base URL in your requests. For self-hosted setups, ensure to switch the base URL to your local server's address as mentioned above.
Example Usage with OpenAI Libraries
Python Example
import openai
openai.api_key = 'anything'
openai.base_url = "http://localhost:3040/v1/"
completion = openai.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "How do I list all files in a directory using Python?"},
],
)
print(completion.choices[0].message.content)
Node.js Example
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: "anything",
baseURL: "http://localhost:3040/v1",
});
const chatCompletion = await openai.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'gpt-3.5-turbo',
});
console.log(chatCompletion.choices[0].message.content);
License
This project is under the AGPL-3.0 License. Refer to the LICENSE file for detailed information.
Top Related Projects
🔮 ChatGPT Desktop Application (Mac, Windows and Linux)
AI agent stdlib that works with any LLM and TypeScript AI SDK.
用 Express 和 Vue3 搭建的 ChatGPT 演示网页
GUI for ChatGPT API and many LLMs. Supports agents, file-based QA, GPT finetuning and query with web search. All with a neat UI.
✨ Light and Fast AI Assistant. Support: Web | iOS | MacOS | Android | Linux | Windows
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot