Convert Figma logo to code with AI

fuergaosi233 logowechat-chatgpt

Use ChatGPT On Wechat via wechaty

13,245
3,925
13,245
62

Top Related Projects

ChatGPT for wechat https://github.com/AutumnWhj/ChatGPT-wechat-bot

基于大模型搭建的聊天机器人,同时支持 微信公众号、企业微信应用、飞书、钉钉 等接入,可选择GPT3.5/GPT-4o/GPT-o1/ Claude/文心一言/讯飞星火/通义千问/ Gemini/GLM-4/Claude/Kimi/LinkAI,能处理文本、语音和图片,访问操作系统和互联网,支持基于自有知识库进行定制企业智能客服。

🤖一个基于 WeChaty 结合 OpenAi ChatGPT / Kimi / 讯飞等Ai服务实现的微信机器人 ,可以用来帮助你自动回复微信消息,或者管理微信群/好友,检测僵尸粉等...

Quick Overview

The fuergaosi233/wechat-chatgpt repository is an open-source project that integrates ChatGPT with WeChat, allowing users to interact with the AI model through WeChat messages. It provides a bridge between the popular Chinese messaging platform and OpenAI's powerful language model, enabling users to have AI-powered conversations within their familiar WeChat environment.

Pros

  • Seamless integration of ChatGPT with WeChat, making AI conversations accessible to a wide user base
  • Supports multiple ChatGPT accounts for load balancing and improved reliability
  • Customizable conversation settings, including temperature and max tokens
  • Active development and community support

Cons

  • Requires technical knowledge to set up and configure
  • Potential for misuse or violation of WeChat's terms of service
  • Dependency on third-party services (OpenAI API) may lead to stability issues
  • Limited documentation in English, which may be challenging for non-Chinese speakers

Getting Started

To set up the wechat-chatgpt project:

  1. Clone the repository:

    git clone https://github.com/fuergaosi233/wechat-chatgpt.git
    
  2. Install dependencies:

    cd wechat-chatgpt
    npm install
    
  3. Configure the project by copying the example configuration file and editing it:

    cp config.yaml.example config.yaml
    nano config.yaml
    
  4. Start the application:

    npm run dev
    
  5. Scan the QR code with your WeChat account to log in and start using ChatGPT within WeChat.

Competitor Comparisons

ChatGPT for wechat https://github.com/AutumnWhj/ChatGPT-wechat-bot

Pros of ChatGPT-wechat-bot

  • Simpler setup process with fewer dependencies
  • More lightweight and focused on core functionality
  • Better documentation for quick start and configuration

Cons of ChatGPT-wechat-bot

  • Less active development and fewer contributors
  • Fewer features and customization options
  • Limited support for different chat platforms

Code Comparison

wechat-chatgpt:

async def handle_message(self, msg):
    if msg.type() != MessageType.TEXT:
        return
    content = msg.content()
    conversation = self.get_conversation(msg)
    response = await self.chatgpt.get_chat_response(content, conversation)
    await msg.reply(response)

ChatGPT-wechat-bot:

def handle_text(msg):
    if msg.text.startswith('/'):
        handle_command(msg)
    else:
        reply = chatgpt.get_chat_response(msg.text)
        msg.reply(reply)

The code comparison shows that wechat-chatgpt uses asynchronous programming and handles conversations, while ChatGPT-wechat-bot has a simpler approach with synchronous code and no conversation management. wechat-chatgpt appears to be more flexible and scalable, but ChatGPT-wechat-bot is more straightforward for basic use cases.

基于大模型搭建的聊天机器人,同时支持 微信公众号、企业微信应用、飞书、钉钉 等接入,可选择GPT3.5/GPT-4o/GPT-o1/ Claude/文心一言/讯飞星火/通义千问/ Gemini/GLM-4/Claude/Kimi/LinkAI,能处理文本、语音和图片,访问操作系统和互联网,支持基于自有知识库进行定制企业智能客服。

Pros of chatgpt-on-wechat

  • More extensive documentation and setup instructions
  • Supports multiple AI platforms (OpenAI, Azure, Claude) and language models
  • Includes image generation capabilities

Cons of chatgpt-on-wechat

  • More complex setup process due to additional features
  • Potentially higher resource usage and slower response times

Code Comparison

wechat-chatgpt:

async def handle_text_message(msg):
    response = await bot.get_chat_response(msg.content)
    await msg.reply(response)

chatgpt-on-wechat:

def handle_text(self, msg):
    context = Context(ContextType.TEXT, content=msg.content)
    reply = self.chat_channel.build_reply_content(context)
    return reply

The code snippets show that chatgpt-on-wechat uses a more modular approach with context handling, potentially allowing for greater flexibility and extensibility. However, this comes at the cost of increased complexity compared to the simpler implementation in wechat-chatgpt.

Both projects aim to integrate ChatGPT with WeChat, but chatgpt-on-wechat offers a more feature-rich experience at the expense of simplicity. The choice between the two depends on the user's specific needs and technical expertise.

🤖一个基于 WeChaty 结合 OpenAi ChatGPT / Kimi / 讯飞等Ai服务实现的微信机器人 ,可以用来帮助你自动回复微信消息,或者管理微信群/好友,检测僵尸粉等...

Pros of wechat-bot

  • More extensive feature set, including image generation and voice message support
  • Better documentation and setup instructions
  • Active development with frequent updates and bug fixes

Cons of wechat-bot

  • More complex setup process due to additional features
  • Potentially higher resource usage due to expanded functionality
  • May require more maintenance and configuration

Code Comparison

wechat-bot:

const { WechatyBuilder } = require('wechaty');
const bot = WechatyBuilder.build({
  name: 'wechat-bot',
  puppet: 'wechaty-puppet-wechat',
});

wechat-chatgpt:

import { WechatyBuilder } from 'wechaty';
const bot = WechatyBuilder.build({
  name: 'wechat-assistant',
});

The code snippets show that wechat-bot uses CommonJS module syntax and specifies a puppet, while wechat-chatgpt uses ES6 import syntax and relies on default configurations.

Both projects aim to integrate ChatGPT with WeChat, but wechat-bot offers a more feature-rich experience at the cost of increased complexity. wechat-chatgpt provides a simpler implementation focused primarily on text-based interactions. Users should choose based on their specific needs and technical expertise.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

!!!! Project Archived 📦 !!!!

This project has been archived. Thank you to everyone who contributed! 🙌😔

Welcome to wechat-chatgpt 👋

Version License: ISC Twitter: fuergaosi join discord community of github profile readme generator

Use ChatGPT On Wechat via wechaty
English | 中文文档

Deploy on Railway

🌟 Features

  • Interact with WeChat and ChatGPT:

    • Use ChatGPT on WeChat with wechaty and Official API
    • Add conversation support
    • Support command setting
  • Deployment and configuration options:

  • Other features:

    • Support Dall·E
    • Support whisper
    • Support setting prompt
    • Support proxy (in development)

🚀 Usage

Use with Railway

Railway offers $5 or 500 hours of runtime per month

  1. Click the Railway button to go to the Railway deployment page
  2. Click the Deploy Now button to enter the Railway deployment page
  3. Fill in the repository name and OPENAI_API_KEY (need to link GitHub account)
  4. Click the Deploy button
  5. Click the View Logs button and wait for the deployment to complete

Use with Fly.io

Please allocate 512MB memory for the application to meet the application requirements

fly.io offers free bills up to $5(Free Allowances 3 256MB are not included in the bill)

  1. Install flyctl
     # macOS
     brew install flyctl
     # Windows
     scoop install flyctl
     # Linux
     curl https://fly.io/install.sh | sh
    
  2. Clone the project and enter the project directory
    git clone https://github.com/fuergaosi233/wechat-chatgpt.git && cd wechat-chatgpt
    
  3. Create a new app
    ➜ flyctl launch 
     ? Would you like to copy its configuration to the new app? No
     ? App Name (leave blank to use an auto-generated name): <YOUR APP NAME>
     ? Select region: <YOUR CHOOSE REGION>
     ? Would you like to setup a Postgresql database now? No
     ? Would you like to deploy now? No
    
  4. Configure the environment variables
    flyctl secrets set OPENAI_API_KEY="<YOUR OPENAI API KEY>" MODEL="<CHATGPT-MODEL>"
    
  5. Deploy the app
    flyctl deploy
    

Use with docker

# pull image
docker pull holegots/wechat-chatgpt
# run container
docker run -d --name wechat-chatgpt \
    -e OPENAI_API_KEY=<YOUR OPENAI API KEY> \
    -e MODEL="gpt-3.5-turbo" \
    -e CHAT_PRIVATE_TRIGGER_KEYWORD="" \
    -v $(pwd)/data:/app/data/wechat-assistant.memory-card.json \
    holegots/wechat-chatgpt:latest
# View the QR code to log in to wechat
docker logs -f wechat-chatgpt

How to get OPENAI API KEY? Click here

Use with docker compose

# Copy the configuration file according to the template
cp .env.example .env
# Edit the configuration file
vim .env
# Start the container
docker-compose up -d
# View the QR code to log in to wechat
docker logs -f wechat-chatgpt

Use with nodejs

You need NodeJS 18.0.0 version and above

# Clone the project
git clone https://github.com/fuergaosi233/wechat-chatgpt.git && cd wechat-chatgpt
# Install dependencies
npm install
# Copy the configuration file according to the template
cp .env.example .env
# Edit the configuration file
vim .env
# Start project
npm run dev

Please make sure your WeChat account can log in WeChat on web

📝 Environment Variables

namedescription
APIAPI endpoint of ChatGPT
OPENAI_API_KEYcreate new secret key
MODELID of the model to use. Currently, only gpt-3.5-turbo and gpt-3.5-turbo-0301 are supported.
TEMPERATUREWhat sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
CHAT_TRIGGER_RULEPrivate chat triggering rules.
DISABLE_GROUP_MESSAGEProhibited to use ChatGPT in group chat.
CHAT_PRIVATE_TRIGGER_KEYWORDKeyword to trigger ChatGPT reply in WeChat private chat
BLOCK_WORDSChat blocker words, (works for both private and group chats, Use, Split)
CHATGPT_BLOCK_WORDSThe blocked words returned by ChatGPT(works for both private and group chats, Use, Split)

📝 Using Custom ChatGPT API

https://github.com/fuergaosi233/openai-proxy

# Clone the project
git clone https://github.com/fuergaosi233/openai-proxy
# Install dependencies
npm install && npm install -g wrangler && npm run build
# Deploy to CloudFlare Workers
npm run deploy
# Custom domain (optional)
Add `Route` to `wrangler.toml`
routes = [
    { pattern = "Your Custom Domain", custom_domain = true },
]

⌨️ Commands

Enter in the WeChat chat box

/cmd help # Show help
/cmd prompt <PROMPT> # Set prompt
/cmd clear # Clear all sessions since last boot

✨ Contributor

🤝 Contributing

Contributions, issues and feature requests are welcome!
Feel free to check issues page.

Show your support

Give a ⭐️ if this project helped you!