Convert Figma logo to code with AI

scottlawsonbc logoaudio-reactive-led-strip

:musical_note: :rainbow: Real-time LED strip music visualization using Python and the ESP8266 or Raspberry Pi

2,686
644
2,686
112

Top Related Projects

Arduino library for controlling single-wire LED pixels (NeoPixel, WS2812, etc.)

6,405

The FastLED library for colored LED animation on Arduino. Please direct questions/requests for help to the FastLED Reddit community: http://fastled.io/r We'd like to use github "issues" just for tracking library bugs / enhancements.

14,769

Control WS2812B and many more types of digital RGB LEDs with an ESP8266 or ESP32 over WiFi!

An Arduino NeoPixel support library supporting a large variety of individually addressable LEDs. Please refer to the Wiki for more details. Please use the GitHub Discussions to ask questions as the GitHub Issues feature is used for bug tracking.

Quick Overview

The audio-reactive-led-strip project is an open-source solution for creating real-time LED animations that react to music. It uses a Raspberry Pi and Python to process audio input and control LED strips, creating dynamic light shows synchronized with music. The project is designed for hobbyists and enthusiasts interested in audio visualization and home automation.

Pros

  • Easy to set up with detailed instructions and a list of required components
  • Customizable animations and color patterns
  • Supports various LED strip types (e.g., WS2812B, APA102)
  • Can process audio from multiple sources (line-in, USB microphone)

Cons

  • Requires some technical knowledge to set up and configure
  • Limited to Raspberry Pi platform
  • May have latency issues depending on the audio source and processing power
  • Potential for high power consumption with longer LED strips

Code Examples

# Example 1: Initializing the LED strip
from led_strip import LEDStrip

# Initialize a strip of 60 LEDs
strip = LEDStrip(60)
# Example 2: Setting a solid color
import numpy as np

# Set the entire strip to red
strip.pixels = np.full((60, 3), (255, 0, 0))
strip.update()
# Example 3: Creating a simple animation
import time

# Create a moving rainbow effect
for i in range(60):
    strip.pixels[i] = wheel(i * 4)
    strip.update()
    time.sleep(0.05)

Getting Started

  1. Clone the repository:

    git clone https://github.com/scottlawsonbc/audio-reactive-led-strip.git
    cd audio-reactive-led-strip
    
  2. Install dependencies:

    sudo apt-get update
    sudo apt-get install python3-numpy python3-scipy python3-pyaudio
    
  3. Configure your LED strip settings in config.py

  4. Run the visualization:

    python3 visualization.py
    

Competitor Comparisons

Arduino library for controlling single-wire LED pixels (NeoPixel, WS2812, etc.)

Pros of Adafruit_NeoPixel

  • Simpler setup and usage for basic LED control
  • Extensive documentation and community support
  • Compatible with a wide range of Arduino-based boards

Cons of Adafruit_NeoPixel

  • Lacks built-in audio reactivity features
  • Limited to basic LED control without additional programming
  • May require more custom code for complex animations

Code Comparison

Adafruit_NeoPixel:

#include <Adafruit_NeoPixel.h>
Adafruit_NeoPixel strip = Adafruit_NeoPixel(60, PIN, NEO_GRB + NEO_KHZ800);
strip.begin();
strip.setPixelColor(i, red, green, blue);
strip.show();

audio-reactive-led-strip:

import numpy as np
import config
from microphone import Microphone
y = np.roll(y, -1)
y[-1] = np.mean(mic.get_audio_samples())
output = visualization_effect(y)
led.update(output)

The Adafruit_NeoPixel library focuses on basic LED control, making it easier to set up and use for simple projects. It offers extensive documentation and wide compatibility with Arduino boards. However, it lacks built-in audio reactivity and may require more custom programming for complex animations.

In contrast, the audio-reactive-led-strip project is specifically designed for audio-reactive LED effects, incorporating audio processing and visualization techniques. It uses Python and includes more advanced features out of the box, but may have a steeper learning curve for beginners.

6,405

The FastLED library for colored LED animation on Arduino. Please direct questions/requests for help to the FastLED Reddit community: http://fastled.io/r We'd like to use github "issues" just for tracking library bugs / enhancements.

Pros of FastLED

  • Extensive library with support for a wide range of LED types and microcontrollers
  • Highly optimized for performance and memory efficiency
  • Large community and extensive documentation

Cons of FastLED

  • Focused on LED control, lacks built-in audio reactivity features
  • Steeper learning curve for beginners
  • Requires additional hardware and code for audio processing

Code Comparison

audio-reactive-led-strip:

def visualize_spectrum(y):
    y = np.clip(y, 0, 1)
    for i in range(config.N_PIXELS):
        r = int(255 * y[i])
        strip.setPixelColor(i, Color(r, 0, 0))
    strip.show()

FastLED:

void visualize_spectrum(float* y) {
  for (int i = 0; i < NUM_LEDS; i++) {
    int r = (int)(255 * constrain(y[i], 0, 1));
    leds[i] = CRGB(r, 0, 0);
  }
  FastLED.show();
}

The audio-reactive-led-strip project provides a more specialized solution for audio-reactive LED control, while FastLED offers a versatile library for general LED programming. The code comparison shows similar approaches to setting LED colors, with FastLED using its own data structures and functions for LED manipulation.

14,769

Control WS2812B and many more types of digital RGB LEDs with an ESP8266 or ESP32 over WiFi!

Pros of WLED

  • More comprehensive LED control features, including multiple effects and patterns
  • User-friendly web interface for easy configuration and control
  • Supports a wider range of LED types and controllers

Cons of WLED

  • Less focused on audio reactivity compared to audio-reactive-led-strip
  • May require more setup time for audio-reactive features

Code Comparison

WLED (Arduino-based):

void loop() {
  WLED_loop();
  yield();
}

audio-reactive-led-strip (Python-based):

def visualize_spectrum(y):
    """Effect that maps the audio spectrum to LED colors"""
    y = np.copy(y)
    y = y**2.0
    gain.update(y)
    y /= gain.value
    y *= 255.0
    r = int(np.max(y[:len(y) // 3]))
    g = int(np.max(y[len(y) // 3: 2 * len(y) // 3]))
    b = int(np.max(y[2 * len(y) // 3:]))
    return np.array([r, g, b])

WLED offers a more versatile LED control system with a user-friendly interface, while audio-reactive-led-strip provides a more specialized audio-reactive solution. WLED's code is simpler due to its Arduino-based architecture, while audio-reactive-led-strip uses Python for more complex audio processing and visualization algorithms.

An Arduino NeoPixel support library supporting a large variety of individually addressable LEDs. Please refer to the Wiki for more details. Please use the GitHub Discussions to ask questions as the GitHub Issues feature is used for bug tracking.

Pros of NeoPixelBus

  • More comprehensive library for controlling various types of LED strips
  • Better performance and efficiency, especially for larger LED installations
  • Supports a wider range of microcontrollers and platforms

Cons of NeoPixelBus

  • Lacks built-in audio reactivity features
  • May require more setup and configuration for specific use cases
  • Less focused on creating audio-reactive LED effects

Code Comparison

NeoPixelBus:

#include <NeoPixelBus.h>

NeoPixelBus<NeoGrbFeature, Neo800KbpsMethod> strip(60, 2);

void setup() {
  strip.Begin();
}

audio-reactive-led-strip:

import numpy as np
import config

# Initialize LED strip
led_strip = LEDStrip(config.N_PIXELS)

def visualize_spectrum(y):
    # Audio processing and visualization logic

The NeoPixelBus example shows basic setup for controlling an LED strip, while the audio-reactive-led-strip code focuses on audio processing and visualization. NeoPixelBus provides lower-level control, whereas audio-reactive-led-strip offers higher-level audio reactivity features out of the box.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Audio Reactive LED Strip

Real-time LED strip music visualization using Python and the ESP8266 or Raspberry Pi.

block diagram

overview

Demo (click gif for video)

visualizer demo

Overview

The repository includes everything needed to build an LED strip music visualizer (excluding hardware):

What do I need to make one?

Computer + ESP8266

To build a visualizer using a computer and ESP8266, you will need:

  • Computer with Python 2.7 or 3.5 (Anaconda is recommended on Windows)
  • ESP8266 module with RX1 pin exposed. These modules can be purchased for as little as $5 USD. These modules are known to be compatible, but many others will work too:
    • NodeMCU v3
    • Adafruit HUZZAH
    • Adafruit Feather HUZZAH
  • WS2812B LED strip (such as Adafruit Neopixels). These can be purchased for as little as $5-15 USD per meter.
  • 5V power supply
  • 3.3V-5V level shifter (optional, must be non-inverting)

Limitations when using a computer + ESP8266:

  • The communication protocol between the computer and ESP8266 currently supports a maximum of 256 LEDs.

Standalone Raspberry Pi

You can also build a standalone visualizer using a Raspberry Pi. For this you will need:

  • Raspberry Pi (1, 2, or 3)
  • USB audio input device. This could be a USB microphone or a sound card. You just need to find some way of giving the Raspberry Pi audio input.
  • WS2812B LED strip (such as Adafruit Neopixels)
  • 5V power supply
  • 3.3V-5V level shifter (optional)

Limitations when using the Raspberry Pi:

  • Raspberry Pi is just fast enough the run the visualization, but it is too slow to run the GUI window as well. It is recommended that you disable the GUI when running the code on the Raspberry Pi.
  • The ESP8266 uses a technique called temporal dithering to improve the color depth of the LED strip. Unfortunately the Raspberry Pi lacks this capability.

Installation for Computer + ESP8266

Python Dependencies

Visualization code is compatible with Python 2.7 or 3.5. A few Python dependencies must also be installed:

  • Numpy
  • Scipy (for digital signal processing)
  • PyQtGraph (for GUI visualization)
  • PyAudio (for recording audio with microphone)

On Windows machines, the use of Anaconda is highly recommended. Anaconda simplifies the installation of Python dependencies, which is sometimes difficult on Windows.

Installing dependencies with Anaconda

Create a conda virtual environment (this step is optional but recommended)

conda create --name visualization-env python=3.5
activate visualization-env

Install dependencies using pip and the conda package manager

conda install numpy scipy pyqtgraph
pip install pyaudio

Installing dependencies without Anaconda

The pip package manager can also be used to install the python dependencies.

pip install numpy
pip install scipy
pip install pyqtgraph
pip install pyaudio

If pip is not found try using python -m pip install instead.

Installing macOS dependencies

On macOS, python3 is required and portaudio must be used in place of pyaudio. If you don't have brew installed you can get it here: https://brew.sh

brew install portaudio
brew install pyqt5
pip3 install numpy
pip3 install scipy
pip3 install pyqtgraph
pip3 install pyaudio

Running the visualization can be done using the command below.

python3 visualization.py /tmp

Arduino dependencies

ESP8266 firmare is uploaded using the Arduino IDE. See this tutorial to setup the Arduino IDE for ESP8266.

Install NeoPixelBus library

Download Here or using library manager, search for "NeoPixelBus".

Hardware Connections

ESP8266

The ESP8266 has hardware support for I²S and this peripheral is used to control the ws2812b LED strip. This significantly improves performance compared to bit-banging the IO pin. Unfortunately, this means that the LED strip must be connected to the RX1 pin, which is not accessible in some ESP8266 modules (such as the ESP-01).

The RX1 pin on the ESP8266 module should be connected to the data input pin of the ws2812b LED strip (often labelled DIN or D0).

For the NodeMCU v3 and Adafruit Feather HUZZAH, the location of the RX1 pin is shown in the images below. Many other modules also expose the RX1 pin.

nodemcu-pinout feather-huzzah-pinout

Raspberry Pi

Since the Raspberry Pi is a 3.3V device, the best practice is to use a logic level converter to shift the 3.3V logic to 5V logic (WS2812 LEDs use 5V logic). There is a good overview on the best practices here.

Although a logic level converter is the best practice, sometimes it will still work if you simply connect the LED strip directly to the Raspberry Pi.

You cannot power the LED strip using the Raspberry Pi GPIO pins, you need to have an external 5V power supply.

The connections are:

  • Connect GND on the power supply to GND on the LED strip and GND on the Raspberry Pi (they MUST share a common GND connection)
  • Connect +5V on the power supply to +5V on the LED strip
  • Connect a PWM GPIO pin on the Raspberry Pi to the data pin on the LED strip. If using the Raspberry Pi 2 or 3, then try Pin 18(GPIO5).

Setup and Configuration

  1. Install Python and Python dependencies
  2. Install Arduino IDE and ESP8266 addon
  3. Download and extract all of the files in this repository onto your computer
  4. Connect the RX1 pin of your ESP8266 module to the data input pin of the ws2812b LED strip. Ensure that your LED strip is properly connected to a 5V power supply and that the ESP8266 and LED strip share a common electrical ground connection.
  5. In ws2812_controller.ino:
  • Set const char* ssid to your router's SSID
  • Set const char* password to your router's password
  • Set IPAddress gateway to match your router's gateway
  • Set IPAddress ip to the IP address that you would like your ESP8266 to use (your choice)
  • Set #define NUM_LEDS to the number of LEDs in your LED strip
  1. Upload the ws2812_controller.ino firmware to the ESP8266. Ensure that you have selected the correct ESP8266 board from the boards menu. In the dropdown menu, set CPU Frequency to 160 MHz for optimal performance.
  2. In config.py:
  • Set N_PIXELS to the number of LEDs in your LED strip (must match NUM_LEDS in ws2812_controller.ino)
  • Set UDP_IP to the IP address of your ESP8266 (must match ip in ws2812_controller.ino)
  • If needed, set MIC_RATE to your microphone sampling rate in Hz. Most of the time you will not need to change this.

Installation for Raspberry Pi

If you encounter any problems running the visualization on a Raspberry Pi, please open a new issue. Also, please consider opening an issue if you have any questions or suggestions for improving the installation process.

Download and extract all of the files in this repository onto your pi to begin.

Installing the Python dependencies

Install python dependencies using apt-get

sudo apt-get update
sudo apt-get install python-numpy python-scipy python-pyaudio

Audio device configuration

For the Raspberry Pi, a USB audio device needs to be configured as the default audio device.

Create/edit /etc/asound.conf

sudo nano /etc/asound.conf

Set the file to the following text

pcm.!default {
    type hw
    card 1
}
ctl.!default {
    type hw
    card 1
}

Next, set the USB device to as the default device by editing /usr/share/alsa/alsa.conf

sudo nano /usr/share/alsa/alsa.conf

Change

defaults.ctl.card 0
defaults.pcm.card 0

To

defaults.ctl.card 1
defaults.pcm.card 1

Test the LED strip

  1. cd rpi_ws281x/python/examples
  2. sudo nano strandtest.py
  3. Configure the options at the top of the file. Enable logic inverting if you are using an inverting logic-level converter. Set the correct GPIO pin and number of pixels for the LED strip. You will likely need a logic-level converter to convert the Raspberry Pi's 3.3V logic to the 5V logic used by the ws2812b LED strip.
  4. Run example with 'sudo python strandtest.py'

Configure the visualization code

In config.py, set the device to 'pi' and configure the GPIO, LED and other hardware settings. If you are using an inverting logic level converter, set LED_INVERT = True in config.py. Set LED_INVERT = False if you are not using an inverting logic level converter (i.e. connecting LED strip directly to GPIO pin).

Audio Input

The visualization program streams audio from the default audio input device (set by the operating system). Windows users can change the audio input device by following these instructions.

Examples of typical audio sources:

  • Audio cable connected to the audio input jack (requires USB sound card on Raspberry Pi)
  • Webcam microphone, headset, studio recording microphone, etc

Virtual Audio Source

You can use a "virtual audio device" to transfer audio playback from one application to another. This means that you can play music on your computer and connect the playback directly into the visualization program.

audio-input-sources

Windows

On Windows, you can use "Stereo Mix" to copy the audio output stream into the audio input. Stereo Mix is only support on certain audio chipsets. If your chipset does not support Stereo Mix, you can use a third-party application such as Voicemeeter.

show-stereomix

Go to recording devices under Windows Sound settings (Control Panel -> Sound). In the right-click menu, select "Show Disabled Devices".

enable-stereomix

Enable Stereo Mix and set it as the default device. Your audio playback should now be used as the audio input source for the visualization program. If your audio chipset does not support Stereo Mix then it will not appear in the list.

Linux

Linux users can use Jack Audio to create a virtual audio device.

OSX

On OSX, Loopback can be use to create a virtual audio device.

Running the Visualization

Once everything has been configured, run visualization.py to start the visualization. The visualization will automatically use your default recording device (microphone) as the audio input.

A PyQtGraph GUI will open to display the output of the visualization on the computer. There is a setting to enable/disable the GUI display in config.py

visualization-gui

If you encounter any issues or have questions about this project, feel free to open a new issue.

Limitations

  • ESP8266 supports a maximum of 256 LEDs. This limitation will be removed in a future update. The Raspberry Pi can use more than 256 LEDs.
  • Even numbers of pixels must be used. For example, if you have 71 pixels then use the next lowest even number, 70. Odd pixel quantities will be supported in a future update.

License

This project was developed by Scott Lawson and is released under the MIT License.