Azure-Kinect-Sensor-SDK
A cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device.
Top Related Projects
Intel® RealSense™ SDK
Drivers and libraries for the Xbox Kinect device on Windows, Linux, and OS X
Quick Overview
The Azure Kinect Sensor SDK is a cross-platform library that provides developers with access to the Azure Kinect DK sensor's features. It enables capturing depth images, color images, and skeletal tracking data, making it useful for computer vision and AI applications.
Pros
- Cross-platform support (Windows and Linux)
- Comprehensive access to sensor data (depth, color, IMU, microphones)
- Robust skeletal tracking capabilities
- Active development and community support
Cons
- Limited documentation for advanced features
- Steep learning curve for beginners
- Requires specific hardware (Azure Kinect DK)
- Some reported issues with firmware updates
Code Examples
- Initializing and starting the camera:
k4a_device_t device = NULL;
k4a_device_open(0, &device);
k4a_device_configuration_t config = K4A_DEVICE_CONFIG_INIT_DISABLE_ALL;
config.color_format = K4A_IMAGE_FORMAT_COLOR_BGRA32;
config.color_resolution = K4A_COLOR_RESOLUTION_1080P;
config.depth_mode = K4A_DEPTH_MODE_NFOV_UNBINNED;
k4a_device_start_cameras(device, &config);
- Capturing and accessing a depth frame:
k4a_capture_t capture;
k4a_device_get_capture(device, &capture, K4A_WAIT_INFINITE);
k4a_image_t depth_image = k4a_capture_get_depth_image(capture);
uint16_t* depth_buffer = (uint16_t*)k4a_image_get_buffer(depth_image);
int width = k4a_image_get_width_pixels(depth_image);
int height = k4a_image_get_height_pixels(depth_image);
- Performing body tracking:
k4abt_tracker_t tracker = NULL;
k4abt_tracker_configuration_t tracker_config = K4ABT_TRACKER_CONFIG_DEFAULT;
k4abt_tracker_create(&calibration, tracker_config, &tracker);
k4abt_frame_t body_frame = NULL;
k4abt_tracker_enqueue_capture(tracker, capture, K4A_WAIT_INFINITE);
k4abt_tracker_pop_result(tracker, &body_frame, K4A_WAIT_INFINITE);
Getting Started
- Install the Azure Kinect Sensor SDK from the official Microsoft repository.
- Connect the Azure Kinect DK to your computer.
- Include the necessary headers in your C++ project:
#include <k4a/k4a.h>
#include <k4abt.h>
- Link against the required libraries (k4a.lib and k4abt.lib on Windows).
- Initialize the device and start the cameras as shown in the first code example.
- Capture frames and process data according to your application's needs.
- Remember to release resources and stop the cameras when done:
k4a_device_stop_cameras(device);
k4a_device_close(device);
Competitor Comparisons
Intel® RealSense™ SDK
Pros of librealsense
- Wider hardware support: Compatible with various Intel RealSense devices
- More active community and frequent updates
- Cross-platform support (Windows, Linux, macOS, Android)
Cons of librealsense
- Limited to Intel RealSense devices
- Less integration with cloud services compared to Azure ecosystem
Code Comparison
Azure-Kinect-Sensor-SDK:
k4a_device_t device = NULL;
k4a_device_open(K4A_DEVICE_DEFAULT, &device);
k4a_device_configuration_t config = K4A_DEVICE_CONFIG_INIT_DISABLE_ALL;
config.color_format = K4A_IMAGE_FORMAT_COLOR_BGRA32;
k4a_device_start_cameras(device, &config);
librealsense:
rs2::pipeline pipe;
rs2::config cfg;
cfg.enable_stream(RS2_STREAM_COLOR, 640, 480, RS2_FORMAT_BGR8, 30);
pipe.start(cfg);
auto frames = pipe.wait_for_frames();
auto color_frame = frames.get_color_frame();
Both SDKs provide similar functionality for device initialization and configuration, but with different syntax and naming conventions. The Azure Kinect SDK uses a more explicit device opening and configuration approach, while librealsense uses a pipeline-based system for managing device streams.
Drivers and libraries for the Xbox Kinect device on Windows, Linux, and OS X
Pros of libfreenect
- Open-source and community-driven, allowing for greater flexibility and customization
- Supports a wider range of Kinect devices, including older models
- Lighter weight and easier to integrate into existing projects
Cons of libfreenect
- Less comprehensive documentation and official support
- May lack some advanced features present in the Azure Kinect SDK
- Potentially less optimized for newer Kinect hardware
Code Comparison
Azure-Kinect-Sensor-SDK:
k4a_device_t device = NULL;
k4a_device_open(K4A_DEVICE_DEFAULT, &device);
k4a_device_configuration_t config = K4A_DEVICE_CONFIG_INIT_DISABLE_ALL;
config.color_format = K4A_IMAGE_FORMAT_COLOR_BGRA32;
k4a_device_start_cameras(device, &config);
libfreenect:
freenect_context *f_ctx;
freenect_device *f_dev;
freenect_init(&f_ctx, NULL);
freenect_open_device(f_ctx, &f_dev, 0);
freenect_set_video_mode(f_dev, freenect_find_video_mode(FREENECT_RESOLUTION_MEDIUM, FREENECT_VIDEO_RGB));
Both examples show device initialization and configuration, but Azure-Kinect-Sensor-SDK offers a more streamlined API with built-in configuration options, while libfreenect requires more manual setup.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Azure Kinect SDK (K4A)
Welcome to the Azure Kinect Sensor SDK! We hope you can use it to build many great applications and participate in the project. Don't be shy to ask questions, and provide feedback. See Azure.com/Kinect for device info and available documentation.
Introduction
Azure Kinect SDK is a cross platform (Linux and Windows) user mode SDK to read data from your Azure Kinect device.
Why use the Azure Kinect SDK
The Azure Kinect SDK enables you to get the most out of your Azure Kinect camera. Features include:
- Depth camera access
- RGB camera access and control (e.g. exposure and white balance)
- Motion sensor (gyroscope and accelerometer) access
- Synchronized Depth-RGB camera streaming with configurable delay between cameras
- External device synchronization control with configurable delay offset between devices
- Camera frame meta-data access for image resolution, timestamp and temperature
- Device calibration data access
Installation
To use the SDK, please refer to the installation instructions in usage
Documentation
API documentation is avaliable here.
Building
Azure Kinect SDK uses CMake to build. For instructions on how to build this SDK please see building.
Versioning
The Azure Kinect SDK uses semantic versioning, please see versioning.md for more information.
Testing
For information on writing or running tests, please see testing.md
Contribute
We welcome your contributions! Please see the contribution guidelines.
Feedback
For SDK feedback or to report a bug, please file a GitHub Issue. For general suggestions or ideas, visit our feedback forum.
Sample Code
There are several places where the sample code can be found.
- In this repository: Azure-Kinect-Sensor-SDK\examples- each example has a readme page that describes it and the steps to set it up.
- Azure-Kinect-Samples repository. There are multiple examples of how to use both Sensor and Body tracking SDKs.
Q&A
Welcome to the Q&A corner!
Join Our Developer Program
Complete your developer profile here to get connected with our Mixed Reality Developer Program. You will receive the latest on our developer tools, events, and early access offers.
Code of Conduct
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Reporting Security Issues
Security issues and bugs should be reported privately, via email, to the Microsoft Security Response Center (MSRC) at <secure@microsoft.com>. You should receive a response within 24 hours. If for some reason you do not, please follow up via email to ensure we received your original message. Further information, including the MSRC PGP key, can be found in the Security TechCenter.
License and Microsoft Support for Azure Kinect Sensor SDK
Top Related Projects
Intel® RealSense™ SDK
Drivers and libraries for the Xbox Kinect device on Windows, Linux, and OS X
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot