Convert Figma logo to code with AI

MaskRay logoccls

C/C++/ObjC language server supporting cross references, hierarchies, completion and semantic highlighting

3,746
257
3,746
187

Top Related Projects

1,499

clangd language server

Official repository for the Microsoft C/C++ extension for VS Code.

2,343

C/C++ language server supporting multi-million line code base, powered by libclang. Emacs, Vim, VSCode, and others with language server protocol support. Cross references, completion, diagnostics, semantic highlighting and more

4,771

Bear is a tool that generates a compilation database for clang tooling.

1,829

A client/server indexer for c/c++/objc[++] with integration for Emacs based on clang.

A code-completion engine for Vim

Quick Overview

ccls is a C/C++/Objective-C language server implementation for the Language Server Protocol (LSP). It aims to provide intelligent code completion, navigation, and other language features for C-family languages in various text editors and IDEs that support LSP.

Pros

  • Fast indexing and low memory usage, thanks to its use of clang and efficient data structures
  • Supports a wide range of C/C++ language features, including code completion, goto definition, and find references
  • Cross-platform compatibility (Linux, macOS, Windows)
  • Integrates well with many popular editors and IDEs through LSP

Cons

  • Setup can be complex, especially for large projects with custom build systems
  • May require manual configuration of compile commands for accurate results
  • Limited support for some advanced C++ features compared to more mature IDEs
  • Documentation could be more comprehensive for advanced use cases

Getting Started

  1. Install ccls:

    # On Ubuntu/Debian
    sudo apt install ccls
    
    # On macOS with Homebrew
    brew install ccls
    
    # On Windows with MSYS2
    pacman -S mingw-w64-x86_64-ccls
    
  2. Configure your editor to use ccls as the language server for C/C++ files. For example, in VS Code, install the "ccls" extension and add the following to your settings.json:

    {
      "languageserver": {
        "ccls": {
          "command": "ccls",
          "filetypes": ["c", "cpp", "objc", "objcpp"],
          "rootPatterns": [".ccls", "compile_commands.json", ".git/", ".hg/"],
          "initializationOptions": {
            "cache": {
              "directory": "/tmp/ccls"
            }
          }
        }
      }
    }
    
  3. In your project root, create a compile_commands.json file or a .ccls file to specify compilation flags and include paths.

  4. Open a C/C++ file in your editor and start using ccls features like code completion and navigation.

Competitor Comparisons

1,499

clangd language server

Pros of clangd

  • Officially supported by LLVM, ensuring long-term maintenance and updates
  • Better integration with other LLVM tools and libraries
  • More extensive documentation and community support

Cons of clangd

  • Can be slower for initial indexing compared to ccls
  • May consume more system resources, especially for large projects

Code Comparison

Both clangd and ccls use similar configuration files. Here's a basic example for each:

clangd (compile_commands.json):

[
  {
    "directory": "/path/to/project",
    "command": "clang++ -std=c++17 -c main.cpp",
    "file": "main.cpp"
  }
]

ccls (.ccls):

clang++
%cpp -std=c++17
%h %hpp %cxx %cc %cpp

Both language servers provide similar functionality for C/C++ development, including code completion, diagnostics, and navigation. The choice between them often depends on specific project requirements and personal preferences. clangd is generally considered more feature-rich and well-supported, while ccls may offer better performance in certain scenarios.

Official repository for the Microsoft C/C++ extension for VS Code.

Pros of vscode-cpptools

  • More comprehensive C/C++ development environment with debugging, IntelliSense, and code browsing
  • Seamless integration with Visual Studio Code, providing a familiar interface for many developers
  • Regular updates and active maintenance by Microsoft

Cons of vscode-cpptools

  • Heavier resource usage, potentially slower on large codebases
  • Configuration can be more complex, especially for non-standard project structures

Code comparison

vscode-cpptools (launch.json configuration):

{
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Debug",
            "type": "cppdbg",
            "request": "launch",
            "program": "${workspaceFolder}/a.out"
        }
    ]
}

ccls (.ccls configuration):

%cpp -std=c++17
%c -std=c11
%h %hpp --include=Global.h
-I/usr/include
-I/usr/local/include

Summary

vscode-cpptools offers a more feature-rich environment with tight Visual Studio Code integration, making it suitable for developers who prefer an all-in-one solution. ccls, on the other hand, is a lightweight language server focusing on performance and simplicity, ideal for those who prioritize speed and customization in their development workflow.

2,343

C/C++ language server supporting multi-million line code base, powered by libclang. Emacs, Vim, VSCode, and others with language server protocol support. Cross references, completion, diagnostics, semantic highlighting and more

Pros of cquery

  • Faster indexing speed for large codebases
  • Lower memory usage, especially for large projects
  • More extensive documentation and setup guides

Cons of cquery

  • No longer actively maintained (last commit in 2018)
  • Fewer features and less compatibility with modern C++ standards
  • Less integration with popular editors and IDEs

Code Comparison

cquery:

void IndexFile::serialize(std::string& output) {
  flatbuffers::FlatBufferBuilder builder;
  auto root = Serialize(builder);
  builder.Finish(root);
  output.assign(reinterpret_cast<char*>(builder.GetBufferPointer()),
                builder.GetSize());
}

ccls:

void IndexFile::serialize(std::string* output) {
  flatbuffers::FlatBufferBuilder builder;
  auto root = Serialize(builder);
  builder.Finish(root);
  *output = std::string(
      reinterpret_cast<const char*>(builder.GetBufferPointer()),
      builder.GetSize());
}

Both projects use FlatBuffers for serialization, but ccls uses a pointer parameter for the output string, while cquery uses a reference. ccls also employs more modern C++ features in its codebase overall.

4,771

Bear is a tool that generates a compilation database for clang tooling.

Pros of Bear

  • Language-agnostic: Works with various build systems and programming languages
  • Generates compilation database for any project, not limited to C/C++
  • Simpler setup process for general use cases

Cons of Bear

  • Requires running the build process to generate compilation database
  • May not provide as deep language-specific features as ccls
  • Less integrated with specific language servers or IDEs

Code Comparison

Bear (command-line usage):

bear -- make

ccls (configuration in .ccls file):

clang
%cpp -std=c++17
-I/path/to/include

Key Differences

  • Bear focuses on generating compilation databases, while ccls is a full-fledged language server
  • ccls provides more advanced C/C++ specific features and optimizations
  • Bear can be used as a complementary tool to generate input for language servers like ccls

Use Cases

  • Use Bear for quickly generating compilation databases for various projects
  • Choose ccls for dedicated C/C++ development with advanced language features and IDE integration
1,829

A client/server indexer for c/c++/objc[++] with integration for Emacs based on clang.

Pros of rtags

  • More mature project with longer development history
  • Supports distributed compilation and indexing
  • Offers integration with popular text editors like Emacs and Vim

Cons of rtags

  • Requires a running daemon, which can be resource-intensive
  • Setup and configuration can be more complex
  • Less active development in recent years

Code Comparison

rtags:

class RTags {
public:
    static void init();
    static void shutdown();
    static void indexFile(const std::string& file);
};

ccls:

class CCLSServer {
public:
    void initialize(const lsInitializeParams& params);
    void shutdown();
    void index(const std::string& file);
};

Both projects aim to provide C/C++ language services, but their implementations differ. rtags uses a client-server architecture with a persistent daemon, while ccls is designed as a language server following the Language Server Protocol (LSP).

ccls generally offers easier setup, better performance, and broader language support (including Objective-C). It's also more actively maintained and has gained popularity in recent years. However, rtags still has a dedicated user base, especially among Emacs users, and its distributed compilation feature can be beneficial for large projects.

The choice between the two depends on specific needs, development environment, and personal preferences. ccls is often recommended for new projects due to its ease of use and active development.

A code-completion engine for Vim

Pros of YouCompleteMe

  • Multi-language support: Offers completion for various languages beyond C/C++
  • Integration with multiple editors: Works with Vim, Neovim, and some other editors
  • Fuzzy-search completion: Provides more flexible matching for completions

Cons of YouCompleteMe

  • Complex setup: Requires compilation and can be challenging to install
  • Resource-intensive: May be slower and use more system resources than ccls
  • Less specialized for C/C++: ccls offers more focused and potentially better C/C++ support

Code Comparison

YouCompleteMe configuration in .vimrc:

let g:ycm_global_ycm_extra_conf = '~/.vim/.ycm_extra_conf.py'
let g:ycm_confirm_extra_conf = 0
let g:ycm_autoclose_preview_window_after_completion = 1

ccls configuration in .vimrc (using coc.nvim):

let g:coc_global_extensions = ['coc-ccls']
let g:coc_user_config = {
  \ 'languageserver': {
    \ 'ccls': {
      \ 'command': 'ccls',
      \ 'filetypes': ['c', 'cpp', 'objc', 'objcpp'],
      \ 'rootPatterns': ['.ccls', 'compile_commands.json'],
      \ 'initializationOptions': {'cache': {'directory': '/tmp/ccls'}}
    \ }
  \ }
\ }

Both projects aim to provide code completion, but YouCompleteMe offers broader language support while ccls specializes in C/C++ with potentially better performance in that domain.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

ccls

Telegram Gitter

ccls, which originates from cquery, is a C/C++/Objective-C language server.

It has a global view of the code base and support a lot of cross reference features, see wiki/FAQ. It starts indexing the whole project (including subprojects if exist) parallelly when you open the first file, while the main thread can serve requests before the indexing is complete. Saving files will incrementally update the index.

>>> Getting started (CLICK HERE) <<<

ccls can index itself (~180MiB RSS when idle, noted on 2018-09-01), FreeBSD, glibc, Linux, LLVM (~1800MiB RSS), musl (~60MiB RSS), ... with decent memory footprint. See wiki/Project-Setup for examples.