Convert Figma logo to code with AI

sadmann7 logotablecn

Shadcn table with server-side sorting, filtering, and pagination.

5,442
506
5,442
29

Top Related Projects

5,960

an ambient intelligence library

112,752

🦜🔗 Build context-aware reasoning applications

4,179

A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.

Integrate cutting-edge LLM technology quickly and easily into your apps

Examples and guides for using the OpenAI API

115,704

Production-ready platform for agentic workflow development.

Quick Overview

TableCN is a Next.js project that demonstrates the creation of a customizable table component using Tanstack Table v8. It showcases advanced features like sorting, filtering, and pagination, all implemented with a clean and modern UI using Tailwind CSS and Radix UI primitives.

Pros

  • Implements a highly customizable and feature-rich table component
  • Uses modern technologies like Next.js, Tanstack Table v8, and Tailwind CSS
  • Provides a clean and accessible UI with Radix UI primitives
  • Demonstrates best practices for state management and component architecture

Cons

  • Limited documentation for customization and advanced usage
  • May have a steeper learning curve for developers unfamiliar with Tanstack Table
  • Requires understanding of multiple libraries and frameworks
  • Not a standalone library, but rather a project showcase

Code Examples

  1. Basic table setup:
import { useReactTable, getCoreRowModel } from "@tanstack/react-table";

const table = useReactTable({
  data,
  columns,
  getCoreRowModel: getCoreRowModel(),
});
  1. Adding sorting functionality:
import { getSortedRowModel } from "@tanstack/react-table";

const table = useReactTable({
  // ... other options
  getSortedRowModel: getSortedRowModel(),
});
  1. Implementing pagination:
import { getPaginationRowModel } from "@tanstack/react-table";

const table = useReactTable({
  // ... other options
  getPaginationRowModel: getPaginationRowModel(),
});

Getting Started

To use this project as a reference or starting point:

  1. Clone the repository:

    git clone https://github.com/sadmann7/tablecn.git
    
  2. Install dependencies:

    cd tablecn
    npm install
    
  3. Run the development server:

    npm run dev
    
  4. Open http://localhost:3000 in your browser to see the table component in action.

Competitor Comparisons

5,960

an ambient intelligence library

Pros of Marvin

  • More comprehensive AI development framework with broader capabilities
  • Larger community and more active development (2.5k+ stars vs 100+)
  • Better documentation and examples for getting started

Cons of Marvin

  • More complex setup and learning curve for beginners
  • Potentially overkill for simple table conversion tasks
  • Requires more dependencies and system resources

Code Comparison

Tablecn (table conversion):

from tablecn import TableCN

table = TableCN()
result = table.convert("Convert this table to markdown")
print(result)

Marvin (AI assistant creation):

from marvin import ai_fn

@ai_fn
def generate_table(prompt: str) -> str:
    """Generate a markdown table based on the given prompt."""

print(generate_table("Create a table of top 5 programming languages"))

Summary

Marvin is a more feature-rich AI development framework suitable for various AI-powered applications, while Tablecn focuses specifically on table conversion tasks. Marvin offers greater flexibility but may be more complex for simple use cases. Tablecn provides a straightforward solution for table-related tasks but has limited functionality beyond that scope.

112,752

🦜🔗 Build context-aware reasoning applications

Pros of langchain

  • Comprehensive framework for building LLM-powered applications
  • Large, active community with frequent updates and contributions
  • Extensive documentation and examples for various use cases

Cons of langchain

  • Steeper learning curve due to its extensive feature set
  • Can be overkill for simple projects or specific table-related tasks
  • Requires more setup and configuration compared to tablecn

Code comparison

tablecn:

from tablecn import TableCN

table = TableCN("path/to/your/table.csv")
result = table.query("What is the average age?")
print(result)

langchain:

from langchain import OpenAI, CSVLoader, LLMChain
from langchain.prompts import PromptTemplate

loader = CSVLoader("path/to/your/table.csv")
llm = OpenAI()
prompt = PromptTemplate(...)
chain = LLMChain(llm=llm, prompt=prompt)
result = chain.run(loader.load())
print(result)

Summary

While langchain offers a more comprehensive solution for LLM-powered applications, tablecn provides a simpler, more focused approach for table-related tasks. langchain's extensive features and community support come at the cost of increased complexity, while tablecn offers a more straightforward implementation for specific use cases involving tabular data.

4,179

A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.

Pros of FLAML

  • More comprehensive AutoML framework with support for various tasks (classification, regression, forecasting, etc.)
  • Highly efficient and scalable, designed for large datasets and resource-constrained environments
  • Extensive documentation and active community support

Cons of FLAML

  • Steeper learning curve due to its broader scope and more advanced features
  • May be overkill for simpler table-to-text generation tasks
  • Requires more setup and configuration compared to TableCN

Code Comparison

FLAML example:

from flaml import AutoML
automl = AutoML()
automl.fit(X_train, y_train, task="classification")
predictions = automl.predict(X_test)

TableCN example:

from tablecn import TableCN
model = TableCN()
model.train(data)
generated_text = model.generate(table_data)

Summary

FLAML is a more comprehensive AutoML framework suitable for various machine learning tasks, while TableCN is specifically designed for table-to-text generation. FLAML offers greater flexibility and scalability but may be more complex to use. TableCN provides a simpler interface for its specific use case but lacks the broader capabilities of FLAML.

Integrate cutting-edge LLM technology quickly and easily into your apps

Pros of semantic-kernel

  • More comprehensive and feature-rich, offering a complete framework for AI integration
  • Backed by Microsoft, ensuring long-term support and regular updates
  • Supports multiple programming languages and AI models

Cons of semantic-kernel

  • Steeper learning curve due to its extensive features and capabilities
  • Potentially overkill for simple projects or specific table-related tasks
  • Requires more setup and configuration compared to tablecn

Code Comparison

semantic-kernel:

using Microsoft.SemanticKernel;

var kernel = Kernel.Builder.Build();
var result = await kernel.RunAsync("What is the capital of France?");
Console.WriteLine(result);

tablecn:

import { TableCN } from 'tablecn';

const table = new TableCN();
table.addRow(['City', 'Country']);
table.addRow(['Paris', 'France']);
console.log(table.toString());

Summary

semantic-kernel is a comprehensive AI integration framework suitable for complex projects across multiple languages, while tablecn is a focused JavaScript library for creating and manipulating tables. The choice between them depends on the specific project requirements and scope.

Examples and guides for using the OpenAI API

Pros of openai-cookbook

  • Comprehensive collection of examples and guides for using OpenAI's APIs
  • Regularly updated with new features and best practices
  • Covers a wide range of use cases and applications

Cons of openai-cookbook

  • Focuses solely on OpenAI's products, limiting its scope
  • May be overwhelming for beginners due to its extensive content
  • Lacks specific focus on table-related tasks

Code comparison

openai-cookbook:

response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"}
    ]
)

tablecn:

const completion = await openai.createChatCompletion({
  model: "gpt-3.5-turbo",
  messages: [{ role: "user", content: "Hello world" }],
});

Summary

openai-cookbook is a comprehensive resource for working with OpenAI's APIs, offering a wide range of examples and best practices. It's regularly updated but may be overwhelming for beginners. tablecn, on the other hand, is more focused on table-related tasks and may be easier for newcomers to navigate. The code examples show similar usage of the OpenAI API, with openai-cookbook using Python and tablecn using TypeScript.

115,704

Production-ready platform for agentic workflow development.

Pros of Dify

  • More comprehensive AI application development platform with a wider range of features
  • Supports multiple LLM providers and offers a visual interface for building AI applications
  • Active development with frequent updates and a larger community

Cons of Dify

  • More complex setup and learning curve compared to TableCN
  • Requires more resources to run and maintain
  • May be overkill for simple table-based AI applications

Code Comparison

TableCN (React component):

<Table columns={columns} data={data} />

Dify (API request):

response = requests.post(
    "https://api.dify.ai/v1/completion",
    headers={"Authorization": "Bearer YOUR_API_KEY"},
    json={"inputs": {"question": "Your question here"}}
)

Summary

Dify is a more feature-rich platform for building AI applications, while TableCN focuses specifically on table components. Dify offers greater flexibility and integration options but comes with increased complexity. TableCN provides a simpler solution for projects primarily dealing with tabular data presentation.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

tablecn

This is a shadcn table component with server-side sorting, filtering, and pagination. It is bootstrapped with create-t3-app.

tablecn

Vercel OSS Program

Documentation

See the documentation to get started.

Tech Stack

Features

  • Server-side pagination, sorting, and filtering
  • Customizable columns
  • Auto generated filters from column definitions
  • Dynamic Data-Table-Toolbar with search, filters, and actions
  • Notion/Airtable like advanced filtering
  • Linear like filter menu for command palette filtering
  • Action bar on row selection

Running Locally

Quick Setup (with docker)

  1. Clone the repository

    git clone https://github.com/sadmann7/tablecn
    cd tablecn
    
  2. Copy the environment variables

    cp .env.example .env
    
  3. Run the setup

    pnpm ollie
    

    This will install dependencies, start the Docker PostgreSQL instance, set up the database schema, and seed it with sample data.

Manual Setup

  1. Clone the repository

    git clone https://github.com/sadmann7/tablecn
    cd tablecn
    
  2. Install dependencies

    pnpm install
    
  3. Set up environment variables

    cp .env.example .env
    

    Update the .env file with your database credentials.

  4. Choose your database approach:

    Option A: Use Docker PostgreSQL

    # Start PostgreSQL container
    pnpm db:start
    
    # Set up database schema and seed data
    pnpm db:setup
    
    # Start development server
    pnpm dev
    

    Option B: Use existing PostgreSQL database

    # Update .env with your database URL
    # Then set up database schema and seed data
    pnpm db:setup
    
    # Start development server
    pnpm dev
    

How do I deploy this?

Follow the deployment guides for Vercel, Netlify and Docker for more information.

Credits

  • shadcn/ui - For the initial implementation of the data table.