Convert Figma logo to code with AI

lunary-ai logolunary

The production toolkit for LLMs. Observability, prompt management and evaluations.

1,053
131
1,053
6

Top Related Projects

The official Python library for the OpenAI API

93,526

🦜🔗 Build context-aware reasoning applications

Integrate cutting-edge LLM technology quickly and easily into your apps

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Quick Overview

The lunary-ai/lunary repository is a Python library that provides a set of tools and utilities for working with lunar data, including lunar phase calculations, lunar calendar generation, and lunar event tracking. The library aims to simplify the process of working with lunar-related information and data.

Pros

  • Comprehensive Lunar Data: The library provides a wide range of lunar data, including phase information, calendar generation, and event tracking.
  • Ease of Use: The library has a user-friendly API that makes it easy to integrate lunar data into various applications.
  • Customizable: The library allows users to customize the lunar data and calculations to fit their specific needs.
  • Open-Source: The library is open-source, allowing developers to contribute to the project and extend its functionality.

Cons

  • Limited Documentation: The project's documentation could be more comprehensive, making it harder for new users to get started.
  • Dependency on External Libraries: The library relies on several external libraries, which could introduce additional complexity and potential issues.
  • Limited Functionality: While the library provides a good range of lunar-related functionality, it may not cover all the use cases that users might have.
  • Potential Performance Issues: Depending on the size and complexity of the lunar data being processed, the library may experience performance issues, especially for large-scale applications.

Code Examples

Here are a few examples of how to use the lunary-ai/lunary library:

from lunary import LunarPhase, LunarCalendar

# Calculate the current lunar phase
phase = LunarPhase.current()
print(f"The current lunar phase is: {phase.name}")

# Generate a lunar calendar for a specific year and month
calendar = LunarCalendar(2023, 4)
for day in calendar.days:
    print(f"{day.date}: {day.phase.name}")

# Track a specific lunar event
from lunary.events import LunarEvent

event = LunarEvent.next_new_moon()
print(f"The next new moon will occur on {event.date}")

Getting Started

To get started with the lunary-ai/lunary library, follow these steps:

  1. Install the library using pip:
pip install lunary-ai
  1. Import the necessary modules from the library:
from lunary import LunarPhase, LunarCalendar
from lunary.events import LunarEvent
  1. Use the library's functions and classes to work with lunar data:
# Calculate the current lunar phase
phase = LunarPhase.current()
print(f"The current lunar phase is: {phase.name}")

# Generate a lunar calendar for a specific year and month
calendar = LunarCalendar(2023, 4)
for day in calendar.days:
    print(f"{day.date}: {day.phase.name}")

# Track a specific lunar event
event = LunarEvent.next_new_moon()
print(f"The next new moon will occur on {event.date}")
  1. Explore the library's documentation and API to learn more about the available features and functionality.

Competitor Comparisons

The official Python library for the OpenAI API

Pros of openai-python

  • Official OpenAI SDK, ensuring direct compatibility and up-to-date features
  • Comprehensive documentation and extensive community support
  • Broader scope, covering all OpenAI API functionalities

Cons of openai-python

  • Lacks specific features for logging and monitoring AI interactions
  • May require additional setup for tracking and analyzing API usage
  • Not tailored for seamless integration with observability tools

Code Comparison

openai-python:

import openai

openai.api_key = "your-api-key"
response = openai.Completion.create(engine="davinci", prompt="Hello, world!")
print(response.choices[0].text)

Lunary:

from lunary import Lunary

lunary = Lunary(api_key="your-api-key")
with lunary.start_trace():
    response = openai.Completion.create(engine="davinci", prompt="Hello, world!")
    print(response.choices[0].text)

The Lunary code wraps the OpenAI API call with a trace, enabling logging and monitoring of the interaction. This demonstrates Lunary's focus on observability and analysis of AI operations, while openai-python provides a more straightforward API interaction without built-in monitoring features.

93,526

🦜🔗 Build context-aware reasoning applications

Pros of LangChain

  • More comprehensive and feature-rich, offering a wide range of tools and integrations for building AI applications
  • Larger and more active community, resulting in frequent updates and extensive documentation
  • Supports multiple programming languages, including Python and JavaScript

Cons of LangChain

  • Steeper learning curve due to its extensive feature set and complexity
  • Can be overkill for simpler projects, potentially leading to unnecessary overhead
  • Requires more setup and configuration compared to Lunary's streamlined approach

Code Comparison

LangChain example:

from langchain import OpenAI, LLMChain, PromptTemplate

llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(input_variables=["product"], template="What is a good name for a company that makes {product}?")
chain = LLMChain(llm=llm, prompt=prompt)

Lunary example:

from lunary import LLM

llm = LLM()
response = llm.complete("What is a good name for a company that makes {product}?", product="shoes")

The LangChain example demonstrates its more structured approach with separate components for the language model, prompt template, and chain. In contrast, Lunary offers a more straightforward, single-line implementation for simple use cases.

Integrate cutting-edge LLM technology quickly and easily into your apps

Pros of Semantic Kernel

  • More comprehensive and feature-rich, offering a wide range of AI integration capabilities
  • Backed by Microsoft, potentially providing better long-term support and resources
  • Extensive documentation and examples available for developers

Cons of Semantic Kernel

  • Steeper learning curve due to its complexity and broader scope
  • May be overkill for simpler AI integration projects
  • Requires more setup and configuration compared to Lunary

Code Comparison

Semantic Kernel (C#):

var kernel = Kernel.Builder.Build();
var function = kernel.CreateSemanticFunction("Generate a story about {{$input}}");
var result = await kernel.RunAsync("a brave knight", function);

Lunary (Python):

from lunary import Lunary

lunary = Lunary()
result = lunary.generate("Generate a story about a brave knight")

Summary

Semantic Kernel offers a more comprehensive AI integration solution with extensive features and Microsoft backing, but comes with a steeper learning curve. Lunary provides a simpler, more straightforward approach for basic AI interactions, making it easier to implement quickly. The choice between the two depends on the project's complexity and specific requirements.

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Pros of Transformers

  • Extensive library of pre-trained models for various NLP tasks
  • Well-documented with comprehensive examples and tutorials
  • Large and active community support

Cons of Transformers

  • Steeper learning curve for beginners
  • Larger library size and potential overhead for simple projects
  • May require more computational resources for training and inference

Code Comparison

Transformers:

from transformers import pipeline

classifier = pipeline("sentiment-analysis")
result = classifier("I love this product!")[0]
print(f"Label: {result['label']}, Score: {result['score']:.4f}")

Lunary:

from lunary import Lunary

lunary = Lunary()
lunary.log("model_prediction", {"input": "I love this product!"})

Transformers focuses on providing ready-to-use NLP models and pipelines, while Lunary appears to be more oriented towards logging and monitoring AI applications. Transformers offers a wide range of pre-trained models and tools for various NLP tasks, making it suitable for complex language processing projects. Lunary, on the other hand, seems to prioritize simplicity in logging and tracking AI model performance, which could be beneficial for monitoring and debugging purposes in production environments.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Lunary Logo

lunary

Developer toolkit for LLM chatbots

website - docs

npm version PyPI - Version GitHub last commit (by committer) GitHub commit activity (branch)

Features

Lunary helps developers of LLM Chatbots develop and improve them.

  • 🖲️ Conversation & feedback tracking
  • 💵 Analytics (costs, token, latency, ..)
  • 🔍 Debugging (logs, traces, user tracking, ..)
  • ⛩️ Prompt Directory (versioning, team collaboration, ..)
  • 🏷️ Create fine-tuning datasets
  • 🧪 Automatic topic classification

It also designed to be:

  • 🤖 Usable with any model, not just OpenAI
  • 📦 Easy to integrate (2 minutes)
  • 🧑‍💻 Self-hostable

1-min Demo

https://github.com/lunary-ai/lunary/assets/5092466/a2b4ba9b-4afb-46e3-9b6b-faf7ddb4a931

⚙️ Integration

Modules available for:

Lunary natively supports:

Additionally you can use it with any other LLM by manually sending events.

📚 Documentation

Full documentation is available on the website.

☁️ Hosted version

We offer a hosted version with a free plan of up to 10k requests / month.

With the hosted version:

  • 👷 don't worry about devops or managing updates
  • 🙋 get priority 1:1 support with our team
  • 🇪🇺 your data is stored safely in Europe

Running locally

  1. Clone the repository
  2. Setup a PostgreSQL instance (version 15 minimum)
  3. Copy the content of packages/backend/.env.example to packages/backend/.env and fill the missing values
  4. Copy the content of packages/frontend/.env.example to packages/backend/.env
  5. Run npm install
  6. Run npm run migrate:db
  7. Run npm run dev

You can now open the dashboard at http://localhost:8080. When using our JS or Python SDK, you need to set the environment variable LUNARY_API_URL to http://localhost:3333. You can use LUNARY_VERBOSE=True to see all the event sent by the SDK

🙋 Support

Need help or have questions? Chat with us on the website or email us: hello [at] lunary.ai. We're here to help every step of the way.

License

This project is licensed under the Apache 2.0 License.

NPM DownloadsLast 30 Days