Convert Figma logo to code with AI

FransBouma logoMassive

A small, happy, dynamic MicroORM for .NET that will love you forever.

1,801
322
1,801
15

Top Related Projects

3,263

Npgsql is the .NET data provider for PostgreSQL.

17,439

Dapper - a simple object mapper for .Net

Fast, Simple, Typed ORM for .NET

13,632

EF Core is a modern object-database mapper for .NET. It supports LINQ queries, change tracking, updates, and schema migrations.

2,790

.NET Transactional Document DB and Event Store on PostgreSQL

Quick Overview

Massive is a lightweight, high-performance data access layer for .NET that simplifies database operations. It provides a flexible and intuitive API for working with various database systems, including SQL Server, PostgreSQL, and SQLite, while maintaining a clean and minimal codebase.

Pros

  • Lightweight and fast, with minimal overhead
  • Supports multiple database systems
  • Flexible query building with a fluent API
  • Easy to use and learn, with a simple and intuitive interface

Cons

  • Limited ORM features compared to full-fledged ORMs like Entity Framework
  • May require more manual SQL writing for complex queries
  • Documentation could be more comprehensive
  • Smaller community compared to more popular data access libraries

Code Examples

  1. Basic query execution:
using (var db = new DapperExtensions("connectionString"))
{
    var result = db.Query<Customer>("SELECT * FROM Customers WHERE City = @City", new { City = "London" });
}
  1. Inserting data:
using (var db = new DapperExtensions("connectionString"))
{
    var newCustomer = new Customer { Name = "John Doe", City = "New York" };
    db.Insert(newCustomer);
}
  1. Updating data:
using (var db = new DapperExtensions("connectionString"))
{
    var customer = db.Get<Customer>(1);
    customer.City = "Paris";
    db.Update(customer);
}

Getting Started

  1. Install the Massive NuGet package:
Install-Package Massive
  1. Create a database connection:
using Massive;

var db = new DynamicModel("connectionString");
  1. Perform database operations:
// Query
var results = db.Query("SELECT * FROM Users WHERE Age > @0", 18);

// Insert
db.Insert(new { Name = "Alice", Age = 30 });

// Update
db.Update("Users", new { Id = 1, Name = "Bob" }, "Id");

// Delete
db.Delete("Users", new { Id = 1 });

Competitor Comparisons

3,263

Npgsql is the .NET data provider for PostgreSQL.

Pros of Npgsql

  • Specialized PostgreSQL driver with extensive support for PostgreSQL-specific features
  • Active development with frequent updates and improvements
  • Comprehensive documentation and strong community support

Cons of Npgsql

  • Limited to PostgreSQL databases only
  • Steeper learning curve for developers new to PostgreSQL

Code Comparison

Npgsql:

using var conn = new NpgsqlConnection(connectionString);
await conn.OpenAsync();
using var cmd = new NpgsqlCommand("SELECT * FROM users WHERE id = @id", conn);
cmd.Parameters.AddWithValue("id", 1);
using var reader = await cmd.ExecuteReaderAsync();

Massive:

dynamic db = new DynamicModel(connectionString);
var user = db.Users.FindBy(id: 1);

Key Differences

  • Massive is a micro-ORM that supports multiple database types, while Npgsql is a dedicated PostgreSQL driver
  • Npgsql provides lower-level database access, whereas Massive offers a more abstracted, dynamic approach
  • Massive focuses on simplicity and ease of use, while Npgsql offers more fine-grained control over database operations

Use Cases

  • Choose Npgsql for PostgreSQL-specific projects requiring advanced features and optimizations
  • Opt for Massive when working with multiple database types or prioritizing rapid development with minimal setup
17,439

Dapper - a simple object mapper for .Net

Pros of Dapper

  • Higher performance and efficiency due to its lightweight nature
  • More extensive documentation and larger community support
  • Better support for complex queries and stored procedures

Cons of Dapper

  • Requires more manual SQL writing compared to Massive's dynamic query generation
  • Less abstraction from the database, which may lead to more boilerplate code
  • Steeper learning curve for developers new to micro-ORMs

Code Comparison

Dapper:

var user = connection.Query<User>("SELECT * FROM Users WHERE Id = @Id", new { Id = 1 }).FirstOrDefault();

Massive:

var user = db.Users.FindBy(Id: 1).FirstOrDefault();

Key Differences

  • Massive provides a more dynamic and flexible API, allowing for easier query construction
  • Dapper offers better performance and is more suitable for complex scenarios
  • Massive has a simpler learning curve, while Dapper provides more control over SQL queries
  • Dapper has broader adoption and community support compared to Massive
  • Massive focuses on convention-based development, while Dapper emphasizes explicit query definition

Both libraries have their strengths and are suitable for different use cases. Dapper is often preferred for its performance and flexibility, while Massive shines in scenarios where rapid development and simplicity are prioritized.

Fast, Simple, Typed ORM for .NET

Pros of ServiceStack.OrmLite

  • More comprehensive ORM functionality, including support for complex queries and relationships
  • Better performance for large-scale applications due to its optimized query generation
  • Extensive documentation and community support

Cons of ServiceStack.OrmLite

  • Steeper learning curve due to its more complex API
  • Requires more setup and configuration compared to Massive's simplicity
  • May be overkill for small projects or simple database operations

Code Comparison

Massive:

var db = new DynamicModel("connectionStringName");
var result = db.Query("SELECT * FROM Users WHERE Age > @0", 18);

ServiceStack.OrmLite:

using (var db = dbFactory.Open())
{
    var result = db.Select<User>(u => u.Age > 18);
}

Key Differences

  • Massive focuses on simplicity and ease of use, making it ideal for rapid development and smaller projects
  • ServiceStack.OrmLite offers more advanced features and better performance for complex scenarios
  • Massive uses dynamic objects, while ServiceStack.OrmLite works with strongly-typed entities
  • ServiceStack.OrmLite provides better integration with other ServiceStack components

Conclusion

Choose Massive for quick prototyping and simple database operations. Opt for ServiceStack.OrmLite when building larger, more complex applications that require advanced ORM features and better performance.

13,632

EF Core is a modern object-database mapper for .NET. It supports LINQ queries, change tracking, updates, and schema migrations.

Pros of Entity Framework Core

  • Robust ORM with full-featured querying capabilities
  • Supports multiple database providers
  • Extensive documentation and community support

Cons of Entity Framework Core

  • Steeper learning curve for beginners
  • Can be overkill for simple database operations
  • Performance overhead for complex queries

Code Comparison

Entity Framework Core

using (var context = new BloggingContext())
{
    var blogs = context.Blogs
        .Where(b => b.Rating > 3)
        .OrderBy(b => b.Url)
        .ToList();
}

Massive

var db = new DynamicModel("northwind");
var products = db.Query("SELECT * FROM Products WHERE UnitPrice > @0", 20);

Entity Framework Core provides a more abstracted, object-oriented approach to database operations, while Massive offers a simpler, more direct SQL-like syntax. EF Core excels in complex scenarios with relationships and advanced querying, whereas Massive shines in its simplicity and ease of use for basic CRUD operations.

EF Core is better suited for large-scale applications with complex data models, while Massive is ideal for smaller projects or when rapid development is prioritized. The choice between the two depends on the specific requirements of the project, the developer's familiarity with ORMs, and the desired level of abstraction from the underlying database.

2,790

.NET Transactional Document DB and Event Store on PostgreSQL

Pros of Marten

  • Specialized for PostgreSQL, offering advanced features like document storage and event sourcing
  • Supports LINQ queries, providing a more familiar and type-safe querying experience
  • Offers built-in support for JSON document storage and querying

Cons of Marten

  • Limited to PostgreSQL, reducing database flexibility compared to Massive
  • Steeper learning curve due to more complex features and concepts
  • May have higher overhead for simple CRUD operations

Code Comparison

Marten:

using (var session = store.OpenSession())
{
    session.Store(new User { Name = "John" });
    session.SaveChanges();
}

Massive:

dynamic users = new DynamicModel("connectionStringName");
users.Insert(new { Name = "John" });

Key Differences

  • Marten is tailored for PostgreSQL and document storage, while Massive is database-agnostic and focuses on simplicity
  • Marten provides a more structured approach with sessions and LINQ support, whereas Massive offers a lightweight, dynamic API
  • Marten excels in complex scenarios like event sourcing, while Massive shines in rapid development and simple CRUD operations

Both libraries have their strengths, with Marten being more suitable for PostgreSQL-centric projects with advanced requirements, and Massive offering flexibility and simplicity across various databases.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Massive, a small, happy, dynamic MicroORM for .NET that will love you forever

Massive was started by Rob Conery and has been transfered to Frans Bouma on March 4th, 2015. It's a small MicroORM based on the Expando or dynamic type and allows you to work with your database with almost no effort. The design is based on the idea that the code provided to you in this repository is a start: you get up and running in no-time and from there edit and alter it as you see fit.

Current Status

Massive is currently on v2.0. To obtain the old v1.0 code, please select the v1.0 branch in the GitHub menu or click here.

Installation

To use Massive in your project simply download the following files from the repository's src folder:

  • Massive.Shared.cs
  • Massive.YourDatabase.cs, e.g. Massive.SqlServer.cs for SQL Server
  • Massive.Shared.Async.cs, if you want to use the Massive API asynchronously. Requires .NET 4.5 or higher.

Due to its design, all files share the same namespace. If you need to use more than one database in your project, you have to change the namespace of the files and use per database a version of the Massive.Shared.* files.

Requirements

Massive has no external direct dependencies, just get the code, compile it with your project and enjoy. It does have an indirect dependency, namely on the ADO.NET provider of the database used. The ADO.NET provider is obtained through .NET's DbProviderFactory system. This requires that the ADO.NET provider has been setup properly so .NET can find the factory needed. The following ADO.NET providers are supported out of the box.

Migrating from v1.0 to v2.0

If you're using v1.0 currently and want to migrate to v2.0, please take a look at What's new in v2.0 page for more details about whether you'll run into the changes made. In general the breaking changes will be minor, if any.

What's new in v2.0

Besides some changes as documented in the What's new in v2.0, the following features / additions are new:

  • Async / Await support. Not all supported databases support asynchronous data-access under the hood, but the Massive API at least allows you to work with the code asynchronously. Full Async is supported by the ADO.NET providers of SQL Server and Npgsql (3.x). ODP.NET (Oracle) doesn't support async under the hood so using the Async API with Oracle will still use synchronous data-access under the hood (through the default DbCommand fall back code). SQLite's ADO.NET provider does support async using specific types but Massive doesn't support these.
  • Shared code. In v1.0 code which was effectively the same among all supported databases was copy/pasted, in v2.0 Massive uses partial classes and shares as much code as possible among all supported databases.
  • Unit Tests. In v1.0 there were no tests but in v2.0 we properly implemented a series of tests to see whether things indeed work the way they do. They can also be used as an example how to get started.
  • Culling of dead code.

Contributing

If you want to add new features to Massive or have a fix for a bug, that's great! There are some rules however and if you don't meet them, I won't merge your code, no matter how long you've worked on it. This is nothing personal, and to many of you these rules might sound rigid, but this repository isn't a playground for newbies: the code is used by many people and they depend on it. It therefore has to be of high quality and changes made to the repository will live on forever so they aren't accepted without review.

  • PRs which are solely about whitespace changes are ignored.
  • Before sumitting a PR, first open an issue and discuss your proposed change/feature there. This is mandatory. Any PR without a linked issue is closed without comment. The main reasoning behind this is that it prevents people wasting time on things that will never make it into the code base or that e.g. a PR requires refactoring before it's being accepted because it doesn't fit into the codebase. ORMs, even small ones like Massive aren't simple: there are a lot of pitfalls and in general non-ORM devs overlook a lot of them. Discussing a change before a PR is a good thing in this case. Also don't be afraid to ask if you don't know how to proceed: that's why the issue is opened.
  • If your PR contains submissions from others, I can't accept your PR: a committer owns the code of a change. If you commit code into Massive owned by others, it is unclear those others were willing to share that code with the main repository.
  • Don't change the API nor its current behavior. Massive doesn't have a fixed version number, and is distributed through text files, but I still want the API to be dependable: no method is removed nor has its signature changed. For instance if you want to add functionality to a method and it requires extra arguments for that, you have to add an overload of the method, you can't simply append arguments to a method's signature. Be very careful here. Adding a new overload to a method which has a params argument at the end can easily break existing code (by causing it to unintentionally compile against your new method instead of the params version). Even simply adding optional parameters to the end of an existing method will break the API, since code which is linked against a pre-compiled version of Massive will fail. I cannot accept changes like this.
  • Tests are preferred. If your change can be tested with the current tests, no new tests are needed. If your change requires additional tests because the current tests don't cover it, add them with your PR.
  • If possible support all databases supported by Massive. I've designed Massive in v2.0 to share as much code as possible across all supported databases. New submissions are required to follow that pattern: for instance large pieces of code specific for SQL Server which are also usable with some tweaks on Oracle are not accepted. Instead the code has to be made generic and added to Massive.Shared, using methods implemented in the specific database partial classes to configure the code for that particular database. This can be a great pain, e.g. because you don't have access to Oracle nor Postgresql. In that case, request what you should add for these databases or that I do that for you and test the changes for you locally using the tests you wrote.
  • No new files please. There's currently a Massive.Shared.Async, and the sooner I can merge that into Massive.Shared, the better (MS still supporting .NET 3.5 is currently the limitation on that, but it's likely it will be merged in the near future).
  • Code defensively. If your code accepts input from the user, be absolutely sure this input is passed on as parameters and user crap like null values and name mismatches are covered. You don't need to throw a myriad of exceptions, but at least make a bit of an effort.
  • If it takes less time for me to write the code myself than to look at your PR, tell you how to change things and go into an endless bikeshedding debate with you, chances are I'll write it myself instead of debating things with you.
  • If you add to the API, it's recommended you add a small example to the documentation in this readme below. Some people think tests are documentation, but tests are tests, they test things. Documentation document things, so it's preferable to have documentation as well.
  • For the databases which are currently supported there are free downloads available. You can freely assume code which works on SQL Server Express and Oracle Express / developer edition to work on the paid commercial versions, unless you use a feature only available in those paid versions.

Usage

Note, the following is a work in progress and doesn't contain all the new API methods. It is primarily the original text written by Conery, and I'll update it when I have time. If you're unsure how to use a given method, please look at the tests.

Massive is a "wrapper" for your DB tables and uses System.Dynamic extensively. If you try to use this with C# 3.5 or below, it will explode and you will be sad. Me too honestly - I like how this doesn't require any DLLs other than what's in the GAC. Yippee.

  • Get a Database. Northwind will work nicely. Add a connection to your database in your web.config (or app.config). Don't forget the providerName! If you don't know what that is - just add providerName = 'System.Data.SqlClient' right after the whole connectionString stuff.
  • Create a class that wraps a table. You can call it whatever you like, but if you want to be cool just name it the same as your table.
  • Query away and have fun

Code Please

Let's say we have a table named "Products". You create a class like this:

public class Products:DynamicModel {
	//you don't have to specify the connection - Massive will use the first one it finds in your config
	public Products():base("northwind", "products","productid") {}
}

You could also just instantiate it inline, as needed:

var tbl = new DynamicModel("northwind", tableName:"Products", primaryKeyField:"ProductID");

Or ignore the object hierarchy altogether:

Massive.DB.Current.Query(...);

Now you can query thus:

var table = new Products();
//grab all the products
var products = table.All();
//just grab from category 4. This uses named parameters
var productsFour = table.All(columns: "ProductName as Name", where: "WHERE categoryID=@0",args: 4);

That works, but Massive is "dynamic" - which means that it can figure a lot of things out on the fly. That query above can be rewritten like this:

dynamic table = new Products(); //"dynamic" is important here - don't use "var"!
var productsFour = table.Find(CategoryID:4,columns:"ProductName");

The "Find" method doesn't exist, but since Massive is dynamic it will try to infer what you mean by using DynamicObject's TryInvokeMember. See the source for more details. There's more on the dynamic query stuff down below.

You can also run ad-hoc queries as needed:

var result = tbl.Query("SELECT * FROM Categories");

This will pull categories and enumerate the results - streaming them as opposed to bulk-fetching them (thanks to Jeroen Haegebaert for the code).

If you want to have a paged result set - you can:

var result = tbl.Paged(where: "UnitPrice > 20", currentPage:2, pageSize: 20);

In this example, ALL of the arguments are optional and default to reasonable values. CurrentPage defaults to 1, pageSize defaults to 20, where defaults to nothing.

What you get back is a Dynamic with three properties: Items, TotalPages and TotalRecords. Items is a Query which is lazily evaluated and you can enumerate it after casting it to IEnumerable<dynamic>. TotalPages is the total number of pages in the complete result set and TotalRecords is the total number of records in the result set. What's in the Items collection is totally up to you, it's dynamic: meaning that it's malleable and exciting. It will take the shape of whatever you return in your query, and it will have properties and so on. You can assign events to it, you can create delegates on the fly. You can give it chocolate, and it will kiss you.

That's pretty much it. One thing I really like is the groovy DSL that Massive uses - it looks just like SQL. If you want, you can use this DSL to query the database:

var table = new Products();
var productsThatILike = table.Query("SELECT ProductName, CategoryName FROM Products INNER JOIN Categories ON Categories.CategoryID = Products.CategoryID WHERE CategoryID = @0",5);
//get down!

Some of you might look at that and think it looks suspiciously like inline SQL. It does look sort of like it doesn't it! But I think it reads a bit better than Linq to SQL - it's a bit closer to the mark if you will.

Inserts and Updates

Massive is built on top of dynamics - so if you send an object to a table, it will get parsed into a query. If that object has a property on it that matches the primary key, Massive will think you want to update something. Unless you tell it specifically to update it.

You can send just about anything into the MassiveTransmoQueryfier and it will magically get turned into SQL:

var table = new Products();
var poopy = new {ProductName = "Chicken Fingers"};
//update Product with ProductID = 12 to have a ProductName of "Chicken Fingers"
table.Update(poopy, 12);

This also works if you have a form on your web page with the name "ProductName" - then you submit it:

var table = new Products();
//update Product with ProductID = 12 to have a ProductName of whatever was submitted via the form
table.Update(poopy, Request.Form);

Insert works the same way:

//pretend we have a class like Products but it's called Categories
var table = new Categories();
//do it up - the inserted object will be returned from the query as expando 
var inserted = table.Insert(new {CategoryName = "Buck Fify Stuff", Description = "Things I like"});
// the new PK value is in the field specified as PrimaryKeyField in the constructor of Categories. 
var newID = inserted.CategoryID;

Yippee Skippy! Now we get to the fun part - and one of the reasons I had to spend 150 more lines of code on something you probably won't care about. What happens when we send a whole bunch of goodies to the database at once!

var table = new Products();
//OH NO YOU DIDN't just pass in an integer inline without a parameter! 
//I think I might have... yes
var drinks = table.All("WHERE CategoryID = 8");
//what we get back here is an IEnumerable < ExpandoObject > - we can go to town
foreach(var item in drinks.ToArray()){
	//turn them into Haack Snacks
	item.CategoryID = 12;
}
//Let's update these in bulk, in a transaction shall we?
table.Save(drinks.ToArray());

Named Argument Query Syntax

I recently added the ability to run more friendly queries using Named Arguments and C#4's DynamicObject.TryInvokeMember method-on-the-fly syntax. In an earlier version this was trying to be like Rails ActiveRecord (so, calls were like var drinks = table.FindBy_CategoryID(8);), but I figured "C# is NOT Ruby, and Named Arguments can be a lot more clear". So now calls look like var drinks = table.FindBy(CategoryID:8); (examples below). In addition, Mark Rendle's Simple.Data is already supporting ActiveRecord style syntax, so ... why duplicate things?

If your needs are more complicated - I would suggest just passing in your own SQL with Query().

//important - must be dynamic
dynamic table = new Products();

var drinks = table.FindBy(CategoryID:8);
//what we get back here is an IEnumerable < ExpandoObject > - we can go to town
foreach(var item in drinks){
	Console.WriteLine(item.ProductName);
}
//returns the first item in the DB for category 8
var first = table.First(CategoryID:8);

//you dig it - the last as sorted by PK
var last = table.Last(CategoryID:8);

//you can order by whatever you like
var firstButReallyLast = table.First(CategoryID:8,OrderBy:"PK DESC");

//only want one column?
var price = table.First(CategoryID:8,Columns:"UnitPrice").UnitPrice;

//Multiple Criteria?
var items = table.Find(CategoryID:5, UnitPrice:100, OrderBy:"UnitPrice DESC");

Aggregates with Named Arguments

You can do the same thing as above for aggregates:

var sum = table.Sum(columns:"Price", CategoryID:5);
var avg = table.Avg(columns:"Price", CategoryID:3);
var min = table.Min(columns:"ID");
var max = table.Max(columns:"CreatedOn");
var count = table.Count();

Metadata

If you find that you need to know information about your table - to generate some lovely things like ... whatever - just ask for the Schema property. This will query INFORMATION_SCHEMA for you, and you can take a look at DATA_TYPE, DEFAULT_VALUE, etc for whatever system you're running on.

In addition, if you want to generate an empty instance of a column - you can now ask for a "Prototype()" - which will return all the columns in your table with the defaults set for you (getdate(), raw values, newid(), etc).

Factory Constructor

One thing that can be useful is to use Massive to just run a quick query. You can do that now by using "Open()" which is a static builder on DynamicModel:

var db = Massive.DynamicModel.Open("myConnectionStringName");

You can execute whatever you like at that point.

Validations

One thing that's always needed when working with data is the ability to stop execution if something isn't right. Massive now has Validations, which are built with the Rails approach in mind:

public class Productions:DynamicModel {
	public Productions():base("MyConnectionString","Productions","ID") {}
	public override void Validate(dynamic item) {
		ValidatesPresenceOf("Title");
		ValidatesNumericalityOf(item.Price);
		ValidateIsCurrency(item.Price);
		if (item.Price <= 0)
			Errors.Add("Price can't be negative");
	}
}

The idea here is that Validate() is called prior to Insert/Update. If it fails, an Error collection is populated and an InvalidOperationException is thrown. That simple. With each of the validations above, a message can be passed in.

CallBacks

Need something to happen after Update/Insert/Delete? Need to halt before save? Massive has callbacks to let you do just that:

public class Customers:DynamicModel {
	public Customers():base("MyConnectionString","Customers","ID") {}
	
	//Add the person to Highrise CRM when they're added to the system...
	public override void Inserted(dynamic item) {
		//send them to Highrise
		var svc = new HighRiseApi();
		svc.AddPerson(...);
	}
}

The callbacks you can use are:

  • Inserted
  • Updated
  • Deleted
  • BeforeDelete
  • BeforeSave