go-cache
An in-memory key:value store/cache (similar to Memcached) library for Go, suitable for single-machine applications.
Top Related Projects
Efficient cache for gigabytes of data written in Go.
A cache library for Go with zero GC overhead.
☔️ A complete Go cache library that brings you multiple ways of managing your caches
A high performance memory-bound Go cache
Concurrency-safe Go caching library with expiration capabilities and access counters
groupcache is a caching and cache-filling library, intended as a replacement for memcached in many cases.
Quick Overview
patrickmn/go-cache is a lightweight in-memory key-value store/cache library for Go. It supports expiration of items, thread-safe operations, and provides a simple API for caching operations. The library is designed to be easy to use and efficient for small to medium-sized caching needs.
Pros
- Simple and easy-to-use API
- Thread-safe operations
- Support for item expiration
- No external dependencies
Cons
- Limited to in-memory storage (not suitable for distributed caching)
- No built-in persistence mechanism
- Limited advanced features compared to more complex caching solutions
Code Examples
- Creating a cache and setting an item:
import (
"time"
"github.com/patrickmn/go-cache"
)
c := cache.New(5*time.Minute, 10*time.Minute)
c.Set("foo", "bar", cache.DefaultExpiration)
- Getting an item from the cache:
foo, found := c.Get("foo")
if found {
fmt.Println(foo)
}
- Deleting an item from the cache:
c.Delete("foo")
- Setting an item with a custom expiration time:
c.Set("temp", "temporary value", 30*time.Second)
Getting Started
To use go-cache in your Go project, follow these steps:
-
Install the library:
go get github.com/patrickmn/go-cache
-
Import the library in your Go code:
import "github.com/patrickmn/go-cache"
-
Create a new cache instance:
c := cache.New(5*time.Minute, 10*time.Minute)
-
Use the cache in your application:
c.Set("key", "value", cache.DefaultExpiration) value, found := c.Get("key") if found { // Use the value }
That's it! You can now use go-cache for in-memory caching in your Go applications.
Competitor Comparisons
Efficient cache for gigabytes of data written in Go.
Pros of bigcache
- Designed for high concurrency and better performance with large data sets
- Uses sharding to reduce lock contention
- Optimized for storing serialized data (byte slices)
Cons of bigcache
- Limited flexibility in data types (primarily for byte slices)
- No built-in expiration mechanism for individual items
- More complex setup and configuration compared to go-cache
Code comparison
go-cache:
cache := cache.New(5*time.Minute, 10*time.Minute)
cache.Set("key", "value", cache.DefaultExpiration)
value, found := cache.Get("key")
bigcache:
config := bigcache.DefaultConfig(10 * time.Minute)
cache, _ := bigcache.NewBigCache(config)
cache.Set("key", []byte("value"))
value, _ := cache.Get("key")
Key differences
- Data types: go-cache supports various data types, while bigcache is optimized for byte slices.
- Expiration: go-cache allows per-item expiration, bigcache uses a global expiration setting.
- Concurrency: bigcache is designed for high concurrency scenarios, while go-cache is simpler but may have lower performance under high load.
- Memory usage: bigcache is more memory-efficient for large datasets due to its internal structure.
- Ease of use: go-cache has a simpler API and is easier to set up for basic use cases.
Choose bigcache for high-concurrency applications with large datasets, and go-cache for simpler use cases with diverse data types and fine-grained expiration control.
A cache library for Go with zero GC overhead.
Pros of freecache
- Better memory management with zero GC overhead
- Higher concurrency with sharding
- Supports limiting cache size to avoid memory issues
Cons of freecache
- More complex API and usage
- Lacks some features like expiration callbacks
Code Comparison
freecache:
cacheSize := 100 * 1024 * 1024
cache := freecache.NewCache(cacheSize)
key := []byte("key")
val := []byte("value")
cache.Set(key, val, 60)
got, err := cache.Get(key)
go-cache:
cache := cache.New(5*time.Minute, 10*time.Minute)
cache.Set("key", "value", cache.DefaultExpiration)
value, found := cache.Get("key")
Key Differences
- Memory Management: freecache uses a pre-allocated fixed-size memory pool, while go-cache relies on Go's garbage collector.
- Concurrency: freecache employs sharding for better performance in high-concurrency scenarios.
- API: go-cache has a simpler, more intuitive API, while freecache requires working with byte slices.
- Features: go-cache offers more features like expiration callbacks and JSON serialization.
- Use Cases: freecache is better suited for high-performance, memory-sensitive applications, while go-cache is more versatile for general-purpose caching needs.
☔️ A complete Go cache library that brings you multiple ways of managing your caches
Pros of gocache
- Supports multiple cache stores (in-memory, Redis, Memcache, etc.)
- Offers advanced features like marshaling, chaining, and metrics
- Provides a more flexible and extensible architecture
Cons of gocache
- More complex setup and configuration compared to go-cache
- Potentially higher memory footprint due to additional features
- Steeper learning curve for developers new to caching systems
Code Comparison
go-cache:
cache := cache.New(5*time.Minute, 10*time.Minute)
cache.Set("key", "value", cache.DefaultExpiration)
value, found := cache.Get("key")
gocache:
store := store.NewGoCache(gocache.New(5*time.Minute, 10*time.Minute))
cache := cache.New(store)
err := cache.Set("key", "value", &store.Options{Expiration: 5*time.Minute})
value, err := cache.Get("key")
Summary
gocache offers more features and flexibility, supporting multiple cache stores and advanced functionalities. However, it comes with increased complexity and potentially higher resource usage. go-cache provides a simpler, lightweight solution for basic in-memory caching needs. The choice between the two depends on the specific requirements of your project, such as the need for multiple cache stores or advanced features versus simplicity and ease of use.
A high performance memory-bound Go cache
Pros of Ristretto
- Higher performance and better memory efficiency due to advanced algorithms
- Built-in metrics and statistics for monitoring cache behavior
- Support for concurrent access without additional synchronization
Cons of Ristretto
- More complex API and setup compared to go-cache
- Requires more configuration to optimize for specific use cases
- Newer project with potentially less community support
Code Comparison
go-cache:
import "github.com/patrickmn/go-cache"
c := cache.New(5*time.Minute, 10*time.Minute)
c.Set("foo", "bar", cache.DefaultExpiration)
foo, found := c.Get("foo")
Ristretto:
import "github.com/dgraph-io/ristretto"
cache, _ := ristretto.NewCache(&ristretto.Config{
NumCounters: 1e7, // number of keys to track frequency of (10M).
MaxCost: 1 << 30, // maximum cost of cache (1GB).
BufferItems: 64, // number of keys per Get buffer.
})
cache.Set("foo", "bar", 1)
value, found := cache.Get("foo")
Both libraries provide in-memory caching for Go applications, but Ristretto offers more advanced features and potentially better performance at the cost of increased complexity. go-cache is simpler to use and may be sufficient for less demanding applications. The choice between them depends on specific project requirements and performance needs.
Concurrency-safe Go caching library with expiration capabilities and access counters
Pros of cache2go
- Supports automatic cache item expiration with custom callback functions
- Provides statistical information about cache usage
- Offers data loader functionality for lazy loading of cache items
Cons of cache2go
- Less actively maintained compared to go-cache (fewer recent updates)
- Lacks some advanced features like saving/loading cache to disk
Code Comparison
go-cache:
c := cache.New(5*time.Minute, 10*time.Minute)
c.Set("foo", "bar", cache.DefaultExpiration)
foo, found := c.Get("foo")
cache2go:
cache := cache2go.Cache("myCache")
cache.Add("foo", 5*time.Minute, "bar")
res, err := cache.Value("foo")
Both libraries offer simple in-memory caching solutions for Go applications. go-cache provides a more straightforward API and is more actively maintained, while cache2go offers some additional features like statistics and data loading. The choice between the two depends on specific project requirements and desired functionality.
groupcache is a caching and cache-filling library, intended as a replacement for memcached in many cases.
Pros of groupcache
- Designed for distributed caching across multiple machines
- Supports automatic replication and load balancing
- Implements a more advanced consistent hashing algorithm
Cons of groupcache
- More complex to set up and use compared to go-cache
- Less flexible for single-machine use cases
- Not actively maintained (last commit in 2020)
Code Comparison
go-cache:
cache := cache.New(5*time.Minute, 10*time.Minute)
cache.Set("key", "value", cache.DefaultExpiration)
value, found := cache.Get("key")
groupcache:
group := groupcache.NewGroup("myCache", 64<<20, groupcache.GetterFunc(
func(ctx context.Context, key string, dest groupcache.Sink) error {
dest.SetString("value")
return nil
},
))
var value string
group.Get(ctx, "key", groupcache.StringSink(&value))
Key Differences
- Scope: go-cache is an in-memory cache for single machines, while groupcache is designed for distributed systems.
- API: go-cache has a simpler API, making it easier to use for basic caching needs.
- Features: groupcache offers advanced features like automatic replication and load balancing, which go-cache lacks.
- Maintenance: go-cache is actively maintained, while groupcache hasn't seen updates since 2020.
- Use cases: go-cache is better suited for simpler, single-machine applications, while groupcache shines in distributed environments.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
go-cache
go-cache is an in-memory key:value store/cache similar to memcached that is
suitable for applications running on a single machine. Its major advantage is
that, being essentially a thread-safe map[string]interface{}
with expiration
times, it doesn't need to serialize or transmit its contents over the network.
Any object can be stored, for a given duration or forever, and the cache can be safely used by multiple goroutines.
Although go-cache isn't meant to be used as a persistent datastore, the entire
cache can be saved to and loaded from a file (using c.Items()
to retrieve the
items map to serialize, and NewFrom()
to create a cache from a deserialized
one) to recover from downtime quickly. (See the docs for NewFrom()
for caveats.)
Installation
go get github.com/patrickmn/go-cache
Usage
import (
"fmt"
"github.com/patrickmn/go-cache"
"time"
)
func main() {
// Create a cache with a default expiration time of 5 minutes, and which
// purges expired items every 10 minutes
c := cache.New(5*time.Minute, 10*time.Minute)
// Set the value of the key "foo" to "bar", with the default expiration time
c.Set("foo", "bar", cache.DefaultExpiration)
// Set the value of the key "baz" to 42, with no expiration time
// (the item won't be removed until it is re-set, or removed using
// c.Delete("baz")
c.Set("baz", 42, cache.NoExpiration)
// Get the string associated with the key "foo" from the cache
foo, found := c.Get("foo")
if found {
fmt.Println(foo)
}
// Since Go is statically typed, and cache values can be anything, type
// assertion is needed when values are being passed to functions that don't
// take arbitrary types, (i.e. interface{}). The simplest way to do this for
// values which will only be used once--e.g. for passing to another
// function--is:
foo, found := c.Get("foo")
if found {
MyFunction(foo.(string))
}
// This gets tedious if the value is used several times in the same function.
// You might do either of the following instead:
if x, found := c.Get("foo"); found {
foo := x.(string)
// ...
}
// or
var foo string
if x, found := c.Get("foo"); found {
foo = x.(string)
}
// ...
// foo can then be passed around freely as a string
// Want performance? Store pointers!
c.Set("foo", &MyStruct, cache.DefaultExpiration)
if x, found := c.Get("foo"); found {
foo := x.(*MyStruct)
// ...
}
}
Reference
Top Related Projects
Efficient cache for gigabytes of data written in Go.
A cache library for Go with zero GC overhead.
☔️ A complete Go cache library that brings you multiple ways of managing your caches
A high performance memory-bound Go cache
Concurrency-safe Go caching library with expiration capabilities and access counters
groupcache is a caching and cache-filling library, intended as a replacement for memcached in many cases.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot