Recently, I discovered a surprisingly reliable memory caching solution, which I’m planning to use in all my further applications to increase performance. In this blog post, I will share some code examples of how you can integrate Ristretto caching library into your application.
Ristretto is a fast, concurrent cache library built with a focus on performance and correctness.
This library was created by the Dgraph team as a contention-free cache for the Dgraph database.
Let’s dive into the practical example. We are going to build a simple application that gets a list of users from the database. In the first iteration, there will be no caching layer at all. In the second iteration, we will add a Ristretto caching and compare execution time.
Below, you can see that I defined a repository
package with the Repository
interface and with InMemoryRepository
implementation:
package repository
import "fmt"
// Repository interface to handle users data.
type Repository interface {
GetUsers() map[int]string
}
type InMemoryRepository struct{}
// NewInMemoryRepository constructs and returns InMemoryRepository.
func NewInMemoryRepository() *InMemoryRepository {
return &InMemoryRepository{}
}
// GetUsers returns 50000 dummy users from the in-memory repository.
func (r *InMemoryRepository) GetUsers() map[int]string {
users := make(map[int]string)
for i:=1; i <= 50000; i++ {
users[i] = fmt.Sprintf("User %d", i)
}
return users
}
Next, we are going to call a GetUsers()
method 100 times to simulate calling of the same function from several places in the real-world applications:
package main
import (
"fmt"
"github.com/alexsergivan/blog-examples/ristretto/repository"
)
func main() {
for i:=0; i<100; i ++ {
UsersGetter(repository.NewInMemoryRepository())
}
}
func UsersGetter(repository repository.Repository) {
repository.GetUsers()
}
Let’s measure how much time it takes to execute it with time go run main.go
:
1.46s user
0.34s system
106% cpu
1.686 total
Next, we are going to add a caching layer to our application.
Don’t forget to get the Ristretto library:
go get github.com/dgraph-io/ristretto
Inside repository
package we inject Ristretto cache:
package repository
import (
"fmt"
"github.com/dgraph-io/ristretto"
"time"
)
type InMemoryRepository struct{
cache *ristretto.Cache
}
// NewInMemoryRepository constructs and returns InMemoryRepository.
func NewInMemoryRepository(ristrettoCache *ristretto.Cache) *InMemoryRepository {
return &InMemoryRepository{
cache: ristrettoCache,
}
}
// GetUsers returns 50000 dummy users from the in-memory repository.
func (r *InMemoryRepository) GetUsers() map[int]string {
key := "users"
value, found := r.cache.Get(key)
// If the users data not cached yet, get it from the repository.
if !found {
users := make(map[int]string)
for i:=1; i <= 50000; i++ {
users[i] = fmt.Sprintf("User %d", i)
}
// Adds data to the cache for 1h.
r.cache.SetWithTTL(key, users, 1, 1*time.Hour)
time.Sleep(10 * time.Millisecond)
return users
}
return value.(map[int]string)
}
Next, inside the main()
function we initiate a new Ristretto cache and pass it to the InMemoryRepository
:
package main
import (
"github.com/alexsergivan/blog-examples/ristretto/repository"
"github.com/dgraph-io/ristretto"
)
func main() {
ristrettoCache, _ := ristretto.NewCache(&ristretto.Config{
NumCounters: 1e7, // Num keys to track frequency of (10M).
MaxCost: 1 << 30, // Maximum cost of cache (1GB).
BufferItems: 64, // Number of keys per Get buffer.
})
for i:=0; i<100; i ++ {
UsersGetter(repository.NewInMemoryRepository(ristrettoCache))
}
}
func UsersGetter(repository repository.Repository) {
repository.GetUsers()
}
Let’s check how much time it takes to perform the same action:
0.29s user
0.26s system
147% cpu
0.377 total
As you can notice, the total time is 4 times less than in the example without caching layer.
Despite a silly example, I hope you got an idea of how to integrate the Ristretto caching into your application and how it could improve overall performance.
The complete source code you can find here.