Ah, caching - the developer’s equivalent of hiding snacks in your desk drawer. But instead of emergency chocolate, we’re stashing frequently accessed data to save those precious database roundtrips. Let’s roll up our sleeves and implement some database-level caching in Go, complete with code samples and battle-tested patterns.
The Cache Conundrum: To Store or Not to Store?
Database caching works like your brain’s muscle memory for frequent tasks. As Prisma’s guide notes, it’s all about keeping hot data ready-to-serve. But beware the siren song of over-caching - nothing ruins a party like stale data!
Cache-Aside: The Go-To Pattern (Literally)
Let’s implement the cache-aside pattern using go-cache
. First, our cache setup:
import (
"time"
"github.com/patrickmn/go-cache"
)
func NewCache() *cache.Cache {
return cache.New(
5*time.Minute, // Default expiration
10*time.Minute, // Cleanup interval
)
}
Now let’s create a middleware that makes caching automatic:
func WithCaching(next http.HandlerFunc, c *cache.Cache) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
key := r.URL.Path
// Cache hit
if data, found := c.Get(key); found {
w.Header().Set("X-Cache", "HIT")
json.NewEncoder(w).Encode(data)
return
}
// Cache miss - capture response
rec := httptest.NewRecorder()
next(rec, r)
// Store successful responses
if rec.Code == http.StatusOK {
var result interface{}
json.Unmarshal(rec.Body.Bytes(), &result)
c.Set(key, result, cache.DefaultExpiration)
}
// Copy response to actual writer
for k, v := range rec.Header() {
w.Header()[k] = v
}
w.WriteHeader(rec.Code)
w.Write(rec.Body.Bytes())
}
}
This middleware acts like a bouncer - checking IDs (cache keys) before letting anyone bother the database. The X-Cache
header helps debug whether we’re hitting the cache or the database.
Read-Through: The Invisible Layer
For read-through caching, we need something a bit more sophisticated. Here’s a database wrapper that automatically caches:
type CachedUserRepository struct {
db *sql.DB
cache *cache.Cache
}
func (r *CachedUserRepository) GetUser(id string) (*User, error) {
key := fmt.Sprintf("user:%s", id)
// Cache lookup
if data, found := r.cache.Get(key); found {
return data.(*User), nil
}
// Database query
user, err := r.db.Query("SELECT * FROM users WHERE id = ?", id)
if err != nil {
return nil, err
}
// Cache storage with expiration
r.cache.Set(key, user, cache.DefaultExpiration)
return user, nil
}
As shown in the YouTube tutorial, this pattern reduces database load significantly once the cache warms up. Just remember - with great caching comes great responsibility to invalidate properly!
Cache Invalidation: The Final Boss
Ah yes, the second-hardest problem in computer science (after naming things). Here’s how we handle it:
func (r *CachedUserRepository) UpdateUser(user User) error {
// Update database first
_, err := r.db.Exec("UPDATE users SET ... WHERE id = ?", user.ID)
if err != nil {
return err
}
// Invalidate cache
key := fmt.Sprintf("user:%s", user.ID)
r.cache.Delete(key)
return nil
}
Benchmarks: Because Numbers Don’t Lie
Here’s what you might see with proper caching implemented:
Scenario | Requests/sec | Latency (ms) | Database Load |
---|---|---|---|
No Cache | 1,200 | 85 | 100% |
Cache-Aside | 8,700 | 12 | 15% |
Read-Through | 9,400 | 9 | 5% |
These numbers aren’t just pretty - they’re your ticket to better scaling and happier users.
Pro Tips from the Cache-weary
- TTL Tango: Set timeouts like you’re cooking pasta - al dente. Too short and you lose benefits, too long and you get stale data. 5-15 minutes is a good starting point.
- Key Strategy: Use consistent key formats like
user:{id}
. It’s like labeling your leftovers - you want to find them later! - Circuit Breakers: Add cache health checks. If Redis goes down, your app should fail gracefully, not faceplant.
- Monitoring: Track cache hit ratio like your app’s batting average. Below 80%? Time to rethink your strategy. Remember folks, caching is like garlic - essential in proper amounts, disastrous in excess. Implement wisely, invalidate diligently, and may your p99 latencies stay low! 🚀