Littlepot: Autofilled Cache

Cook, little pot, cook!

-- The Magic Porridge Pot, Brothers Grimm

I've just released littlepot, tiny library devoted to transform batched request into single element request.

The Problem

Imagine you develop a service which return one random image and some witty text below. As an image provider, you have chosen some http://image.provider.com, because they have a free limited API. Their endpoint http://image.provider.com/api/random is exactly what you need.

You wrote a function to get batched data.

(def get-batch []
  (-> api-endpoint
      http/get
      json/parse))

But your service need to return one image (get-image)

Wasting Requests

You lazy enough, so you just return random image from fresh batch.

(defn get-image []
  (rand-nth (get-batch)))

Simple enough, unless you'll encounter RateLimitException because guys from image.provider.com too greedy and allow only 100 requests per hour, but you just utilize 2% of all images.

Storage

Obvious solution: save images somewhere and return from there.

Congratulations, you discovered state, your life will never be the same.

Assume you introduced atom over list to save the data from the batch. Now every time you retrieve image, you need to remove it from cached batch.

(defn get-image []
  (let [element (first @batch)]
    (swap! batch rest)
    element))

Refill

Good enough, but the cache is not infinite, you need to refill it when data is over.

(defn get-image []
  (when (empty? @batch)
        (fill batch))
  (let [element (first @batch)]
    (swap! batch rest)
    element))

All clients are blocked until you cache is filled and see little latency.

What if we could send request in background, allowing clients still retrieve the data?

Littlepot

Meet littlepot, solutions to some of these problems.

Storage. It is backed by clojure.lang.PersistentQueue, clojure queue implementation, so you don't need to care about efficient storage.

Autofill. It sends request for next batch in a background, when your cached data is close to exhaustion, so the process of filling cache goes automatically and silently.

Non-blocking. You do not need to wait when data appears in cache; if something there, return it, if not, return :no-data.

Composable. Having function to retrieve single element (get-one) you can easily get fifty elements by calling (take 50 (repeatedly get-one)).

Concurrency. It encapsulates whole state in ref and uses STM, so multiple consumers allowed. Also, guaranteed that at most one batch will be in progress.

Getting Started Guide or some Advanced Usages could be found on github.

mishadoff 04 October 2015
blog comments powered by Disqus