ShiftOne Java Object Cache
What is this?
What does it mean that the caches are strict?
What caching policies are supported?
What features are supported?
How is a cache obtained?
How does this cache perform?
Why was this implemented?
What is this?
ShiftOne Java Object Cache is a Java library that implements several strict
object caching policies, decorators that add behavior, and a light framework
for configuring them for an application.
What does it mean that the caches are strict?
It's strict in that each cache enforces two limits in a very strict and predictable way.
Max Size - each cache has a hard limit on the number of elements it will contain. When this
limit is exceeded, the least valuable element is evicted. This happens immediately, on the same thread.
This prevents the cache from growing uncontrollably
Element Timeout - each cache has a maximum time that it's elements are considered valid.
No element will ever be returned that exceeds this time limit. This ensures a predictable data freshness.
What caching policies are supported?
Currently, First In First Out (fifo), Least Recently Used (lru), and
Least Frequently Used (lfu) caching policies are implemented. These
are refered to as policy caches. They are
responcible for holding onto the data in the cache.
What features are supported?
Features are added by wrapping policy caches with "decorators". When
a decorator is configured for a cache,
your application operates on the decorator cache, which delegates to a
policy cache.
Behavior is configured by linking several decorators in front of a policy cache.
Support includes:
Monitoring and instrumentation with JMX
Clustering using JGroups or JMS
Hit/Miss statistics reporting
Memory sensitivity using soft references
Hibernate ORM integration (adaptor)
There are also a number of adaptors that allows a cache
to take on another interface.
How is a cache obtained?
This cache implementation strives to abstract from an application, the
concrete implementation and configuration of a cache. For this reason,
cache's should be accessed exclusively using the "Cache" interface
(cache policies are not public classes, so this is required).
Both caches and cache factories may be obtained from a cache configuration.
A cache configuration allows mnemonic names to be associated with
various cache and factory configurations.
If an application absolutely needs to programmatically set the max size and
timeout of a cache, then a cache factory should be obtained from the config
first, and used to create a cache.
CacheConfiguration config = new CacheConfiguration();
CacheFactory factory = config.getCacheFactory("default");
Cache cache = factory.newInstance(
"news.sports.football",
1000*60 * minutes, 500);
|
If an application's requirements allow, all properties of a cache may be
externalized to the cache configuration. This approach is far more
powerful (and recommended).
CacheConfiguration config = new CacheConfiguration();
Cache cache = config.createConfiguredCache(
"news.sports.football");
|
How does this cache perform?
Performace depends a great deal on configuration, as well as how
the cache is being used (size, hit frequency, etc).
Benchmarking is a complex task, and it
is extremely difficult (if not impossible) to make a claim that
one cache is faster than another.
That said, the author of this cache (Jeff Drost) is also the author
of JRat, a java performance profiling
tool which he has used in tuning the performance of the policy cache
implementations (case study anyone?). He believes the implementation
is a good balance of efficiency and features.
The full source distribution of this cache contains a benchmark tool,
as well as several other popular quality cache implementations.
Give it a run, or take a look at some
benchmark results.
Why was this implemented?
JOCache was originally implemented as part of the ExQ project
to support ResultSet caching. It was later split out for use
by other projects. It was designed to cache large expensive
database query results.
|