Do you want to optimize the performances of a busy server? Then a properly set cache makes a big difference. Why exactly? Because good caching saves a great deal of computing power: when the data don’t change, the repetition of complex – read: time-consuming – calculations or queries is unnecessary.
Caching is performed on various levels and layers. Therefore, it is often the most complex part of an application. Caching is often a combination of different solutions.
With the right caching strategy, your application could get a powerful boost. But you still need to find and implement the right strategy. Nucleus knows its way around, and between, pitfalls.
Do you want to get started with caching? At Nucleus, we’d like to help you answering the following questions:
- Is the cache the same for everyone?
- When should a cache be disabled?
- How long can certain content be cached?
- What happens to users who are logged in?
While we look for the right caching strategy for your company, we envision the right trade-off between complexity and performance. Because different solutions emphasize different things.
Almost all databases have a query cache, which is an internal cache system. With a Relational Database Management System (RDMS), the cache can include a SELECT query without the need for a multitude of additional settings. This means that an identical query will later receive a faster result.
As a developer, there is no need to take cache invalidation into account. The database engine takes care of this for you. When the data change, the query will be executed again. This is the easiest cache that can be activated. It is merely a matter of settings in the database engine. Nothing has to change in the code.
A Key/Value Cache is often the first supplement to a busy database server. A Query Cache creates an additional load on your database server, while a separate Key/Value Cache provides more freedom. Hence, you can determine what may be cached and what may not.
For example, several queries can result in a single key/value cache, as a result of which the execution of multiple queries is not necessary.
In the past, Memcached was often used for this task. But today, it’s been replaced with its modern successor: Redis. That allows you to maintain a persistent cache, and also to periodically write the cache to the disk. If a server needs to restart, you still have a cached version of the data available, and you avoid cache stampedes, a flood of requests that suddenly swamp the cache and overload the database layer.
Full Page Cache
Full page caching places an entire page in the cache, which is presented to a next visitor. This is ideally suited for websites mainly visited by anonymous users, such as news sites or popular blogs. As long as the users do not log in, they get to see identical pages. So, there is no need to address several queries or key/value caches.
Varnish is the standard in this regard. We have tons of experience with this tool, and also make our Varnish templates available. We always meet up with the developers and system administrators to run through all scenarios, and based upon it, implement the correct Varnish strategy.
Edge Side Includes
Edge Side Includes (ESI) is a good addition to Full Page Caching. It’s the appropriate solution when almost all pages are identical, except some crucial blocks or elements. Like a personal greeting or a poll.
ESIs allow you to keep an entire page in cache, and at the same time load previously defined blocks separately. ESIs can also be gathered within Varnish, but they require a deeper integration with the application. It is the most flexible way of running your servers as smoothly as possible.
- High-quality solutions
- Valuable, tailor-made advice
- Fierce and transparent guarantees
- An ISO 27001 certified data security
- Years of experience in the field of hosting
- All of our servers are hosted in Belgium
- An independent, financially healthy and growing company
- Transparent, honest and proactive communication