I want to know the concept difference
between warm cache
and cold cache
.
How do we use both these techniques effectively with varnish
and Redis cache
.
Any help, knowledge and experience sharing would be appreciated.
magento2redisvarnish
I want to know the concept difference
between warm cache
and cold cache
.
How do we use both these techniques effectively with varnish
and Redis cache
.
Any help, knowledge and experience sharing would be appreciated.
TL; DR - On MageStack we use Varnish, Redis (cache), Redis (sessions) and Eaccelerator/Zend OPCache (depending on PHP version)
You've already got most of it understood.
The cache backend, session store, opcode cache, full page cached and reverse proxy cache are all completely different.
You can use different technologies for all and you can use them ALL simultaneously (including Varnish and a FPC)
You can only use one cache backend.
Contrary to popular belief, using a memory based cache will not improve performance. But it will overcome some fatal flaws in Magento's default file based caching.
As of writing this message, Redis is my recommendation.
You can only use one session store.
Contrary to popular belief, using a memory based session store will not improve performance.
As of writing this message, Redis is my recommendation.
You can actually install multiple opcode caches, but it's not recommended, nor would I expect to see any gains.
My recommendations are in the brackets above.
No module is required to be installed to leverage this.
You can use multiple reverse proxies, and whilst doing so is complex and prone to cache elongation, it can have merits (ie. To prevent stampeding during a cache flush).
Use one when necessary (ie. Not to speed up a slow site, but to reduce resource usage on a fast site).
To leverage a reverse proxy, it needs both enabling server side and needs a module for Magento.
The reason for the module is to help control caching logic (ie. To tell the cache what it should and shouldn't cache) and also to manage cache contents (ie. To trigger purges of the cache).
I don't recommend any unless you have a total understanding of what you are doing. Badly set up reverse proxies can break header information, can cause session loss, session sharing, stale content, apply additional limits to load time/buffers, consume additional resources etc.
Use one when necessary (ie. Not to speed up a slow site, but to reduce resource usage on a fast site).
Contrary to popular belief, you can (and should) use a FPC in conjunction with a reverse proxy cache. The two solve different problems and have different capabilities.
FPCs can leverage more intelligence, because they have direct access to the users session and Magento's core, whereas a reverse proxy is not application aware (it's fairly dumb in the way it works) - so the two complement each other, not compete with each other.
Ie. Don't think Varnish or FPC, think Varnish and FPC.
I came across this recently and thought it would be helpful to you.
A crawler is really the only way to go about it as I mentioned in my comment to your question, here's a pre-baked one from the Magento Turpentine plugin group.
https://github.com/nexcess/magento-turpentine/blob/master/util/warm-cache.sh
It gets the URLs from the sitemap and crawls them.
And, in case the Github page ever goes AWOL.
#!/bin/bash
# Nexcess.net Turpentine Extension for Magento
# Copyright (C) 2012 Nexcess.net L.L.C.
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
SITEMAP_URL="$1"
TMP_URL_FILE="/tmp/urls_$(cat /proc/sys/kernel/random/uuid).txt"
PROCS="${PROCS-$(grep processor /proc/cpuinfo | wc -l)}"
echo '<root/>' | xpath -e '*' &>/dev/null
if [ $? -eq 2 ]; then
XPATH_BIN='xpath'
else
XPATH_BIN='xpath -e'
fi
if [ -z "$SITEMAP_URL" ]; then
cat <<EOF
Usage: $0 <sitemap URL>
Warm Magento's cache by visiting the URLs in Magento's sitemap
Example:
$0 http://example.com/magento/sitemap.xml
EOF
exit 1
fi
echo "Getting URLs from sitemap..."
curl -ks "$SITEMAP_URL" | \
$XPATH_BIN '/urlset/url/loc/text()' 2>/dev/null | \
sed -r 's~http(s)?:~\nhttp\1:~g' | \
grep -vE '^\s*$' > "$TMP_URL_FILE"
echo "Warming $(cat $TMP_URL_FILE | wc -l) URLs using $PROCS processes..."
cat "$TMP_URL_FILE" | \
xargs -P "$PROCS" -r -n 1 -- \
siege -b -v -c 1 -r once 2>/dev/null | \
sed -r 's/\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|K]//g' | \
grep -E '^HTTP'
cat "$TMP_URL_FILE" | \
xargs -P "$PROCS" -r -n 1 -- \
siege -H 'Accept-Encoding: gzip' -b -v -c 1 -r once 2>/dev/null | \
sed -r 's/\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[m|K]//g' | \
grep -E '^HTTP'
rm -f "$TMP_URL_FILE"
Best Answer
After waiting on this question, I gathered some of the concept of warm and cold cache.
By caching copies of image files, CSS, and HTML documents, the origin server does not have to generate these files each time a new visitor comes to the website. This both improves page load time and decreases stress on the origin server, meaning a website can serve more visitors at once.
Because modern websites are constantly being updated - whether it’s a media site updating the articles on their homepage or an ecommerce site updating inventory of a certain product - files are set to expire after a set period of time, which may be a minute or an hour. Each time a file in the cache expires, it needs to be re-collected from the origin server.
The first visitor to visit a website after a cache is initially set up or a cache expires will go through an empty or
cold cache
and experience acache miss
. The cache will visit the origin server to retrieve the file, deliver it to the visitor and keep the file in the cache so it is then a full or warm cache. Each subsequent user that visits before the cache expires again will be served from cache - a “cache hit” for all files that have been stored.Warming a
Varnish Cache
is a technique designed to shield users from this inconvenience by making those necessary but slow cache-refreshing requests yourself. You make a series of requests to your server for cacheable assets and you get the slow responses needed to refresh the cache instead of your users.Redis
offers optional and tunable data persistence designed to bootstrap the cache after a planned shutdown or an unplanned failure. While we tend to regard the data in caches as volatile and transient, persisting data to disk can be quite valuable in caching scenarios. Having the cache’s data available for loading immediately after restart allows for much shortercache warm-up
and removes the load involved in repopulating and recalculating cache contents from the primary data store.I hope this will help others