Simple question. What is your recommended PHP memory limit for Magento?
Recommended PHP Memory Limit for Magento
magento-1.7magento-1.8magento-enterprise
Related Solutions
Memory errors can be tricky to track down, and are almost always system specific (that's a fancy way of saying "I don't have a specific solution for you, but here's some debugging tips).
Memory problems usually happen for one of two reasons.
Repeated loading of something in a loop without clearing memory from the previous loop
Loading something too large
Based on your error message
Allowed memory size of 536870912 bytes exhausted (tried to allocate 501481472 bytes).
It sounds like you have PHP's memory limit set to 536870912 bytes (512MB
), and PHP ran into a problem when it attempted to allocate 501481472
bytes. That is, in a single operation, PHP attempted to set aside around 478MB of memory to hold the image helper object you attempted to put in a variable. (resize
returns $this
, meaning a copy of the image helper)
$variable = $this->helper('catalog/image')->init($product, 'small_image')->resize(135)
My guess is your base images are huge files, and that one the objects used by the image helper had that image loaded in memory. When cast that object as a string (which for an image helper returns the images URL, see its __toString
method.
$variable = (string)$this->helper('catalog/image')->init($product, 'small_image')->resize(135)
PHP didn't need to keep a copy of this around.
That, however, is just a guess. I'd up the memory limit temporarily to 1024MB
to see if that clears up the problem. If it doesn't, that means you have a runaway loop loading too many things. If it does clear the problem up, takes steps to up PHP's memory limit only when this operation needs to run.
We run on multiple c1.medium instances that are auto-scaled with a m1.db.small RDS instance, cache.t1.micro ElastiCache server for full page caching and CloudFront CDN served from two S3 sources (one for media, one for skin/js for more concurrent downloads per session). We use S3FS for EC2 instances to connect to the S3s. In addition, we use Lightspeed (now called Warp) full page caching, some custom block caching and Fooman Speedster Advanced.
We recently made some changes and found with the S3 system you need to set some symbolic links to expedite the process of PHP's file_exists
function when a file is not present in a directory for the skin folder. With that set up, we are getting page load times under 1 second (cached) and 2-3 seconds not cached.
I like the c1.medium because the cost (we usually run spot instances at <$0.07/hr...unless it's Christmas time). Also, the compute optimized helps with the speed of processing all of the requests. With memory levels set correctly in php.ini, we haven't run into memory problems on that instance.
You do more traffic than we do, but caching is still the key. If you want to spend a little more, the current gen c3.large instances look enticing. They offer good computing power, decent memory, and SSD storage!
Be sure to set up any file system caching (like the cache for S3FS) on a local ephemeral drive to speed disk reads (and save money). With the setup I described, the disk read shouldn't be the bottleneck because most data is coming from ElastiCache for the full page and CloudFront/S3 for media.
I also highly recommend using a profiler to find and reduce bottlenecks. That is how we lowered non-cached page speeds from 6 to 2 seconds because of the file_exists
function and symlinks.
To answer the other part of your question...I'm not sure the VPC would help much, except to block bad bots/DNS attempts before they hit your servers. Unless you have a ton of bad traffic, that shouldn't make a huge difference.
Update 3/4/2017:
This reply gets a decent amount of traffic and I've been getting some messages asking for help. Since the original post was over three years ago, I don't remember all the details from back then. Here's my current setup (pretty much scroll through the AWS menu and add one or two of everything):
- c3.large front-end servers running Apache (these work great for Magento and are relatively inexpensive)
- smaller back-end servers for API and admin
- Amazon EFS for shared media storage (just implemented, but good so far)
- CloudFront pointing to web servers for source
- No more full page caching (it wasn't giving enough speed enhancement to make up for the pain of hole-punching and dealing with added dynamic content on pages)
- ElastiCache for caches and sessions
- AWS Elasticsearch Service for searches
- RDS server for database
My current project is implementing scripts to ban IPs that are requesting too many pages or bad URLs (almost all of which are originating from Russia and Ukraine). These can be >1000 requests/minute and cause major problems on several levels. I'm looking at using something similar to this fail2ban to Network ACL project. Speaking of malicious requests...ALWAYS move your admin folder to a different location that /admin/
and set up /downloader/.htaccess
to deny all but your IP. Those are the two main URLs that these guys hit and the server load of them nailing those pages can cripple your server set up fast.
I feel that this site is not really small, but nowhere near large. There are components that should be different depending on catalog size and traffic levels and the probability of traffic spikes. The nice part is that you can set something like this up for around $100/mo, then scale quickly and easily by looking at loads from different services and add capacity where it is needed.
Best Answer
Recommended PHP memory limit for Magento is 512M
From System Requirements Prior to EE 1.14.1 and CE 1.9.1:
For above these versions no memory is specified for system requirements only the PHP versions (PHP 5.4 and 5.5), but there are known issues with the default PHP memory limit of 128 Mb for these PHP versions.