Portal Home > Knowledgebase > Articles Database > Apache using 1.7GB of Memory for single request
Apache using 1.7GB of Memory for single request
Posted by toynz, 04-26-2011, 03:37 PM |
Hello
I have a family tree site that uses the gedcom file format for data.
This site is run via suPHP and PHP has a memory limit of 196MB which it copes with. However as soon as PHP passes the page to apache the apache child starts grabbing loads of memory before returning the page to the client.
This can be around 1.7GB for a single request which means it doesn't take many requests in a short period to exhaust memory.
My question is what could be causing apache to use so much memory? If PHP can process the request in less than 196MB then surely Apache just needs to take the output from PHP and pass it to the client. The page returned is under 1MB.
I've run out of ideas so I hope someone else out there has some suggestions.
Thanks
Brent
|
Posted by TransNOC, 04-26-2011, 03:42 PM |
You can enable memory limits in cPanel and if you search google you can learn how to setup a cron job to run every 1-2 minutes and restart Apache if it's using over the set memory limit.
|
Posted by toynz, 04-26-2011, 03:47 PM |
This is on a Plesk server. And unfortunately the RLimitMem in apache only limits memory used by forked processes of children and doesn't affect the amount of memory a child can use.
Yes I could setup a cron. That introduces downtime but could be a temporary workaround but I was hoping to find an answer as to the cause so that we can implement a proper resolution
|
Posted by superiorhost, 04-26-2011, 03:50 PM |
Hi,
Another thing to look at is the coding on the page calling. Look for direct full links back to the site, rather than relative links to items on the page.
We had a customer that had an issue like that. It would just take off on memory, and eventually bog down or kill apache. We noticed that there were tons of instances of that page trying to load, which was really a loop back from the page, and kept growing and growing.
This error was fixed by changing the links to images and other objects, to relative links.
It may not be your issue, but thought it may be of some use if you haven't looked into that.
Tim L
|
Posted by toynz, 04-26-2011, 04:02 PM |
Thanks for the suggestion. I just checked and all the links are relative. But worth checking. Why would that cause apache to bog down though?
There is one very large line in the output though. The html has not been split onto multiple lines and therefore there is one line of potentially 600,000 characters. I will do some tests splitting that onto multiple lines and see if that helps.
|
Posted by Jedito, 04-26-2011, 04:06 PM |
Are you using Apache 1, 2, 2.2x?
|
Posted by toynz, 04-26-2011, 04:08 PM |
Sorry I should have mentioned that. Apache 2.2.3, PHP 5.2.14
|
Posted by Jedito, 04-26-2011, 04:13 PM |
Are you using worker MPM or Prefork MPM?.
Is it that a VPS or a Dedicated server?
|
Posted by toynz, 04-26-2011, 05:02 PM |
I have split the long lines in the HTML and it has helped. Helped with browser load speed too. But not enough.
a 3.8MB HTML output is still using 1.3GB of apache memory.
|
Posted by toynz, 04-26-2011, 05:13 PM |
Using Prefork, and it is a dedicated server running CentOS.
|
Posted by speckl, 04-26-2011, 05:19 PM |
I don't see this as an apache issue. Sounds like the code to me.
Having a variable that size is not even close to being smart. Just because the page returned size is small doesn't mean anything.
What is the file doing with that var?
|
Posted by toynz, 04-26-2011, 05:25 PM |
It is not a variable. It is data that is echo'd.
I would understand if it was PHP that was using the memory, however as it is using suPHP, PHP uses it's memory outside the apache child and therefore I can't see it being a code issue. If it was a code issue then PHP would be using huge memory not apache.
Correct me if i'm wrong, but wouldn't PHP be passing apache the raw output which apache then just sends to the client? What would apache be doing with it?
|
Posted by Techbrace, 04-27-2011, 02:53 AM |
What's your setting for MaxRequestsPerChild? Try Lowering the value and see if it helps. You can look at the memory usage of the certain process and see how the memory is distributed to get a fair idea of the memory consumption.
|
Posted by toynz, 04-27-2011, 03:34 AM |
MaxRequestsPerChild is 4000. Lowering that will likely free the memory up sooner however it won't stop the memory being used.
Not sure I understand what you mean by check the memory usage of the certain process?
I check the usage of the process/pid that is handling my request and that is where I see the huge usage.
|
Posted by Techbrace, 04-27-2011, 04:13 AM |
It might not stop the memory being used, but you can keep the memory leak in check by doing so. If you still don't agree, may be you should learn more about memory mapping and Apache's handling of memory.
Unfortunately, what you have found so far is not enough. And don't rely on the memory usage shown by ps, because ps doesn't exclude the shared libs. It's good to find out one process is taking more RAM (but its' a rather easy task), but it's important to know why it's taking so much memory (this is where you will spend time).
So how would you go about finding the memory consumption of a certain process?
Run
cat /proc//smaps
that will give you some idea about the memory consumption of each libraries and files.
|
Posted by toynz, 04-27-2011, 04:44 AM |
Thanks for that information. I have done that whilst a process was in progress and here is the only large item in the output:
0920c000-29198000 rwxp 0920c000 00:00 0 [heap]
Size: 523824 kB
Rss: 281696 kB
Shared_Clean: 0 kB
Shared_Dirty: 23516 kB
Private_Clean: 0 kB
Private_Dirty: 258180 kB
Swap: 116 kB
Pss: 259331 kB
Sorry I don't understand how to investigate further from here.
|
Add to Favourites Print this Article
Also Read