Win a copy of Practical SVG this week in the HTML/CSS/JavaScript forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic

xml rpc and setGzipRequesting

 
Roelof de Vries
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Today I found out that there is aparently a limit in the block size that can be transported by xml-rpc for zipped data. An application that has been running fine for one and a half year suddenly crashed. This java client application talks to a php layer, linked to a mysql database, on a server by means of xml-rpc. A while ago the amount of data in the mysql database was growing and the time to fetch the data was increasing. Since it is possible to request server side zipping to send the data, this is what I added at that time. (See piece of code below).

XmlRpcClient server = new XmlRpcClient();
XmlRpcClientConfigImpl server_config = new XmlRpcClientConfigImpl();
java.net.URL s_url = new java.net.URL("http://www.someserver.com/somefile.php");
server_config.setServerURL(s_url);
server_config.setGzipRequesting(true);
server.setConfig(server_config);

When the amount of data becomes too large, the request to the server ends in an exception. However, changing the parameter in the setGzipRequesting method from true to false made the application working again!

I have two questions regarding this:
1) Setting the parameter of the setGzipRequesting method to false makes that the amount of data that needs to be transported increases significantly. Why doesn't it crash like it does when it is zipped?
2) Is there a way to increase the amount of zipped data that can be transported by means of xml-rpc (I am using XMLRPC FOR PHP written by Edd Dumbill et al on the server side)?

Roelof

 
Roelof de Vries
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Ok, I found the solution to the problem and just want to share it : it is not the amount of data that needs to be transported, it is the memory_limit in the PHP server. When the response needs to be zipped this apparently takes up memory and it exceeds the maximum allowed memory at a certain amount of data. Setting the zip parameter to false will make that the response will fit into memory again and the system will be working correctly.....for a short time. When the amount of data is growing over time it eventually will fail again because it will not fit into memory again (I tested this and it is indeed the case).

I have increased now the memory_limit parameter in the .htaccess file and everything works smoothly again.

Just something I noticed: my hosting provider is going to get rid of PHP4 and is moving to PHP5. While the application worked setting the memory_limit to 40M for PHP4, it did not work for PHP5: I had to set the memory_limit to 60M to make the same code usable with PHP5. Apparently PHP5 is a bit more memory hungry....

Ok, it is clear I have to rewrite the application to avoid these memory problems in the first place.

Roelof
 
Did you just should on me? You should read this tiny ad:
the new thread boost feature brings a LOT of attention to your favorite threads
https://coderanch.com/t/674455/Thread-Boost-feature
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!