In one of our projects, we had lot of performance issues with the portal. That project had all the data loaded in the memory, so we were not concerned about database hits but the huge data being transferred. So after careful thought process and observations, we got to know following facts :
1. To many requests were made to the server (approx 110)
2. Every time connection opening & closing consumed lot of time.
3. File size were quite heavy. Files like js, images, css
We used Firebug, a firefox addon to identify the first problems. So in a process to improve performance we implemented three things
1. Not open & close connection on every request : For this to happen, we created filters and in before filter we added the following code :
response.setHeader('Connection','keep-alive')2. Caching the response : To cache the response, we added the following code to the before filter :
response.setHeader('CacheControl', "public") response.setDateHeader('Expires', System.currentTimeMillis() +60*1 000)3. Reduce the size of files to be loaded : To do this, we zipped all the files and set the response header again in before filters as given in the following code :
response.addHeader('Accept-Encoding','gzip, deflate')
Page download time reduced to 1/5th of the original time after using these techniques.
There is another option that can be explored, which is ETAG, which saves us from downloading the file again, if there are no changes from the last response.
Hope this helped. Please share your thoughts to improve the performance.
Cheers!
Regards,
~~Amit Jain~~
amitjain1982@gmail.com