How to Optimize Your Web Application?

Frontend, Backend Web Application Optimization


Every developer has a special way to optimize their own or legacy projects. Obviously I had my own ways too. Recently I started to think about how I can optimize projects a more elegant and proper way than I currently do. So I decided to optimize an old active project. By doing a lot of research for industry standard tools, libraries and methods.

Unfortunately because of confidential issues I can’t tell you about project itself but in general project is written in PHP Laravel version 5.2 framework, there are 44 database tables with a lot of relations/joins. Database system is Mysql, for searching elastic search, for caching redis is used. Application hosted in AWS EC2 instance with local docker containers.

In this article I will talk about optimization in 3 different titles frontend optimization, backend optimization and server side optimization.


1 - Preparation

First of all I installed project on my local computer using docker. Then I went through whole frontend and backend code to have an idea about project. Codebase was quite clean, readable and seems like maintainable besides in some parts there were nested if else blocks for special situations and I had no idea about what are those but they were rarely appeard while I skim through. So codes didn't scare me too much. After my docker containers started to run I imported most recent production database on my local environment. So I was set to work on project just in 10 minutes!


2 - Frontend

Fixing frontend issues was quite easy to be honest. I checked javascript and css files if they were minified. I found some tools online that checks unused css and javascript codes but I didn’t really use them. Progressive loading wasn’t a case on this project but as a reminder putting CSS files on header ensures faster loading speed by progressively load css files.

Another thing is having separate js and css files slows down your website since each of them requires seperate requests. So I installed webpack and merged all files into app.js and app.css. Also instead of using external javascript libraries I downloaded those libraries into server and imported them to my webpack instance.

Besides performance having external javascript files can cause massive security issues too. Think that you externally imported an innocent slider library. Suddenly slider website got hacked and hacker put malicious code on those javascript files which are already included to your project directly or owner of javascript library did it by himself…

After merging and downloading external css and js files next thing was put all images, js and css files into a CDN server and override Laravel’s asset function to point out my CDN url instead of projects assets folder. So asset requests will not use any resources from my web server anymore.

You can also setup nginx to server to serve static assets without handling request but I dont really like this approach as CDN is always a better solution

Lastly I run through images and checked if any assets are bigger than 1mb I found couple of images and I simply compressed them. For compression my choice was Adobe Lightroom but there are quite a lot tools to automate this task aswell.

So that’s pretty much what I did on frontend. Those changes increased speed quite a lot already however speed problem was mostly because of backend as my requests were taking way too long to get a response.


3 - Backend

After very basic tests through browser it was pretty obvious that requests taking longer than expected.

Server was running on PHP 5.6 but my docker was running on PHP 7. So first I did was migrate to PHP 7 which increased speed drastically! Actually I was lucky enough I didn’t run into any problems during migration. So it was very smooth and easy step. As easy as installing PHP7 over PHP 5.6. But you dont always be that lucky so its better to follow a migration guide before upgrading PHP version

As I mentioned earlier project was massive and most of data coming from database. As there were around 44 tables with lots of joins. I wasn’t exactly sure size of database queries.

Database performance issues mostly occur because of eager/lazy loading for particular part of your application which causes slow queries because of multiple joins etc

What is eager and lazy loading ?

Let’s say you have categories table with 100 rows and products with 1 million rows.

On categories page loading your categories among with products is eager loading. You get all data in one big chunk. However you don’t really require products on your category page.

On the other hand; Lazy loading is something like get all categories without products in category page. If someone clicks to category then you will send another database request to get products related with that category.

Technically lazy loading is better approach for most cases unless some cases such as special ajax requests. As an example in Laravel

Product::with(‘orders’); // eager loading
Product::find(1)->with(‘orders’) // lazy loading

Product::with(‘orders’); // eager loading Product::find(1)->with(‘orders’) // lazy loading

protected $with = [‘orders’];

Will make your product model lazy as in default so each time when you call Product it will automatically call orders too.

How did I found those slow queries?

Mysql has an amazing feature called Slow Query Logs which lets you find slow queries

You can read more about it in here

For me I filter through queries which takes more than 15 seconds.

After I spotted queries simply I used `mysqldumpslow` to receive information about those queries. You can also read more about mysqldumpslow in here

In my case slow queries were caused because lack of indexing.
 Indexing on databases plays very important role for optimization and indexing logic is absolutely simple.

Let me give you an example:
You have users table with id, firstname, lastname, role, city_id, email. id is your primary key, and email is unique and city table with id and city_name.

Mysql indexes primary and unique columns by default, so id and email are already indexed.

Assume that your application searching people by their role or you show users with city names. So in that case role column and city_id needs to be indexed for faster results. Long in a short we can say all join columns and frequently used where columns needs to be indexed.

Spot slow functions

Frontend optimized, php version upgraded, queries are fastened. At this point I was already quite happy with results but I wanted to go further so I started to find code that works slow too. Xdebug is still number one debugger and profiler for PHP. Even though I don’t use debug features of Xdebug anymore I wanted to install Xdebug profiler to spot slow functions But also I found a php package called Clockwork. Which helped me tremendously to you check code speed, size of requests, size of responses and many more things right from browser. Clockwork even works with Xdebug profiler too!

Here is a screen shot from Clockwork

You can read more about clockwork and download from here

With a chrome extension I got myself a pretty good debugger! Already loved clockwork implemented to my laravel boilerplate which you can find here

For your information Clockwork is only available if your application on debug mode. So be aware that its not a tool that you might want to use on your production server. That’s a screenshot of a response. For confidential reasons I had to paint controller names to black sorry!

Eager Loaded Resources

First things I noticed on Clockwork panel was eager loaded resources such as videos and image. I found a great article about it by Jeremy Wagner he explains how to lazy load your resources. Even though I didn’t take any action since I setup a CDC but still its good read to have an idea. Here is the link

Clockwork gave me all slow functions. With Xdebug profiler on I can even get which part of a function is slow. Actually clockwork gives you database query durations too. But I still prefer using mysql query log since it gives direct query timing rather than including application call timing through laravel framework.

There were not many deadly speed issues on functions but some for loops was a little bit and they might get even slower in future for example

for ($i = 0; count($users['countries']); $i++) {
        //do something
}

counts users countries each time however if this loop switched to

foreach ($country as $user['countries']) {
        //do something
}

This calculation will be done only once. Thats why I personally prefer foreach loops over for loops. Which helps me to prevent this kind of expenses on loops by default

Switching for loops to foreach loops in occasional spots of project. That was pretty much I refactored on code itself


4- Server Side

Frontend and Backend is done next step was to look for what I can do on server side.

Project was using official nginx docker container so settings were quite good already. Only thing I changed was increasing worker_processes. Gzip compression was already on. So I didn’t touch that one either. But gzip compression is one of the most important server optimization option that should be always on.


5 - Results

For results I put project under stress test on my staging environment.

I produced tests with 500 concurrent users on various urls of project routes by using Siege. Siege is another great tool that helps to benchmark website with various options such as number of concurrent users, request size, or defining urls of website that you want to visit etc. Its a tool I use frequently for benchmarking. You can read more about Siege here And finally I am done. Here my before and after results.


Before

Transactions:               17300 hits
Availability:               100.00 %
Elapsed time:               179.84 secs
Data transferred:           66.6 MB
Response time:              4.04 secs
Transaction rate:           96 trans/sec
Throughput:                 0.00 MB/sec
Concurrency:                4.00
Successful transactions:    17300
Failed transactions:        0
Longest transaction:        59.00
Shortest transaction:       2.00

After


Transactions:               16800 hits
Availability:               100.00 %
Elapsed time:               179.23 secs
Data transferred:           48.00 MB
Response time:              0.96 secs
Transaction rate:           22.56 trans/sec
Throughput:                 0.00 MB/sec
Concurrency:                3.00
Successful transactions:    16800
Failed transactions:        0
Longest transaction:        10.01
Shortest transaction:       1.2

As you have noticed website speed increased about 4 times! and all this changes, tests took me only one sunday to finish.

I appreciate all your feedbacks and questions. Feel free to drop me an email at anr.bayramov@gmail.com