Web optimization šŸ”„. Itā€™s worth it! (Part-1)

BeastImran
23 min readJun 28, 2021

--

My avatar everywhere ā¤ļø. Itā€™s cute right šŸ˜! Thanks to the creator.

Welcome ā¤ļø!

You are about to embark on a very interesting journey about optimizing the web in the best ways possible.

After reading this article, you should be able to improve the websiteā€™s performance and user experience by almost 120% šŸ”„, should be able to save 50 to 70% bandwidth, plan for efficient performance, etc.

Letā€™s start this outstanding journey on web optimizationā€¦.

Letā€™s start our journey ā¤ļø

A formal intro šŸ˜Ž, My name is Shaik Imran (BeastImran), a Student interested in every aspect of the software industry and very keen to dive deep into software technologies, frameworks and tools of my interest. I do things that address, solve and simplify many peopleā€™s lives.

Web optimization Intro

Things to keep in mind

Common things to keep in mind about web optimization:

Web optimization is a great way to improve a siteā€™s performance and experience, but it also needs some care and pre-planning while implementing it.

Everything can be optimized, but not all are worth optimizing.

Know and understand what and why you are doing it before doing it.

Overdoing something can cause the opposite of expected. KISS (keep it simple as stupid).

Most importantly, watch this. I donā€™t know why, but, yeah, it is what it is.

Things we are going to talk about

  • Resources that will help you (websites, videos, tools, etc).
  • Getting started with some great tools.
  • Optimizations. (modularize, minify, images optimizations, asset loading techniques, compressions, HTTP headers, etc).

Web optimization resources

Websites

Google maintains a web.dev website that will help you with a lot of things. It has tons of the most amazing content related to web development ranging from courses to guides and articles. This is a great place to start for newcomers.

MDN(Mozilla Developer Network) web docs (Firefox guys) is the go-to place when you feel lost. Find out better ways of doing things. It is really easy to learn, read and explore the unknown areas of the darkness. It has tons of in-depth documentation about most of the web technologies, security-related things, etc.

Not really a useful website, but, this is my website which I will use throughout this guide for examples. Default techniques and methods of the site were fine, but I will discuss how I made them far better.

Built with a free Bootstrap template which was good enough for me in design and visual aspect, it came bundled with its enormous assets like hi-res images, huge CSS, jQuery files, etc.

The initial score with the raw template was around 40 to 60, and now, check it yourself.

For Mobile:

Mobile score
Mobile score

For Desktop:

Desktop score

Videos

Google Chrome Developer YouTube channel has really useful videos ranging from best practices to guides, tips and tricks, etc. Everything you would expect to learn and grow.

Web Dev Simplified is a fine place to learn about new web technologies. They often upload videos about new technologies, tips and tricks in web development, better ways of doing it, etc.

YouTube itself is a great place to learn, get entertained and commute.

Google is everyone's best friend šŸ˜‰.

W3Schools, I donā€™t have any words to say about this site šŸ™‚.

Web optimization tools

Performance tools

Photo by Pixabay from Pexels

The performance of the page is one of the important factors for user experience. Pageā€™s load times, responsiveness, size, structure, etc. They all become assessing factor for performance. The better the performance of the page, the greater will be the user experience.

PageSpeed Insights is a great tool to measure the pageā€™s performance, approximate user experience and other very useful features which we will discuss later. It will also suggest better ways of doing things. Lighthouse is a chrome integrated tool that gives similar functionality to the PageSpeed Insights site.

other similar tools are:

All of these tools will help you measure page performance in different metrics. Most of them provide great insights into how the page is being loaded, the timings, etc, and also, they will suggest better ways of doing it.

SEO (Search Engine Optimization) tools

Source Pixabay from Pexels

Maintaining SEO will make the page rank higher in search results. Suppose your site is not in competition over an area or brand value like mine, even then, minimum SEO should be maintained. If the scene is opposite, your siteā€™s area of interest is in serious competition, need to get the impressions and clicks from search results. SEO is very important.

SEO tools like Ahrefs help us find the right words to use on a page. If you use them correctly, they can be a powerful tool in your arsenal. Other similar tools are:

SEO is not only about stuffing keywords into the page. Itā€™s also about the metadata your page holds. The data you are allowing the crawlers to reach, read and understand, the structure of your page, the page headers and performance, etc. These are serious business.

Here are few things to get you started.

META tags are the HTML tags that tell browsers and crawlers about your site's info, the data, how it is structured, how to handle it, etc. Hereā€™s an example.

page description metadata

META tags are in the above format. They have a name, content properties, besides them, there are a lot of other attributes like rel, charset, etc.

name=ā€descriptionā€ property tells crawlers and search engines the content to show as the description of this page in search results.

BeastImran search result

We need to tell browsers and crawlers about what format or encoding the page uses or has content of. We can inform it as below.

charset metadata

This will tell browsers and crawlers that the page has UTF-8 content. Almost all websites have their content in UTF-8.

Canonical links are the most important thing to specify on your page. Believe me, they will save you from a lot of situations like the one I faced. Anyone can link your website to their domain by specifying a type A record in their DNS to your server IP address. We call this, hotlinking.

Search engines donā€™t show duplicate sites in search results, they show the latest domainā€™s results and exclude duplicate pages. My website was not showing up on search results cause someone hot-linked my server IP address to their domain.

canonical link

This will tell crawlers and search engines that this is the master URL to show on search results and the rest are duplicates. I used rel=ā€canonicalā€ property with the href=ā€http://beastimran.com/ā€ value inside a <link> tag in the above code. You need to specify this on every real page with the pageā€™s respective href url value.

There are a lot of other things to include and improve. Learn more about these topics to see how you can improve SEO, security, etc.

Recommended places to start

A great tool to generate JSON-LD data

A great tool from Google to test rich-text

structured data is being deprecated soon.

Advanced SEO techniques and strategies

Web optimization techniques

Photo by Lum3n from Pexels

I will talk about a few optimization techniques which make a significant difference.

Recommended video. Before you continue, please watch this video.

Minify every asset possible

Every site has HTML and CSS as common assets. They get shipped over the wire which is on average 200KB to 300KB of consumed bandwidth resulting in increased latency. No need to be shocked even if they are of 1 to 2MB as of today's standards.

Minifying is a way to remove unnecessary spaces, newline characters, etc. Which will save you up to 20 to 30% of the bandwidth and will lower the latency significantly.

In my case, the main CSS file bundled with the template was around 240KB. It would take at least 5 seconds on a slow 4g and 15 seconds on a slow 3g network to download.

Minifying that file brought down its size to 168KB which is 30% lower than the original size. Now it should take around 3 to 10 seconds to download.

I did the same thing to all of my assets (HTML, CSS, JS, JQUERY) and it brought a total download time of 15 to 17 seconds on slow 3g to 10 to 12 seconds. Which is around 35% of performance improvement.

Still, 10 to 12 seconds load time is huge. Here comes our second technique.

Compress all images

The template came with a lot of hi-res images, and obviously, their sizes are much higher than images used for web pages. My site doesn't need any hi-res images, so I removed them, compressed all of my images to low-quality jpg format images (choose webP if possible) of different sizes/dimensions for different placements.

There are plenty of online tools to use for image compressions and the best part is, most of them are free to use with no limitations. These are some good ones.

I use this tool most of the times for video or gif optimizing and editing works and is a great tool to have it in your arsenal.

All of this brought the siteā€™s average download from 3MB to 300KB (regarding images) which is approximately 90% lower than the original size.

Improve the resource loading method

loading animation. Source

Whenever there is a CSS/JS linked in HTML, the browser pauses parsing the HTML, requests the file, downloads it, parses and executes it and then continues with the HTML and the cycle repeats. We call this kind of resource usage render-blocking usage. More about it here.

To improve this situation, we can use modern built-in features like asynchronous requests, deferred request, preloading, lazy loading, etc.

Assets are loaded in order and their size matters šŸ˜‰. Letā€™s talk about loading images.

Loading Images

Images are a great way to express and convey a message, motivate and naturally recognizable by people. Most of the time, images are all over the page and do not need to be loaded all at once. They can be loaded on demand, like when it is in the userā€™s view, or when thereā€™s some kind of use with it, etc.

Browsers have a prominent feature with image loading called Lazy loading. ā€˜Lazy loading imagesā€™ technique will load images just before coming into the viewport of the user. In other words, on-demand. They will also get loaded even when the view has not yet reached but all of the other content has been loaded and rendered.

Learn more about it here. You can do it just by adding the loading=ā€lazyā€ attribute in the <img> tag.

lazy loading attribute

You can decide on the server-side which images are going to be lazy-loaded or just load all of them lazily. I recommend images after immediate viewport to be lazy-loaded.

Be aware that this kind of dynamic loading will cause layout shifts all over the places and it will annoy the users. You can deal with it by reserving those images spaces by specifying the width and heights for them in HTML itself.

specifying width and heights

By doing this simple thing, you can reduce the TTI(Time To Interaction) by a lot which is great for a lot of frustrated/impatient users who are in a hurry and canā€™t wait for all the images to be loaded šŸ˜‰ (users are gods, just saying to be on the safe side šŸ˜).

In my case, doing that, reduced the TTI from 5 to 10 seconds to 4 to 8 seconds.

You can also configure images of different sizes to be loaded depending on screen sizes automatically. Learn more here. Here is an example.

srcset attribute

You can use the srcset attribute in the img tag to specify images of different sizes which will tell the browser to load the best image from the specified set of images depending on the screen size of the user. If none are available, it will fall back to the default src value.

Placeholders are a great way to keep users engaged. This website may help you find the perfect placeholder you are looking for.

Loading Scripts

JavaScript can be crucial to any siteā€™s functionality, and often, they are extensive files. Thanks to built-in HTML features, loading scripts efficiently has become much easier more than ever before. Learn more.

We can load scripts in 4 different ways (common usage). They are as follows.

  1. Normal loading.
  2. Asynchronous loading.
  3. Defer loading.
  4. Appending script to DOM after the page has loaded.

Here is a video (recommended) to better understand Normal loading, Async and Defer loading.

JavaScript Loading techniques

Another great video to watch:

Async vs Defer loading techniques

The normal method of loading scripts

Scripts are linked in HTML as follows.

Normal method

When browsers parse this line, they stop HTML parsing, requests the javascript.js resource. Once javascript.js has been downloaded, it executes and runs the code. Only after that, HTML parsing is continued and the cycle repeats. Loading, parsing and executions are done in order as specified in the HTML.

JavaScript loading normally
Source

If you notice what's wrong here, the javascript.js file is acting as a major render-blocking resource. The first thing we can do is modularize the file. If we can separate the initially needed code with later code, we can load the initial part first and load the rest of the code efficiently in the background without blocking the render.

By doing this, we can decrease the render-blocking time significantly and the rest can be loaded on-demand or when the page is loaded completely, etc.

Async loading

async attribute

Just adding the async attribute in the script tag tells the browser to load the script asynchronously. This means the while the HTML is being parsed, the JavaScript file is downloaded in the background. Once it's downloaded, HTML parsing is paused, JavaScript is executed, then HTML parsing is continued and the cycle repeats.

JavaScript asynchronous loading
Source

The things to keep in mind is while using the async method, the order of execution is not maintained. All of the scripts are executed independently and no order is guaranteed.

Defer loading

defer attribute

Defer acts similar to async. It downloads the JavaScript file in the background while HTML is being parsed. Here is the difference. The loaded scripts are executed after the page has been rendered/loaded and in order as specified in HTML.

I recommend using defer wherever possible as it is a better way of improving the performance and user experience than async and normal loading.

A combination of the above three can also be required or might come in handy.

Append after page load

Another little trick you can do is load a small JavaScript initially or inline it in the HTML file using <script> tag that has the following code.

loadJS

As you can see in this function, we create a script element and assign the src to be url of the JavaScript file you want to append. At last, we append the script element to the ā€œdocument.bodyā€ as a child.

You need to call the function as below when needed. Like when the window is loaded.

calling loadJS

else just write it as bellow.

Load JavaScript when the window loads.

We are adding an ā€œeventListenerā€ to the userā€™s window with the property of load. This will tell the browser to run a specified function when the page is loaded, that is after HTML and CSS rendering is done.

Learn more about the load event here.

Just apply these techniques in a meaningful way and see the magic.

Loading Fonts

Photo by Brett Jordan from Pexels

Fonts are one of the major things to specify, either for consistency over many devices, for brand purposes, theme, etc. Here are some pitfalls!

  • Using a lot of fonts.
  • Loading unnecessary font assets.
  • Loading after content is displayed.
  • And this, I donā€™t know why, but yeah. It is what it is šŸ¤¦šŸ»ā€ā™‚ļø.

In short, Using a lot of fonts is a waste of resources and must be done only if needed. Usually, pages should have a max of 3 fonts. More than that is not recommended due to many reasons like no consistency in the page, annoying sometimes, over-styling, etc.

Letā€™s say you are using the bold version of a font, then there is no need to load the whole family of the font. Load the bold version itself. DO NOT LOAD THE WHOLE FAMILY. You can use font CDNs like Google fonts for better performance, but be aware of extra requests it takes.

I see this often in the medium itself. As of June 26, 2021 (Indian time), while I am writing this article, the medium contentā€™s font change after a second or two when I open an article to read for the first time and itā€™s noticeable. This is not recommended. Load the font Initially, give it a higher priority or just use JavaScript to do the thing.

To learn more about fonts, I recommend reading these articles.

Compressions techniques

When it comes to page load times, file transfer speeds over the network has been the most time taking part of all behind the scene magic āœØ. Letā€™s take my site as an example again.

After minifying all the siteā€™s assets (HTML, CSS, JavaScript files) the total initial download size is still around 316KB which include approximately 8 very small video files. 100KB of images and videos leaving 216KB worth of assets. I believe this size is small and acceptable in 2021. But still, it will take around 10 to 15 seconds on a slow 3g network to load and that is a significant time. Here comes compression to help.

Compression is a way to reduce file size to save space and bandwidth. There are two kinds of compression techniques.

  1. Lossless compression.
  2. Lossy compression.

In Lossy compression, reducing the file size as much as possible is the main aim. We see these techniques in image, video and audio compressions commonly. Image formats like jpg are compressed image formats that aim to provide the same visible quality of the image and by reducing the image file size significantly.

mp4 video formats are also similar to jpg format in images and so is mp3 in audio.

In Lossless compression, reducing the file size without losing any data is the main aim. There are a lot of Lossless compression techniques available like these, Gzip is the king of the web till now. Gzip has been around for decades now and is supported by all major browsers. Brotli is a comparatively new similar technique that promises even lesser file sizes with relatable compression speeds.

In Lossless compressions, particularly in web technologies, compression speeds and file sizes matter a lot. I will talk about how to implement Gzip.

Configure the Ngnix web server like this.

Configure the Apache web server like this.

Configure the IIS web server like this.

If you use some other webservers as I do (Sanic framework similar to python flask) here's what you can do. I am going to use the CPython3 language here.

gzip response

send_response function/method (whatever) accepts the request that came in and the response you are sending.

On line #6, we read the Accept-Encoding that the browser supports. If the ā€œAccept-Encodingā€ header is specified, it would look something like this.

Accept-Encoding: gzip, deflate

The browser is telling us that it supports gzip and deflate compression techniques in the above case. If there is no such header, the response without compression is sent on line #9.

When the browser supports gzip compression, we compress the content (body) of the response on line #13, write the compressed content to the response body on line #16, set required headers to tell the browser that the content is gzip-compressed from line #17 to #19, at last, return the response which is far less in size. The response might look like this.

Compression headers

Similar functionality can be achieved in any language you are using and you can use any compression technique. But everything comes down to browser support. As previously said, Gzip compression is supported by most browsers. Check out browser compatibility and learn more about compression here.

Guess the total size of my site now!. The total download size is down from 316KB to an average of 140KB šŸŽ‰šŸ„³. Thatā€™s approximately a 65% decrease in total size and the load time on slow 3g is now 4 to 7 seconds.

Reduce the number of requests

Photo by Pixabay from Pexels

DNS requests take a long time to evaluate for the first time. Keep the number of different domain requests small, at least for initial content load, do the rest of the things in the background so you donā€™t block the content rendering. Else you can also host all the assets on your domain itself so that the browser doesn't have to query DNS for multiple domains.

Letā€™s take an example.

Bootstrap CDN

When a browser sees this In HTML, it first needs to know who cdn.jsdelivr.net is. Then it will send a request for /npm/bootstrap@5.0.2/ā€¦ā€¦ page which is the minified Bootstrap CSS.

Then this,

jQuery CDN

then this,

Bootstrap JS CDN

then this,

PopperJS CDN

These links are very common in a modern website built using Bootstrap, jQuery, PopperJS, Bootstrap JS. The problem is, with every new domain, the number of requests, time to load everything, render time also increases.

Another problem is, even though they are great utilities/frameworks to use, they hold a lot of other stuff we donā€™t need.

Letā€™s take my website again as an example. It came bundled with the above bootstrap CSS and JS, jQuery and popper JS. The only features used were the carousel, menu slider which could have been done using vanilla JS easily.

My entire website was structured using Bootstrap CSS which made sense. Bootstrap JS was used for the carousel which depended on the popper JS. jQuery was being used to expand or close the menu depending on the userā€™s screen size automatically or when the user clicks on the menu button. If it was a large (desktop or iPad kind of) display, it would expand with a sliding animation else will be closed.

I removed all of that unnecessary stuff and wrote vanilla JS code that would replicate the same functionalities and they go like this.

Auto open menu

Toggle show class JS code

On click open or close menu

Open/Close menu on click

Share button (copy to clipboard on click)

Copy message to clipboard on click

Adding these few lines of code in my JS file made all other dependencies unnecessary. This JS file was on my site itself so it also reduces the number of requests. Evaluate your pros and cons about these.

This unnecessary number of extra requests problem is not only with CSS or JS files. Every asset on the page can lead to this issue.

Keep as many assets as possible on your server and minimize the extra requests to other domains. Specify caching policies.

HTTP Headers

Use the HTTP headers to their full extent. There are a lot of headers to use to tell the browser about many things like, how to handle data, does it have to cache the assets, how long does it have to keep them, what security policies to enable or disable on a page, etc. Learn more about them here.

We will talk about the following few of them:

I hear people say that ā€œwe can achieve the functionality of a specific header by adding respective <meta> tagsā€. I would love to disagree with that argument. Of course, you can use the <meta> tags to do that but, if you add them to the page, you are just increasing the file size, decreasing the performance by telling browsers to parse more info, etc. Reading values from response headers is far more efficient than parsing an HTML and then picking up the values from there.

CSP (Content Security Policy) headers are the headers that tell browsers where to expect the content from, if anything unexpected was found, what to do, where to report the unexpected behaviour, etc. This is mainly useful for eliminating cross-site attacks, code injections, etc. You must specify at least a few important attributes like script-src, img-src, font-src, etc. Learn more here.

img-src CSP attribute

By specifying ā€˜img-srcā€™ values to ā€œā€˜selfā€™ http://beastimran.com http://www.beastimran.comā€ I am telling the browser to expect image assets from these two origins only and self means the same origin the request is being made. ā€˜selfā€™ should be used consciously cause it can be abused to inject unwanted resources. I am going to remove the ā€˜selfā€™ value after this article is completed. Learn more here.

script-src CSP attribute

Similar to img-src, ā€˜script-srcā€™ is telling the browser to expect scripts from these locations/URLs. As previously said, ā€˜selfā€™ should be used consciously. Learn more here.

x-xss-protection header

Modern browsers are smart enough to detect XSS attacks, but not completely resistant. When they detect one, the ā€˜x-xss-protectionā€™ header tells the browser what to do. As per the above specification, when an XSS attack is detected, the site should be blocked from rendering or should not be accessed.

x-content-type-options header

ā€˜x-content-type-optionsā€™ header tells the browser whether to sniff the MIME types of contents on the page or not. MIME can be sniffed from all types of contents like stylesheets, HTML files, JSON files, images, videos, etc. I have disabled the sniffing by specifying ā€˜nosniffā€™ values as you can see from the above specification. Learn more here.

You donā€™t need to load all of the assets every time a user visits your page. You can cache (store locally on the browser for faster access) your assets for a specific amount of time and can control whether to update or not whenever you want. Learn more about it here.

There are tons of other useful headers to specify like security headers, asset caching headers, etc. Learn more about them here.

Here is a great tool to test your page:

This is an all-in-one test kind of website from Mozilla and is handy. ā€˜Mozilla Observatoryā€™ will test all kinds of stuff like SSL, HTTP headers, page speed and performance and a lot more. I use this website all most every time.

Conclusion

Conclusion smile. Source

Doing all of the specified things might give a 120% increase in performance and if meaningfully used will improve the user experience significantly.

Itā€™s been an awesome journey šŸ¤© from planet earth, all the way up to Hogwarts School of Witchcraft and Wizardry šŸ”®. I very much enjoyed learning about these topics and writing this article. I believe it was useful for you at least a little bit. As a learner like you, I need to inform you that there might be some inaccuracies in terminology or usage here and there, even though I iterated over it multiple times. All the links of specified topics have been provided and I recommend you to go through those links, understand how things work on your own and figure out what you can do with them. This article just gives you a glimpse into those topics.

Have a Great day, afternoon, evening or Good night, wherever you are in the world. Here is a tribute for you ā¤ļø.

--

--

BeastImran

Creator of telegram @allutilitybot ā¤ļø, open-source lover šŸ’—, Naruto fan ā¤ļøā¤ļøšŸ”„šŸ”„. My website: beastimran.com