Web optimization 🔥. It’s worth it! (Part-1)
You are about to embark on a very interesting journey about optimizing the web in the best ways possible.
After reading this article, you should be able to improve the website’s performance and user experience by almost 120% 🔥, should be able to save 50 to 70% bandwidth, plan for efficient performance, etc.
Let’s start this outstanding journey on web optimization….
A formal intro 😎, My name is Shaik Imran (BeastImran), a Student interested in every aspect of the software industry and very keen to dive deep into software technologies, frameworks and tools of my interest. I do things that address, solve and simplify many people’s lives.
Web optimization Intro
Things to keep in mind
Common things to keep in mind about web optimization:
Web optimization is a great way to improve a site’s performance and experience, but it also needs some care and pre-planning while implementing it.
Everything can be optimized, but not all are worth optimizing.
Know and understand what and why you are doing it before doing it.
Overdoing something can cause the opposite of expected. KISS (keep it simple as stupid).
Most importantly, watch this. I don’t know why, but, yeah, it is what it is.
Things we are going to talk about
- Resources that will help you (websites, videos, tools, etc).
- Getting started with some great tools.
- Optimizations. (modularize, minify, images optimizations, asset loading techniques, compressions, HTTP headers, etc).
Web optimization resources
Get the web's modern capabilities on your sites and apps with useful guidance and analysis from web.dev.
Google maintains a web.dev website that will help you with a lot of things. It has tons of the most amazing content related to web development ranging from courses to guides and articles. This is a great place to start for newcomers.
MDN Web Docs
MDN(Mozilla Developer Network) web docs (Firefox guys) is the go-to place when you feel lost. Find out better ways of doing things. It is really easy to learn, read and explore the unknown areas of the darkness. It has tons of in-depth documentation about most of the web technologies, security-related things, etc.
This is my (BeastImran) personal website which hosts my projects, details, achievements, resume, etc 🙂. Please, feel…
Not really a useful website, but, this is my website which I will use throughout this guide for examples. Default techniques and methods of the site were fine, but I will discuss how I made them far better.
Built with a free Bootstrap template which was good enough for me in design and visual aspect, it came bundled with its enormous assets like hi-res images, huge CSS, jQuery files, etc.
The initial score with the raw template was around 40 to 60, and now, check it yourself.
Google Chrome Developer YouTube channel has really useful videos ranging from best practices to guides, tips and tricks, etc. Everything you would expect to learn and grow.
Web Dev Simplified
Web Dev Simplified is all about teaching web development skills and techniques efficiently and practically. If…
Web Dev Simplified is a fine place to learn about new web technologies. They often upload videos about new technologies, tips and tricks in web development, better ways of doing it, etc.
YouTube itself is a great place to learn, get entertained and commute.
Google is everyone's best friend 😉.
W3Schools, I don’t have any words to say about this site 🙂.
Web optimization tools
The performance of the page is one of the important factors for user experience. Page’s load times, responsiveness, size, structure, etc. They all become assessing factor for performance. The better the performance of the page, the greater will be the user experience.
Have specific, answerable questions about using PageSpeed Insights? Ask your question on Stack Overflow. For general…
PageSpeed Insights is a great tool to measure the page’s performance, approximate user experience and other very useful features which we will discuss later. It will also suggest better ways of doing things. Lighthouse is a chrome integrated tool that gives similar functionality to the PageSpeed Insights site.
other similar tools are:
All of these tools will help you measure page performance in different metrics. Most of them provide great insights into how the page is being loaded, the timings, etc, and also, they will suggest better ways of doing it.
SEO (Search Engine Optimization) tools
Maintaining SEO will make the page rank higher in search results. Suppose your site is not in competition over an area or brand value like mine, even then, minimum SEO should be maintained. If the scene is opposite, your site’s area of interest is in serious competition, need to get the impressions and clicks from search results. SEO is very important.
SEO tools like Ahrefs help us find the right words to use on a page. If you use them correctly, they can be a powerful tool in your arsenal. Other similar tools are:
- Free: Google Search Console
- Free: Google Ads Keyword Planner
- Free: Screaming Frog SEO Spider
SEO is not only about stuffing keywords into the page. It’s also about the metadata your page holds. The data you are allowing the crawlers to reach, read and understand, the structure of your page, the page headers and performance, etc. These are serious business.
Here are few things to get you started.
META tags are the HTML tags that tell browsers and crawlers about your site's info, the data, how it is structured, how to handle it, etc. Here’s an example.
META tags are in the above format. They have a name, content properties, besides them, there are a lot of other attributes like rel, charset, etc.
name=”description” property tells crawlers and search engines the content to show as the description of this page in search results.
We need to tell browsers and crawlers about what format or encoding the page uses or has content of. We can inform it as below.
This will tell browsers and crawlers that the page has UTF-8 content. Almost all websites have their content in UTF-8.
Canonical links are the most important thing to specify on your page. Believe me, they will save you from a lot of situations like the one I faced. Anyone can link your website to their domain by specifying a type A record in their DNS to your server IP address. We call this, hotlinking.
Search engines don’t show duplicate sites in search results, they show the latest domain’s results and exclude duplicate pages. My website was not showing up on search results cause someone hot-linked my server IP address to their domain.
This will tell crawlers and search engines that this is the master URL to show on search results and the rest are duplicates. I used rel=”canonical” property with the href=”http://beastimran.com/” value inside a <link> tag in the above code. You need to specify this on every real page with the page’s respective href url value.
There are a lot of other things to include and improve. Learn more about these topics to see how you can improve SEO, security, etc.
Recommended places to start
Beginners Guide to SEO | Search Central | Google Developers
Search engine optimization can (SEO) help improve your site's appearance on Google Search. Explore a guide designed to…
A Guide to JSON-LD for Beginners [Json Ld Code]
Structured data is a must-have for many sites, but it can be hard to get a handle on the technical considerations…
JSON for Linking Data
Data is messy and disconnected. JSON-LD organizes and connects it, creating a better Web. Linked Data empowers people…
Properties from Organization The geographic area where a service or offered item is provided. Supersedes service area…
A great tool to generate JSON-LD data
Schema Markup Generator (JSON-LD) | TechnicalSEO.com
A Schema.org structured data generator that supports the creation of JSON-LD markups. Including all of the required…
A great tool from Google to test rich-text
Rich Results Test - Google Search Console
Does your page support rich results? What are rich results? Rich results are experiences on Google surfaces, such as…
structured data is being deprecated soon.
Advanced SEO techniques and strategies
Web optimization techniques
I will talk about a few optimization techniques which make a significant difference.
Minify every asset possible
Every site has HTML and CSS as common assets. They get shipped over the wire which is on average 200KB to 300KB of consumed bandwidth resulting in increased latency. No need to be shocked even if they are of 1 to 2MB as of today's standards.
Minifying is a way to remove unnecessary spaces, newline characters, etc. Which will save you up to 20 to 30% of the bandwidth and will lower the latency significantly.
In my case, the main CSS file bundled with the template was around 240KB. It would take at least 5 seconds on a slow 4g and 15 seconds on a slow 3g network to download.
Minifying that file brought down its size to 168KB which is 30% lower than the original size. Now it should take around 3 to 10 seconds to download.
I did the same thing to all of my assets (HTML, CSS, JS, JQUERY) and it brought a total download time of 15 to 17 seconds on slow 3g to 10 to 12 seconds. Which is around 35% of performance improvement.
Still, 10 to 12 seconds load time is huge. Here comes our second technique.
Compress all images
The template came with a lot of hi-res images, and obviously, their sizes are much higher than images used for web pages. My site doesn't need any hi-res images, so I removed them, compressed all of my images to low-quality jpg format images (choose webP if possible) of different sizes/dimensions for different placements.
There are plenty of online tools to use for image compressions and the best part is, most of them are free to use with no limitations. These are some good ones.
Compress JPEG Images Online
Compress JPEG images and photos for displaying on web pages, sharing on social networks or sending by email.
TinyPNG - Compress PNG images while preserving transparency
Excellent question! Let me give you a side-by-side comparison. Below are two photos of my cousin. The left image is…
Online Image Сompressor
Optimizilla is the ultimate image optimizer to compress your images in JPEG and PNG formats to the minimum possible…
Easily compress images at optimal quality in seconds.
Choose multiple JPG, PNG or GIF images and compress them in seconds for free! You can shrink with ease in just a few…
I use this tool most of the times for video or gif optimizing and editing works and is a great tool to have it in your arsenal.
Animated GIF editor and GIF maker
Ezgif.com is a simple online GIF maker and toolset for basic animated GIF editing. Here you can create, resize, crop…
All of this brought the site’s average download from 3MB to 300KB (regarding images) which is approximately 90% lower than the original size.
Improve the resource loading method
Whenever there is a CSS/JS linked in HTML, the browser pauses parsing the HTML, requests the file, downloads it, parses and executes it and then continues with the HTML and the cycle repeats. We call this kind of resource usage render-blocking usage. More about it here.
To improve this situation, we can use modern built-in features like asynchronous requests, deferred request, preloading, lazy loading, etc.
Assets are loaded in order and their size matters 😉. Let’s talk about loading images.
Images are a great way to express and convey a message, motivate and naturally recognizable by people. Most of the time, images are all over the page and do not need to be loaded all at once. They can be loaded on demand, like when it is in the user’s view, or when there’s some kind of use with it, etc.
Browsers have a prominent feature with image loading called Lazy loading. ‘Lazy loading images’ technique will load images just before coming into the viewport of the user. In other words, on-demand. They will also get loaded even when the view has not yet reached but all of the other content has been loaded and rendered.
Learn more about it here. You can do it just by adding the loading=”lazy” attribute in the <img> tag.
You can decide on the server-side which images are going to be lazy-loaded or just load all of them lazily. I recommend images after immediate viewport to be lazy-loaded.
Be aware that this kind of dynamic loading will cause layout shifts all over the places and it will annoy the users. You can deal with it by reserving those images spaces by specifying the width and heights for them in HTML itself.
By doing this simple thing, you can reduce the TTI(Time To Interaction) by a lot which is great for a lot of frustrated/impatient users who are in a hurry and can’t wait for all the images to be loaded 😉 (users are gods, just saying to be on the safe side 😁).
In my case, doing that, reduced the TTI from 5 to 10 seconds to 4 to 8 seconds.
You can also configure images of different sizes to be loaded depending on screen sizes automatically. Learn more here. Here is an example.
You can use the srcset attribute in the img tag to specify images of different sizes which will tell the browser to load the best image from the specified set of images depending on the screen size of the user. If none are available, it will fall back to the default src value.
Placeholders are a great way to keep users engaged. This website may help you find the perfect placeholder you are looking for.
We can load scripts in 4 different ways (common usage). They are as follows.
- Normal loading.
- Asynchronous loading.
- Defer loading.
- Appending script to DOM after the page has loaded.
Here is a video (recommended) to better understand Normal loading, Async and Defer loading.
Another great video to watch:
The normal method of loading scripts
Scripts are linked in HTML as follows.
By doing this, we can decrease the render-blocking time significantly and the rest can be loaded on-demand or when the page is loaded completely, etc.
The things to keep in mind is while using the async method, the order of execution is not maintained. All of the scripts are executed independently and no order is guaranteed.
I recommend using defer wherever possible as it is a better way of improving the performance and user experience than async and normal loading.
A combination of the above three can also be required or might come in handy.
Append after page load
You need to call the function as below when needed. Like when the window is loaded.
else just write it as bellow.
We are adding an “eventListener” to the user’s window with the property of load. This will tell the browser to run a specified function when the page is loaded, that is after HTML and CSS rendering is done.
Learn more about the load event here.
Just apply these techniques in a meaningful way and see the magic.
Fonts are one of the major things to specify, either for consistency over many devices, for brand purposes, theme, etc. Here are some pitfalls!
- Using a lot of fonts.
- Loading unnecessary font assets.
- Loading after content is displayed.
- And this, I don’t know why, but yeah. It is what it is 🤦🏻♂️.
In short, Using a lot of fonts is a waste of resources and must be done only if needed. Usually, pages should have a max of 3 fonts. More than that is not recommended due to many reasons like no consistency in the page, annoying sometimes, over-styling, etc.
Let’s say you are using the bold version of a font, then there is no need to load the whole family of the font. Load the bold version itself. DO NOT LOAD THE WHOLE FAMILY. You can use font CDNs like Google fonts for better performance, but be aware of extra requests it takes.
To learn more about fonts, I recommend reading these articles.
Optimize WebFont loading and rendering
Updated A "full" WebFont that includes all stylistic variants, which you may not need, plus all the glyphs, which may…
How to load web fonts to avoid performance issues and speed up page loading
by Mattia Astorino Custom web fonts are used everywhere around the world, but many (oh so many) sites load them…
When it comes to page load times, file transfer speeds over the network has been the most time taking part of all behind the scene magic ✨. Let’s take my site as an example again.
Compression is a way to reduce file size to save space and bandwidth. There are two kinds of compression techniques.
- Lossless compression.
- Lossy compression.
In Lossy compression, reducing the file size as much as possible is the main aim. We see these techniques in image, video and audio compressions commonly. Image formats like jpg are compressed image formats that aim to provide the same visible quality of the image and by reducing the image file size significantly.
mp4 video formats are also similar to jpg format in images and so is mp3 in audio.
In Lossless compression, reducing the file size without losing any data is the main aim. There are a lot of Lossless compression techniques available like these, Gzip is the king of the web till now. Gzip has been around for decades now and is supported by all major browsers. Brotli is a comparatively new similar technique that promises even lesser file sizes with relatable compression speeds.
In Lossless compressions, particularly in web technologies, compression speeds and file sizes matter a lot. I will talk about how to implement Gzip.
Configure the Ngnix web server like this.
Configure the Apache web server like this.
Configure the IIS web server like this.
If you use some other webservers as I do (Sanic framework similar to python flask) here's what you can do. I am going to use the CPython3 language here.
send_response function/method (whatever) accepts the request that came in and the response you are sending.
On line #6, we read the Accept-Encoding that the browser supports. If the “Accept-Encoding” header is specified, it would look something like this.
Accept-Encoding: gzip, deflate
The browser is telling us that it supports gzip and deflate compression techniques in the above case. If there is no such header, the response without compression is sent on line #9.
When the browser supports gzip compression, we compress the content (body) of the response on line #13, write the compressed content to the response body on line #16, set required headers to tell the browser that the content is gzip-compressed from line #17 to #19, at last, return the response which is far less in size. The response might look like this.
Similar functionality can be achieved in any language you are using and you can use any compression technique. But everything comes down to browser support. As previously said, Gzip compression is supported by most browsers. Check out browser compatibility and learn more about compression here.
Guess the total size of my site now!. The total download size is down from 316KB to an average of 140KB 🎉🥳. That’s approximately a 65% decrease in total size and the load time on slow 3g is now 4 to 7 seconds.
Reduce the number of requests
DNS requests take a long time to evaluate for the first time. Keep the number of different domain requests small, at least for initial content load, do the rest of the things in the background so you don’t block the content rendering. Else you can also host all the assets on your domain itself so that the browser doesn't have to query DNS for multiple domains.
Let’s take an example.
When a browser sees this In HTML, it first needs to know who cdn.jsdelivr.net is. Then it will send a request for /email@example.com/…… page which is the minified Bootstrap CSS.
These links are very common in a modern website built using Bootstrap, jQuery, PopperJS, Bootstrap JS. The problem is, with every new domain, the number of requests, time to load everything, render time also increases.
Another problem is, even though they are great utilities/frameworks to use, they hold a lot of other stuff we don’t need.
Let’s take my website again as an example. It came bundled with the above bootstrap CSS and JS, jQuery and popper JS. The only features used were the carousel, menu slider which could have been done using vanilla JS easily.
My entire website was structured using Bootstrap CSS which made sense. Bootstrap JS was used for the carousel which depended on the popper JS. jQuery was being used to expand or close the menu depending on the user’s screen size automatically or when the user clicks on the menu button. If it was a large (desktop or iPad kind of) display, it would expand with a sliding animation else will be closed.
I removed all of that unnecessary stuff and wrote vanilla JS code that would replicate the same functionalities and they go like this.
Auto open menu
On click open or close menu
Share button (copy to clipboard on click)
Adding these few lines of code in my JS file made all other dependencies unnecessary. This JS file was on my site itself so it also reduces the number of requests. Evaluate your pros and cons about these.
This unnecessary number of extra requests problem is not only with CSS or JS files. Every asset on the page can lead to this issue.
Keep as many assets as possible on your server and minimize the extra requests to other domains. Specify caching policies.
Use the HTTP headers to their full extent. There are a lot of headers to use to tell the browser about many things like, how to handle data, does it have to cache the assets, how long does it have to keep them, what security policies to enable or disable on a page, etc. Learn more about them here.
HTTP headers - HTTP | MDN
HTTP headers let the client and the server pass additional information with an HTTP request or response. An HTTP header…
We will talk about the following few of them:
I hear people say that “we can achieve the functionality of a specific header by adding respective <meta> tags”. I would love to disagree with that argument. Of course, you can use the <meta> tags to do that but, if you add them to the page, you are just increasing the file size, decreasing the performance by telling browsers to parse more info, etc. Reading values from response headers is far more efficient than parsing an HTML and then picking up the values from there.
CSP (Content Security Policy) headers are the headers that tell browsers where to expect the content from, if anything unexpected was found, what to do, where to report the unexpected behaviour, etc. This is mainly useful for eliminating cross-site attacks, code injections, etc. You must specify at least a few important attributes like script-src, img-src, font-src, etc. Learn more here.
By specifying ‘img-src’ values to “‘self’ http://beastimran.com http://www.beastimran.com” I am telling the browser to expect image assets from these two origins only and self means the same origin the request is being made. ‘self’ should be used consciously cause it can be abused to inject unwanted resources. I am going to remove the ‘self’ value after this article is completed. Learn more here.
Modern browsers are smart enough to detect XSS attacks, but not completely resistant. When they detect one, the ‘x-xss-protection’ header tells the browser what to do. As per the above specification, when an XSS attack is detected, the site should be blocked from rendering or should not be accessed.
‘x-content-type-options’ header tells the browser whether to sniff the MIME types of contents on the page or not. MIME can be sniffed from all types of contents like stylesheets, HTML files, JSON files, images, videos, etc. I have disabled the sniffing by specifying ‘nosniff’ values as you can see from the above specification. Learn more here.
You don’t need to load all of the assets every time a user visits your page. You can cache (store locally on the browser for faster access) your assets for a specific amount of time and can control whether to update or not whenever you want. Learn more about it here.
HTTP caching - HTTP | MDN
The performance of websites and applications can be significantly improved by reusing previously fetched resources…
Cache-Control - HTTP | MDN
The Cache-Control HTTP header holds directives (instructions) for caching in both requests and responses. A given…
Expires - HTTP | MDN
The Expires header contains the date/time after which the response is considered stale. Invalid dates, like the value…
There are tons of other useful headers to specify like security headers, asset caching headers, etc. Learn more about them here.
Here is a great tool to test your page:
The Mozilla Observatory is a project designed to help developers, system administrators, and security professionals…
This is an all-in-one test kind of website from Mozilla and is handy. ‘Mozilla Observatory’ will test all kinds of stuff like SSL, HTTP headers, page speed and performance and a lot more. I use this website all most every time.
Doing all of the specified things might give a 120% increase in performance and if meaningfully used will improve the user experience significantly.
It’s been an awesome journey 🤩 from planet earth, all the way up to Hogwarts School of Witchcraft and Wizardry 🔮. I very much enjoyed learning about these topics and writing this article. I believe it was useful for you at least a little bit. As a learner like you, I need to inform you that there might be some inaccuracies in terminology or usage here and there, even though I iterated over it multiple times. All the links of specified topics have been provided and I recommend you to go through those links, understand how things work on your own and figure out what you can do with them. This article just gives you a glimpse into those topics.
Have a Great day, afternoon, evening or Good night, wherever you are in the world. Here is a tribute for you ❤️.