FAQ - NitroPack is not Magic

Last updated on Dec 23rd, 2022 | 7 min

As a co-founder and CTO of NitroPack, I have heard lots of questions and misconceptions about our service over the years. Lately, some of these questions have been causing confusion in different communities.

That’s why I wanted to take the time to write this article. In it, I will answer the most common questions about NitroPack that we haven’t discussed yet. I will also shed some light on a few misconceptions.

Computing on The Web

Computing is about timing. The web is mainly single-threaded in terms of user-space code (JS that you write), and it suffers from similar problems that the computing industry suffered when CPUs were single-threaded. In essence, there are multiple programs that need to be executed, and each one of them has full control over the main thread.

One of the first solutions for multitasking in computers was the so-called "Cooperative multitasking," which relies on the fact that programs will voluntarily free the CPU to allow other programs to run as well. This is very similar to how the web currently works.

A JS block of code can either execute indefinitely, run to full execution or split its tasks into subtasks and give back some CPU time to other programs on the main thread in between the subtasks. You want the latter - by splitting everything into subtasks, you give the system time to either execute other scripts (if there is nothing more important) and work on things like actually drawing the site on the screen and responding to user interaction events.

Our goal at NitroPack is to free up CPU time so that important tasks like rendering and responding to user interaction can run along with the JS on the page. It is going to be a long road until we are fully there, but we believe we are on the right track and what we have so far is a very good start.

To achieve this, NitroPack uses a one-of-a-kind algorithm for loading user resources. At the end of the day, all of the existing scripts are being executed. This is needed in order to preserve a site’s original behavior, but the way these scripts are executed is what makes the difference in user experience.

Many people seem to be afraid that NitroPack is removing scripts or functionality to achieve better performance, but this cannot be further from the truth. One of the options for script loading that NitroPack provides is to delay the execution of some scripts until user interaction is detected.

I often see people considering this a “hack” or “cheating”, at the same time these people are okay with image lazy loading which is basically the same thing - delaying image loading until the image is needed (comes into view).

Delaying script loading until user interaction is essentially lazy loading for scripts, i.e., loading only the necessary scripts to present a site and then loading the rest of the scripts as they are needed (user starts interacting with the site).

NitroPack is a pioneer in bringing this option to the market but other solutions have recently started to adopt similar technology in their offerings. Timing in computing is key and we are dedicated to creating a system with the perfect timings for every situation. We are proud to be leading the charge here.

“Okay, but how can your results be so much better compared to other solutions?”

Script lazy loading alone is great, but it is not the only thing that gives the outstanding results that NitroPack provides. It is the entire package of all our features working in unison that makes these results possible.

Take image optimization, for example. NitroPack provides a complete suite for image optimization, including image compression, adaptive sizing and image lazy loading.

Many solutions have provided lazy loading for years, but it was NitroPack that brought this to the next level by automating the lazy loading of all background images since day one. Other solutions typically provide the option to lazy load inline background images when NitroPack does this for all images, even those coded into CSS files. And it happens automatically for any image.

We are currently aware of only one other solution out there that recently started to provide such an option and it is not even a caching solution.

As for image compression - it is the same story again. Almost no other optimization solution provides this as part of their toolkit despite this being critical for a good image loading experience. Oftentimes users must pair their site optimization solution with an image optimization one trying to make them work together.

Putting it All Together

Pair these optimizations with a world-class CDN (currently based on Cloudflare) automatically applied to all resources, and you got a very good foundation to build upon. Being a truly cloud-based solution, we provide a framework for optimizing websites that is impossible to achieve otherwise.

Let’s look at a simple example - most hosting providers allow you to execute only PHP code in the scope of a web request. Some allow executing code in a separate process, and even fewer allow executing non-PHP processes. This means that the options for running a site optimization become very limited because of the restrictions on what can be executed on the server.

Having our cloud infrastructure allows us to use any system/software/language to perform the optimizations and to have fast interconnects between the systems with the added benefit that optimizing your site actually adds zero overhead to your servers.

“Okay, what about cheating in tests?”

Many people get confused when looking at the output of tools like GTmetrix. Take this for example:

GTmetrix

There are two questions worth answering here:

  • Why are there only 13 resources when I know the site has more resources?

  • Why is the size of some files 0?

The answer to both questions is “because of asset preloading”.

GTmetrix doesn’t tell you the full story. Let’s take a look at the same site inspected with Google Chrome’s inspector:

Chrome Inspector

You can now see clearly how the CSS files in group (1) are being preloaded. The same CSS files are then loaded into the page, group (2), but their size is now 0 because they are being loaded from cache (i.e, no data has been transferred over the network).

The same goes for the JS files in group (3) - they are preloaded and scheduled to be loaded. They are preloaded with their actual sizes.

In a conclusion, GTmetrix will show resources with size 0 simply because they have been preloaded, not because these files are indeed 0 bytes.

Because of the unique way NitroPack preloads these files, they are not blocking the main thread, hence why GTmetrix doesn’t seem to show the rest of the requests. This, however, is a decision made by the creators of GTmetrix, and there’s nothing we can do about it.

At NitroPack, we work on solving page loading issues with respect to improving the real-world loading experience regardless of the output of speed measuring instruments. As it happens, our solutions seem to score pretty well, which is awesome :)

But because of the unconventional solutions that we provide, reading the results of tools like GTmetrix needs to be interpreted differently compared to their output when evaluating a conventional solution.

“Don’t use a speed optimization plugin, fix your code/pay a developer instead.”

I hear this a lot. However, this is not as simple as it sounds. If you already have full-time employed developers who can do this for you and get you good results - of course, go that route.

Very often, developers themselves have a hard time fixing a site’s code to achieve good results. It usually takes a lot of work and effort, which in turn costs a lot. And this is only temporary because the next time you install a new plugin or build a new functionality in your site, you might end up having to go through the same optimization process again.

Instead of building new features, your developers end up spending their time optimizing your site. The larger the site gets, the harder it gets to do this. This strategy only works for small sites or very large companies with teams dedicated to speed optimization. However, more and more large companies are starting to adopt automated solutions like NitroPack because it makes sense and is actually scalable.

On the other hand, spending money on speed optimization for small sites rarely makes sense. We always recommend carefully examining the state of your business and the value that you will get from optimizing your site.

There are lots of free solutions that will give you just enough speed to grow your business initially. As your business grows, the costs related to running it will also grow and you will need to start thinking about speed more seriously. This is when you need to decide which solution you are going to invest in.

“NitroPack is too expensive.”

This is another comment that we get very often.

It seems that a lot of clients compare NitroPack’s subscription price to one-time payment solutions not realizing that these solutions are missing critical features like a built-in CDN and image optimization/compression. Some of them provide these features as paid addons which are monthly subscriptions. Adding the price of all addons to the initial one-time investment makes a big difference.

The advice here is to consider the price of these addons in your solution of choice before comparing any two solutions out there.

“Get a faster server instead.”

The last topic that I would like to cover is the idea that subscribing to a higher tier server instead of NitroPack will be the better choice.

Undeniably getting a faster server is a good thing if your site is running slow, but it is almost completely unrelated to the solution that NitroPack provides. A faster server can give you faster TTFB (time to first byte), which is great, but this is only a small part of the site actually visualizing on the client’s device.

What you gain with a faster server is simply a shorter time to send the dynamically generated HTML code to the client’s device. Once the client gets the HTML, their device renders the site with the same speed regardless of how powerful your server is (assuming static files are served equally fast, which is true in most cases). For cacheable pages, the same effect can be achieved by employing a caching mechanism on a less powerful server.

NitroPack’s main focus, however, is optimizing the process of visualizing the site and loading all of the resources to the client’s device. This makes our service a great addition to low-powered servers as well as powerful dedicated multi-server environments.

Ivailo Hristov

Passionate about all things performance optimizations, Ivailo is the driving force behind our product and core NitroPack features.