my alt

What are Alternatives to Tracking Tools that Offers Similar Perks

It doesn’t require classic Tracking Tools like Google Analytics to build and maintain your business or online present.
There are alternatives, which will help you as Webmaster to preserve independence and grant your visitors and clients privacy at the same time.

Businesses heavily rely on digitalisation, and the best way to reach the maximum number of customers is classically through Google. The reason businesses chose to add google analytics to their websites is the data analysis tools it offers. These tools help businesses keep track of their website’s performance, determine keywords, behaviours and impasses in their online presence.

 But there have been practices by google that were questionable. And now, ever since google chrome announced the Federated Learning of Cohorts model (FLoC), it raised serious ethical questions on its business practices. So how can businesses survive this change while keeping their vow of decency and mutual respect towards customers?

In this article, you’ll find tools that will provide the perks businesses enjoyed through google analytics but without going behind the back of your users/visitors. 

Going full-fledge manual on Web data evaluation like a Pro

If you do not want to use paid analytical tools, here are some awesome hacks that can help you ace your website evaluation and run your business as smoothly as always. 

“Site: domain.tld”

Note this down because this hack will help you determine how your website is doing in terms of ranking. Type site:domain-name.tld (TLD means “top level domain”, classically .com or .org etc) into the Google Search and hit “search”.

The results shown are all items Google found in your domain and deems important to index. Note, the results may not include all items Google found on your domain, however, it show the items Google deems important, and as well it shows how your websites’ single posts, pages and archives are performing. The first result is the “top indexed”, thus, the one Google thinks is the most important page on your site.

With this method you can quickly find top performing items on your website and also figure out what Google actually thinks is important on your website. You can also find possible 404’s with this technique, or even rogue sub-domains which you may have forgotten. 

How to figure out “Search phrases.”

Google was very helpful in identifying search phrases for search engine optimization. It was easy to just figure out what customer/user/netizens were looking for. If we do not want to use Google Analytics, there is another, almost simpler and less intrusive technique.

[tb-dynamic-container provider=” source=” field=” removeDeadLinkTarget=”true”]



A manual way of doing it is by typing (any) keywords in the Google Search bar and waiting for Google’s suggestions. Google will suggest the most searched keywords related to your desired topic.

For example, You can just type “Books,” and a list of suggestions will appear.
You can use those suggestions as your keywords or even long tail search phrases. 

Just beware that when we search for something, and are logged in to Google Account, our device stores it in our cache memory for next time. Those cached search phrases are logically not what we are after, we want only the genuine results (suggestions of Google). Thus either log out of Google, or make sure to not interpret the Suggestions with a “Remove” option in the search suggestions as actual Search Suggestions by Google.

Any suggestion by Google, is basically based on other people’s search. Thus, if Google suggests “Bookshelf Minecraft” upon typing “Books” then that means many other people search for “Bookshelf Minecraft” and you know this is a “hot topic” related to Books.


The Internet is all about fads, hypes, and trends. When you want to know the most searched phrases, just go to trends. You can insert any keywords in the top bar to find how it performs worldwide in Google Search. There’s also a “Current Trends” section, showing the latest search trends of netizens worldwide.

There’s a section of “similar search phrases”, which will help you find associated keywords to your search term, or also long tail search phrases. Long-tail keyword phrases are more specific and laser-focused compared to short-tail keywords, such as “Books to read” is a short tail phrase, whereas “Books to read about Central European history” is a long-tail keyword phrase with a very well defined destination on web search. 

Note that this page is location based, so you may want to change the location in the search tools of the Google Trends page, to get results of worldwide searches or localised to a specific area.

How to submit a Sitemap without Google?

A sitemap is a file that carries all your domain information (pages, pictures, videos, or any other files on your website). Google and other search engines can use this sitemap as a directory to find and crawl pages on your website. It is not required, if your site is well linked internally the modern crawlers will find the content either way, but it is a great tool to indicate more hidden content, or give weight and other crawl instructions to the bots.

[tb-dynamic-container provider=” source=” field=” removeDeadLinkTarget=”true”]



One of the most asked question when we suggest not to use Google is “But then, how do we submit our Sitemaps to Google?”

First of all, there are several validation tools on the Internet that do not need support from google to validate your sitemaps. Just remember that it wasn’t Google who invented Sitemaps, thus, look out of the “real guys” and let them validate your Sitemap.

You can get validation by submitting your sitemap on XML-Sitemap. The tool will tell you whether your Sitemap is valid or not, and indicate the problems if any.

Second, you do not need to submit your Sitemap or register it anywhere, as long it is valid, the bots will find it. You can – and should – also add the sitemap location to your Site’s robots.txt file. This file is read first by any serious bot visiting your site and thus, it will immediately find your Sitemap.

That’s It!

How to avoid failing CTAs and “measure” their functionality?

To ensure that any/all of your site’s call-to-action (buttons) are fully operational, you can ask friends or peers to give a visit to your site and review it.

You can even hire an online assistant to evaluate your site from time to time and keep you updated. Since your users are human beings, why bother relying on bots. Test your prototype with actual humans.

The classic “Ask your Grandma (or anyone, within your target audience) to visit your site and tell you what she thinks” is one of the most effective evaluations you can get.

I really need analytics. I can’t without.

The self-hosted solution, not cloud solution!

If you still think you need analytics with more insight, you still have better choices than Google, while retaining a fully valid and professional insight in your site’s performance and visitors behaviour.

It is common knowledge that each website’s data is stored on its server.
So instead of using third-party cookies or “spying” tools, why not capitalise on your own data, by storing the analytics on your own server.

Doing so, you have full control over the tracked and analysed data, and you can process your user’s data deletion request (so no other spybots will still be following them).

You can be as discrete or transparent with your customer as you want to depend upon the nature of your business (for example Online therapy, health services data is the user’s very personal data that must be protected). s the most favored self-hosting solution with analysis tools that you can customize, and it does not manipulate or retain your user’s data in any external instances (like Google does). uses no cookies and works under the General Data Protection Regulations, e-privacy directives (Privacy and Electronic Communications Regulations. (PECR)), and California Consumer Privacy Act (CCPA).

Plausible is a European Union-made analytics service that drew its regulations from European laws and ensured user’s privacy without damaging businesses’ growth. It is a very convenient and reliable solution.

They have a cloud based solution (paid, and external) but also a self hosted solution (free, and you own your data!).

How to validate Schema Structured Data, also known as LD-JSON?

In 2021, almost every website uses structured data schemas, see Google Analytics and other tracking tools automatically tell us when there are problems with our Structured Data snippet. Convenient! But you need to sign up for it and thus, also Google to track your site.

If you do not register your site with Google Analytics you wont receive these warnings, of Course.

However, you can simply test your Schema data using (oh irony) a Schema Structure Testing Tool made by Google. Since it does not need any subscription, it’s free and can be used without a tracking code, it allows you to test your data structure without actually letting Google see other parts of your site and specially, retaining the visitors privacy.

A word on CDN

Google is not the only instance tracking your data.

As far as the CDN (Content Delivery Network) such as Cloudflare is concerned, they have as well tracking and statics setup which “help” you to view your sites performance. However, again at the expense of your and your visitors privacy.

Ditch CDNs and optimise your website professionally.
It will cost a little extra, but it will earn you the trust and reliability of your customers. 

[tb-dynamic-container provider=” source=” field=” removeDeadLinkTarget=”true”]



Removing CDN will not affect your site’s efficiency; with larger traffic, all you’ll need is better optimization.

Remember that any CDN which transfers Data from your visitor to your server, terminates TLS (temporary Layer of Safety) and then re-encrypts the communication on their node. This means, the famous “HTTPS” security is 100% lost on the CDN server, because they terminate TLS. You are thus completely relying on the word of the CDN Provider to not share the decrypted data with third parties.

Remember that (amongst others, Cloudflare) are registered companies in countries that may anytime force those companies to disclose that traffic. Classically, if anyone else than a CDN terminates TLS between your server and the user, we would be speaking about a “man in the middle attack”. So why trust a company just because they state on a virtual piece of paper, that they wont disclose, or use to their own advantage this data?

It borders to irrationality to trust a company that knowingly terminates TLS, but then freaking out when a Website does not show the green HTTPS Lock…

By not using CDN, you can guarantee end-to-end encryption, and if you optimise your website you will actually get better results than with a CDN.

The removal of CDN did not affect our website’s performance, rather, it improved it!

Page Sizes and Requests
Improved Page Sizes and Requests

It handles the traffic proficiently and loads as quickly as it used to before removing CDN, if not better.


Letting go of Google Analytics might sound a bit challenging, but it is a long-term decision that’ll serve your business well.

Your website will be in compliance with all the privacy protection regulations, reliable and reputable.

Trust is the currency of business, and being under fire for unethical practices will damage your business severely. 

Related Articles

Based on our Cosine Similarity Analysis
What are Alternatives to Tracking Tools that Offers Similar Perks

Why we shouldn’t use Third-Party resources and CDN Services

We will explore more in depth why we shouldn’t use Third-Party Resources and CDN Services. We have written previously about why we shouldn’t use Analytic Tools for our Websites, and extend the topic in this article by exploring more in depth the negative aspects of Third-Party Resources and Content Delivery Networks (CDN). Almost every webmaster […]

Read more

What are Alternatives to Tracking Tools that Offers Similar Perks

Why We Should Not Use Analytic Tools On Our Websites?

In this Article we will explore one of the main reasons why ethically aware WebMasters and Developers should not use Analytic Tools on our Websites or Applications. Visiting a website and relentlessly clicking “I Agree” is a regular user experience. Millions of websites use first- or third-party cookies and users cannot proceed on the website […]

Read more