utm_source=crap: How Google Analytics Breaks Links + User Experience
Over the years Google – the dominant analytics provider on the market – and marketers have increasingly inserted so called utm_source etc. parameters into third party website addresses (so called URLs).
I have warned against this type of URL or Internet address pollution repeatedly to almost no avail.
Clean URLs are an SEO and UX best practice for more than a decade by now.
Yet it seems we’re in a middle of a wide-spread regression to pre-WordPress and human readable website addresses now.
More than that: it’s an era where third parties actively get away with harming your website usability and findability by inserting unwanted code into your site to make your URLs dirty again.
Google doesn’t care about the people
Google takes no prisoners when it comes to tracking people around the Web anyway and many marketers apparently play dirty whenever their bottom line justifies it.
I’m neither a marketer nor a Google shareholder and thus
I view this trend of adding lots of crap to website addresses with growing annoyance.
As an optimizer of websites for people or popularizer I’m mostly interested in making things work as smoothly as possible so that my visitors get
- what they want
- when they want
- how they want it.
Then I had to find a more sophisticated solution. In case you just want that scroll to the bottom of the post.
Tracking visitors across websites
In contrast Google and marketers are predominantly focusing on making money and proving it by tracking people.
You probably know this tracking phenomenon from the ads that follow you around the Web
for weeks after you searched for something on Google or have checked out some online stores. I minimize such annoyances by using
- anti-spyware tools
- ad blockers
- privacy-oriented search engine DuckDuckGo
and even uninstalling Adobe Flash when browsing the Web. Most invisible tracking cookies use Flash.
While utm_source URL pollution may seem harmless in contrast to tracking cookies that follow you across most websites you can’t simply block it.
Every webmaster needs to take care of this parameter pollution her or himself.
As a user I often clean up web addresses of web pages I visit manually by simply removing the redundant parameters from them. Why? When copy and pasting addresses into a mail message
- I don’t want the address to be huge and get broken down in several parts by the character limit of the mail software.
- I don’t want to scare people with strange additions looking like some virus no human can decipher.
- I don’t want people to think that some of the parameters are the actual URL, especially when they include sources like twitter.com
How Google Analytics breaks links
Long story short: utm_source parameters are big turn off for people and can potentially ostracize them. Nothing new here. I’ve been preaching that for several years now.
Yet it gets worse. It’s not just about the the ugly and misleading looks or general user experience.
I started to encounter links containing utm_source parameters that actually break links!
Yes, perfectly working links without parameters end up being broken after utm_source crap gets added to them. Look at this example here I have taken from a real life newsletter I received:
[update: the link has been “fixed” during a relaunch. Now it end sup on a 404 page with or without the crappy parameters.]
When clicking it on any browser – I tried
– you will see an empty page.
It’s not the first time it happens to me that utm_source parameters literally block content.
This is a partial screenshot taken on my Firefox: Yes. It’s blank! The page hasn’t loaded correctly! Without the parameters it works perfectly fine though.
Enlarge your website address now!
The address is pretty long by itself, it’s 117 characters long:
yet the added utm_source parameter crap amounts to another 191 characters. Thus the useless tracking part of the URL exceeds the original URL by far.
Why would you allow someone to sabotage your website address like that?
You may argue that is an extreme case of website address pollution that rarely happens. Thus I’d like to show once again a more common example.
Here’s a link polluted by the Buffer tool many marketers use to automate their content sharing on Twitter and beyond.
the original URL is shorter than the polluted addition. It’s also redundant as you can see.
It says Buffer twice (great advertising for them!) and adds Twitter.com to your URL so that some people and tools might even mistake that for the actual address.
This example shows how a third party or rather three of them literally conspire to pollute your website address:
- Google with its market dominating Analytics
- the Buffer social media automation tool vendor
- the careless or downright dirty marketers.
All three either don’t care about proper UX and SEO best practices or prefer to sacrifice them in favor of short term ROI (Return On Investment) measurement to prove their worth.
In Google Analytics website statistics Buffer will appear as a major source of traffic this way.
I have made the Buffer team aware of the issues last time I have written about the user experience issues but no change has been made ever since. They followed me on Twitter but didn’t act on it.
The only revolution is a better solution
Try to add utm_source=crap to my URLs though and see what happens! Just click this link:
The parameters start after the question mark: “?” They get automatically counted and discarded then.
You only need to add it to your exiting Google Analytics code in case you use GA like I still do. There’s another similar script on StackOverflow.
To remove utm_source crap from other sites you visit automatically there is a simple tool as well. You can find and download it on GitHub.
In Firefox you can go to -> Add-ons -> User Scripts and add it manually as a “New User Script”.
Last updated: March 19th, 2017.