Whilst governments attempt massive crackdown on email spammers via laws like GDPR, some of the world’s largest tech behemoths are working hard to eliminate email and link building tactics that sit on the negative end of the bell curve. But instead of regulating with punitive legal measures, they regulate via training their algorithms to learn, adapt and improve how they read, interpret and act against some of the internet’s most prolific backlink spammers. In short, the bots are getting closer to reasoning as humans would when it comes to identification of poor online content or tactics that add little to increase the value to actual users.
As the bots get smarter, eventually even the best backlink strategies, if they are done manually, will be more easily detected by smarter, always-learning webcrawlers. Avoiding manual link building altogether is ill-advised, but extreme care should be taken when doing so, to ensure the path of least resistance is not a shady shortcut.
The Early Days of Link Building Are Dead
In the early days of the internet, a wild west land grab was truly afoot. In those days, simple exact match domains (EMDs) could be purchased, a few low-quality links thrown their way and voila: you’re ranking for high volume search queries.
In fact, in the earliest days, there were no algorithmic checks for site gestation (often referred to as the “Google Sandbox”) that would keep sites’ rankings muted for months or longer, to ensure they were not wielding some fly-by-night, flash-in-the-pan business model for selling shady products/services.
No, in those days, you could rank easily with a basic knowledge of SEO.
Contrast that to today. A more mature internet means billions of pages of indexed content, including that number to the Nth degree of legitimate backlinks. In fact, it is estimated that some 4.4 million blog articles are posted daily.
A more established internet also means that some sites have backlink profiles that are decades old, including millions of relevant inbound links. These types of signals are extremely strong to search engine spiders, allowing the big to get bigger and the small to stagnate without a whole lot of hustle.
This is one reason why, when it comes to online marketing in competitive industries, startups are in a perpetual state of playing catch-up with larger, more seasoned rivals. It’s at least one reason why startups should seek to differentiate, not revolutionize the world.
As early as a decade ago, startups had a better chance at competing by creating low quality content from link wheels, link farms, private blog networks, sitewide links, web directories and other tools for automatic link generation.
Fortunately for the users, today’s algorithms are getting better at outsmarting spam.
How Search Engines Interpret Link Spam
A year ago we experienced algorithmic blocks on emails that were incorrectly attributed to spam. The experience is not only a great case study in how legit sites can also get it wrong, but also how algorithms are now created with their own checks and balances to weed out bad actors.
In this case, we had established new emails under a brand-new domain name, but these emails were attached to an established business, so several of the accounts which were tied to existing email sequences for marketing. These sequences were accustomed to sending more than 200 emails a day from automated marketing chains.
However, due to the newness of the accounts anything over 50 emails started causing an automated throttle-down to occur with the server. This began occurring daily until the account was considered properly seasoned, which tool over a month. In the interim, even the folks in support could not help us get away from what was manually included in the algorithm.
The machines had interpreted the account as 100% new, assuming that, as a new business account, it would not be normal to have larger numbers of emails sent immediately. We were told it was one of many failsafe layers included in the email algorithm, in order to prevent large-scale spam emails.
Much like the automated email spam blocks, automating how search engine algorithms interpret spam is very similar. The algorithms look for patterns inherent and common among those building links in a spammy fashion. Here are just a few.
Link Velocity – How quickly are links being acquired? Are massive numbers of similar links being pointed to specific pages all at once has the growth in inbound links occurred more naturally over a more extended period?
Anchor Text & IP Variability – Is your website using the same term over and over to rank for “buy used cars online”? Are your links coming from different IP addresses? How varied are those IPs?
Link Variability – Are there more incoming site links to specific pages targeting highly-competitive keywords? When it comes to outbound links, are there a good mix of both internal and external links from the site in question? Does each article being written include outbound links to other sites using commercial terms or are the links natural and pointing to pages relevant to the user experience that answer actual user queries? This also applies to sitewide links, including those in the footer or on a blog sidebar.
Co-Citation & Applicability – Are references in specific articles applicable to other links referenced in the same article? Are sites whose theme or business typically revolve around travel suddenly linking to sites talking about CBD or insurance in a way that looks outlandishly like link spam?
Domain & IP History – Has the domain been owned by the same person or entity since inception? If not, when the shift occurred was there a simultaneous shift in the theme, content and industry sector of the site itself? Was there a major shift in how frequently the site was posting? Has the way the outbound links looked varied heavily?
Redirects – Unless a company is doing a complete rebrand, links that were once ranking on one site should not regularly be funneled to another site to pass link equity. If redirects are occurring, it is more natural to have them occur sitewide.
TLD Extension – Certain TLD extensions are more frequently used for creating spam. Top Level Domain extensions like .info, .biz and even .co have been nefarious for this.
Site Legitimacy – Illegitimate websites typically exclude things like social accounts with legitimate followers in tow. In contrast, links from real websites
In each of the areas listed above, robot algorithms are getting better at understanding the real from the fake. They also are getting better at looking for patterns in what constitutes a spam backlink. In cases where a telling number of violations occur from any of the above, bots may trigger a manual review which could jeopardize the amount of time you have spent building out your SEO.
The Misnomer of Negative SEO
Ten years ago, a strategy known as “Google Bowling” was effectively implemented by competitors looking to tank sites ranking higher than them in the SERPs. In doing so, competitors could hire cheap contractors from sites like Fiverr to quickly impact the link profile of specific pages on a website using spam sites and manipulated links using phrases that only spammers might use.
Lucky for most legitimate businesses, the algorithms can almost bat a thousand when it comes to completely ignoring links from heinous attacks like this.
While such attacks are obvious, knowing how they occur and how manipulating link building occurs are one in the same. If you find someone manipulating their profile, it will be that much easier to use them as their own case study and find the pattern that can help to identify others doing the same thing.
Find the pattern and you can find the offenders.
How the Future Looks for Spam Link Builders
The human, manual elements of SEO are quickly disappearing. Yes, human’s currently control the bots that are making the decisions and humans are creating the content that is being consumed, but what happens when the algorithm gets smart enough when they know the links they are seeing look manipulative, unnatural or down right shady?
More importantly, as search algorithms improve, they continue to place less emphasis on hyperlink-based signals in general, instead opting for user dwell time, bounce rates and direct user feedback. Consequently, even those who play by the rules when it comes to manual link building are also most likely to be demoted compared to competitors if the user experience is not at least at par.