October 7, 2016 12:23 PM / by Joshua Ballard Joshua Ballard

It seems the moment June or July hit, it becomes a metaphorical arms race for the first person to
publish an article entitled ‘optimising your SEO for 2017’ (or whatever the next year is).

There seems to be a little bit of creep every year, in how early some content writers decide it is time to stop thinking about this year, and start planning for next year. Almost as though, if they can show that they are the one planning ahead the soonest that you will instantly be in awe of how prepared they are.

I suppose in that vein, a title such as ‘plan for 2027’ could very well be much the same, only worse…

For the Foreseeable Future Our Industry Will Be Dominated by Google

The real question here becomes ‘just how much of the future can we even see?’

One of my favorite bloggers ever is Tim Urban, and his article on the AI Revolution literally strung a chord with me, ok… figuratively.

The thing is, if you were to look back at 2006 and the state of the Internet and SEO you wouldn’t even recognise some of the advances that have been made in this space.

The technological and computing power of an update as advanced as Penguin 4.0 would have been a mere pipe dream to Google engineers 10 years ago, yet here we are. That’s not to even speak about RankBrain and the implications that come with that.

Even if the future will eventually play out to some form of ‘balance within the search engine force’ and another player rising that can rival Google, we have to assume one thing…

 

Any Google Competitor Will Be Finding Ways to Optimise the Same things: Only Better

Google built their fortune and prominence on simply being the best at delivering the content that people wanted or needed. Its a pretty simple premise.

In exchange for being the best method for people to get the answers they seek, people have been more than willing to swap huge stock piles of their personal data to Google.

The playoff seems fair to many people, insignificant to others, and there are plenty who don’t actually realise that this is what is happening.

It would seem the average person doesn’t care that much about their information being harvested and then used for a profit. Just as long as they are getting the most relevant and useful responses to their search queries.

The emphasis here, is relevant and useful. 

If another company can find a better way to deliver the most useful and relevant content to users and they can get the message out to the wider population, then they may just topple Google. Who can honestly know what the next 10 years holds.

One thing however can be safely assumed…

 

Most of the Ways Google Filters and Ranks Results Make Sense

If we look at something as rudimentary as not awarding rankings to a site that has thousands of tier one forum comment links that have been built with an automated tool, then we can see that it makes perfect sense.

Why would a user find content that has been promoted with simple automation more useful than content that people have actually read, reflected on and then decided to share?

Algorithm updates such as Panda and Penguin can be dissected, and broken down for all of eternity. People can run tests, experiments and all sorts of assessments to try and find the exact mechanics and break down of what has now changed.

If you take a step back though and look at the larger picture, it becomes pretty clear that the main point is that rankings should not be awarded to someone based simply on the fact that they know how to game the system better than someone else.

The quality of someone’s content has very little to do with the methodology that the content writer then used to promote it. So the algorithms are designed to try and provide an equal footing for content that may be superior, but not as aggressively optimised.

Don’t get me wrong, its easy to judge Google and their consistent algorithm updates on the vein that it is simply about moving the cup frequently enough that SEO can never remain as stable as just using Adwords, but I for one don’t think this is their primary motivation.

Furthermore, if we understand the essence of what Google is trying to do, then we understand that…

 

There is a Whole World of Possibilities with how Search Engines May Rank Content in the Future

I was recently discussing dwell time with a new partner as we are looking to provide a make over to one of their under performing sites.

Dwell time is one of those metrics that many people speak about that is not completely confirmed as to whether it is actually a ranking metric or not.

As Joshua Hardwick points out in his recent Ahrefs post on Dwell Time, there is no official statement from Google on whether this is in fact a ranking signal or not. However, there is a great reason to believe that it either is, or will be in the future.

A forward thinking SEO will often find themselves in this situation.

At the moment, we don’t actually know if this is or isn’t a ranking signal, but we can see quite easily how it could be a good indicator of the quality of a piece of content.

The question becomes, should we optimise for dwell time now even though we don’t actually know if it is a signal, or should we stick to what can be confirmed?

My thoughts have always been…

 

Regardless of Confirmation, If We Know It Could Be Used as a Quality Signal, We Should Optimise for it

If you told an SEO ten years ago that Google would eventually introduce an update that can cross check the semantic relevance of keywords and whether they are related to the queried keyword, they may have laughed at you there and then.

The update in question is of course Hummingbird, and the way SEO’s adapted to this update was by using LSI Keywords.

Hummingbird wasn’t designed to become just another game of cat and mouse between an SEO and the algorithm, it was designed to be able to reward rich and relevant content with the rankings they deserve.

If you have written an article on something, and there are certain words that would generally be needed in the exploration of that topic, then it makes sense that Google would want to use the existence of these peripheral words as a way to spot check for quality.

People that were already providing rich and useful content on a topic were rewarded for having already been useful, whilst webmasters that had previously been keyword stuffing till their hearts content were taken down a peg or two.

In one scenario the writer was optimising for quality and human consumption, and in the other scenario the writer was optimising for how they knew the algorithm was programmed.

 

No Matter How the Algorithm Changes, The Signals that Indicate Authority and Trustworthiness Will Not Change

If you just remember one simple rule, you will always be fine.

That simple rule is that the algorithm is changing and adapting in an ongoing attempt to understand human beings better.

The more you think about how you can show a human being that you are trustworthy, or authoritative then the more you will end up showing a search engine in 2027 that you know what you are talking about and can be trusted. 

Joshua Ballard

Joshua Founded Paradox Marketing with the goal of providing high quality Inbound Marketing services. Above all he values efficiency, transparency and communication.

Comments are closed.