When you’re a content strategist you spend a lot of time explaining what you’re not. No, I’m not an editor, nor a copywriter. Not a marketer. Not a project manager either. And I’m definitely not an SEO guy.
I take a long view of search engine optimisation, and am more than happy to leave the details to people who know more than me. Sometimes though, I get the feeling that those details are taken way more seriously than they should be. This post is my attempt to explain myself (and not make too many enemies in the process).
Search engines want to provide easily-understood information to people who want it. They do that by executing an algorithm that runs rules over the messiness of human desire (as expressed in short queries) and the difficult world of web content (as published by millions of people of all over the world).
When you think about it like this, it’s amazing how often Google gets things right.
But it’s more understandable when you remind yourself that Google is made up of thousands of incredibly smart people. And when you remember the world wide web is built on a set of (evolving) standards designed to make content machine readable.
I like to separate the human and machine elements in all this:
A collection of humans, called Google, maintains an algorithm that tells machines to dig through content stored on other machines. That algorithm then scores the usefulness of that content to humans who want to know a particular thing.
At both ends of this simple story you find humans. Machines are just the intermediaries. When you’re too concerned with SEO, it’s easy to focus only on those machines, and not on the people.
Three sides to SEO
The practice of search engine optimization (SEO) splits roughly into:
- Machine semantics: Making your web content intelligible to machines.
- Human usefulness: Making your web content unique, informative and interesting
- Algorithmic athletics: The constant competition to make your web content appealing to today’s version of Google’s algorithm.
Machine semantics confer the longest-lasting benefits, because the rules change slowly. Getting this stuff right is very prescriptive – it’s necessary but not sufficient for great-ranking content.
At its most basic this is HTML that marks out headings, or metadata (like titles and descriptions) that properly describes a page. It also encompasses website structure or information architecture. To get machine semantics right means following a set of well-established standards.
Search bots and algorithms are far from the only machines that understand these semantics. When new web technology comes around, it is always built on top of existing standards. New browsers and screen readers are two good examples. If your site and content were semantically structured when Firefox or JAWS was released, then these new applications would have been able to serve your content correctly from day one.
Paradigm-changing hardware like smartphones were the same. Standards-compliant (i.e. semantically rich) sites had a much easier time going mobile. This will be true in the future as smart speakers, VR-based browsing and other things attempt to re-contextualize the web. They’ll still be reading content and code and looking for standard markers.
So this stuff is crucial, and it will confer long-lasting benefits, but it’s stuff that literally everyone can get right. It can’t give you a big advantage over the pack.
This is the quality that Google is most concerned with. It evolves over time as people’s preoccupations change, and as sites compete to be more useful than each other.
Being useful to people is much, much harder than creating semantically correct content. It marks you out as a true leader in your field – a genuine source of value for others. It requires an understanding of your target audience and a certain creativity and flair. Most of all, you need to have something interesting to say.
This is where the biggest advantage over the competition is possible. But to achieve that possibility takes a lot of knowledge and commitment.
The irony of SEO is that human usefulness – the ultimate gold that Google wants to find and deliver to people – can’t be directly detected by machines. Over time, useful content ought to gain traffic and thus float up search rankings, but there’s not going to be an immediate jump when you hit ‘publish’. This is sometimes dispiriting. Other things seem to be more immediately effective. And that brings us to…
Remember that Google’s algorithm exists only to mediate between people’s search requests and existing web content. It’s never going to be perfect, and it’s going to keep changing.
The smart people at Google detect all sorts of proxies for human usefulness, and code those proxies into the algorithm. Examples of those proxies include:
- Things that are popular.
- Things that are linked to from a lot of different sites.
- Things that use very similar language to the original search query.
- …and so on.
What I call algorithmic athletics is the practice of identifying and replicating these content traits WITHOUT a similar care for human usefulness. At its worst this is purely black hat stuff.
Practices like these can become normal, and rewarding, without anyone ever deciding to be evil. It’s not crazy to create content because people are searching for it, even if you’re not exactly the right people to talk about the topic at hand. It’s easy to put so much emphasis on keywords that your content becomes difficult to read. If your rankings go up as a result, why wouldn’t you follow those incentives?
Because you’re sacrificing human usefulness.
There are countless sites and suppliers dedicated to algorithmic athletics. It’s easier to bump yourself up search results by following today’s “seven hot tips to clock Google!!” than it is to create a truly authoritative, engaging website. But only one of these things is going to end up giving you an unassailable, sustainable advantage.
Google’s own advice is to get the hard, human stuff right
Google’s own guidelines for webmasters are clear. The first “Basic Principle” is to “make pages primarily for users, not for search engines.”
“A useful test is to ask, ‘Does this help my users? Would I do this if search engines didn’t exist?’”– Google’s Webmaster Guidelines
Every now and then some algorithmic athletics work well enough for low-grade content to overwhelm search results. The trick changes each time (link farming, hidden text and keyword density mathematics are historical examples) but the reaction doesn’t. Google’s human side rises up, plugs the algorithmic weakness, and punishes sites that have exploited it.
Sometimes that punishment falls on companies that were deliberately exploiting Google and wasting everyone’s time. Sometimes it falls on competitors who were just doing what they thought was necessary to keep up. Sometimes it falls on companies who didn’t realise they were paying SEO providers for a sugar rush rather than an exercise regime.
After every algorithmic reset a new set of content rises up the rankings. And that content is closer to what Google wants to serve up – stuff that’s useful to humans.
Meanwhile the algorithm athletes restart the race to decode and exploit the new Google update…
So yeah, I’m an SEO cynic
When it comes to a lot of SEO, I’m a bit of a cynic. I believe that as content producers, we ought to put our energy into human usefulness and into machine semantics. Make stuff that helps your users and conforms to web standards. In a way, we’re better off being ignorant of the algorithm’s specifics.
If we get machine semantics and human usefulness right, there are thousands of clever Googlers who are ready to help us out.
But yes, it’s good to know which keywords will play best in the content we’re writing. We know that linking ought to draw connections our site together to help machines, as well as people, understand things. We know that it makes sense to do research into what people are searching for when we decide which topics to address.
But we need to remember that we are people who are serving people. The machines in the middle aren’t the important bit.