When someone asks me what SEO is, I usually say that SEO is not a discipline for the poor in spirit.
The evolution of Google (and of the others search engines) is constant and, as it happened in the last 18 months, even a frenzy.
Therefore, if you are someone, who has problems in discussing your own certainties, if you don’t have an inquisitive mind and, if you don’t recognize yourself amongst those ones who like to break things for understanding how they work, then you should better choose another digital marketing profession and not SEO.
Focusing only on Google, in the last 18 months (substantially since when John Giannandrea substituted Amit Singhal as Head of the Search Team), we saw big and radical changes in Google search, which we can resume under two main concepts:
- Machine Learning;
- User Experience.
Moreover, we can consider these two concepts has the most modern answers Google gives to the same old question it asks itself since 1998: how to offer to users the best results to their queries?
Google and Machine Learning
Let’s start being totally clear; Google is (still) not using Machine Learning for determining the ranking of a web document.
Google is using Machine Learning – as far as we have knowledge of – for better interpreting and understanding the content published more and more every day.
It is not a case that the most interesting papers about the use of Machine Learning are related to:
- Natural Language comprehension;
- Image comprehension (and Parsing in general).
- Classification, meaning with this word information retrieval and a better comprehension of ontologies, taxonomies and categorizations that us, the humans, do in almost automatic way when we refer to this or that, but that that is something the bots and its algorithms couldn’t understand in an exhaustive way because we have this “bizarre” costume of “leaving out logical passages” without losing the core of our reasoning when we think.
Natural Language Comprehension
The majoritarian use of mobile has the place where surfing the web, and the growth of vocal search has changed how we search something.
However, it is nothing but the old difference existing between queries and keywords, and now it is more convenient to think queries first and not anymore keywords as such.
This implies creating content that can answer to more queries logically related between them and, consequently, targeting to more keywords consistent to the intentions tied to those queries.
But, what is it Natural Language?
Simple: the language as it is used normally in a plain conversation between humans. Nothing more and nothing less.
Obviously, we could dig into the different tones Natural Language can assume depending on the context of the conversation, but the rule of thumb about it will not change.
Moreover, as SEOs, we will still need to focus on those keywords that we consider as the most important ones, but following the old best practice of not over optimizing our text, enriching them – instead – with the right use of synonyms, antonyms, rhetorical figures semantically consistent and related concepts and entities.
Using a very simple example, if we are writing about Apple the technological company, we would naturally use words, concepts, and entities like iPhone, MacBook, hardware, software, operative system, iOS, Steve Jobs et al.
On the other hand, if we are writing about apple the fruit, then our dictionary will include organic, biological, fruit trees, mountains, juice et al.
Natural Language, then, means also writing in a structured way, as when we write something on Word, with the only exception that what in Word is Title Title 1, 2, 3 et al), in HTML it is Heading (H1, 2, 3 et al).
Image Comprehension (and parsing in general)
The use of mobile is strictly linked to the success of the visual search.
This partly explains the success of totally visual APPs like, for instance, Instagram, or of those ones who have in the images (and videos) one of its main content assets.
Thanks to Machine Learning Google has advanced spectacularly in the recognition of what it is represented in the images and, consequently, in their classification.
The evolution is such that we can already see cases where not only Google it’s able to read writings in a photo, but also it indexes that content!
However, we can – and should – help Google in better comprehending the meaning of our visual and not visual content and the relationships existing between the elements composing it.
This means prioritizing structured data in our SEO Strategy.
There are many reasons for that prioritization, being these the most interesting ones:
- Google prizes the websites implementing structured data with rich cards like, for instance:
- Domain category listing (or carousels), which allows a user to see all the items of a category page directly from Search;
- Related Items in Google Images, so that a user who is searching – for instance – summer bags, can see suggested other bags as related items enriched with price and link to their corresponding product pages.
- Because as much as we explain Google what the things presented in our content mean, the more it will classify the content itself relating it to the most appropriate set of queries (and keywords). This doesn’t mean that our content will rank better, but yes that it will have more opportunities, if it also positively answers to the ranking factors, to be used by Google thus for the larger set of queries done about the topics our content is about.
Classification: ontologies, taxonomies, and categorizations
Human brain naturally thinks in categories.
Categories, then, are founded on the recognition of a common ontological domain between objects and concepts, which we classify using a different kind of taxonomies.
For instance, if we think to the broad ontology “Animal Kingdom”, we know we are talking about entities like human beings, cats, fishes and snakes and not about plants and rocks.
Moreover, we know that there are two different kinds of animals:
And that these latter ones are classified in:
Then, digging down the classification, we reach the distinction between the different kind of primates:
- Homo Sapiens (us)
The same logic is used for creating taxonomies based on physical characteristics, non-material ones, distance, time, etc. etc.
Ontologies, taxonomies, and categorizations are logic, and over logic is based the human thinking that the algorithms of search engines try to replicate to offer better answers to the users.
Therefore, albeit all of this is something basic in SEO since its very beginnings, right now it’s even more important, because a correct information architecture implies creating a solid consistency in our site, made explicit by a correct navigation architecture and solid internal linking, which can create logical connections between different but related documents present on our website.
Because of this, one of the biggest mistakes we can see affecting the SEO visibility of a website is related to an incorrect categorization, and in that sort of frenzy, many websites’ owners suffer, which brings them to endlessly synthesize and eliminate taxonomies and categories elements.
Google and User Experience, or what Quality means for Google
User Experience, or using a catch-all word like “quality”, is the other pillar over which Google is reinventing itself as a search engine.
About the meaning of quality exists a lot of confusion because – let’s be clear – it can be quite changing its meaning from one person to another one. In other terms, we tend to be exclaves of subjectivity when it comes to defining Quality.
Generally, we marketers tend to consider Quality over these two main reasons:
- A content is quality content because they like it (the classic syndrome of looking yourself in the mirror);
- Quality is usually measured on content and never on their containers.
These are two big mistakes because it is thinking about Quality as something only aesthetic and not something that must always refer to the search experience inside Google.
A mistake that leads, for instance, to the creation of beautiful but totally deficient and ineffective websites.
When we think about Quality, we must ask ourselves this question:
What is the idea of Quality that Google thinks its users have when doing a search?
The answer is not that hard to discover, because we can find pieces of it in practically everything Google publishes, in how it is “guiding” the websites’ owners in implementing one technical solution and abandoning another one, etc. etc.
For Google, a web document is of good quality when:
- It positively answers to the search intention of a user;
- It offers a clear answer to the questions that the user is implicitly and explicitly asking when searching for something;
- It is easy to consult, especially from mobile;
- It offers a secure environment to the user, who consults it, especially if the context implies the user to offer private information to realize any kind of conversion.
It positively answers to the search intention of a user
Since the beginning of this year, Google is almost regularly rolling out non-official updates (Fred apart), which we the SEOs have defined as “Quality Updates”.
Image from Search Engine Land
Many speculations had been done about what are the factors behind these updates, but on one thing everybody agrees:
- The pages that positively answer to the search intent implied by the keywords they rank for, see their ranking unchanged or improved;
- The pages which negatively answer to the search intent implied by the keywords they rank for, see their rankings tanked.
For example, if a search implies an intention to buy and the page of our site that was ranking for that search does not present a clear answer to that intention (for example it is a review of a product, but we cannot buy it from the same page), then that page will most likely begin to lose visibility in favor of others, which, on the contrary, do offer that opportunity to the users as well as offering a better information about that product or service itself.
It offers a clear answer to the questions that the user is implicitly and explicitly asking when searching for something
This point is an obvious consequence of the previous one, so much that it could appear redundant.
However, I want to introduce it because it implies something that many SEOs do not just understand: it is not the length of the content but its consistency with the intent to which it must respond that matters.
In other words, not all pages of our site should be long-forms, and not every “thin content” is bad by default. Moreover… the HTML to text ratio is generally one of the silliest metrics SEOs follow.
It is true that long-forms tend to rank for a greater number of keywords, but the question we must ask ourselves is: are all those keywords really the ones for which our page should rank for?
In many cases, the answer is a resounding no!
Moreover, this explains how Google often ranks pages that we could consider as “thin content” instead of our very long and often useless pages.
What does imply this point?
It implies that we should pay more attention in understanding what content the users really want from a website when they search for a certain thing.
For example, if you look for something like “summer handbag models”, chances are the users will find more satisfaction from a page where the visual element is predominant with respect to the textual.
Conversely, if the users look for something like “how to prepare a paella in an easy way”, most likely the content they will prefer will be a step-by-step video enriched with a pointed list text style that may be downloaded as PDF and printed to have it handy when they are cooking.
Then, the importance of looking with attention the content formats of the pages ranking for our targeted keywords, and the kind of SERPs features Google presents for those keywords is an essential exercise, which must be done by SEOs.
Only by thinking in this way will we can create content that obtains real and positive engagement metrics (Dwell Time, View Time), and that – on the contrary – will not be affected by those that are beginning to be shown as negative factors, Pogosticking.
It is easy to consult, especially from mobile
We know that the volume of searches from mobile devices has surpassed the volume of searches from desktop in the clear majority of industries and niches.
In addition, we know that what matters for a user, who browses from mobile is that the page he is visiting opens quickly and is easy to consult and “use” with a single finger.
For all this, Google continues to insist since years now on everything that is web site speed and usability from mobile.
Then, that’s why Google is pushing so many efforts in promoting development platforms like AMP and PWA.
Watch out! The performance of a website is not (yet) a ranking factor, but its importance is so great in terms of conversions that not making the Web Performance Optimization of our sites a priority is – today – totally stupid.
Equally important is the usability on mobile devices, and do not think of because your website is responsive, this is usable from smartphones. Experience taught me that it is quite the opposite!
It is very likely that it is not, so using tools like Mobile Friendly Tool, Lighthouse and Chrome Developer Tools should be a daily practice for all SEO (also check this great checklist by Builtvisible)
It offers a secure environment to the user
Recently a study was presented according to which around 50% of the search results that are positioned in the top 10 are under protocol HTTPS.
That is, the entire Google communication strategy for migrating websites from HTTP to HTTPS has worked flawlessly.
As I said before speaking about speed, HTTPS is not (yet) a ranking factor, but still resisting to migrate from HTTP to HTTPS is – again – something very stupid.
Not only is HTTPS a must-use if we want to use development platforms like AMP or PWA, but more and more Chrome (and Firefox) present non-secure site alert screens when we are about to enter in a transactional website (e.g.: an E-commerce) or simply ask your visitors to leave their emails to be able to get an ebook or just sign up for a newsletter.
SEO as Search Experience Optimization
Another answer I give since many years when I am asked about what is SEO, it is that SEO is not really Search Engine Optimization… because, well, we do not optimize search engines, but it is Search Experience Optimization.
Search Experience Optimization means working for our websites and our customers to respond positively to all those factors that Google considers as indicators of a quality web.
These factors are of two types:
- Technical factors, which refer to Google directly and how the search engine discovers, understand, indexes and ranks a website;
- Marketing / Communication factors, which refer to the search experience of Google’s users directly and to how they look for answers to questions/pain points of different nature and what they demand from a website that aims to offer these answers.
Looking at these two types of factors, the dual nature of SEO as technological marketing is evident, where technical knowledge and public understanding (marketing and communication) are symbiotic: one cannot succeed without the other.
Post from Gianluca Fiorelli