SEO Over-Optimization: Why Google Might Penalize Your Site

SEO Over-Optimization: Why Google Might Penalize Your Site

Google may be penalizing good SEO practices by considering them over-optimization: Learn about the SEO factors that can affect your positioning.

Imagine that you are an SEO specialist who sits down every day to optimize your clients’ sites, creating quality, useful or Helpful Content , and following what we all think (or should I say thought) are good SEO practices. 

What would you think if I told you that, because you have done things right from an SEO point of view, you may be in Google’s sights?

According to Cyrus Shepard’s research, Google may be penalizing SEO over-optimization. Google appears to be linking classic SEO optimization factors to content that is targeted at search engines rather than users.

Helpful Content vs Search Engine First

To begin with, we should remember what Google says about Helpful Content:

“Google’s automated ranking systems are designed to present useful and trustworthy information created primarily to benefit people, not to rank high in search results.”

The question that arises is how does Google determine that content was created with search engines in mind and not to help the user? Yes, we all know that what we SEOs do is aimed at improving results. But that doesn’t mean that our content doesn’t have to be of quality and helps users.

What is SEO over-optimization?

SEO over-optimization occurs when there is an excessive use of SEO techniques, in some cases there is so much emphasis on improving the positioning of a website that it becomes counterproductive. One of the most common is keyword stuffing, which is basically overusing the keyword and its variants, forcibly including it in the content, titles and headers. 

What is new is that many of the practices that were previously accepted now seem to be added to the list of over-optimization. 

Do Google updates penalize good SEO? 

In the research, Cyrus used the database of 50 websites analyzed in the study on winners and losers of Google updates and cross-referenced them with Ahrefs metrics. The goal was to examine “classic” SEO factors and the relationship between their presence or absence with the winning and losing sites after the latest algorithm updates.

Although there were several factors where no correlation could be found, there are others where a striking negative correlation was observed. And the surprising thing about this is that those that received the greatest impact were those that were considered as “good SEO practices” generally accepted by the entire sector

But now, let’s see what they are:

Variety of anchor text

image

Internal Anchors/Page | Correlation: -0.337 | Evidence: Strong

External Anchors/Page | Correlation: -0.352 | Evidence: Strong

The research shows that the greater the variety of anchor texts used (both internal and external), the greater the negative impact they received. And this becomes credible considering that according to the recent Google Leak, Google frequently uses anchors as a ranking factor.

Historically, it was considered that the more varied the anchor text, the better in Google’s eyes. Should we rethink link building?

Page refresh rate

AD 4nXcKSVQAeGW10UuJw7U ZkQ7fCGjqP TaIjNHdNlHTN01TXsolVp9vTjHDoxNTuhxPy9wx5OpeMcj6odNN7tM94PGUR1mOJU LTMrWfg2u9u19qYEk0IwnalGDAPmEfERh vueJUs4ZuSzHEo7C3WnItqYlg?key=3iZkjI 6POHoae0y ei qA

Days since last content update | Correlation: 0.455 | Evidence: Strong

There’s an inside joke among SEOs that goes “It’s January already, time to update the year on articles.” And, beyond the joke, it’s possible that some publishers have been updating the date of their articles without making any real content changes. 

It seems that Google noticed this, because the research showed that the websites that were negatively impacted were generally the most up-to-date ones

The average age of URLs on the winning sites was 774 days, just over 2 years. In contrast, on the losing sites the publication or update date was only 273 days, less than half the age of the winning sites.

Another interesting fact: 35% of the winning URLs did not have any date on the page; while among the losing URLs only 4% of the URLs did not contain a date. 

Yes, this section also makes us think. 

Titles that “invite” clicks, or clickbait?

AD 4nXdDW3GE8BJfNt0ReFUx27ig2VlMusK26C6d1 q9g37 Vrsyxh8xoJIIEWpH kIKzLMiGqAWSeht17ZVgGgi8 snvl0C5Jl 5ChXbXm3sAxX533gcoOvPCxlLNIrPRZVBTOfHC8DCCZOUbpaljr8kdkcSqdz?key=3iZkjI 6POHoae0y ei qA

Number of numbers in titles | Correlation: -0.297 | Evidence: Moderate

Number of adjectives in titles | Correlation: -0.420 | Evidence: Strong

Once again, the impact is negative, the highest presence of adjectives and numbers in the titles was seen in the losing sites. In this case, it is a little more logical, it is very likely that Google associates this type of titles with clickbait. Or at least with a sign of overly aggressive SEO optimization.

In this case, Cyrus links it to the leaks from the antitrust lawsuit against Google, which indicate that it uses clicks as a ranking factor. And in that sense, it is likely that Google is trying to reduce the CTR alteration that a “striking” title can generate.

Intensive use of Schema Code

AD 4nXfmuIWub0d6P1mIfV7yZHrxG4zoQMy0ytA qLB3 GeQpm f1vw0xVprnu6zcn BVVhlhRJRKolBj BynWBMYJLnE89YnXI1b5pCBJLejo1kETjTlQTRFVCmOVmVla8uXbYivyMFBsiWAhhrmBJ TPdz92NY?key=3iZkjI 6POHoae0y ei qA

Multiple Schema Code | Correlation: -0.314 | Evidence: Moderate

Schema ItemType Only | Correlation: -0.381 | Evidence: Strong

Again an unexpected result: it turns out that, on average, losing sites implement more structured data than winning sites. Many more.

Structured data used to be considered almost an SEO requirement, and Google kindly gave us an endless list of options. But now we see that many of the winning sites barely used any Schema code at all.

One sample site even saw over a 5000% increase in traffic using Schema TypeScript alone . 

We will have to think about whether we have not overused structured data. Let those who are free from intensive use of schema code cast the first stone. 

Conclusion

In my opinion, I think Google is looking for clues or signals from sites that are created solely for ranking and using what it has at hand to identify them. Unfortunately, in the process it can hurt sites that try to do things right at the SEO level, without losing sight of the goal of helping the user.

And again, Google updates, some logical and others not so much, require us to stay up to date and aware of the news that arises every day in the world of SEO. That and constantly testing and measuring new strategies, to improve our positioning and recover lost traffic if we are impacted by any of the latest updates.

Share this article
0
Share
Shareable URL
Prev Post

What is remarketing?: Everything you need to know

Next Post

Hotel Marketing: Complete Guide to Improve Your Online Strategy

Leave a Reply

Your email address will not be published. Required fields are marked *

Read next