Google is translating more and more English-language websites and linking the translations on its search results pages. The result: fewer clicks for the originals, more clicks for Google.
If there aren’t enough high-quality web pages in a language to display in Google search results pages, Google translates English-language sources and displays them in search results. The translations are hosted on a Google subdomain (translate.google.com). An example URL for a translated page is www-your-site-com.translate.google/path?hl=es&sl=en.
The traffic this subdomain receives has recently increased significantly. As ahrefs reports, this coincides with the global rollout of Google AI Overviews in many different languages. It’s possible that there isn’t enough content for all languages at the moment, which is why Google is using this technique.

Here is an example of a translated Wikipedia page:

According to Ahrefs, the countries with the most translated pages are India, Indonesia and Brazil.
Check if there are Google translations of a website
To see if your website is affected, you can use the “Search Display – Translated Content” filter in the Search Console:

You can also search for your website in the relevant countries in incognito mode or search for the referrer “translate.google.com” in Google Analytics.
Avoid automatic translations
There are several ways to counteract this: The best option is certainly to translate your own content into different languages and link it with hreflang. However, this is a lot of work and hardly possible for all languages.
There is also an opt-out for translations: To do this, you simply need to set the appropriate meta robots tag. For example, to exclude translations from Google, you need to set this tag:<meta name=”googlebot” content=”notranslate”>Alternatively, you can also use the X-Robots header X-Robots-Tag: notranslateset.