On Feb 3, 2017, Google Japan announced a search quality algorithm improvement. It affects low-quality websites which lack originality, usefulness and reliability of content.
It is a local algorithm applied only to Japanese Google searches.
As a result, many media websites consisting of recast contents – low-quality contents which curates or rewrites original contents from other webpages, and adds little or no original value – have been demoted. But it does not mean Google bans or discriminates against the usage of content curation and rewriting.
This article contains a translation of the announcement, and explanations – what this announcement means, what intentions it reveals and why it was done.
1: Translation of the announcement
To Improve Search Quality of Searches in Japanese
To make Google search better and more helpful for users, Google keeps on improving its search ranking algorithm day by day. Needless to say, Japanese Google search does so.
As a part of such efforts, we improved our methods of evaluating the quality of websites this week. Through this update, websites will be demoted if they neglect to give users useful and reliable information and focus more on showing their pages higher on Google search results. As a result, high-quality websites with original and useful contents will be ranked higher.
This improvement is a countermeasure to low-quality websites showing up on Japanese Google search. We hope improvements like this can help make the web ecosystem better – an ecosystem in which providers of useful and reliable contents are appraised as they should be.
This improvement is just a part of our effort, and it won’t solve all of the problems with Google Japanese web search. To improve search quality, we keep on improving the search ranking algorithm.
Source: Google Webmaster Blog (Japan)
2: Q&A and Analysis
What’s the name of this update?
It is not named officially.
What kind of websites are affected?
Websites with lots of recast contents tends to get affected heavily.
It is natural since Google tells “websites which neglects serving useful and reliable information for users, and focusing on making their pages ranked higher on Google search results.”, and want to rank “high-quality websites with original and useful contents” higher.
Producers of recast contents neglect to produce original, useful and reliable contents from scratch. Instead, they concentrate on recasting the contents from other websites into the form which search engines prefer. Their focus is on how to improve their ranking on Google SERPs efficiently.
How is it affecting actually?
As long as IREP observes, it really affects, and works on websites consisting of many recast contents.
Fig 1. shows how it works – many websites with recast contents suddenly lost their findability just after the announcement of the update:
(Fig 1: Effect of Google Japan Update in Feb 2017, Researched by IREP)
What does “Improving Web Ecosystem” mean?
If you know the food chain, it is easy to understand.
“Content” is “food”. Players in this ecosystem produce, transfer or consume it.
“Original content providers” are “Producer organisms”. They produce content from scratch, and enrich web ecosystem.
“Producers of recast contents” are “Consumer organisms”. They never produce content, but just consume contents from “Original content providers”. They make transaction of “food” more smooth, but does not increase the sum of resources in the ecosystem.
What if “Consumer organisms” overrun the ecosystem and “Producer organisms” are not fed well? The answer is simple, the ecosystem will shrink.
Of course that would be a tragedy for content providers and users, and also for Google which depends on this ecosystem. This update takes such a broad view and aims for long-term effect.
Is it rolled out globally?
No. For now it only affects Japan.
Will it be rolled out to other countries?
As for now, there are not any official statement.
Considering the reason of this update, it is possible that it can be rolled out in countries where recast contents overrun.
Why have low-quality curated and/or rewritten articles been favored by Google?
As you know, “Quality/reliability of content (= Page Quality)” and “Relevance to Search intent (=Needs Met)” are the main aspects of Google Search Quality evaluation. And in Needs Met evaluation, Google takes into account broadness of the content. If a content deals with too wide or too specific topic compared to search intent, it is not a good “fit” to the query.
-“Highly Meets results are highly satisfying and a good “fit” for the query.”
-“Slightly Meets results may … be too specific, too broad, etc. to receive a higher rating.”
(Source: Google General Guidelines)
Recasting is a cheap trick which makes the most of these aspects. As you know, making high-quality content from scratch is expensive. Recast contents are not original, reliable nor useful. But they can be produced with low cost by curating and/or rewriting, and often fit more to Google’s preference than originals.
Let’s assume there is an original blog post about ‘Review of a “Ramen King” (an imaginary Ramen restaurant in NY)’. The reviewer has been there several times, and wrote a review which tells good/bad points clearly, and includes lots of photos.
However, the post has some weak points. First, it deals with too specific topic. Many users search for “Ramen NY”, “Good Ramen in NY”, etc., but not so much for “Ramen King”.
Then content curators come. They copy or quote lots of reviews for Ramen restaurants in NY, then makes a patchwork content which covers many Ramen restaurants in NY. Though it is not original nor reliable, it is a good fit to users searching for “Ramen NY” or “Good Ramen in NY”. Google show content curators in “Ramen NY” or “Good Ramen in NY” searchers, and the original is not rewarded.
Another problem is that the post includes some irrelevant parts. The post includes some diary parts like, “It was a cold day. I planned to go out with my friend but he caught a cold. That’s why I headed to Ramen King.” Though it can be helpful for the author or some friends and fans, it is not necessary for search users looking for reviews.
Here the rewriters come in. They edit, rewrite and summarize the post, and make out rewritten contents with size and topic broadness which Google would prefer. From Google’s point of view, both deal with a similar topic, but rewritten ones can be more relevant to search users since they contain less noise.
That is why recast contents are often favored over original contents.
Is it better to avoid curating or rewriting?
Of course NOT. You can use it any time if proper.
In short, Google demotes recast contents because of poorness of their quality, but not for the methods to produce them.
As the announcement tells, Google demotes “websites which neglect to give users useful and reliable information, and focuses more on making their pages ranked higher on Google search results.” Though such websites often use content curation and rewriting as a “method”, it does not mean all curating and rewriting are bad. Lots of car accidents happen every day, but does it mean cars are evil and should be banned?
When contents fit to intent, or are easy to understand, it is good for users, and Google will keep on appraising them as well. If you curate or rewrite to revive the content – orchestrate them to proper size of intent, rewrite them to make them more intuitive and up-to-date, or add insights you got through curation/rewriting process – , it is helpful for everyone in web ecosystem. You do not need avoid curating and rewriting, if you are eligible and with proper purpose.
For any inquiries, please contact;
TEL: +81-3-3596-8050, FAX: +81-3-3596-8145