Aaron Wall’s 2020 article discusses Panda and Penguin updates, but it remains relevant to the recent Google March Core and Spam updates.

Reverse engineering what happened is very difficult because of the added noise (running two or more algorithm updates consecutively).

This is Aaron Wall’s comment below the article:

Google has a lot of reasons for not allowing immediate recalculation.

— Some of the more complex calculations require offline processing in a batch run.

— Some of them also require accumulating performance data over a period of time.

— They also like to roll out multiple changes at the same time in order to make reverse engineering of changes harder.

Early on in the Penguin algorithm they ran a Panda update just on either side of it and they also introduced an on-page classifier to add more noise to the links-based Penguin filters.

— If they allowed you to instantly see exactly where the limits were with constant recalculations then people could run consistent experiments to see exactly where the bright red line is and be just under it.

So by constantly recalculating they would allow themselves to be more easily reverse engineered while rewarding people who had an intent to exploit over those who had an intent to comply with the spirit of the guidelines.

— By making penalties last an extended period of time they make almost everyone more risk adverse and lower the amount people invest into SEO generally.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *