What is Google Panda Algorithm? How to recover from Panda

0
166
views
Google Panda Algorithm Update

Google releases its first update on February 23, 2011. Main purpose of Google Panda algorithm update was to reward high-quality websites and diminish the presence of low-quality websites in Google’s search engine results pages. Panda initially known as “Farmer.” As per Google, Panda’s initial roll out over the course of several months affected up to 12 percent of English language search results.

Triggers for Panda

Thin content – Weak pages with very little relevant or substantive text and resources, such as a set of pages describing a variety of health conditions with only a few sentences present on each page.

Duplicate content – Copied content that appears on the Internet in more than one place.

Low-quality content – Pages that provide little value to human readers because they lack in-depth information.

Lack of authority/trustworthiness – Content produced by sources that are not considered definitive or verified. A Google rep stated that sites aiming to avoid Panda’s impact should work to become recognized as authorities on their topic and entities to which a human user would feel comfortable giving their credit card information.

Content farming – Large numbers of low-quality pages, often aggregated from other websites.

Low-quality user-generated content (UGC) – An example of this type of low-value User Generated Content would be a blog that publishes guest blog posts that are short, full of spelling and grammatical errors and lacking in authoritative information.

High ad-to-content ratio – Pages made up mostly of paid advertising rather than original content.

Low-quality content surrounding affiliate links – Poor content around links pointing to paid affiliate programs.

How to recover from Panda

By following below steps you can easily recover from Google Panda.

  • Abandoning content farming practices
  • Overhauling website content for quality, usefulness, relevance, trustworthiness and authority
  • Revising the ad/content or affiliate/content ratio so that pages are not dominated by ads or affiliate links
  • Ensuring that the content of a given page is a relevant match to a user’s query
  • Removing or overhauling duplicate content
  • Careful vetting and editing of user-generated content and ensuring that it is original, error-free and useful to readers, where applicable
  • Using the Robots noindex,nofollow command to block the indexing of duplicate or near-duplicate internal website content or other problematic elements

Source:moz.com

LEAVE A REPLY

Please enter your comment!
Please enter your name here