How to Heal Your Site after Google’s Panda Update

Did you get mauled by Google’s Panda? Are you sure? And if you did, do you know how to nurse your Web site back to health on the world’s foremost search engine?

To recap, Google is in an almost constant state of revision of its search algorithm, the mathematical formulas it uses to decide which pages are the most authoritative for a given keyword and therefore should get displayed first in a search on that term. But the system saw two big spikes in changes last January and again in early March. The first, as Google explained after the fact, was an attempt to cut back drastically on the search prominence of “content farms”: sites that produce reams of low-quality content, drive traffic to those sites through organic search, and then earn revenue by selling pay-per-click ads on those pages.

Good riddance, you may think.

But that effort then broadened into a much larger revision of the Google algorithm in February, nicknamed “Panda” (apparently after one of the update’s engineers), that had the express intent of downgrading the page ranking of many pages within non-farming sites.

On its official blog, Google announced that the aim was to cut the rankings—and thus the position within search results—of “sites which are low-value-add for users, copy content from other websites or sites that are just not very useful.”

It’s been estimated that the first revision, the one targeting content farmers, had an effect on about 2% of all Google queries. Not huge as a proportion, but since Google logs upwards of 2 billion searches a day, pretty substantial in real numbers: 40 million queries a day affected by the Farmer update.

The second, more extensive Panda update is ballparked to have affected some additional 15% of search queries. Together with the first change, that’s now about 340 million Google searches a day that produce search results pages with links different than those they turned up before the update process began. And tweaks to Panda are still rolling out.

That’s queries. But how many actual sites have reported a drop in their rankings for their main keywords? The Web is so vast that it’s hard to know, but a poll conducted by Barry Schwartz on the Search Engine Roundtable site about five days after Panda found about 40% of Webmasters reporting that they were seeing less Google traffic than pre-Panda.

Pretty certainly the large majority of sites downgraded in these Google moves are informational or publishing sites on the lines of AssociatedContent.com (down 93% in U.S. Google traffic right after the update, according to calculations from search research firm Sistrix) and Ezinearticles.com (90% dropoff).

But ecommerce, B-to-B and general product or service sites can’t afford to be complacent, for the simple reason that many of those also contain some of the telltale signs Google’s looking for of poor quality content. An article in the Wall Street Journal pointed to ecommerce sites such as OnTimeSupplies.com, an office-supplies Web retailer that experienced a 50% decline in Web traffic.

So what can a Web merchant who has felt the Panda’s claws do to recover? There’s plenty of advice floating around—some of it contradictory.

First, to evaluate whether they are in fact being hurt by the algorithm change, marketers should go to their analytics toolkit and check the number of key search terms that are referring visitors to their site from Google. “That’s an important metric for search engine optimization,” says Chip Rice, National accounts manager with digital marketing firm OneUpWeb. “It goes beyond the vanity of search position [on a Google results page] and gets to the core of how big a presence you have in the natural search space. If you’re seeing a smaller variety of search terms driving traffic, then there’s a good chance that you’re experiencing a negative impact from Panda.”

But one evident key is for marketers to check their Web site’s content and evaluate it honestly for originality, value, and usefulness to visitors.

“A lot of ecommerce companies face a real challenge in producing good quality unique content,” says Rice. “Their product descriptions often come straight from manufacturers and are duplicated across many, many Web sites—perhaps hundreds or thousands with the exact same copy. So as this update goes through and looks to reduce the amount of scraped, duplicated or poor-quality content and to reward unique value-added content, there’s definitely a potential for ecommerce companies to suffer an impact.”

The answer might be to rewrite that content substantially to make it unique. The situation may be complicated by the fact that in some cases, manufacturers require resellers to post their product content exactly as written. If enough resellers feel the pinch of Panda in their rankings because of content that Google sees as duplicative, those agreements may have to change. In the mean time, etailers may need to consider adding elements to those product pages that counterbalance the shared content with what Google will detect as valuable original data—such as video, consumer reviews or ratings.

“Reviews and ratings can be a wonderful way to add unique, relevant and fresh content that’s being updated continually,” says Chris Keating, director of SEO with Performics. “That way you don’t run into the problem of having only scraped content that never changes. Ideally you want to show Google a page that’s alive. That sends the message that this page is relevant.”

Web marketers should also be aware that bad pages can drag down the good ones around them in Google’s eyes. If you’ve got a page that has very little on it except a short product description and photo, you’re asking Google’s bots to see that as shallow content of little worth, and it can affect the overall ranking of your site.

Other things to be considered in a post-Panda site makeover:

Look at the user experience. Google appears to be penalizing sites that hit visitors with too many ads above the fold, as well as sites with poor navigation. In addition, pages that have a high proportion of users clicking through and then quickly clicking back to Google—a form of “bounce rate”—appear to be telling the engine that this is a page those visitors don’t find useful, and that’s a black mark.
Remember, the latest version of Google’s Chrome browser includes the ability to block Web sites. Google seems to be looking at that data now as an outside check on the accuracy of its algorithm revision (and reportedly about 80% of the blocked sites are among the ones downgraded), in future it may well integrate that blocking data directly into its rankings. So users’ views of your site’s worth will be even more important.

Don’t forget the off-site factors—most importantly, the kind of inbound links you have coming into your content pages. If sites Google considers reputable, authoritative and valuable are linking to you, that will make your pages look better to the search bots and rank higher. But if you’re engaged in buying links or swapping links with low-ranking sites simply to increase your volume of inbound links in an attempt to look good to Google, that behavior is more likely than ever before to cost you search rankings and traffic.

“Eliminate low-quality links that might be dragging your site down,” says Rice. “If you have lots of links to sites with low-quality content, or sites with loads of content but very little traffic, or sites lacking in content moderation or editorial review, there’s a good chance those can become a real hindrance to your site’s performance.”

Google is notoriously reluctant to discuss specifics about how its algorithms work because that would encourage gaming the system. But the company has set up a thread on its Webmaster Central Forum where operators can make the case that they’ve been unfairly impacted by the algorithm change. Don’t look for any direct help from Google here—“As this is an algorithmic change, we are unable to make manual exceptions,” the Google moderator writes. But your arguments are monitored by Google and could find their way into future algo tweaks or updates. Meanwhile, you might get some useful suggestions from your fellow sufferers.

If you’re faced with a big falloff in Google traffic, this may be a good time to redouble efforts to push for more referrals from other channels such as email or social media. Get more active on Facebook or LinkedIn. Think about starting a brand or corporate blog that can drive traffic to your Web site with engaging or valuable editorial good enough to be spread through sharing or retweeting. That requires the will to spend the time and effort to create good quality content worth sharing, of course. But at least you’re asking real people to evaluate that quality, not some software housed in a sub-basement in Mountain View CA.

Keep in mind that if poor-quality sites get bumped down in the search ranks, good ones get promoted. If you’re been practicing good SEO for any length of time, you may find that you’re actually ranking higher for some of your key search terms than you were before the Panda update.

Finally, resist the temptation to do just the bare minimum amount of optimization required to get back to your former Google performance. Moving as little as possible to stay on the right side of the line Google has drawn with Panda may seem like an efficient use of money and personnel; but it’s really just setting your site up to fail at the inexorable next update of Google’s algorithm.

In a March interview with Wired magazine, Google engineer Matt Cutts described how the Web community’s reaction to a 2009 Google update codenamed Caffeine led the company to focus its efforts on examining content quality more minutely in Panda: “It was like, ‘What’s the bare minimum that I can do that’s not spam?’”

“After a big update, the initial place everybody goes to is, ‘How do we recover from this?’” says Rice. ‘The answers are just the same as they’ve always been: provide high-quality content and a good user experience and practice good SEO. Google is continually furthering their core pursuit of providing the highest quality search results to their users. If you’re trying to stick as close to that line as possible, then you’re also making yourself more vulnerable to the next update.”