Sunday, July 17, 2011

Produce of Nations United-Panda-inverse: response of a Blog to update Google's Panda

When Google has deployed the first Panda update February 23, 2011, we have seen our traffic site collapse of 40%. I learned about this four hours after my day's work to become a blogger full time to quit. I regret the decision for a second, but it presents some unique challenges for the coming days.


Since then, we used several different strategies for regaining our former glory. Analysis of research and the location has led to remove potentially low quality content. We have experienced with editing and deleting of ads, while trying to improve the user experience. It is important to know that we have not seen a recovery... again. Nothing of what I am on the point of sharing has been a significant improvement, but I hope that this section provide an overview of other publishers.



"This update is designed to reduce the rankings for low quality sites" - Amit Singhal, Google Fellow


Google has mentioned repeatedly and repeatedly as the new document classifier Panda impacts on the entire site. Before, you may have a handful of really good post and it was incumbent on Google to find them. Now, the webmasters assume responsibility to carefully organize each any content.


Since the term low-quality is subject to interpretation, we started our site analysis to identify high quality content. The objective was to improve our link profile and remove everything except our best content. From data Google Analytics, webmaster tools and backlink analysis tools, we assessed to each unique position. Specifically, we looked at the top of the page of landing pages, contained by number of links by number of areas of liaison and domain authority. Many of these factors are correlated with AdSense win so we have also taken into account.



"wanted blocked from crawling and indexing, so that search engines can focus on what makes your site unique and valuable…. "- John Mu, Google employee


We decided to necessary articles go, and who would remain. It is difficult to suggest the deletion of about 75% of our archives, it is a relief to find alternative solutions "removal" content. By blocking crawling, we would be able to maintain informational messages that did not make the cut and preserve the link juice.


In another forum, John Mu stated that you use an error code 404 or 410 pages worth step recovery, 301 redirect that can be merged, and a tag meta "noindex" for content that you plan to rewrite. Matt Cutts made a direct on the Web on May 25 in which it has verified that noindexing is a good solution for the removal of low quality content. Block content in the robots.txt prevent Googlebots crawl while noindexing creeping and following links.



"While it is exciting to maximize the performance of your ad with AdSense, it is also important to take account of the user when" - guidelines for best practices, Google AdSense


It seemed very revealing that the team published new guidelines for ad placement AdSense about two months after that Panda hit. Many felt publishers despised because AdSense Optimization specialists have always pushed for more blocks advertising and more aggressive internship. Now, it is clear there is a threshold for ads that have pushed the content below the fold. This is not a stretch, as Google has already makes each page for the overview they provide alongside search results. They know where are the ad blocks.


I will admit that we were being aggressive with our ad placement. We have taken the diving and removed AdSense for more than a month, through the Panda 2.2 update, but has seen no improvement. Since then, we have replaced only AdSense on a handful of articles.


We suspect that Google views affiliate links as ads, especially when it can distort the editor to a specific product. Abolition of the majority of our affiliate links has been easy that only a few never converted. But it goes without saying that, throughout these changes really hit us where it hurts.



"Panda technology seems to have helped some scraper sites"-Michael Martinez, SEO Theory


Michael shared that it was difficult to find examples of scrapers exceeding the original authors, but he hits the nail on the head in the last line of the section. If Panda does not downgrade your site, you'll still come before the scrapers. Our site is not.


I have presented many disassembly view since Panda hit, but this is not the only content duplicate, that we had reviewed. Many of our articles overlap because the subjects similar (but separate). We began to work to ensure that each article could stand on its own merits with unique ideas and new perspective. This was no easy task and is still a work in progress.



"The + 1 button is a shortcut for 'it's not poorly cool' or"you must verify this."" Click on + 1 to something publicly give your seal of approval. -"Google + 1"


Bloggers have known that social marketing (a good metric of user experience) is an important part of your online identity and a great way to build readership. With movements such as the button + 1, Google travel some of the power of the owners of sites to the everyday web surfer. Before, build us relationships and advocate for links from webmasters, but that the system was easily assume. Now, the end user experience and how they interact on your site matters more than ever.


We have made many improvements and a certain way, I am pleased that Panda has had a dramatic impact. Nothing else would have stimulated many changes, we have made. Our site will be refined by fire with the final result will be much better than before. Sometimes webmasters are too close to their own products.


If you have ideas to overcome the demotion of Panda or suggestions for how we can improve, I would love to hear.

0 коммент.:

Post a Comment

Twitter Delicious Facebook Digg Stumbleupon Favorites More