As part of Google's goal to make the web faster, we uploaded several video tips about optimizing the speed of your website. Check out the tutorials page to view the tutorials and associated videos.

Matt Cutts answered a new question each day from the Grab Bag:
And during Adam Lasnik's visit to India, he was interviewed by Webmaster Help Forum guide Jayan Tharayil about issues related to webmasters in India. We have the full three-part interview right here.

We'll get you started on this batch of videos with Matt's tips for targeting your site to a specific region:


Feel free to leave comments letting us know how you liked the videos, and if you have any specific questions, ask the experts in the Webmaster Help Forum.




Site design and architecture issues
Now that we've seen how malicious changes might affect your site and its traffic, let's examine some design and architecture issues. Specifically, you want to ensure that your site is able to be both effectively crawled and indexed, which is the prerequisite to being shown in our search results. What should you consider?

  • First off, check that your robots.txt file has the correct status code and is not returning an error.
  • Keep in mind some best practices when moving to a new site and the new "Change of address" feature recently added to Webmaster Tools.
  • Review the settings of the robots.txt file to make sure no pages -- particularly those rewritten and/or dynamic -- are blocked inappropriately.
  • Finally, make good use of the rel="canonical" attribute to reduce the indexing of duplicate content on your domain. The example in the presentation shows how using this attribute helps Google understand that a duplicate can be clustered with the canonical and that the original, or canonical, page should be indexed.


In conclusion, remember that fluctuations in search results are normal but there are steps that you can take to avoid malicious attacks or design and architecture issues that might cause your site to disappear or fluctuate unpredictably in search results. Start by learning more about attacks by hackers and spammers, make sure everything is running properly at crawling and indexing level by double-checking the HTML suggestions in Webmaster Tools, and finally, test your robots.txt file in case you are accidentally blocking Googlebot. And don't forget about those "robots.txt unreachable" errors!



Not interested? Then let me introduce you to my dear friend PrettyGirlsWebCam1234, she says she's an old college friend of yours and has exciting photos and videos you might want to see."


You probably don't want your visitors' first impression of your site to include inappropriate images or bogus business offers. You definitely don't want your users hounded by fake invites to the point where they stop visiting altogether. If your site becomes filled with spammy content and links to bad parts of the web, search engines may lose trust in your otherwise fine site.

Why would anyone create spam profiles?

Spammers create fake profiles for a number of nefarious purposes. Sometimes they're just a way to reach users internally on a social networking site. This is somewhat similar to the way email spam works - the point is to send your users messages or friend invites and trick them into following a link, making a purchase, or downloading malware by sending a fake or low-quality proposition.

Spammers are also using spam profiles as yet another avenue to generate webspam on otherwise good domains. They scour the web for opportunities to get their links, redirects, and malware to users. They use your site because it's no cost to them and they hope to piggyback off your good reputation.

The latter case is becoming more and more common. Some fake profiles are obvious, using popular pharmaceuticals as the profile name, for example; but we've noticed an increase in savvier spammers that try to use real names and realistic data to sneak in their bad links. To make sure their newly-minted gibberish profile shows up in searches they will also generate links on hacked sites, comment spam, and yes, other spam profiles. This results in a lot of bad content on your domain, unwanted incoming links from spam sites, and annoyed users.

Which sites are being abused?

You may be thinking to yourself, "But my site isn't a huge social networking juggernaut; surely I don't need to worry." Unfortunately, we see spam profiles on everything from the largest social networking sites to the smallest forums and bulletin boards. Many popular bulletin boards and content management systems (CMS) such as vBulletin, phpBB, Moodle, Joomla, etc. generate member pages for every user that creates an account. In general CMSs are great because they make it easy for you to deploy content and interactive features to your site, but auto-generated pages can be abused if you're not aware.

For all of you out there who do work for huge social networking juggernauts, your site is a target as well. Spammers want access to your large userbase, hoping that users on social sites will be more trusting of incoming friend requests, leading to larger success rates.

What can you do?

This isn't an easy problem to solve - the bad guys are attacking a wide range of sites and seem to be able to adapt their scripts to get around countermeasures. Google is constantly under attack by spammers trying to create fake accounts and generate spam profiles on our sites, and despite all of our efforts some have managed to slip through. Here are some things you can do to make their lives more difficult and keep your site clean and useful:

  • Make sure you have standard security features in place, including CAPTCHAs, to make it harder for spammers to create accounts en masse. Watch out for unlikely behavior - thousands of new user accounts created from the same IP address, new users sending out thousands of friend requests, etc. There is no simple solution to this problem, but often some simple checks will catch most of the worst spam.
  • Use a blacklist to prevent repetitive spamming attempts. We often see large numbers of fake profiles on one innocent site all linking to the same domain, so once you find one, you should make it simple to remove all of them.
  • Watch out for cross-site scripting (XSS) vulnerabilities and other security holes that allow spammers to inject questionable code onto their profile pages. We've seen techniques such as JavaScript used to redirect users to other sites, iframes that attempt to give users malware, and custom CSS code used to cover over your page with spammy content.
  • Consider nofollowing the links on untrusted user profile pages. This makes your site less attractive to anyone trying to pass PageRank from your site to their spammy site. Spammers seem to go after the low-hanging fruit, so even just nofollowing new profiles with few signals of trustworthiness will go a long way toward mitigating the problem. On the flip side, you could also consider manually or automatically lifting the nofollow attribute on links created by community members that are likely more trustworthy, such as those who have contributed substantive content over time.
  • Consider noindexing profile pages for new, not yet trustworthy users. You may even want to make initial profile pages completely private, especially if the bulk of the content on your site is in blogs, forums, or other types of pages.
  • Add a "report spam" feature to user profiles and friend invitations. Let your users help you solve the problem - they care about your community and are annoyed by spam too.
  • Monitor your site for spammy pages. One of the best tools for this is Google Alerts - set up a site: query along with commercial or adult keywords that you wouldn't expect to see on your site. This is also a great tool to help detect hacked pages. You can also check 'Keywords' data in Webmaster Tools for strange, volatile vocabulary.
  • Watch for spikes in traffic from suspicious queries. It's always great to see the line on your pageviews chart head upward, but pay attention to commercial or adult queries that don't fit your site's content. In cases like this where a spammer has abused your site, that traffic will provide little if any benefit while introducing users to your site as "the place that redirected me to that virus."


Have any other tips to share? Please feel free to comment below. If you have any questions, you can always ask in our Webmaster Help Forum.

Share on Twitter Share on Facebook


The Webmaster Central team does our best to support the webmaster community via Webmaster Tools, the Webmaster Central Blog, the Webmaster YouTube Channel, Help Center, our forum, and a fellow named Matt Cutts.

If you've got ideas and suggestions for Webmaster Central - features you want, things we can do better - tell us. From now until Friday, July 24, 2009, Product Ideas for Webmaster Central will be open for feedback. Every suggestion you add will be seen not only by the Webmaster Central team, but by other users and webmasters. We'll review every submission, and we'll update you regularly with our progress and feedback.

The more feedback the better, so get started now.

Share on Twitter Share on Facebook

While you're moving your site, you can test how Google crawls and indexes your new site at its new location by submitting a Sitemap via Google Webmaster Tools. Although we may not crawl or index all the pages listed in each Sitemap, we recommend that you submit one because doing so helps Google understand your site better. You can read more on this topic in our answers to the most frequently asked questions on Sitemaps. And remember that for any question or concerns we're waiting for you in the Google Webmaster Help Forum!
Update: as mentioned here, we have introduced a new feature: Change of Address. Check it out if you are moving from one domain to another! By using this feature you will help us update our index faster and hopefully make the transition for your users smoother.

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook


We just added external resource loading to our Flash indexing capabilities. This means that when a SWF file loads content from some other file—whether it's text, HTML, XML, another SWF, etc.—we can index this external content too, and associate it with the parent SWF file and any documents that embed it.
This new capability improves search quality by allowing relevant content contained in external resources to appear in response to users' queries. For example, this result currently comes up in response to the query [2002 VW Transporter 888]:


Prior to this launch, this result did not appear, because all of the relevant content is contained in an XML file loaded by a SWF file.

To date, when Google encounters SWF files on the web, we can:
If you don't want your SWF file or any of its external resources crawled by search engines, please use an appropriate robots.txt directive.

Share on Twitter Share on Facebook


As of last week, after your request has been processed, we'll confirm this by sending a message to your Message Center in Webmaster Tools. (Prefer to be notified by email? You can do that too.) Sometime after you receive a reconsideration request confirmation message, check your site's performance in search results. If it's doing well, it means that Google has reviewed your site and believes that it adheres to our Webmaster Guidelines. If your site still isn't performing well in search, we recommend reviewing our Webmaster Guidelines and also checking out these possible reasons why your site might not be doing as well as you expect.

Share on Twitter Share on Facebook



More information can be found in the Product Search Help Center.

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook


We launched a preview of our new Webmaster Tools interface three weeks ago, and received a lot of valuable feedback. Most of you liked the update, appreciating features such as the one-stop dashboard, more top search query data, and the improved menu and navigation.

You offered some constructive feedback as well:
  • You missed the option to switch listing 25, 50, or 100 rows in features such as links to your site. We did not add the option back to select how many rows you would like to see but increased our default to 100!
  • Top search query information differed between the old and new versions. We expected this since we went through a lot of re-engineering to improve the new top search queries backend. We reviewed many of the issues posted on our forums, and verified that the new backend is far more accurate and reliable.
  • Initially, the Sitemaps downloaded and indexed URL counts differed between the two versions. We resolved this issue quickly.
  • Backlinks numbers between the old and the new user interface (UI) may differ since our new UI shows the original anchor (not following redirects) as it's linked on the web. Let's say example.com links to http://google.com, then http://google.com 301s to http://www.google.com/:
    • In the new UI -- only verified site owners of google.com will see examples.com's backlink (because we show the original link prior to any redirects)
    • In the old UI -- verified site owners of www.google.com could see example.com's backlink
  • The new site switcher lists only five sites, and some of you who manage a large number of sites found this limiting. We appreciate the feedback and will work on addressing this limitation in a future release.
From today, only the new user interface will be available (http://google.com/webmasters/tools)! You'll see that in addition to fixing many of the issues users addressed, we took some time to launch a new feature: Change of Address. The Change of Address feature lets you notify Google when you are moving from one domain to another, enabling us to update our index faster and hopefully creating a smoother transition for your users.

Thanks to all the users that took time to give us feedback on the new user interface. To those users using it for the first time today, we hope you enjoy it. As always, your feedback is appreciated.

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook

Page Speed is a tool we've been using internally to improve the performance of our web pages -- it's a Firefox Add-on integrated with Firebug. When you run Page Speed, you get immediate suggestions on how you can change your web pages to improve their speed. For example, Page Speed automatically optimizes images for you, giving you a compressed image that you can use immediately on your web site. It also identifies issues such as JavaScript and CSS loaded by your page that wasn't actually used to display the page, which can help reduce time your users spend waiting for the page to download and display.

Page Speed's suggestions are based on a set of commonly accepted best practices that we and other websites implement. To help you understand the suggestions and rules, we have created detailed documentation to describe the rationale behind each of the rules. We look forward to your feedback on the Webmaster Help Forum.

Share on Twitter Share on Facebook

Share on Twitter Share on Facebook