Joe Levi:
a cross-discipline, multi-dimensional problem solver who thinks outside the box – but within reality™

Webmaster Tools Vulnerability


For those of you who aren’t web masters or web developers, you’re probably not familiar with Google’s, Yahoo’s, and MSN’s webmaster tools. These are the “big 3” in search engines today, and each has offered a fairly similar set of tools to web masters to help with indexing of the sites within the web master’s control.

For the purpose of this article I’ll use Google as an example, but the same basic concepts apply to MSN and Yahoo as well. Also, I use the terms “web master” and “web developer” interchangeably in this article, as the tasks exposed by these tools can be categorized as the responsibility of either.

For example, as the primary web developer for a site, I can tell Google where to find the sitemap file for the site (which helps Google, because they only have to spider the one file, rather than all the files on the site). This also helps me get my new pages indexed by Google that much faster, which is a very good thing.

Also, using these tools, if a site has gotten indexed inadvertently (say a test or development site that you don’t want the public to see), you can submit a removal request to have the site and its pages removed from the index, the crawl schedule, and the cache/archive.

Once my sitemap file has been submitted and crawled (usually with 48 hours), I have access to a whole slew of information about my site, including top search queries, incoming links to my site, and crawl errors. Using this information helps the web master fix any problems with the site (missing pages, etc.) and better craft relationships with external sites to increase cross-site relevancy.


The way all three search engines check to see if the logged in user is allowed to make changes usually involves one of two things:

  • adding a uniquely named .html file to the root of your website, or
  • adding a uniquely named meta tag to the header of your default page of your website.

Doing this effectively tells the search engine that you really do have control over the site and therefore are authorized to make changes to it – including all the bells and whistles that their tools give you.

So far so good, right?

3rd Party Search Engine Optimization Companies

Most of us know that web masters and web developers really don’t like the minutia involved in search engine placement. It’s tedious, and not typically taught in any training or education curriculum. Not only that, most web masters want to administer servers; most web developers want to write code. Neither really wants to babysit search engines, after all, that’s more of a marketing job than it is an IT job.

To that end, many companies opt to hire 3rd party search engine optimization consultants or companies. These companies will typically evaluate the server configuration (and make improvement recommendations to the web master), page layout and coding (and make improvement recommendations to the web developer), the addition of specific files (such as the sitemap.xml file), etc.

So far so good, right?

Where all this breaks down

Once you have placed the authorization files on your sever and the search engines have authenticated a certain user, that user has access to that search engine’s web master tools (including the ability to delete your site from their index).

Of course, any decent SEO/SEP/SEM company would never do this, but the fact remains that they can. For example, if you feel that your SEO company didn’t accomplish what you wanted them to and refused to pay them their fee, they might feel justified in removing the work they’d just done – by deleting your pages from the search engines index.

Google has a “workaround” for this, all you need to do is delete the verification files then tell Google to “update verification.” Any Google account that does not have the authorization file present will be shut out of the data immediately. (

I haven’t found similar solutions for Yahoo or MSN.

Possible Solutions

Two potential solutions to this problem come to mind.

  1. Similar to how Google handles user access to their Analytics data for a given site, the web master should be authenticated as an administrator. That administrator can then give “user” access to another user. This “user” account would not be able to change user permissions and would not be able to do bulk deletions without the manual authorization of the administrator. Of course, the administrator would be able to remove (or “de-authorize”) any given user account at any time.
  2. Google (et. al.) should re-authenticate the authorized token before allowing a user to have access to the corresponding web property. This way an administrator need only remove the token from the server to revoke access, and needn’t log in to Google and “update verification” after having deleted said token

Obviously, Yahoo and MSN need to do something to “de-authorize” users (if they haven’t already done so).


You may also like...

Leave a Reply