Oops! Google did it again. It seems that Googlebot has forgotten the norms regarding which characters are allowed in a valid domain and is allowing the crawlers to easily index sub-domains with invalid characters. These are sub-domain which are not accessible through all browsers, but Google seems to be comfortable with them. In this case the issue is spaces in subdomains, for instance http://%20www.cyberinet.com/.
A cross check between Firefox, IE and Opera resulted the following:
Firefox displayed the message “couldn't find server”, IE didn't mind and displayed the page and Opera removed the space and displayed the correct page.
Good for those who live for abusing and stealing search traffic.