RETURN $ecure;

Security, Technology and Life

Posts Tagged ‘XSS

90% Exploitable – Is this progress?

leave a comment »

It’s been nearly three years since many of us estimated that 9 out of 10 sites had at least one flaw while most had more. I have not been to active in the security world as of late ( though this will change soon! ), but I would have hoped we would have made some sort of progress. It seems XSS is still amazingly pervasive and CSRF; the now waking giant, is not far behind.

As Darkreading reports, WhiteHat has issued a press release which states that around 9 of 10 sites have at least one vulnerability while the average site has around six or seven.  I rarely seen WAF’s as the solution, but even over a few years — nearly eternity for the internet, little to no progress has been obviously made. So, perhaps it is finally time. In the whitehat’s defense though, the odds are amazingly against them. Over a hundred million sites operate now. That 1 of 10 sites that is safe is often brochure-ware. A site with little or no interactivity; static html on secure servers.

Perhaps we ARE making developers more security-minded and making progress. I do remember saying this awhile back.

Many sites are vulnerable to XSS, and since all Websites change, eventually another XSS hole will probably open up on sites previously thought [of as] safe.

This seems to remain fairly true today. The very nature of interactive websites tied along with them being revamped fairly often, means that it’s all very dynamic, thus apparently; very insecure.

Oh well. At least with my inactivity as of late, I won’t be heading to an early grave.

Advertisements

Written by Rodney G

04/10/2008 at 1:19 am

Posted in Security, Technology

Tagged with , , ,

UserJS URL Sanitizing

with 5 comments

I was reading a post by RSnake over at Darkreading and got to thinking about client-side security.  There seems to be very little we can do against most things for the average user. NoScript is fine for a tech-minded individual, but the average user will probably forget about it and wonder why a site is now missing functionality.

So what do you think of some javascript that could check the URL for typically bad characters(since JS can easily find html-entities/url encoding/etc.)  and then sanitize them somehow? This could mean removing them or properly entifying them. Sure it’s fine. But even Greasemonkey scripts on run after a page is loaded. How could we do this?  Let’s take a look at UserJS in Opera.

User JavaScript is loaded and executed as if it were a part of the page that you visit. It is run immediately before the first script on the page. If the page does not contain any scripts of its own, User JavaScript will be executed immediately before the page is about to complete loading. It is usually run before the DOM for the page has been completed. (Note that this does not apply to Greasemonkey scripts. “….”User JavaScript will not be loaded on pages accessed using the opera: protocol. By default, it is also not loaded on pages accessed using the https: protocol.

Oh! So it should run before any other script is run. This is good. We can check to see if a script was injected, then proceed to remove it. But what if the injection is inside javascript? It will be hard to tell if it’s valid or not. Well since we are using UserJS already, let’s look at the UserJSEvent object and event listeners.

if( location.hostname.indexOf('example.com') != -1 ) {
  window.opera.addEventListener('BeforeScript',
   function (e) {
       e.element.text = e.element.text.replace(/!=\s*null/,'');
    },
    false
  );
}

BeforeScript
Fired before a SCRIPT element is executed. The script element is available as the element attribute of the UserJSEvent. The content of the script is available as the text property of the script element, and is also writable:

UserJSEvent.element.text = UserJSEvent.element.text.replace(/!=\s*null/,”)

So with this, we can check the text of a script object before it fires to sanitize it, which we could set to do only if it contains echoed content. Just a note that this isn’t restricted to off-site JS like it can be in some browsers. UserJS has full access to remote files accessed via script src even before it executes.  The hardest part is obviously sanitizing, but with some work I don’t see it being a huge issue for some basic XSS protection on the client-side.  I’m sure you could even expand it to  search all scripts for things that are commonly malicious like sending document.cookie somewhere to help protect against persistent XSS.

Anyways, I’d love to hear feedback on this idea before I go run off and make it.

Written by Rodney G

11/21/2007 at 6:03 pm

Posted in Security

Tagged with , , , , ,

Mobile Zombies, XSSWW, hack the planet?

leave a comment »

Warning, this post may be long, rant-like and totally off-target. 😛

While using bi-directional persistent communication channels to control browsers isn’t anything new,  nor is the  concept of a Cross Site Scripting Warhol Worm, but recently I have been thinking about them again. First off, earlier I was discussing in the #slackers irc channel, a concept regarding mobile zombies. I recently got a new phone to find out it has a fairly fast connection to the internet. Some phones can even reach 4.9MBits/s! This opens a whole new area, especially if malicious users can harness this. It seems at least 2.7 billion people own a mobile phone. If even only a small percentage of these users have high speed internet access, that’s still much more surface area for attack and data throughput. Plus, phones are often on longer than a home PC. “Follow the sun” no longer applies.

So enough information and theory, is this possible? Can we supplement mobile phones to use in a giant botnet? Well, to be honest,  I really have no idea. I have no statistics on what phones  can run JavaScript in their browser, which browser people are using for mobile browsing nor the resources to test any of this. But for the sake of this post, let’s assume at least 5%  of the 2.7  billion people have high speed internet on their mobile phone. That’s  135 million people. Since they are using a newer model of phone, let’s assume at least 80% of them have some sort of vulnerable web technology enabled on their phones. (JavaScript, Flash, Java (probably this…)) That’s still a little over 100 million phones. Now don’t get too excited, I doubt anyone could infect all of them. So how could we infect them? It’s pretty simple. Persistent XSS, tricking users into downloading Java viruses, etc.

So I went a little too in-depth on the mobile zombienet. Sue me. It seems possible and something to consider.

Anyways, back to the XSSWW. While RSnake claimed it wasn’t fiction in his post, at the time it seemed like the technologies and attacks that could be used for something like that didn’t really exist yet. Now they do. It doesn’t seem very far fetched, or hard for that matter. Here’s the little process my mind went over imagining how a worm like this would work. First one would need a few 0day XSS holes. Preferably at least one in a major forum software like phpBB or vBulletin and another in a web-based instant messaging service, such as MSN Web Messenger or Meebo.com. Obviously the initial attack would be over the forum software. It could use search engines to find other vulnerable installs of the forum to propagate. I imagine some sort of algorithm would be needed to choose a random result so the same forum wouldn’t be infected over and over so suddenly. Infected users would have their browser window hijacked with a full screen iframe so we could keep control longer, then zombified using attackapi or similar tools. Then we could use the CSS history hack to find which social networking sites, web-based instant messengers, etc, the user has visited that we have a vulnerability in. For an IM site, we could hijack the users list and find ways to infect them as well. Perhaps using a JavaScript XSS scanner or the PDF XSS to find a reflective XSS hole to use the CSS history hack on this stolen user list, to repeat the process.  Then of course we could do anything we wanted from DDoSes to using stolen MSN login credentials to send spam, or any of the other usual bad deeds.

Now the key problem with this situation is obviously losing control of zombies and network traffic overload to the channel. Since the scale would theoretically be huge, we could easily increase the interval of the requests to the channel immensely and only have one message in queue for all zombies at a time. Then you can change that message when you want to change objectives. Now assuming XSS vulnerabilities will be fixed and we couldn’t renew our supply of lost zombies, we would have a problem. Unless we created a JavaScript function that changed something in the worm. The propagation methods and the XSS vectors used. ;D Since we will have one or more central control locations more than likely, another thing a client could request is a series of XSS vectors to try on specific sites, probably an XML document containing these things, as well as the next place to request details from. (Then you could compromise different servers all the time in an attempt to hide your own identity.)

So combining the new power of mobile zombies as well as some theory about how a Warhol worm would work, we have a very scary scenario. I really have no idea how to stop something like this. I think I’ll go unplug my Ethernet cord now.

P.S. Sorry if you read all of that.

Written by Rodney G

11/14/2007 at 8:02 pm

Posted in Security

Tagged with , , , ,

Opera to support HttpOnly

with 3 comments

Heya. I haven’t blogged in awhile but I do want to start getting back into it.

So, we’ll start with something small.

I read this article the other day about updates coming to Opera in 9.5 and was pleasantly suprised to read that it will support HttpOnly cookies. Now, if you don’t know what that is I’ll give a quick run-down. Normally, cookies are able to be accessed through scripts with things like document.cookie in JavaScript. Along with the normal cookie header, in Set-Cookie you can set it to HttpOnly. This means the cookie cannot normally be read from means other than sending it in an Http Request. This slightly mitigates using XSS to steal credentials as you can no longer read the cookie with JavaScript and send it out, but obviously doesn’t stop phishing via any means. Many sites do use HttpOnly cookies but currently,  only Internet Explorer supports it. If you use a browser that doesn’t support it,  it simply is downgraded to a normal cookie. To be fair, Firefox 3 is planned to have support for it as well but it seems 9.5 will be out before FF3. At any rate, while this won’t stop XSS, it basically eliminates the risk of cookie theft.

Just remember though, cookie theft isn’t the only credentials that can be stolen with XSS or other methods. ‘Dynamic’ phishing methods that don’t rely on a third-party site are still somewhat hard to detect and should be watched out for.

Written by Rodney G

05/10/2007 at 1:37 pm

Posted in Uncategorized

Tagged with , , ,

[Paper] Anatomy of a Worm

with 2 comments

Here it is.

I just wrote a paper exclusively for SudoLabs.com. It’s about the worm I wrote targeting GaiaOnline.com,

aptly named “gaiaworm”. This is the third version of the worm and the first time I’ve ever really written a paper.

Edit 1 – If you link to the paper, link to the blog post instead. I will be updating with links to who has published it as well as other updates. Thanks.

Edit – XSSed.com now has a copy of it. View it here.

— Edit 3, Apparently SudoLabs forums are dead for now. View the XSSed.com copy above!

Written by Rodney G

02/10/2007 at 1:26 am

Posted in Uncategorized

Tagged with

Keep an eye out.

leave a comment »

Soon I’ll be releasing a small paper to Sudolabs.com and XSSED.com

Keep an eye out.

Written by Rodney G

02/9/2007 at 12:59 am

Posted in Uncategorized

Tagged with

Cross-Site Scripting : The book!

with one comment

I figured this would happen for awhile, then he told us he was working on it, but here it finally is!

RSnake(a.k.a. Robert Hansen) has literally written the book on XSS. Well, he is a contributer as well as Seth Fogie, Jeremiah Grossman and Anton Rager. I’m really excited about this. According to the Amazon.com description, it covers the basics as well as some of the more bleeding-edge stuff. It’s set to release March 1, 2007.

You should go pre-order it now. I know I’ll be picking this up.

Written by Rodney G

02/6/2007 at 10:21 am

Posted in Uncategorized

Tagged with