Idea: rate-limiting in browsers

Talk about internet security, computer security, personal security, your social security number...
Post Reply
User avatar
Thrawn
Master Bug Buster
Posts: 3106
Joined: Mon Jan 16, 2012 3:46 am
Location: Australia
Contact:

Idea: rate-limiting in browsers

Post by Thrawn »

Hi, folks.

I had an idea that I want to put to the open-source browser vendors, and I thought I'd bounce it off the great minds here first.

Many of the vulnerabilities in SSL/TLS require the attacker to send a large number of requests to the target site. BEAST, Lucky 13, breaking RC4, etc. Even much more compact attacks, like CRIME and POODLE, still require hundreds of requests per byte, and if simple mitigations are applied, CRIME could easily take thousands.

So, as a proactive measure, why not apply some rate-limiting when a page is sending an unreasonable number of requests to a site? Even busy AJAX shouldn't need more than, eg, 1 request per second. So, if traffic from a page exceeds a threshold - say, 100 requests to the same site within the same minute - then the browser could start applying a 10-second delay between each request (exact parameters subject to discussion, testing, etc, of course).

Any thoughts on this?
======
Thrawn
------------
Religion is not the opium of the masses. Daily life is the opium of the masses.

True religion, which dares to acknowledge death and challenge the way we live, is an attempt to wake up.
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:36.0) Gecko/20100101 Firefox/36.0
barbaz
Senior Member
Posts: 10834
Joined: Sat Aug 03, 2013 5:45 pm

Re: Idea: rate-limiting in browsers

Post by barbaz »

Speaking from the standpoint of a power user:

That's a good idea, not only as a security measure but also for those of us with limited bandwidth. Personally I'd also want a notification shown if a site is doing something like that.

I imagine it would be possible to implement something like that in a Gecko extension. Also couldn't a NoScript surrogate based on your idea make it significantly harder for JS to do that sort of thing? Something like this maybe (untested code)

Code: Select all

let _busy = false;let _xmlhttprequest = XMLHttpRequest;Object.defineProperty(window, 'XMLHttpRequest', {value:function(){ while(_busy){continue;} _busy=true;window.setTimeout(function(){_busy=false},1000); return _xmlhttprequest.apply(this, arguments); }});
*Always* check the changelogs BEFORE updating that important software!
-
barbaz
Senior Member
Posts: 10834
Joined: Sat Aug 03, 2013 5:45 pm

Re: Idea: rate-limiting in browsers

Post by barbaz »

Oops, nvm, you need to rate-limit XMLHttpRequest.prototype.send(), not "new XMLHttpRequest()".
(again, I didn't test this code, it's just a concept)

Code: Select all

let _busy=false;let _send=XMLHttpRequest.prototype.send;Object.defineProperty(window.XMLHttpRequest.prototype, 'send', {value:function(){ while(_busy){continue;} _busy=true;window.setTimeout(function(){_busy=false},1000); return _send.apply(this, arguments); }});
*Always* check the changelogs BEFORE updating that important software!
-
User avatar
Thrawn
Master Bug Buster
Posts: 3106
Joined: Mon Jan 16, 2012 3:46 am
Location: Australia
Contact:

Re: Idea: rate-limiting in browsers

Post by Thrawn »

That might work for simple cases, yes. I'd imagine that if it became standard practice, then there would be ways for sites to fight back (eg inserting img tags into the page, launching 10000 asynchronous requests, etc). Also, power users aren't the ones who need protection so much, because NoScript can already kill most of that. It's the masses who would benefit from it.
======
Thrawn
------------
Religion is not the opium of the masses. Daily life is the opium of the masses.

True religion, which dares to acknowledge death and challenge the way we live, is an attempt to wake up.
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:36.0) Gecko/20100101 Firefox/36.0
barbaz
Senior Member
Posts: 10834
Joined: Sat Aug 03, 2013 5:45 pm

Re: Idea: rate-limiting in browsers

Post by barbaz »

Well, "the masses" want things to "just work" and they don't know enough about this kind of thing to realize they should care. What if a page legitimately loads a huge number of images and scripts because it's badly designed and has a lot of ads? If that's rate limited, average users are going to grumble something like "Why is this taking so long to load, what's wrong with this internet".

So if this isn't for power users, maybe a better approach would be pause the loading of resources to throw an alert in the users' faces something like "This webpage is downloading an abnormally large count of files. This could indicate a security attack. Allow this to continue? [Read more] [No] [Yes]" and they can decide whether to apply rate limiting or not based on their expectations about the nature of the site...
(Also that way they have the chance to know if things are slowed down in such an instance it could be the browser doing it deliberately to protect them, they won't be stuck to think it some random cyber-ghost that could be anywhere between them and the server.)
*Always* check the changelogs BEFORE updating that important software!
-
User avatar
Thrawn
Master Bug Buster
Posts: 3106
Joined: Mon Jan 16, 2012 3:46 am
Location: Australia
Contact:

Re: Idea: rate-limiting in browsers

Post by Thrawn »

barbaz wrote:Well, "the masses" want things to "just work" and they don't know enough about this kind of thing to realize they should care.
Which is why they need protection from themselves. Although I would certainly want the parameters to be tunable for those who know enough to make their own decisions.
What if a page legitimately loads a huge number of images and scripts because it's badly designed and has a lot of ads?
Yeah, pages like that would be the major challenge, which is why it would have to be rate-limiting, not blocking. But there might be ways to make it manageable. For example, same-origin requests could be unfiltered, or have a very high threshold; they're not dangerous without XSS (which would make most TLS vulnerabilities superfluous). And if there are a lot of pages that really are smashing one site that fast (remember, we're talking hundreds in one minute; even a Facebook news feed probably isn't as aggressive as that, since it spreads the load and only loads more when you scroll down), then perhaps the delay would need to be something smaller, like 2-3 seconds; it'll still make attacks on RC4 infeasible, along with BEAST and co.
So if this isn't for power users, maybe a better approach would be pause the loading of resources to throw an alert in the users' faces something like "This webpage is downloading an abnormally large count of files. This could indicate a security attack. Allow this to continue? [Read more] [No] [Yes]" and they can decide whether to apply rate limiting or not based on their expectations about the nature of the site...
I don't think that that would work as stated. The users don't have any idea whether there's an attack going on, or any simple way to find out. They only know that they wanted to see the site. And the only way to see the site is to dismiss the dialog. So they will.

Maybe if, instead of saying anything about a possible attack, the browser simply warned that the page was loading a lot of resources and might slow things down? That presents the user with a meaningful decision. They know whether they have time to sit around. They know if they're on a metered connection. They know if there are other search results that might load faster. They can realistically choose yes or no without further input. It could be like the warning about hung scripts.
======
Thrawn
------------
Religion is not the opium of the masses. Daily life is the opium of the masses.

True religion, which dares to acknowledge death and challenge the way we live, is an attempt to wake up.
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:36.0) Gecko/20100101 Firefox/36.0
barbaz
Senior Member
Posts: 10834
Joined: Sat Aug 03, 2013 5:45 pm

Re: Idea: rate-limiting in browsers

Post by barbaz »

Thrawn wrote:
What if a page legitimately loads a huge number of images and scripts because it's badly designed and has a lot of ads?
Yeah, pages like that would be the major challenge, which is why it would have to be rate-limiting, not blocking. But there might be ways to make it manageable. For example, same-origin requests could be unfiltered, or have a very high threshold; they're not dangerous without XSS (which would make most TLS vulnerabilities superfluous). And if there are a lot of pages that really are smashing one site that fast (remember, we're talking hundreds in one minute; even a Facebook news feed probably isn't as aggressive as that, since it spreads the load and only loads more when you scroll down), then perhaps the delay would need to be something smaller, like 2-3 seconds; it'll still make attacks on RC4 infeasible, along with BEAST and co.
This part is beyond my current level of technical expertise to comment on, sorry.
Thrawn wrote:Maybe if, instead of saying anything about a possible attack, the browser simply warned that the page was loading a lot of resources and might slow things down? That presents the user with a meaningful decision. They know whether they have time to sit around. They know if they're on a metered connection. They know if there are other search results that might load faster. They can realistically choose yes or no without further input. It could be like the warning about hung scripts.
agreed, +1
but it should indicate that the *browser* is slowing things down too though. Otherwise users will say "why does this site load so slowly in [current browser version], it loads significantly faster in [old browser version]".
*Always* check the changelogs BEFORE updating that important software!
-
yes_noscript

Re: Idea: rate-limiting in browsers

Post by yes_noscript »

*joining the thread as normal NoScript user*
Is their a chance to implement this?
Mozilla/5.0 (Windows NT 10.0; rv:25.4) Gecko/20150509 PaleMoon/25.4.1
barbaz
Senior Member
Posts: 10834
Joined: Sat Aug 03, 2013 5:45 pm

Re: Idea: rate-limiting in browsers

Post by barbaz »

yes_noscript wrote:*joining the thread as normal NoScript user*
Is their a chance to implement this?
You mean in NoScript or in Firefox?
*Always* check the changelogs BEFORE updating that important software!
-
yes_noscript

Re: Idea: rate-limiting in browsers

Post by yes_noscript »

In NoScript. But if is a chance to build this into the browser then fine. (I using Pale Moon btw)
Mozilla/5.0 (Windows NT 10.0; rv:25.4) Gecko/20150509 PaleMoon/25.4.1
User avatar
Thrawn
Master Bug Buster
Posts: 3106
Joined: Mon Jan 16, 2012 3:46 am
Location: Australia
Contact:

Re: Idea: rate-limiting in browsers

Post by Thrawn »

Well, as previously mentioned, NoScript users don't need this so much. NoScript will already kill most attacks.

Theoretically, it is possible for an extension like NoScript to do just about anything the browser can. But I would think that this would not be core to NoScript, and would belong in a separate extension.
======
Thrawn
------------
Religion is not the opium of the masses. Daily life is the opium of the masses.

True religion, which dares to acknowledge death and challenge the way we live, is an attempt to wake up.
Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:38.0) Gecko/20100101 Firefox/38.0
Post Reply