Page 1 of 1

Pages fail to load at all or very slowly with "allow sites"

Posted: Tue Jun 14, 2011 3:36 am
by gracefool
When I run ABE with "Allow sites to push their own rulesets" enabled, there is massive slowdown especially when starting the browser with multiple tabs open, such that many tabs fail to load at all (even including about:addons). Firefox is practically unusable.

Took me a while to track this down. Using NoScript 2.1.1.1, Firefox 4.0.1, and a bunch of extensions. Using only the default ABE ruleset.

Re: Pages fail to load at all or very slowly with "allow sit

Posted: Wed Jun 15, 2011 3:15 pm
by therube
(Was/is "Allow sites to push their own rulesets" experimental?)

Re: Pages fail to load at all or very slowly with "allow sit

Posted: Wed Jun 15, 2011 6:41 pm
by GµårÐïåñ
Generally it should not take that much overhead to check and see if the server is pushing a ruleset or not, BUT, if you are loading a massive enough list of different domains simultaneously and it has to check with each one, then yeah there is reasonable expectation that a potential toll on the speed of loading while it validates each site could exist. I personally like to control my rulesets and don't have that option checked, why do you want to run it with that on anyway? Although the feature is great and has potential, it is still a bit of ways away from standardization by domain hosts/webmasters to actually do it and validated it and so on, so run without it.

Re: Pages fail to load at all or very slowly with "allow sit

Posted: Thu Jun 16, 2011 4:45 am
by gracefool
I'm not loading a massive number of domains. There's a bug; the slowdown happens even with a few tabs.

Re: Pages fail to load at all or very slowly with "allow sit

Posted: Thu Jun 16, 2011 10:03 am
by Giorgio Maone
gracefool wrote:I'm not loading a massive number of domains. There's a bug; the slowdown happens even with a few tabs.
Unfortunately it's not a bug: for this feature to work, first time a certain domain is hit with a HTTPS request, its "rules.abe" file must be searched for and no other request to the same domain can be initiated until the file is retrieved or established as not present.

This means that the number of requests are doubled, and they cannot even be parallelized (CSRF, which is the threat handled by ABE, happen on server hit, not on content load).
If the site is not very fast (and HTTPS sites are not, usually), and you're downloading multiple sites at once (as it happens on session restore) the result can be dramatic.

This feature is turned off by default for two reasons: most sites don't deploy any rules.abe file (so the first request is always a 404, and you end to spam the file's log), and the performance hit is such that only motivated users (e.g. corporate ones which need to access protected resources) can bear it.

Nonetheless, a low priority item in my TODO list is finding ways to mitigate this issue (even though it's very hard by design, because of the way CSRF works).

Re: Pages fail to load at all or very slowly with "allow sit

Posted: Thu Jun 16, 2011 10:32 am
by gracefool
Ah thanks Giorgio. It would be nice if there were a warning about the severe performance impact in the GUI.