ABE: A few questions

Discussions about the Application Boundaries Enforcer (ABE) module
TestingABE

ABE: A few questions

Post by TestingABE »

Hi!

I've got a few questions that I couldn't answer by myself nor by searching the net.

1) Let's assume that I put the following rule in ABE:

Code: Select all

Site *
Accept from SELF
Deny
That sure is one restrictive rule. Assuming there are no other user rules, is it 100% guaranteed that external sites will know NOTHING? Links will still appear in Adblock Plus as unfiltered. That's only due to the fact that ABP checked the page before NoScript, right? But still no request has been performed anywhere except on the exact site we're browsing (according to the rule) ? I assume so but wished to make sure before getting to other questions.

2) Now, many site should be able to communicate with their subdomains, like informaction.com, http://www.informaction.com and forums.informaction.com for instance. With the exception of sites like akamai which would require separate, higher priority rules. How can I setup a global rule that would look like this:

Code: Select all

Site (ANY_SITE)
Accept from (all-subdomains.ANY_SITE)
Deny
3) Does ABE protects efficiently against Webbugs when configured properly? It seems so. Doesn't it make RequestPolicy useless when configured properly? How about ABP with EasyPrivacy ? How about ABP with EasyList?

4) When I try to watch a youtube video using these (restrictive-happy) rules:

Code: Select all

Site *
Accept from SELF

Site ytimg.com *.ytimg.com googlevideo.com *.googlevideo.com
Accept GET from youtube.com *.youtube.com

Site http://208.117.250.16 http://208.117.250.18 http://208.117.250.165
Accept GET from youtube.com *.youtube.com googlevideo.com *.googlevideo.com

Site *
Deny
It worked for my tests, but since the IP list is uncomplete, it will only work sometimes. It seems that these IPs are DNS servers whose name is DNS1.SJL.YOUTUBE.COM and DNS2.SJL.YOUTUBE.COM. These servers have registered the net range 208.117.224.0 - 208.117.255.255. (according to the whois). If I strip the http:// off these IP addresses in the rule, ABE engine doesn't process them. If I put DNS1.SJL.YOUTUBE.COM, same. When I put http://dns1.sjl.youtube.com, the engine processes it properly, but ABE blocks the requests as it doesn't link the address to the IP.
So my question is, how can I make a rule for dns1 and dns2.sjl.youtube.com ? Is it possible to allow the IP range from 208.117.224.0 to 208.117.255.255 ? Is it certain that these IPs are only given to those two servers during at least the next couple years?

5) There may be a little bug with ABE logging in the error console. When trying to watch a video on youtube without enabling Javascript (for test purposes), I grabbed the swf link and its required parameters from the page source. It comes down to something like this: hxxp://s.ytimg.com/yt/swf/watch-vfl105721.swf?video_id=ne7P4csDLWY&t=vjVQa1PpcFN30F4DLOAoPykIFTGVHGzG5Gm2DKWSiCg=
With the rules above (and the right IPs), it did not work. Of course, it needed a new rule for *.youtube.com to allow GET requests from *.ytimg.com... But for that one, ABE didn't log any blocked request even though it was indeed blocking. Note that when I had to update the IP list, ABE in those same circumstances did tell me it was blocking requests to the new IP. For some reason it just didn't log the fact that *.youtube.com wasn't allowed to get requests from *.ytimg.com.

6) Lastly, what does this parallelization of DNS requests do? Does it resolve all domain names linked in a page even if we don't click them? If so, does it obey the network.dns.disablePrefetch about:config pref and gets disabled when this pref is set to true?


Thanks a lot!
Gotta love NoScript.
Mozilla/5.0 (Windows; U; Windows NT 6.0; fr; rv:1.9.0.11) Gecko/2009060215 Firefox/3.0.11 (.NET CLR 3.5.30729)
TestingABE

Re: ABE: A few questions

Post by TestingABE »

Oh, forgot one.

7) Any time we click on a link on siteA.com that goes to siteB.com, siteB.com receives a GET request from siteA.com, right?
So, in the context of a ruleset that denies everything except what is explicitely allowed, the global rule for all sites would be

Code: Select all

Site *
Accept from SELF
Accept GET from *
Deny
In the case where we'd like the strictest possible rules, assuming the referrer is stripped from those GET requests, is there a better rule than this one? (Of course other site specific rules would go somewhere between Site * Accept from SELF and Site * Deny)
What do siteB.com and siteA.com know, if I allow such GET requests (with no referrer)?
Mozilla/5.0 (Windows; U; Windows NT 6.0; fr; rv:1.9.0.11) Gecko/2009060215 Firefox/3.0.11 (.NET CLR 3.5.30729)
User avatar
Giorgio Maone
Site Admin
Posts: 9454
Joined: Wed Mar 18, 2009 11:22 pm
Location: Palermo - Italy
Contact:

Re: ABE: A few questions

Post by Giorgio Maone »

TestingABE wrote: Let's assume that I put the following rule in ABE:

Code: Select all

Site *
Accept from SELF
Deny
That sure is one restrictive rule. Assuming there are no other user rules, is it 100% guaranteed that external sites will know NOTHING?
First, ABE's aim is not controlling how sites "see" things (i.e. privacy), but how sites can interact with each other (i.e. web application isolation).
That said, the two things may overlap, even though the conceptual "direction" of ABE rules is usually the opposite of what many people seem to assume.
In a CSRF-protection perspective, the "Site" clause defines the site(s) we want to protect, and the "from" clause identifies friends (if action is "Accept") or foes (if action is "Deny").

However, provided that you don't allow 3rd party cookies nor referrers to be sent, that rule makes "100% guaranteed that external sites will know nothing", in your words.
TestingABE wrote:Links will still appear in Adblock Plus as unfiltered. That's only due to the fact that ABP checked the page before NoScript, right? But still no request has been performed anywhere except on the exact site we're browsing (according to the rule) ? I assume so but wished to make sure before getting to other questions.
Yes, that's all correct. The processing order is ABP > NoScript content policy blocking > XSS filters > ABE rules.
TestingABE wrote: Now, many site should be able to communicate with their subdomains, like informaction.com, http://www.informaction.com and forums.informaction.com for instance. With the exception of sites like akamai which would require separate, higher priority rules. How can I setup a global rule that would look like this:

Code: Select all

Site (ANY_SITE)
Accept from (all-subdomains.ANY_SITE)
Deny
You currently cannot (there's no "subdomain of Site" meta-identifier), but I can evaluate something like that as an enhancement, e.g. something like
Site * Accept from *.SITE, where "SITE" would be replaced with the current Site match.
TestingABE wrote: 3) Does ABE protects efficiently against Webbugs when configured properly? It seems so. Doesn't it make RequestPolicy useless when configured properly? How about ABP with EasyPrivacy ? How about ABP with EasyList?
Yes, ABE can surrogate all those tools.
However all them have their specializations, which may make them valuable even if used together.
Request Policy, for instance, makes really easy for newbies to setup exceptions to generally restrictive rules, even though its resolution at the domain level is less flexible than ABE. This will be mitigated when ABE gets its own rule wizard, but that's future.
Msot Easy* items will be technically replaceable by ABE subscriptions, when they're activated, but keeping them updated with the same devotion is another story.
Actually, adding a bridge which automatically translaties ABP filtersets into ABE rulesets should not be exceedingly difficult, at least for the "general" URL filters (i.e. those which don't have content-type modifiers such as $script, $object and the like).
TestingABE wrote: So my question is, how can I make a rule for dns1 and dns2.sjl.youtube.com ? Is it possible to allow the IP range from 208.117.224.0 to 208.117.255.255 ?
Working with IPs will be much easier in next release. At this moment it's quite hairy, and the only DNS resolution working really well is the one fueling the "LocalRodeo" SYSTEM rule.
TestingABE wrote: Is it certain that these IPs are only given to those two servers during at least the next couple years?
Alas, you can only ask Youtube staff and they can always change their mind, too ;)
TestingABE wrote: ABE didn't log any blocked request
What do you mean by "logging" here?
ABE display a notification for DENYed document loads only, and logs everything else in Tools|Error Console.
Do you refer to the former or to the latter?
TestingABE wrote: Lastly, what does this parallelization of DNS requests do? Does it resolve all domain names linked in a page even if we don't click them?
Nope. It just let the request initiator keep doing its business (i.e. processing other requests) while ABE is waiting for a DNS response if needed to decide a certain rule outcome.
The DNS query, if needed, it's performed only after a request is already decided (passed ABP, NoScript and XSS filters) and if in-browser DNS is allowed by your proxy settings.
TestingABE wrote: Any time we click on a link on siteA.com that goes to siteB.com, siteB.com receives a GET request from siteA.com, right?
So, in the context of a ruleset that denies everything except what is explicitely allowed, the global rule for all sites would be

Code: Select all

Site *
Accept from SELF
Accept GET from *
Deny
In the case where we'd like the strictest possible rules, assuming the referrer is stripped from those GET requests, is there a better rule than this one? (Of course other site specific rules would go somewhere between Site * Accept from SELF and Site * Deny)
What do siteB.com and siteA.com know, if I allow such GET requests (with no referrer)?
Looks like you're still searching privacy from ABE, and as I said it's not its primary aim.
SiteB.com can still know a lot about your usage of sitaA.com, if you let siteA.com include an image or another auto-loading resource like
http://siteB.com/?ref=http://siteA.com
Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.9.1) Gecko/20090624 Firefox/3.5 (.NET CLR 3.5.30729)
TestingABE

Re: ABE: A few questions

Post by TestingABE »

Thank you for the fast reply :)

I have read many of your posts where you say that ABE's primary aim is CSRF-protection and not privacy and I actually wanted to aknowledge that in my first post. I forgot to do it. I understand why you're willing to make that clear, and if I still ask privacy related questions that's because I feel ABE has potential for wider use provided it is configured properly by the user. The main reason I'm trying to use ABE for that purpose is that the less extensions in my Fx, the better it is. And because I trust your thoroughness and your working capacity.
I have a handful of disabled extensions that I enable on specific (uncommon) instances. Eventually I use the -no-remote tag to run several Fx profiles at once. In everyday use, I only enable NoScript carefully configured and Cookie Monster (CookieSafe was a lot slower when I tested it about a year ago, due to the fact that Fx had tons of stuff blacklisted by Spybot S&D. Don't know if CS Lite works better.). I use ABP too but I don't want to block advertising; what annoys me is tracking (unfortunately attempting to defeat the latter tends to defeat the former...). I'm also looking over RequestPolicy but it's apparently not been tested enough by the Mozilla team, so I'm probably going to wait.

What I am coming to is, if NoScript, with a careful configuration, can replace 2 excellent addons (ABP and RequestPolicy), I'm willing to spend some time informing myself. And if it appears that ABE isn't better in the privacy area than these addons, I won't be questioning your skills: I know that I'm trying to use the tool in a "secondary" way... I'm sure that RequestPolicy allows just what I'm looking for when I play around with ABE trying to see if it can isolate sites from one another unless I say otherwise. But I'm still reluctant to use a not reviewed addon, add yet another one, and do so when there's ABE that shall in the next few months be able to do the same. I want to give ABE a try in filling that privacy purpose.


Now that it's cleared up, a couple replies:
Giorgio Maone wrote:You currently cannot (there's no "subdomain of Site" meta-identifier), but I can evaluate something like that as an enhancement, e.g. something like
Site * Accept from *.SITE, where "SITE" would be replaced with the current Site match.
A SITE variable would probably be useful in several circumstances, that would be a nice addition. :)

Giorgio Maone wrote:What do you mean by "logging" here?
ABE display a notification for DENYed document loads only, and logs everything else in Tools|Error Console.
Do you refer to the former or to the latter?
The latter. For me, the error console ONLY logs denies (I was about to suggest it could be useful to have a toggle button that logs "allows" too).
As for DENY notifications, I get some when I try to open a link with very restrictive rules, but many denies go unnotified: No notification bar at all even though many things are denied... It could be because images and stuff like that aren't considered "documents" though, so it wouldn't be a bug.
In any case DENYs are still logged in the error console. But here in the first post, I was talking about a denied GET request that was not logged anywhere for some reason.
Giorgio Maone wrote:Working with IPs will be much easier in next release. At this moment it's quite hairy, and the only DNS resolution working really well is the one fueling the "LocalRodeo" SYSTEM rule.
In the case of youtube, when ABE will be able to recognize dns1.sjl.youtube.com as the everchanging IP for which I set up a separate rule in the first post, then this rule shall become useless because of the *.youtube.com rule. I'll wait until you've developped ABE further then :) (Note that even my Firewall can't automatically recognize fixed domain names that keep switching IPs, so good luck with that ;) )
Giorgio Maone wrote:
TestingABE wrote:Is it certain that these IPs are only given to those two servers during at least the next couple years?
Alas, you can only ask Youtube staff and they can always change their mind, too ;)
At least they won't change domain names, which I guess makes it safer to use rules with domain names than rules with IPs....when it's possible that is. :)


As for clicking on links in websites, looks like...

Code: Select all

Site *
Accept from SELF
Accept GET from *
Deny
...isn't the rule I'm looking for... I wonder what could be good... I'll play a bit more and if I don't find anything fine I may give RequestPolicy a try.

Giorgio Maone wrote:Yes, ABE can surrogate all those tools.
However all them have their specializations, which may make them valuable even if used together.
But having the same things checked over and over and eventually modified by each addon... Is it really worth it having several addons that do similar things, once the downsides are taken into account? (i.e. slowdowns, potential conflicts, ???, ...)


And uh...LAST questions, I promise :P
8/ How does Noscript|Options|Advanced|Untrusted|Forbid "Web Bugs" work? Is it disabled on trusted sites even if "Apply these restrictions to trusted sites" is checked? And finally, won't it be rendered useless/redundant with a future ABE subscription that mimics EasyPrivacy?

9/ Is there a way for a webmaster to make revenue from advertising while ensuring his visitors are not tracked AT ALL? (and if yes, would it be a much lower revenue or would the difference be acceptable?)


That was one messy post. Thank you for your time and answers, your "customer support" is just too good to be true. It's like you're omnipresent or there are a dozen instances of you replying everywhere and working and doing researches at the same time. I don't know how you do that. Is it the secret recipe in your cup of coffee? I could use that.
Mozilla/5.0 (Windows; U; Windows NT 6.0; fr; rv:1.9.0.11) Gecko/2009060215 Firefox/3.0.11 (.NET CLR 3.5.30729)
Post Reply