Varying browser fingerprint to make tracking more difficult

General discussion about web technology.
Post Reply
Tom T.
Field Marshal
Posts: 3620
Joined: Fri Mar 20, 2009 6:58 am

Varying browser fingerprint to make tracking more difficult

Post by Tom T. »

(Split as O/T from "DNT Redundancy" -- Tom T.)
JackBlack wrote:I have many settings tweaked to make my fingerprint less unique. You seem to have yours as well, some overlap with mine, but in the end it kind of looks like voodoo tricks that, although partially effective, probably ends in a false sense of anonymity.
Absolutely. No one should use the Internet in any manner that relies on them being completely anonymous and untraceable, because if your adversary has enough resources and motivation, you can be found. We can, however, make it non-trivial for basic tracking and dossier-building, which is a reasonable goal.
JackBlack wrote: I think we're still doomed to be almost unique because of our IP address even if it's dynamic.
If a single IP address is used, or reused, by many different users, then the info collected over time will vary widely, eventually approaching randomness. Today, a 40-yr-old man; next month, a 14-yr-old girl. Etc. Consider a university in which 5000 students share one WAN-facing IP. It's easy to know that the request came from the university, but not possible to know by which student unless the admin is corrupt or incompetent. (Both are possible, of course.)
JackBlack wrote:The only way would be to proxify our butt using things such as Tor, but it's just too uncomfortable to do this all the time. So we end up having our daily traffic reveal our IP and thus, our approximate location
Agree on Tor, and for other issues too -- it's not so private as one is lead to believe. And, of course, Govs and others can place Tor nodes and do traffic analysis. However, you can use foreign proxies, should you wish. I know one other Mod here who has several proxies, on both sides of the Atlantic. And Giorgio surely has many proxies, because, e. g., one issue involved a site that was accessible only to UK users -- meaning, those with a UK IP. Can't diagnose the issue without the proxy. So however much time, effort, and money you're willing to spend...

IP geolocation per se can be a pretty rough estimate. I've had sites ID me (same IP) as being in half a dozen different cities, with the distance between the two most distant being a good 60 miles (100 km) or so. It depends on your ISP, and how narrowly they restrict a given octet, and how much they reveal to others.
Using my neighbor's connection (different ISP) places it in a city 150 miles away. But agree that population density affects how many possible browsers could have the same fingerprint. The radius described above, on my own, would be in the seven-figure population range. In Wyoming, probably not. ;)

Critical is to disable geo.enabled in about:config, which I'm sure you've already done, and *never to allow scripting from google.com*, which uses nearby known points to place you, sometimes as closely as 100m.
JackBlack wrote: and thus, place us within quite a small group of users sharing our fingerprint, as non-unique as it is country or world wide.
True. So, we vary fingerprints.
JackBlack wrote:Hint: I'd be glad to be wrong and that you have some miraculous solution, but aside from living in a 10 millions people city... :p
No miracles here. But one can spread disinformation when sites require information. I have several Yahoo mail accounts, not counting disposable, and one thinks I'm a young woman living 15 miles away, and the others... you get the picture.

Doesn't work when you buy stuff online, of course, because as soon as someone knows your name and address, game over. So either don't shop online, or don't be too paranoid about it. Or use one specific e-mail and ISP for online shopping and banking, and another for all else. ... balance how much this is worth to you in inconvenience vs. how valuable is the personal info. I lead a dull, boring life, :cry: , and never see the "targeted ads", but resent the tracking on principle.
JackBlack wrote:(And then your ISP still knows everything that's not proxified, but that's another story...)
If they ever revealed anything without being first served a subpoena, warrant, or other Court order, my lawyer would jump on that, and we could both retire -- I *hope*.

They can't read your SSL/TLS-encrypted connection traffic, unless they're extremely unethical (MITM with false certs, etc.). They know what IP you are visiting, of course, but not the contents. IMHO, the whole Net should use secure connections, as the improvements in tech render the increase in overhead more and more trivial.
JackBlack wrote:I have one solution though: Have a fingerprint switcher. Useragent is not enough...is there such a thing as a fingerprint switcher?
Yes. It's called a profile. Create as many as you like. Use your must-have add-ons (NoScript grows in popularity; hence, not as much of a GUID as when the user pop was small), but add some non-essential ones, even ones you don't care about, and make the mix different for each. Vary which profile you use. Combine that with changing useragent. You'd have to update each one for each Fx or add-on update, or use a Synch feature, but make sure the Synch didn't make the add-ons the same in every profile. Probably have to end up writing your own batch script or something. But personally, I don't care to go to that much trouble.
JackBlack wrote: The ideal one would be able to allow Javascript but provide fake system and browser variables.
Faking OS may break some pages, such as coding for Mac vs. Windows, and so may faking a browser, such as coding for IE vs (everyone else, LOL).

There's a limit. Also, I could change my language pref to en-UK, but the IP is US no matter what. So you'd need a UK proxy. Hmm... Australia? :D

But then, one would have to remember to write in that dialect. Not easy... Just exactly what is it you're hiding? WAIT! *Don't* tell me! :mrgreen:
JackBlack wrote:
Tom T. wrote:I never see ads anyway (ask me how, should you like)
I'm guessing you use some kind of other way to filter ads, such as Privoxy or hosts file or even a white list based approach such as RequestPolicy?
The latter two, exactly.
JackBlack wrote:On a side note I have nothing against most ads, my issue is that ads mean tracking. I wouldn't have blocked them if they hadn't been attempting to mark me like a cow :p
I find most of them annoying and distracting, which is their goal. In a world with a superfluity of stimuli bombarding us constantly, they must try ever harder to get our attention. I've even used "No Style" on some hard-to-read web sites.
JackBlack wrote:
Tom T. wrote:Surprised that ABP would limit its use
Yeah... I double checked and it appears that I've been misled and ABP most likely sends DNT for every request. The rule contains specifics about images but it's a workaround for backward compatibility. :)
Glad to hear that. I had a higher opinion than that of Wladimir's astuteness, so thanks for confirming.
Last edited by Tom T. on Sun Jan 22, 2012 3:33 am, edited 1 time in total.
Reason: add notice of origin of split
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
JackBlack

Re: DNT redundancy?

Post by JackBlack »

Just exactly what is it you're hiding? WAIT! *Don't* tell me!
Heh that's not the point. The question should be reversed. What is it they want from us, to stalk us all individually like this? And why should we accept?

No miracles here. But one can spread disinformation when sites require information. I have several Yahoo mail accounts, not counting disposable, and one thinks I'm a young woman living 15 miles away, and the others... you get the picture.
You'd make such a geek girl :D
But you need Javascript for yahoo. I never bothered to check their scripts but potentially they could identify all your accounts as being a single person only through javascript fingerprinting. With Javascript turned off it could be safer indeed seeing how crowded your area is. Still this is something that's hard to make sure of, it could be good ol' false sense of anonymity striking again. :p
Yes. It's called a profile. Create as many as you like. Use your must-have add-ons (NoScript grows in popularity; hence, not as much of a GUID as when the user pop was small), but add some non-essential ones, even ones you don't care about, and make the mix different for each. Vary which profile you use. Combine that with changing useragent. You'd have to update each one for each Fx or add-on update, or use a Synch feature, but make sure the Synch didn't make the add-ons the same in every profile. Probably have to end up writing your own batch script or something. But personally, I don't care to go to that much trouble.
Yeah, it can work but it's really really tedious and you're still limited to about 5 profiles, after that it would be too hard to handle. You'd also have to make sure to log into your different website accounts from always the same profile, and if you have dynamic IP, it might be best to check it before picking a profile so that in the long run, each profile would have its own IP range. (making you quite effectively 5 distinct people)

But it's just horrible browsing experience :D
Also you HAVE to disable Javascript, WebGL and all those revealing technologies on at least 4 out of 5 profiles, otherwise identities are compromised because JS reveals so much about your computer setup. Even CSS 3 reveals a bunch of info, but disabling CSS often comes down to making your browsing experience a living hell.


Torbutton tries to mitigate the concerns over how unique everyone's Javascript fingerprint is world-wide, but it does so in the opposite way of what we're looking for right now. It tries to make all fingerprints to be the same, for all users, while we would like to be able to switch at will between many different widespread fingerprints. A concerned developer could maybe use Torbutton or Tor browser's source code and tweak it to achieve that goal more easily... *hint at developers reading this :roll:*
It would be great if there was an efficient and convenient fingerprint switcher around. Even if it was a whole fork of Firefox instead of an addon I'd be interested.


Agree on Tor, and for other issues too -- it's not so private as one is lead to believe. And, of course, Govs and others can place Tor nodes and do traffic analysis. However, you can use foreign proxies, should you wish. I know one other Mod here who has several proxies, on both sides of the Atlantic. And Giorgio surely has many proxies, because, e. g., one issue involved a site that was accessible only to UK users -- meaning, those with a UK IP. Can't diagnose the issue without the proxy. So however much time, effort, and money you're willing to spend...
You mean buying/renting servers around the world and setting them up to route your traffic? But you'd still have to pay bills, that's not too anonymous :D
Unless you mean something else.

As for tor, do you know in which way it is not as private as it seems? (maybe you have some links if it's too long a topic to discuss? )
Unless I'm mistaken malicious tor nodes are only a concern when there are at least 2 out of 3 in your way. Probably quite safe to say odds are low, no? Aside from the malicious nodes issue, someone able to sniff both Facebook servers and your area (or maybe tor entry nodes), could try and correlate that it's you who made that request to Facebook using Tor. But I'm not aware that anyone sniffs traffic on such a large scale. It could be a concern for US citizens eventually, seeing how Facebook is in the US as well and how there's the Patriot Act around, but... Dunno.

What is an actual reality though is that most ISP and most large sites store extensive data and do mine it with a surprising efficiency. While not all knowledgeable or error proof, algorithms get better each year as research moves forward, and profiles built with them are good enough that the market around this crap is giganormous. What professionals from this field call "aggregate" data that is "not personally identifiable" has a meaningful percentage of chances to actually be personally identifiable. The "paranoid" (lack of) argument from supporters stands no more :p

Also, each year the % of population concerned over online privacy increases. I don't remember how high it is exactly, I've seen 50-80% figures but it depends on the poll question, so it's something to look up. Either way, bottom line is: There's demand for efficient privacy tools, and major browser vendors are unlikely to ever meet this demand efficiently enough.
Hence developing a fingerprint switcher addon (or Fx fork) would make sense... :roll:
Mozilla/5.0 (Windows NT 6.0; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Tom T.
Field Marshal
Posts: 3620
Joined: Fri Mar 20, 2009 6:58 am

Re: DNT redundancy?

Post by Tom T. »

JackBlack wrote:
Just exactly what is it you're hiding? WAIT! *Don't* tell me!
Heh that's not the point. The question should be reversed. What is it they want from us, to stalk us all individually like this? And why should we accept?
Mostly, "they" want demographic and other info (browsing and purchase habits, etc.) so that advertisers get the most "bang for the buck" -- so that their ads reach those most likely to be interested. They pay for each view of the ad. A product whose demographic is, say, teen girls, whose ad gets viewed by many adult males, is a wasted cost to the advertiser. They're paying for views that will not result in sales.

Even better, to them, is specific interests. If you're ID'd as being interested in ocean cruises, the stock market, or rap music, then the advertisers of those products will pay extra for displays to their respective markets.

One woman in the US sued DoubleClick, then the largest ad agency on the Net, and when forced to disclose their file on her, it was the equivalent of 968 single-spaced typewritten pages. What color underwear she bought, or browsed for, laxatives used, ---- etc. :evil:

But you can relax -- DoubleClick is now owned by Google, and we all know how private they keep your data, search queries, browsing habits, etc., right? :mrgreen:

Question: Why would anyone use a web browser *produced* by a company that gets 99+% of its revenue from advertising and/or selling data about you? :?:

This is why cookies and timers record how long you spend on a given page, which links you click, which items you browse and for how long you look at them, etc.
JackBlack wrote:
Tom T. wrote:No miracles here. But one can spread disinformation when sites require information. I have several Yahoo mail accounts, not counting disposable, and one thinks I'm a young woman living 15 miles away, and the others... you get the picture.
You'd make such a geek girl :D
But you need Javascript for yahoo. I never bothered to check their scripts but potentially they could identify all your accounts as being a single person only through javascript fingerprinting. With Javascript turned off it could be safer indeed seeing how crowded your area is. Still this is something that's hard to make sure of, it could be good ol' false sense of anonymity striking again. :p
Actually, is it not possible that I have a daughter (and wife, and two sons, and mother-in law) living here? And that we all share the same IP? Or even the same computer? (Can't get those bleep kids off the machine! :lol: ) Probably not the same e-mail address, but suppose Junior takes his laptop to his buddy's house, and logs in from there? Etc.

I fine-tune Yahoo Mail script permissions for the sake of security (which trumps privacy IMHO, if it comes down to that), but that in itself may be an identifier.
I don't really care. I don't see the ads, and in this economy, who can afford to buy anything, anyway? :cry:

(multiple profles)
But it's just horrible browsing experience :D
Also you HAVE to disable Javascript, WebGL and all those revealing technologies on at least 4 out of 5 profiles, otherwise identities are compromised because JS reveals so much about your computer setup. Even CSS 3 reveals a bunch of info, but disabling CSS often comes down to making your browsing experience a living hell.
As said, just how much effort do you wish to expend?

In the late 1990s, a high executive at a large IT company said, "You have zero privacy now, Get over it."
Not quite true back then, but it's getting there rapidly. So, adjust the effort to the value of what's protected.
I use encrypted e-mail for stuff that's worth protecting. What goes through Yahoo, I really don't care about all that much (or it shouldn't go through them, or Gmail, or Hotmail...)
JackBlack wrote:
... However, you can use foreign proxies, should you wish. ...
You mean buying/renting servers around the world and setting them up to route your traffic?
I mean publicly-available proxy services that will route your traffic through them before it goes out on the public Internet, so that their address, or one of their IP pools, is attached to your traffic instead of your own. Your own ISP sees you first, but then it goes to your proxy in Germany, so the web site you're visiting sees a German IP.

Of course you must pay a monthly fee for most of these services. (Some allow a very limited amount of bandwidth per month for free, as sort of a trial offer, or knowing you'll go over it, and they'll get paid.) So you need to do research and decide if the proxy service is trustworthy. (Perhaps I should have said "proxy service", but "proxy server" is often used, because they're acting as your server, then relaying to the real server you want to visit.)
JackBlack wrote: As for tor, do you know in which way it is not as private as it seems? (maybe you have some links if it's too long a topic to discuss? )
Yes, I'd recommend Wikipedia for starters, and their various references and links, then Scroogle for articles about Tor.
JackBlack wrote: and how there's the Patriot Act around, but... Dunno.
Fortunately, the Unpatriotic Act gives only Government agencies the right to read your e-mail and tap your phone. I don't think they're selling the data to advertisers, because that would help reduce the budget deficit, and they're adamantly against that, judging by all measures. :P
JackBlack wrote:Also, each year the % of population concerned over online privacy increases.
Then how is it that the % of the population that puts its entire life history and details of their every move on TwitFace continues to increase?
-- one of the greatest coups ever by the industry. You and I fight for privacy; everyone else has been convinced that it's cool to give it all away for nothing. Go figure.
JackBlack wrote:Either way, bottom line is: There's demand for efficient privacy tools, and major browser vendors are unlikely to ever meet this demand efficiently enough.
Agreed.
JackBlack wrote:Torbutton tries to mitigate the concerns over how unique everyone's Javascript fingerprint is world-wide, but it does so in the opposite way of what we're looking for right now. It tries to make all fingerprints to be the same, for all users, while we would like to be able to switch at will between many different widespread fingerprints. A concerned developer could maybe use Torbutton or Tor browser's source code and tweak it to achieve that goal more easily... *hint at developers reading this :roll:*
It would be great if there was an efficient and convenient fingerprint switcher around. Even if it was a whole fork of Firefox instead of an addon I'd be interested. <snip> Hence developing a fingerprint switcher addon (or Fx fork) would make sense... :roll:
An interesting experiment in that regard -- excuse me, my lawyer is nudging me.

WARNING: THE FOLLOWING EXPERIMENT WAS CONDUCTED UNDER CONTROLLED LABORATORY CONDITIONS. DO NOT TRY THIS YOURSELF.
OK, he's cool now.

When support was ending for Firefox 2.x, an online financial institution warned me on login that F2 would be allowed to login for only a certain grace period after end of support, and to upgrade ASAP, blah blah. Fair enough.

I got curious. After the "grace period" expired, I went there with the final version, Fx 2.0.0.20.
Yes, they wouldn't allow login, get one of these currently-supported browsers and come back, etc.
So I changed the useragent in F2's about:config to Firefox 3.(whatever). *Nothing else* changed.

And got logged in without trouble.

There was a mismatch between the Fx version # and the Gecko engine build date, (and many other differences between the 2 and 3 versions), but they clearly didn't look at anything except the UA version #. (I've always said that banks are the worst at security.)

To this day, I still go there once in a while with F2, updating the UA to, say 3.6.25.
And it still works. :roll:

*If* one wanted to fiddle more with UA for *fingerprint* purposes, it would be necessary to make sure the browser version # matched the Gecko engine version date, else you'd have a globally-unique fingerprint. :o

Last thought: One could have several versions of prefs.js file, stored in folders identifying the differences (to the extent that they show up in fingerprinting), and perhaps write batch scripts that would overwrite the existing one with one of the varied ones with just two clicks, and other batch files to swap among them.
What a PITA. Not for me, thanks.

We've gotten quite O/T. If this were to continue, it should probably be split off into Forum Extras > Web Technology. And i really should devote the time to actual technical support issues, as interesting and enjoyable as this discussion has been. ;)

Cheers. Image
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
JackBlack

Re: DNT redundancy?

Post by JackBlack »

One woman in the US sued DoubleClick, then the largest ad agency on the Net, and when forced to disclose their file on her, it was the equivalent of 968 single-spaced typewritten pages. What color underwear she bought, or browsed for, laxatives used, ---- etc.
Correlating the data available my guess would be brown underwear. :mrgreen:

But you can relax -- DoubleClick is now owned by Google, and we all know how private they keep your data, search queries, browsing habits, etc., right?
Google's both a great company and one to avoid as much as possible. :p

Also, each year the % of population concerned over online privacy increases.
Then how is it that the % of the population that puts its entire life history and details of their every move on TwitFace continues to increase?
-- one of the greatest coups ever by the industry. You and I fight for privacy; everyone else has been convinced that it's cool to give it all away for nothing. Go figure.
People have been increasingly caring about privacy in the past five years or so, they just have no idea what to do. Remedies are far too complex and cumbersome, and on top of that their efficiency is unsure. Things are slowly improving as awareness rises and pressure comes from both authorities and the people... The sheer demand for privacy and the almost complete absence of products meeting this demand leads browser vendors and groups such as Tor project or EFF to do research and find solutions. They're doing this today and tomorrow I'm confident we'll have more proper tools that actually work reasonably enough not to be called voodoo tricks. :)


Just wanted to end the thread on a positive note :)
No more messing up your moderating duty with my off topic-ness now. Thanks for sharing!
Mozilla/5.0 (Windows NT 6.0; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
tlu
Senior Member
Posts: 129
Joined: Fri Jun 05, 2009 8:01 pm

Re: Varying browser fingerprint to make tracking more diffic

Post by tlu »

Interesting thread, indeed! Just some additional comments:
Tom T. wrote:
Critical is to disable geo.enabled in about:config,
I think that the privacy threats of having geolocation enabled are limited considering what is said here. In other words, geolocation is completely opt-in.

Regarding cookie management: I think that controlling cookies via Cookie Monster is superior to the built-in FF cookie management as it is easier to fine-tune your cookie permissions, particularly if you un-check the "Use Second Level Domain Names" option.

Cookie permissions also determine how DOM storage (sometimes called "super cookies") is handled. Some users disable it completely by setting "dom.storage.enabled" to "false" in about:config. However, I found that this breaks some sites, e.g., one of my banking sites. Considering that the same permissions are applied as to cookies, I think that you can leave it enabled once a strict cookie management is adopted.

Regarding RequestPolicy: Tom, I know you're a fan of it, and I, too, had used it for a while. But I found that it breaks too many sites and requires a lot of manual intervention. Quite disturbing is also that the same blocked domains (like googleanalytics, doubleclick and the likes) appear for nearly every site since there is no blacklist available (like the one in Noscript). I know, though, that this feature is planned for RP.

Another extension you might want to check out is CsFire. This publication explains its logic very well.
Mozilla/5.0 (Ubuntu; X11; Linux x86_64; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Tom T.
Field Marshal
Posts: 3620
Joined: Fri Mar 20, 2009 6:58 am

Re: Varying browser fingerprint to make tracking more diffic

Post by Tom T. »

tlu wrote:
Tom T. wrote:
Critical is to disable geo.enabled in about:config,
I think that the privacy threats of having geolocation enabled are limited considering what is said here. In other words, geolocation is completely opt-in.
Considering that they've flip-flopped on that, with extensive discussion and, IIUC, only after massive user protests, who knows if they'll silently re-enable again?
In general, I tend not to trust "We'll always ask you before we screw you". Better to deny the ability.
tlu wrote:Regarding cookie management: I think that controlling cookies via Cookie Monster is superior to the built-in FF cookie management as it is easier to fine-tune your cookie permissions, particularly if you un-check the "Use Second Level Domain Names" option.
Thanks for the suggestion.

IMHO only, I don't much care what cookies the site itself sets, since they're all session-only, and deleted before moving on, if the "moving on" is at all privacy-sensitive. Third-party cookies are always blocked by default. For those who allow permanent cookies (not to my taste, thank you), or keep 20 tabs open, or allow 3rd-party, yes, I can see that fine-tuning permissions would indeed be useful -- necessary, in fact.
tlu wrote:Cookie permissions also determine how DOM storage (sometimes called "super cookies") is handled. Some users disable it completely by setting "dom.storage.enabled" to "false" in about:config. However, I found that this breaks some sites, e.g., one of my banking sites. Considering that the same permissions are applied as to cookies, I think that you can leave it enabled once a strict cookie management is adopted.
Yikes! I forgot to disable it on some update - thanks for the reminder. But always have had it disabled, and don't remember any sites breaking. If any do, I'll come back and confirm.

I'm a firm believer in redundancy ("defense in depth"), and so would prefer to keep strict cookie permissions *and* disable dom storage.

I allow only session cookies, don't allow third party, yet there's only the single Boolean choice in dom.storage.enabled, and it's "true". It's not readily visible what restrictions are placed, but to me, "true" means "true". Especially since the linked bug was from 2006. So, prefer to disable it, and see if anything breaks. IMHO. YMMV.
tlu wrote:Regarding RequestPolicy: Tom, I know you're a fan of it, and I, too, had used it for a while. But I found that it breaks too many sites and requires a lot of manual intervention.
I haven't, except for doing support, in which one visits many, many more sites (per each user's issue) than a single user normally would. Like NS, the faves are configured, set and forget.
tlu wrote:Quite disturbing is also that the same blocked domains (like googleanalytics, doubleclick and the likes) appear for nearly every site since there is no blacklist available (like the one in Noscript). I know, though, that this feature is planned for RP.
If you have these blocked in NS, they can't run script. Block images from them in Firefox Tools > Options > Content > Load Images automatically > Exceptions, and then even the typical single-pixel clear gif "Web bug" can't load.

Also, I use a blocking Hosts file that won't let the browser access DoubleClick (and 16,000 more ad/spyware/malware sites) even if I type it in the address bar.
Not everyone agrees with this usage of Hosts, but it's never caused me a problem, for probably six or seven years now.

ETA: Just realized that of course, they're default-blocked in RP, and you meant that it's annoying (hardly "disturbing", lol) that they show up at all.
They're easy to ignore, and I find that if the script is blocked in NS, sometimes the domain doesn't show in RP, depending on whether it's the script that makes the call for non-NS-blocked content, or the site itself, regardless of script permissions. Anyway, they don't show often, and I haven't found it an issue. But certainly support something equal to NS's Untrusted list, just to get them off the main display. Sorry for misunderstanding the first time.
tlu wrote:Another extension you might want to check out is CsFire. This publication explains its logic very well.
CsFire provides a secure-by-default policy, which can be extended with fine-grained remote policies as well as fine-grained local policies. The remote policies are obtained from a policy server,
Scary. I don't like visiting remote servers for permissions, or for anything else for that matter, behind my back.

Not well-vetted yet. Less than 4000 users, 9 reviews: Four 5, two 4, three 3. Quotes:
by megabob6666 on December 20, 2010

A few days after installing this extension, I ordered some items from Newegg. When I went to pay for the items, I kept getting an error message telling me that they could not complete my request for payment. I ended up submitting the order another three or four times before I figured out that the initial order had gone through by checking my Newegg account page. I finally realized that CSFire was blocking a browser request that would have informed me that the initial order had gone through. Needless to say, I uninstalled CSFire so that I would have no further problems in this regard and have not had similar problems on subsequent orders.

Yes, you can configure CSFire to allow those requests that you want, but the problem is that you don't know in advance when and where this kind of thing will occur, and without knowing that, you can't configure it to allow them
A user who shares my concern about remote policy management:
Good concept, I was looking for an addon to do exactly this.
But I have some questions left: Is the default setting allow or deny? What if there are multiple matching rules? And how can I disable remote policies?
good security concept, however if CsFire cant know all legitimate site where this occurs, then why not add a on-demand allow/disallow button.
Exactly. How can *any* add-on know all good or evil sites, especially since new sites start thousands of times a day, old ones change policies, addresses, scripts, script calls, etc.

This is why the frequent user requests for a NS resource that can make the user's decision -- a complete database of "trustworthy" and "untrustworthy" sites is impractical. Aside from which, there is considerable difference among individuals in what is "trustworthy". I don't trust Google, Facebook, etc., yet some people put their entire life history in the hands of such sites.

Thanks for adding some very interesting and thought-provoking ideas to this thread.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
Tom T.
Field Marshal
Posts: 3620
Joined: Fri Mar 20, 2009 6:58 am

Re: Varying browser fingerprint to make tracking more diffic

Post by Tom T. »

@ tlu:

FWIW, I just logged in to four online financial institutions, with the above cookie settings and dom.storage.enabled = false.
No problems whatsoever. Doesn't mean no one will have them, but as said, FWIW.

Do you care to mention which bank, or PM it to me (in strictest confidende)?
Not required, of course, just curious to look at the site and see if dom storage could be circumvented.

Cheers.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
tlu
Senior Member
Posts: 129
Joined: Fri Jun 05, 2009 8:01 pm

Re: Varying browser fingerprint to make tracking more diffic

Post by tlu »

Tom T. wrote: IMHO only, I don't much care what cookies the site itself sets, since they're all session-only, and deleted before moving on, if the "moving on" is at all privacy-sensitive. Third-party cookies are always blocked by default. For those who allow permanent cookies (not to my taste, thank you), or keep 20 tabs open, or allow 3rd-party, yes, I can see that fine-tuning permissions would indeed be useful -- necessary, in fact.
I also block 3rd party cookies and allow only session cookies by default. There are some exceptions, though, where I allow permanent cookies. Cookie Monster is more convenient and flexible here.
tlu wrote: Some users disable it completely by setting "dom.storage.enabled" to "false" in about:config. However, I found that this breaks some sites, e.g., one of my banking sites. .
I checked that banking site again (thanks for your offer, Tom!) with DOM storage disabled and this time it worked. Either they changed something in the meantime or something went wrong the last time I tried. Whatever - I still think that the privacy implications are limited if a strict cookie management is applied.
tlu wrote:Regarding RequestPolicy: Tom, I know you're a fan of it, and I, too, had used it for a while. But I found that it breaks too many sites and requires a lot of manual intervention.
I haven't, except for doing support, in which one visits many, many more sites (per each user's issue) than a single user normally would. Like NS, the faves are configured, set and forget.
Noscript is also said to break many sites but it blocks "only" active content so that most sites are at least readable. But RP breaks a lot more - take, for example, this site - without allowing twimgs.com (which is one of 11 entries in the RP menu!) the graphical elements and the menu are not shown and the site is, well, cluttered and confusing. And there are many, many sites with similar problems.
tlu wrote:Quite disturbing is also that the same blocked domains (like googleanalytics, doubleclick and the likes) appear for nearly every site since there is no blacklist available (like the one in Noscript). I know, though, that this feature is planned for RP.
If you have these blocked in NS, they can't run script. Block images from them in Firefox Tools > Options > Content > Load Images automatically > Exceptions, and then even the typical single-pixel clear gif "Web bug" can't load.

Also, I use a blocking Hosts file that won't let the browser access DoubleClick (and 16,000 more ad/spyware/malware sites) even if I type it in the address bar.
Not everyone agrees with this usage of Hosts, but it's never caused me a problem, for probably six or seven years now.

ETA: Just realized that of course, they're default-blocked in RP, and you meant that it's annoying (hardly "disturbing", lol) that they show up at all.
Yes, I meant "annoying" - sorry for being unclear but English is not my native language ;)
CsFire provides a secure-by-default policy, which can be extended with fine-grained remote policies as well as fine-grained local policies. The remote policies are obtained from a policy server,
Scary. I don't like visiting remote servers for permissions, or for anything else for that matter, behind my back.

Not well-vetted yet. Less than 4000 users, 9 reviews: Four 5, two 4, three 3.
Yes, but CsFire has changed in the meantime. The new approach is explained in that PDF - worth reading. Regarding the remote policies: There are only a handful of them or two for well-known sites that cause problems. I don't regard that as problematic. RP also has predefined lists for its initial configuration, the ABP lists are updated from remote servers, and I'm pretty sure that you don't compile and update your hosts file by manually editing it yourself ;)
Yes, you can configure CSFire to allow those requests that you want, but the problem is that you don't know in advance when and where this kind of thing will occur, and without knowing that, you can't configure it to allow them
Well,the same is definitely true for RP, isn't it?
This is why the frequent user requests for a NS resource that can make the user's decision -- a complete database of "trustworthy" and "untrustworthy" sites is impractical.
CsFire doesn't rely on a vast database, it works differently as outlined in that PDF.
Thanks for adding some very interesting and thought-provoking ideas to this thread.
You're welcome :)
Mozilla/5.0 (Ubuntu; X11; Linux x86_64; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Tom T.
Field Marshal
Posts: 3620
Joined: Fri Mar 20, 2009 6:58 am

Re: Varying browser fingerprint to make tracking more diffic

Post by Tom T. »

tlu wrote:
Tom T. wrote:
tlu wrote:Regarding RequestPolicy: Tom, I know you're a fan of it, and I, too, had used it for a while. But I found that it breaks too many sites and requires a lot of manual intervention.
I haven't, except for doing support, in which one visits many, many more sites (per each user's issue) than a single user normally would. Like NS, the faves are configured, set and forget.
Noscript is also said to break many sites but it blocks "only" active content so that most sites are at least readable. But RP breaks a lot more - take, for example, this site - without allowing twimgs.com (which is one of 11 entries in the RP menu!) the graphical elements and the menu are not shown and the site is, well, cluttered and confusing. And there are many, many sites with similar problems.
Did you love the irony in that headline? "Study sponsored by Google says Chrome is most secure" :P
I'm sure if the study said that some other browser were more secure, Google would have paid them twice the fee to hush it up. :mrgreen:

And while I don't care to read it -- we have an entire thread on that topic, with almost 1500 views at the moment -- I wonder if they compared Chrome to Fx + NS?

I agree that it's unfortunate that sites are becoming more and more complex, and that simplicity is no longer regarded as a virtue.

My usual procedure: TA the site itself in NS. Look for something in RP that matches the site name, or its initials, or whatever, especially with "img" or "static" in the name. Try that. Usually works.

Didn't see anything like that there, so just held the mouse pointer over the RP image-block placeholder in the article title. Shows destination as twimgs, so r-click, RP > TA to twimgs.com. Agree it's a PITA (does that translate? ;) )

Bottom line: There is always a trade-off between security and convenience. If everyone were 100% honest, we wouldn't need to carry around these jangling key rings in our pockets, and sometimes lock ourselves out of the car. :o But not everyone is honest, so we accept the inconvenience of locking our homes, cars, etc.

No reason the virtual world should be different from the real world. Not everyone or every website is honest (now, *that's* an understatement!), so we have to try to "lock" our OS with anti-virus, firewall, etc., and "lock" our browsers with NS, RP, whatever. Maybe sandbox or virtualize the browser, or the entire OS. Perhaps full-disk-encryption. The point is, everyone makes their own choice on how much convenience they will trade for security, and vice versa.

Sadly, random surveys show that 80-90% of home computers have at least one form of malware on them. I diagnosed this remotely just by e-mails received from two non-tech, but trusted, friends. Also, a career programmer with advanced degree in Computer Science picked up a spyware toolbar somewhere. I helped this local friend remove it.

So, while not everyone has the knowledge level, or the desire, to go to great lengths for security (and privacy), it's hard to fault anyone who does. Image
(I don't do FDI or VM, btw.)
ETA: Just realized that of course, they're default-blocked in RP, and you meant that it's annoying (hardly "disturbing", lol) that they show up at all.
Yes, I meant "annoying" - sorry for being unclear but English is not my native language ;)
Afterward, I realized that at some point in time, I was aware of that. Sorry, I did not mean to lecture you. I'm sure that I couldn't do nearly as well in your native language as you do in mine. :)
tlu wrote:Yes, but CsFire has changed in the meantime. The new approach is explained in that PDF - worth reading. Regarding the remote policies: There are only a handful of them or two for well-known sites that cause problems. I don't regard that as problematic. RP also has predefined lists for its initial configuration, the ABP lists are updated from remote servers, and I'm pretty sure that you don't compile and update your hosts file by manually editing it yourself ;)
I don't use ABP. I formerly used AdBlock (Original) with Fx 2.x, but it was not compatible with F3+. Totally local control: Can add any image to the block-list from a context menu, use wildcards, regexps, etc. to build your own list over time. Once you get the main sources, plus matches to "ad", "banner", etc. you don't see ads very often. And never more than once . ;)

RP has user-choice sets of default rules. You can uncheck any default rule. It's not the same as a live connection to a live RP server.

The nice people at the Hosts service compile and update it, about once a month. All I do is change their redirect from 127.0.0.1 to 0.0.0.0, and add two entries of my own: www .yahoo.com, so I don't get the annoying redirect after logging out of Yahoo Mail, and an entry for my router's admin IP, merely to avoid the "certificate mismatch" error that otherwise occurs. (Even the local router is accessible only by HTTPS, so I can do it from the laptop with confidence.)

But the point is, before installing any Hosts file from them, I can open it up and read it, search it, etc. And once installed, there is *no* connection to a remote server to vet web sites. (I don't use the Firefox "Safe Browsing" features - block reported evil or reported forgeries, etc. -- for that very reason. IMHO. YMMV.)
Yes, you can configure CSFire to allow those requests that you want, but the problem is that you don't know in advance when and where this kind of thing will occur, and without knowing that, you can't configure it to allow them
Well,the same is definitely true for RP, isn't it?
I haven't tested the newegg example cited by that review, or CsFire at all, but I've bought things from newegg. With adequate script permissions, including newegg.com, which includes content. and images10., I don't remember any problems with RP, or at least, any that weren't obvious to allow. In which case, add to permanent whitelist. But it's never been an issue.
This is why the frequent user requests for a NS resource that can make the user's decision -- a complete database of "trustworthy" and "untrustworthy" sites is impractical.
CsFire doesn't rely on a vast database, it works differently as outlined in that PDF.
I will have to look at that .pdf some time, thanks.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
Post Reply