noscript.untrusted format

Ask for help about NoScript, no registration needed to post
usofad
Posts: 1
Joined: Tue Jan 17, 2012 3:27 pm

noscript.untrusted format

Post by usofad »

Hello.
I have some questions about the format of noscript.untrusted string setting.
I have next noscript.untrusted string:
Seems to be we must specify the separate entries for each protocol. Am I right? And if there will not be a https://google-analytics.com in noscript.untrusted string then on all SSL requests to google-analytics.com javascript will be allowed. Yes?

And if I will specify http:// this will block javascript on all web-sites which I use with HTTP:// sheme. Am I right?
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Tom T.
Field Marshal
Posts: 3620
Joined: Fri Mar 20, 2009 6:58 am

Re: noscript.untrusted format

Post by Tom T. »

usofad wrote:Seems to be we must specify the separate entries for each protocol.
Simply adding google-analytics.com to the Untrusted list manually, as it seems you are doing, will block all scripts that have that as their 2nd-level domain, including somesite.google-analytics.com, google-analytics.com/somesite, https:// google-analytics.com (deliberately broke the link), etc.

The reason it shows up in about:config under both protocols, in my list as well as yours, is that most users, self included, don't edit this manually.
Rather, we figure that a script isn't an issue until it shows up.

So, the first time that htttp-G-A.com showed in the NS menu, as a full address, I marked it Untrusted.

Some time later, https - G-A.com shows. So I marked it Untrusted. Hence, two entries in about:config.

Simply adding google-analytics.com should block http, https, and, AFAIK, ftp://, gopher://, socks://, SSH://, telnet, etc. -- if the browser supported them.
usofad wrote:And if I will specify http:// this will block javascript on all web-sites which I use with HTTP:// sheme. Am I right?
I don't think so (you could test it yourself), because NoScript is still looking for the critical element of at least a base 2nd level domain name.
But why would you want to do that? *All* scripting is blocked by default, from every source, except for those in the Default Whitelist FAQ. And you can remove those easily, should you like.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
User avatar
GµårÐïåñ
Lieutenant Colonel
Posts: 3377
Joined: Fri Mar 20, 2009 5:19 am
Location: PST - USA
Contact:

Re: noscript.untrusted format

Post by GµårÐïåñ »

Its best that you mark the item as untrusted using the GUI as you encounter it and let NS add the protocol redundancies for you. But if you must do it manually, then by all means go ahead and include http/https versions of it and separate by a whitespace, simple as that.
~.:[ Lï£ê ï§ å Lêmðñ åñÐ Ì Wåñ† M¥ Mðñê¥ ßå¢k ]:.~
________________ .: [ Major Mike's ] :. ________________
Mozilla/5.0 (Windows NT 6.1; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
emm7
Posts: 3
Joined: Sat Sep 15, 2012 8:30 am

Re: noscript.untrusted format

Post by emm7 »

Tom T. wrote:Simply adding google-analytics.com should block http, https, and, AFAIK, ftp://, gopher://, socks://, SSH://, telnet, etc. -- if the browser supported them.

Yeah... it doesn't work that way.
Manually added tpb without any protocols, accessed it via https and js works fine even if in the list it's marked as untrusted without protocol but as trusted with the https protocol. Running with Globally allowed.

For me the problem is fast browsing when using a power plan for low power (dual core 1,8GHz). It appears adding each domain 3 times, without protocol, with http, and with https will slow down page access a lot, when first accessing the page, while still loading it takes some time until it's able to scroll fluidly and if I don't wait, I have to look at jerked scrolling. Removed the excess entries and page access got a lot faster. My list has some 600 entries, so 1800 with protocols.

Is there an option that will force NoScript to take an entry without a protocol and apply it to all urls, with http and https alike?
Mozilla/5.0 (Windows NT 6.1; rv:15.0) Gecko/20100101 Firefox/15.0.1
User avatar
Thrawn
Master Bug Buster
Posts: 3106
Joined: Mon Jan 16, 2012 3:46 am
Location: Australia
Contact:

Re: noscript.untrusted format

Post by Thrawn »

emm7 wrote:Running with Globally allowed.

For me the problem is fast browsing when using a power plan for low power (dual core 1,8GHz).
If you want fast browsing, aren't you better off using default-deny? That will block the download and execution of all manner of unnecessary active content...
======
Thrawn
------------
Religion is not the opium of the masses. Daily life is the opium of the masses.

True religion, which dares to acknowledge death and challenge the way we live, is an attempt to wake up.
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.89 Safari/537.1
emm7
Posts: 3
Joined: Sat Sep 15, 2012 8:30 am

Re: noscript.untrusted format

Post by emm7 »

It's true that global deny will result in faster browsing, but these days most sites use some sort of js even for simple stuff, this means I have to manually allow around half of sites I visit, to activate disqus, twitter, even basic comments implementations, captcha, menus, etc. What I block are mostly excess ad-driven nags, which mostly don't look like ads but they behave like it.
Mozilla/5.0 (Windows NT 6.1; rv:15.0) Gecko/20100101 Firefox/15.0.1
Tom T.
Field Marshal
Posts: 3620
Joined: Fri Mar 20, 2009 6:58 am

Re: noscript.untrusted format

Post by Tom T. »

emm7 wrote:It's true that global deny will result in faster browsing, but these days most sites use some sort of js even for simple stuff, this means I have to manually allow around half of sites I visit, to activate disqus, twitter, even basic comments implementations, captcha, menus, etc.
Once you've whitelisted them the first time, they won't bother you again.

If you wish to allow them only at certain sites, see Site-Specific Permissions for ways to, again, do once and forget.
What I block are mostly excess ad-driven nags, which mostly don't look like ads but they behave like it.
Have you mass-added the sites in SOME SITES YOU MIGHT NOT WANT TO ALLOW? -- although default-deny is still *much* safer.
Without default-deny, the new script from evil.com has already run before you get a chance to blacklist it, which is too late. Damage done.
This is why Globally Allow is so dangerous, which is why NS was created in the first place.

"Security is the opposite of convenience". If everyone were honest, we wouldn't need to lock our homes, cars, etc., nor carry around keys that we might lose or forget. But everyone isn't honest. So we do the inconvenient thing, to improve our security.

btw, "low-power"? My machine for daily use has a single-core 1.6 Ghz processor. With default-deny JS and a blocking HOSTS file, browsing is quite respectably fast.

Come to think of it, using one of the free HOSTS services speeds things up by blocking requests to sites that don't necessarily use scripting (tracking cookie, web bug, etc.), and since the system checks Hosts before DNS, there's no DNS lookup for any of those sites, saving more time. Search the forum for discussions of Hosts as annoyance-blockers (including some caveats).
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:15.0.1) Gecko/20100101 Firefox/15.0.1
Post Reply