[RESOLVED] Allowance for subfolder but not for root
[RESOLVED] Allowance for subfolder but not for root
Hi there,
today i found the page fireget. com which uses the new google url for recaptcha.
Now i'm looking for a method to allow scripts for www.google.com/recaptcha/* only but denies for www.google.*
Here is an examplelink, if someone wants to test it: http://fireget.com/cggqnsawxhh6/testfile.txt
Thanks for your help!
greetz
TheCrap
today i found the page fireget. com which uses the new google url for recaptcha.
Now i'm looking for a method to allow scripts for www.google.com/recaptcha/* only but denies for www.google.*
Here is an examplelink, if someone wants to test it: http://fireget.com/cggqnsawxhh6/testfile.txt
Thanks for your help!
greetz
TheCrap
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2
Re: Allowance for subfolder but not for root
Site-Specific-Permission Questions? PLEASE READ THIS FIRST! and
Creating Site-Specific Permissions via ABE
Let us know if you need help after reading those and trying it out.
Creating Site-Specific Permissions via ABE
Let us know if you need help after reading those and trying it out.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.27) Gecko/20120216 Firefox/3.6.27
Re: Allowance for subfolder but not for root
Thanks! I read these articles befor.
After testig the syntax for about an hour with no result even not for allowing script for whole google, i found out that the page you are doing an ABE script for has to be on the "trusted pages" list else ABE does nothing.
Maybe this information should be somewhere in the FAQ. This would have saved me 58 minutes of time
But now i still can't get the behavior i wish.
The problem i have is to reblock google.com with ABE so the behavior is like it is when it's not on the trusted list.
After another hour i have the following script, which is close to what i want:
# google.com rule
Site *.google.com/recaptcha/*
Accept from ALL
Deny
Site .google.*
Deny INC(SCRIPT)
SANDBOX
there are still two important differences
1. with google not trusted "Cached" and "Similar" is shown next to the results, with the ABE rule it is not.
2. on the image search "not trusted" shows the old style (which is much better i think) and my ABE rule shows the new one
(3. the +1 button is not that important but it would be nice to kick it too)
After testig the syntax for about an hour with no result even not for allowing script for whole google, i found out that the page you are doing an ABE script for has to be on the "trusted pages" list else ABE does nothing.
Maybe this information should be somewhere in the FAQ. This would have saved me 58 minutes of time

But now i still can't get the behavior i wish.
The problem i have is to reblock google.com with ABE so the behavior is like it is when it's not on the trusted list.
After another hour i have the following script, which is close to what i want:
# google.com rule
Site *.google.com/recaptcha/*
Accept from ALL
Deny
Site .google.*
Deny INC(SCRIPT)
SANDBOX
there are still two important differences
1. with google not trusted "Cached" and "Similar" is shown next to the results, with the ABE rule it is not.
2. on the image search "not trusted" shows the old style (which is much better i think) and my ABE rule shows the new one
(3. the +1 button is not that important but it would be nice to kick it too)
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2
Re: Allowance for subfolder but not for root
I've made it nearly.
Removing just one line did what i wanted for search an images.
The script is now:
#recaptcha rule google decent behavior
Site .google.com/recaptcha/*
Accept from ALL
Site .google.*
Sandbox
to bad that google maps is no loger working with this script. even if i allow maps.google.*
Removing just one line did what i wanted for search an images.
The script is now:
#recaptcha rule google decent behavior
Site .google.com/recaptcha/*
Accept from ALL
Site .google.*
Sandbox
to bad that google maps is no loger working with this script. even if i allow maps.google.*
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2
Re: Allowance for subfolder but not for root
Now i really need help with the google maps problem.
The script is:
#recaptcha rule google decent behavior
Site .google.com/recaptcha/* m*.google.*
Accept from ALL
Site .gstatic.com
Accept ALL from m*.google.*
Site www.google.*
Accept ALL from m*.google.*
Sandbox
But the behavior is EXTREMLY strange.
When ABE is active an i go to maps.google.com only the box for typing the adress is visible. When i refresh (without changeing anything) the whole page becomes visible, but no button is workung an the map is not manipulitable in any way.
When ABE is deactivated, everything works find, BUT when i reactivate ABE and refresh the page, everything still works o_O Trying a new window the same happens as told above.
The script is:
#recaptcha rule google decent behavior
Site .google.com/recaptcha/* m*.google.*
Accept from ALL
Site .gstatic.com
Accept ALL from m*.google.*
Site www.google.*
Accept ALL from m*.google.*
Sandbox
But the behavior is EXTREMLY strange.
When ABE is active an i go to maps.google.com only the box for typing the adress is visible. When i refresh (without changeing anything) the whole page becomes visible, but no button is workung an the map is not manipulitable in any way.
When ABE is deactivated, everything works find, BUT when i reactivate ABE and refresh the page, everything still works o_O Trying a new window the same happens as told above.
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2
Re: Allowance for subfolder but not for root
Per FAQ: "What Is A Trusted Site?", NS by default blocks all third-party scripts at your trusted sites, so you must explicitly whitelist them or Temp-Allow them.TheCrap wrote:After testig the syntax for about an hour with no result even not for allowing script for whole google, i found out that the page you are doing an ABE script for has to be on the "trusted pages" list else ABE does nothing.
Maybe this information should be somewhere in the FAQ. This would have saved me 58 minutes of time
Per Can I use ABE to fine-tune NoScript's permissions?,
So, it's in the FAQ, although not in one lump. Perhaps we'd like to encourage users to read, or at least, search the entire FAQ?Notice that since ABE's rule work independently from NoScript's permissions, you need to "Allow google-analytics.com" in NoScript's menu for the above to work.
Notice also that, independently from ABE, even if a certain script source is whitelisted in NoScript it won't run as a 3rd party script on pages whose origin is not whitelisted itself.
(My wish: read. But it's like the instruction manual for your new whatever -- we all jump into using it. Who reads? -- so, no offense intended.

Note that many sites that require Google also require gstatic, which is in the Default Whitelist, for reasons described in one of the updates to NoScript Quick Start Guide.
Did you remove it from the whitelist? If so, you should return it there, then do the ABE rule.
Incidentally, for shorthand, the absence of a restrictive list automatically implies ALL.
I usually see api.recaptcha.net. Example of a site that uses google.com/recaptcha?
Code: Select all
Site .gstatic.com .maps.google.com
Accept from .maps.google.com
Deny
This assumes that you don't want gstatic running anywhere else. If so, we can adjust that.
Any reason for the wildcard in m*.google? I see no real change with just maps.google vs. mt0.google (in this visit), but I might be missing something
Remember that you must whitelist gstatic.com and maps.google.com for this to work, if you want the "automatic" permissions of ABE.
Else, we're back to just Temp-allow on each visit (and revoke before leaving the site.)
Regarding the google Sandbox rule, per the comprehensive guide ABE Rules .pdf, "Sandbox" sends the request, but disables script and plugins/embedding at the landing page. But we need those scripts at maps.g. Did you mean "Anonymize" (short: Anon)?
Sorry to suggest further reading, but not sure exactly what we're trying to accomplish here.
Let us know if the above rule works for you at Maps, and where the google/recaptcha can be tested (vs. api.recaptcha.net.)
Cheers.

Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.27) Gecko/20120216 Firefox/3.6.27
Re: Allowance for subfolder but not for root
Thanks for your answer.
I missed the part with whitelisting.
An examplelink for a page using google.com/recaptcha is in my first post. (http://fireget.com/cggqnsawxhh6/testfile.txt)
Let's start form the beginning.
I'm don't like the behavior of google with scripts, so google.* is blocked by noscript. Google Maps needs gstatic.com and maps.google.* so these both are on the whitlist, recaptcha.net is whitelisted to.
These setting work to my satisfaction for a log time until i found the page fireget.com which uses the code "<script type="text/javascript" src="http://www.google.com/recaptcha/api/cha ... "></script>" for captcha. There where different options for me.
1. Do like i did it for maps.google.de an whitelist google.com/recaptcha, but noscript did not accept a subfolder
2. temporary allow "google.com" everytime i use the page and directly disallow after the captcha. But this refreshes all other pages i have open with a google.com script on it two times.
3. Do option two with autorefresh deactivated. I don't think thats's an real option.
4. whitelist "google.com" with the result of an unacceptable behavior of web and image search.
5. finetune with ABE.
The only real option is to use ABE. The goal is to set it up so there is no difference in behavior of any google-related page but that the captchas are shown.
So first i allowed the recaptcha in the rule (google.de and .com whitelisted):
Now it was time to block the rest of google using scripts like it was befor whitelisting. Ithink that Sandbox does exactly what i want to.
After a short look everything looked fine. Captcha is shown and google web and imagesearch runs without script.
The second look showed me the problem with google maps so i defined the exception for the rule above:
the part with m*.google.* was only try an error, i tested alot but can't find a rule to block all script from google but maps.google.* and google.com/recaptcha but found that you have to close and reopen the Maps window for ABE rule changes taking effect
I hope it's now clearer what i'm trying to realise.
I missed the part with whitelisting.
An examplelink for a page using google.com/recaptcha is in my first post. (http://fireget.com/cggqnsawxhh6/testfile.txt)
Let's start form the beginning.
I'm don't like the behavior of google with scripts, so google.* is blocked by noscript. Google Maps needs gstatic.com and maps.google.* so these both are on the whitlist, recaptcha.net is whitelisted to.
These setting work to my satisfaction for a log time until i found the page fireget.com which uses the code "<script type="text/javascript" src="http://www.google.com/recaptcha/api/cha ... "></script>" for captcha. There where different options for me.
1. Do like i did it for maps.google.de an whitelist google.com/recaptcha, but noscript did not accept a subfolder
2. temporary allow "google.com" everytime i use the page and directly disallow after the captcha. But this refreshes all other pages i have open with a google.com script on it two times.
3. Do option two with autorefresh deactivated. I don't think thats's an real option.
4. whitelist "google.com" with the result of an unacceptable behavior of web and image search.
5. finetune with ABE.
The only real option is to use ABE. The goal is to set it up so there is no difference in behavior of any google-related page but that the captchas are shown.
So first i allowed the recaptcha in the rule (google.de and .com whitelisted):
Code: Select all
Site .google.com/recaptcha/*
Accept from ALL
Code: Select all
Site www.google.* # my intention was to run ONLY WWW.google.* in Sandbox but not MAPS.google.*
Sandbox
The second look showed me the problem with google maps so i defined the exception for the rule above:
Code: Select all
Site maps.google.*
Accept ALL
I hope it's now clearer what i'm trying to realise.
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2
Re: Allowance for subfolder but not for root
I had to unblock a few things to get the recaptcha to show, but yes, it's there, thanks.Guest wrote:An examplelink for a page using google.com/recaptcha is in my first post. (http://fireget.com/cggqnsawxhh6/testfile.txt)
Agree.<snip> 5. finetune with ABE.
The only real option is to use ABE.
Yes, it's best to set ABE rules when you are not on the page.i tested alot but can't find a rule to block all script from google but maps.google.* and google.com/recaptcha but found that you have to close and reopen the Maps window for ABE rule changes taking effect
Note the new feature that you can right-click an empty area of an open NoScript menu, and it will copy the present menu items to the clipboard, so you can paste them to your favorite text program while working on your rule. See changelog, 2.2.9.
Otherwise, yes, you most close and re-open. And ensure that you click "Refresh" and "OK" after your ABE changes.
("Refresh" activates the check of syntax, identifying syntax errors.)
I think so. "...rule to block all script from google but maps.google.* and google.com/recaptcha"I hope it's now clearer what i'm trying to realise.
First, add maps.google.com and maps.gstatic.com to your whitelist.
You don't mind those running anywhere, because they're specific to the Maps function, correct?
Note that we're using fine-tuning here, with permissions at the third-level domain instead of base second-level domain (google.com).
Then we have to whitelist www. google.com, so that we can use ABE to allow the recaptcha subfolder while sandboxing all else.
And I think we'll have to use regular expressions to do this.
Confession: I've had little need for them, so I may be a bit rusty. Using the template in ABE Rules .pdf 1.3, how about this?
Code: Select all
# Allow all Google recaptcha, but sandbox all www.google.com.*
Site ^http?://www\.google\.com/recaptcha/*
Accept
Site ^http?://www\.google\.com/*
Sandbox
Those that don't match recaptcha proceed down the list, where the Sandbox rule is applied.
I just tested this by going to your Fireget site. I allowed Google (temporarily, as I prefer not to whitelist -- I might forget to remove it

The recaptcha loads properly. I closed that page.
Then I went to Google's home page, www dot google.com. I tried to temp-allow www dot google.com, and the page reloaded, but both the red NoScript window and the JSView add-on confirm that the script is not running.
So, I *think* you have your wish -- Google-scriptless browsing, except that the specific recaptcha scripts are allowed.
Does this now work for you?
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.27) Gecko/20120216 Firefox/3.6.27
Re: Allowance for subfolder but not for root
Thanks again.
Sadly, the rule is still not working for google maps. Your rule is the same like mine but mor universal, so i optimised it a bit to make it even more universal:
The rule seems to be right because there is no restriction for maps and gstatic but ends in an incomrehensible behavior on google maps.
With ABE not activated, google.* blocked and maps.google.* + gstatic.com whitelisted JSView shows the following .js files loaded:
Absolutly nothing
I must refresh the page, then only these .js loaded:
Definitely it has something to do with how the rule is handle.
Sadly, the rule is still not working for google maps. Your rule is the same like mine but mor universal, so i optimised it a bit to make it even more universal:
Code: Select all
# Allow all Google recaptcha, but sandbox all www.google.com.*
Site ^(https?://)?www\.google\.com/recaptcha/* #now with and without http
Accept
#Site ^(https?://)?m\w+.google\.(?:[a-z]{1,3}\.)?[a-z]+/* <-- just testing
#Accept <-- just testing
Site ^(https?://)?www\.google\.(?:[a-z]{1,3}\.)?[a-z]+/* #working for all top-level
Sandbox
With ABE not activated, google.* blocked and maps.google.* + gstatic.com whitelisted JSView shows the following .js files loaded:
- http://ssl.gstatic.com/gb/js/sem_24f279 ... ed5fee2.js
http://maps.gstatic.com/cat_js/intl/de_ ... _rst%7D.js
http://maps.gstatic.com/cat_js/intl/de_ ... d_mg%7D.js
http://maps.gstatic.com/intl/de_de/mapf ... mod_mva.js
http://maps.gstatic.com/intl/de_de/mapf ... mod_cbs.js
http://maps.gstatic.com/cat_js/intl/de_ ... _adf%7D.js
http://maps.gstatic.com/intl/de_de/mapf ... d_stats.js
Absolutly nothing

I must refresh the page, then only these .js loaded:
- http://ssl.gstatic.com/gb/js/sem_24f279 ... ed5fee2.js
http://maps.gstatic.com/cat_js/intl/de_ ... _rst%7D.js
http://maps.gstatic.com/intl/de_de/mapf ... d_stats.js
Definitely it has something to do with how the rule is handle.
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2
Re: Allowance for subfolder but not for root
I don't understand. I just now went to maps.google.com, without changing my own rule, above.TheCrap wrote:Thanks again.
Sadly, the rule is still not working for google maps.
Keeping in mind that I prefer to TA rather than to whitelist google scripts, I TA maps.google and maps.gstatic.
All fields work. I plan a trip from Dallas, TX to Salt Lake City, UT. The map route shows on the right. The directions and distances show in the left frame, with a vertical scroll bar. All map functions work, including pan and zoom. Why must we put maps. into the ABE rule?
Do you have RequestPolicy add-on? If so, you must allow (or temp-allow) requests from google to gstatic.
You can tell RP to use full domains or full addresses instead of base domains, to fine-tune to "maps.google to maps.gstatic".
But then, you must also inlclude mt0 and mt1, and it really complicates the menu elsewhere. Unlike NoScript, RP menu allows only *one* choice of granularity for domain names, whereas NS can show you two or all three simultaneously. (Appearance tab).
Interesting: It seems that Google Maps uses the numbered subservers, mt0, mt1, etc. to store various parts of the globe.
More interesting: Allowing only mt0 in RP showed the east and west parts of the US, but apparently the central part is stored on mt1.
None of this mattered in NS, where allowing maps.google was enough, without wildcarding m*
Sorry, if I had known that you were versed in regexp, I'd have suggested that up front.Your rule is the same like mine but mor universal, so i optimised it a bit to make it even more universal:
No offense intended, but leaving nothing to chance: You do know that comments may be placed *only* at the beginning of a line, and nothing else can be placed on a comment line? See ABE Rules .pdf Chapter 1 on Pg. 2. If those comments are in your actual rule, that *might* be what's breaking them.# Allow all Google recaptcha, but sandbox all www.google.com.*
Site ^(https?://)?www\.google\.com/recaptcha/* #now with and without http
I just tested this by adding a #comment after a rule, click Refresh, and no syntax error noted. So either the syntax checker misses this, letting the rule break, or the Rules are not correct. Surprisingly, it did not seem to break the rules. If you can confirm the same thing, I'll let Giorgio know of the apparent discrepancy.
I too meant to use https?:// ... that was just a typo on my part, sorry.
Did you remember to navigate away from the page, preferably *clearing the cache*, or even better (for a *pure* test), close and restart the browser?[/code]Code: Select all
#Site ^(https?://)?m\w+.google\.(?:[a-z]{1,3}\.)?[a-z]+/* <-- just testing #Accept <-- just testing Site ^(https?://)?www\.google\.(?:[a-z]{1,3}\.)?[a-z]+/* #working for all top-level Sandbox
The rule seems to be right because there is no restriction for maps and gstatic but ends in an incomrehensible behavior on google maps.
With ABE not activated, google.* blocked and maps.google.* + gstatic.com whitelisted JSView shows the following .js files loaded:
With ABE activated, google.*, maps.google.* + gstatic.com whitelisted :
- http://ssl.gstatic.com/gb/js/sem_24f279 ... ed5fee2.js
http://maps.gstatic.com/cat_js/intl/de_ ... _rst%7D.js
http://maps.gstatic.com/cat_js/intl/de_ ... d_mg%7D.js
http://maps.gstatic.com/intl/de_de/mapf ... mod_mva.js
http://maps.gstatic.com/intl/de_de/mapf ... mod_cbs.js
http://maps.gstatic.com/cat_js/intl/de_ ... _adf%7D.js
http://maps.gstatic.com/intl/de_de/mapf ... d_stats.js
Absolutly nothing![]()
Why do we have to have maps in a regexp at all? Using the third-level domains as I did, maps.google is allowed to pass right through, but www dot google is subject to ABE rules. My map showed about a dozen scripts in JSView, *all* from maps.google, plus the trivial embedded ones (functions, click, etc.)
Third-party sites are unlikely to call maps.google and maps.gstatic unless that site offers map functions, in which case, we want them, right?
Doesn't seem to be much possibility of harm from an evil site calling to a script that we trust?
In my test case, ssl.gstatic showed in the context menu version of JSView, which is often more comprehensive (right-click the page, View Page Info > JSView), but it did not run. Size showed as ?? in the JSView, and double-clicking it to open it gave a blank page. It seems not to be needed for the map function.I must refresh the page, then only these .js loaded:
- http://ssl.gstatic.com/gb/js/sem_24f279 ... ed5fee2.js
http://maps.gstatic.com/cat_js/intl/de_ ... _rst%7D.js
http://maps.gstatic.com/intl/de_de/mapf ... d_stats.js
Again, do we need to subject maps.google to ABE?For now i couldn't find a way to trace why the remaining scripts are not loaded. Even HTTPFox didn't help.
Definitely it has something to do with how the rule is handle.
Do the recaptcha and www google rules work OK for you, as they did for me?
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.27) Gecko/20120216 Firefox/3.6.27
Re: Allowance for subfolder but not for root
I have tested a lot yesterday and i found a way to reproduce the bug i have. First of all, the rule for maps was just a test. It's not needed.
The rule is working BUT only if you do not visit the google web search. I have tested it with my firefox settings, a vanilla firefox profile with just NoScript and another pc.
I made a small capture video on the vanilla profile to show the situation.
http://youtu.be/pKqN3TeIRqU
The rule does what is was supposed to but i must not visit maps AFTER a search.
Is this behavior reproduceable on your side?
The rule is working BUT only if you do not visit the google web search. I have tested it with my firefox settings, a vanilla firefox profile with just NoScript and another pc.
I made a small capture video on the vanilla profile to show the situation.
http://youtu.be/pKqN3TeIRqU
The rule does what is was supposed to but i must not visit maps AFTER a search.
Is this behavior reproduceable on your side?
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:10.0.2) Gecko/20100101 Firefox/10.0.2
Re: Allowance for subfolder but not for root
Yes.TheCrap wrote:I have tested a lot yesterday and i found a way to reproduce the bug i have. First of all, the rule for maps was just a test. It's not needed.
The rule is working BUT only if you do not visit the google web search. I have tested it with my firefox settings, a vanilla firefox profile with just NoScript and another pc.
I made a small capture video on the vanilla profile to show the situation.
http://youtu.be/pKqN3TeIRqU
The rule does what is was supposed to but i must not visit maps AFTER a search.
Is this behavior reproduceable on your side?
@ Giorgio:
Steps to reproduce:
1) ABE rule:
Code: Select all
# Allow all Google recaptcha, but sandbox all www.google.com.*
Site ^https?://www\.google\.com/recaptcha/*
Accept
Site ^https?://www\.google\.com/*
Sandbox
No cookies needed. (blocked)
Allow or TA maps.google.com; maps.gstatic.com.
Plan a trip of your choice.
Result: Map and route display on right; left pane shows direction and distance. Pan and zoom functions work.
3) Visit www .google.com - no cookies.
ABE rule successfully prevents attempt to manually TA www .google.com
NOT necessary even to do a search!
4) Go to maps.google.com (not using Back arrow; type in address bar).
Same permissions.
Result: "Get Directions" button broken. Link shows as "javascript:void(0)".
What could the visit to Google do, that would break the Maps site that worked a moment ago?
Did we not successfully exempt 3rd-level domain of maps.google from our www .google regexp?
I'm using RefControl with default option of "Forge the root of the site". RequestPolicy allows requests from google.com to gstatic.com.
@ TheCrap:
Have you considered a better-behaved (privacy-conscious) search engine, such as https://duckduckgo.com(/html for no-JS version)?
And since you're not registered, would you consider using a name that wouldn't possibly be found offensive by any segment of our user base?

Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.28) Gecko/20120306 Firefox/3.6.28
- Giorgio Maone
- Site Admin
- Posts: 9524
- Joined: Wed Mar 18, 2009 11:22 pm
- Location: Palermo - Italy
- Contact:
Re: Allowance for subfolder but not for root
I've tried your steps, and even tweaked the rule to match my local setup which redirects many google things to google.it:Tom T. wrote: Steps to reproduce:
Code: Select all
# Allow all Google recaptcha, but sandbox all www.google.com.*
Site ^https?://www\.google\.(?:com|it)/recaptcha/*
Accept
Site ^https?://www\.google\.(?:com|it)/*
Sandbox

Mozilla/5.0 (Windows NT 6.1; WOW64; rv:11.0) Gecko/20100101 Firefox/11.0
Re: Allowance for subfolder but not for root
I'll try that, thanks.Giorgio Maone wrote:Unfortunately I cannot reproduce, on a clean profile with just NoScript default options plus the rule above
In the meantime, if you get a chance, could you please see if you can reproduce using your US proxy, simulating my situation, or a proxy from OP's locale as implied in his script reports (or from IP lookup)?
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.28) Gecko/20120306 Firefox/3.6.28
Re: Allowance for subfolder but not for root
Unfortunately, I can.Giorgio Maone wrote:Unfortunately I cannot reproduce, on a clean profile with just NoScript default options plus the rule above

Create new profile, Fx 3.6.28
Install only NS 2.3.4 (vs. .5rc5 in previous config)
Leave all defaults in Firefox's new profile, *including* Google cookies, which were blocked in the previous test (and permanently, in my default browser).
Leave all defaults in NS, except for the ABE rule (I don't think I need the .it tweak)
Code: Select all
# Allow all Google recaptcha, but sandbox all www.google.com.*
Site ^https?://www\.google\.com/recaptcha/*
Accept
Site ^https?://www\.google\.com/*
Sandbox
Went to maps.google.com. Of course, *google.com and *gstatic.com are in Default Whitelist
Result: Same as before. "Get Directions" is broken; link destination again shows as "javascript:void(0)
Delete Google cookie folder (all)
Open new tab.
Visit maps.google. (New google cookies set)
Result: "Get directions" succeeds. Planned trip displays as expected.
Returned to www .google.com, then *immediately* returned to maps.google by auto-complete (disabled in my default browser used previously).
Maps/ Get Directions is broken again; same symptoms.
Delete Google cookie folder; open new tab while Google Maps is still broken in existing tab.
Sets new folder of Google cookies.
*The existing tab remains broken, while the new tab works.*
Reloading the broken tab doesn't fix it.
"Curiouser and curiouser", said Alice. (-In Wonderland)
Now what, Boss?

(Need to log off for a long time. Will be back much later. Thanks for any additional assistance in the meantime.)
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.28) Gecko/20120306 Firefox/3.6.28