TheCrap wrote:Thanks again.
Sadly, the rule is still not working for google maps.
I don't understand. I just now went to maps.google.com, without changing my own rule, above.
Keeping in mind that I prefer to TA rather than to whitelist google scripts, I TA maps.google and maps.gstatic.
All fields work. I plan a trip from Dallas, TX to Salt Lake City, UT. The map route shows on the right. The directions and distances show in the left frame, with a vertical scroll bar. All map functions work, including pan and zoom. Why must we put maps. into the ABE rule?
Do you have
RequestPolicy add-on? If so, you must allow (or temp-allow) requests from google to gstatic.
You can tell RP to use full domains or full addresses instead of base domains, to fine-tune to "maps.google to maps.gstatic".
But then, you must also inlclude mt0 and mt1, and it really complicates the menu elsewhere. Unlike NoScript, RP menu allows only *one* choice of granularity for domain names, whereas NS can show you two or all three simultaneously. (Appearance tab).
Interesting: It seems that Google Maps uses the numbered subservers, mt0, mt1, etc. to store various parts of the globe.
More interesting: Allowing only mt0 in RP showed the east and west parts of the US, but apparently the central part is stored on mt1.
None of this mattered in NS, where allowing maps.google was enough, without wildcarding m*
Your rule is the same like mine but mor universal, so i optimised it a bit to make it even more universal:
Sorry, if I had known that you were versed in regexp, I'd have suggested that up front.
# Allow all Google recaptcha, but sandbox all
www.google.com.*
Site ^(https?://)?www\.google\.com/recaptcha/* #now with and without http
No offense intended, but leaving nothing to chance: You do know that comments may be placed *only* at the beginning of a line, and nothing else can be placed on a comment line? See
ABE Rules .pdf Chapter 1 on Pg. 2. If those comments are in your actual rule, that *might* be what's breaking them.
I just tested this by adding a #comment after a rule, click Refresh, and no syntax error noted. So either the syntax checker misses this, letting the rule break, or the Rules are not correct. Surprisingly, it did not seem to break the rules. If you can confirm the same thing, I'll let Giorgio know of the apparent discrepancy.
I too meant to use https?:// ... that was just a typo on my part, sorry.
Code: Select all
#Site ^(https?://)?m\w+.google\.(?:[a-z]{1,3}\.)?[a-z]+/* <-- just testing
#Accept <-- just testing
Site ^(https?://)?www\.google\.(?:[a-z]{1,3}\.)?[a-z]+/* #working for all top-level
Sandbox
[/code]
The rule seems to be right because there is no restriction for maps and gstatic but ends in an incomrehensible behavior on google maps.
With ABE not activated, google.* blocked and maps.google.* + gstatic.com whitelisted JSView shows the following .js files loaded:
With ABE activated, google.*, maps.google.* + gstatic.com whitelisted :
Absolutly nothing
Did you remember to navigate away from the page, preferably *clearing the cache*, or even better (for a *pure* test), close and restart the browser?
Why do we have to have maps in a regexp at all? Using the third-level domains as I did, maps.google is allowed to pass right through, but www dot google is subject to ABE rules. My map showed about a dozen scripts in JSView, *all* from maps.google, plus the trivial embedded ones (functions, click, etc.)
Third-party sites are unlikely to call maps.google and maps.gstatic unless that site offers map functions, in which case, we want them, right?
Doesn't seem to be much possibility of harm from an evil site calling to a script that we trust?
I must refresh the page, then only these .js loaded:
In my test case, ssl.gstatic showed in the context menu version of JSView, which is often more comprehensive (right-click the page, View Page Info > JSView), but it did not run. Size showed as ?? in the JSView, and double-clicking it to open it gave a blank page. It seems not to be needed for the map function.
For now i couldn't find a way to trace why the remaining scripts are not loaded. Even HTTPFox didn't help.
Definitely it has something to do with how the rule is handle.
Again, do we need to subject maps.google to ABE?
Do the recaptcha and www google rules work OK for you, as they did for me?