Hello -
I have Facebook.com in my whitelist. The Facebook pages work fine.
However, many other sites seem to run facebook scripts, such as the internet movie database
(imdb.com). How can I allow the scripts associated with Facebook to only run
when I'm at Facebook.com, and prohibit their use anywhere else?
Thanks for any info.
[RESOLVED] limiting scripts to their home sites.
[RESOLVED] limiting scripts to their home sites.
Last edited by Tom T. on Thu Jan 05, 2012 4:50 am, edited 1 time in total.
Reason: mark as resolved
Reason: mark as resolved
Mozilla/5.0 (Windows NT 5.0; rv:8.0.1) Gecko/20100101 Firefox/8.0.1
Re: limiting scripts to their home sites.
ABE FAQ, esp. Chapter 8.10.
If you need further help after reading that, please feel free to ask. Or to tell us that it works for you.
Thank you.
If you need further help after reading that, please feel free to ask. Or to tell us that it works for you.
Thank you.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.24) Gecko/20111103 Firefox/3.6.24
Re: limiting scripts to their home sites.
Thanks Tom,
I'll look at those materials and report back if need be.
Dave
I'll look at those materials and report back if need be.
Dave
Mozilla/5.0 (Windows NT 5.1; rv:8.0) Gecko/20100101 Firefox/8.0
Re: limiting scripts to their home sites.
Thanks, Dave. I should also have pointed you to the long-running thread on this very topic, which has more than 150 posts and more than 23,000 views.
Sorry, it slipped my mind.
Sorry, it slipped my mind.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
Re: limiting scripts to their home sites.
I've had some problems with the ABE script, for example twitter.com and this website:
http://www.tottenhamhotspurs.tv/forum/players-lounge/
(I am using Firefox 9.01)
I have the following in ABE/user:
# User-defined rules. Feel free to experiment here.
# facebook.com rule
Site facebook.com *.facebook.com
Accept from facebook.com *.facebook.com *.mail.yaho.com *.mail.yahoo.net
Deny
# twitter.com rule
Site twitter.com *.twitter.com
Accept from twitter.com *.twitter.com
Deny
yet the twitter.com script is still active in the above page. Did I code it wrong?
Thanks again -
http://www.tottenhamhotspurs.tv/forum/players-lounge/
(I am using Firefox 9.01)
I have the following in ABE/user:
# User-defined rules. Feel free to experiment here.
# facebook.com rule
Site facebook.com *.facebook.com
Accept from facebook.com *.facebook.com *.mail.yaho.com *.mail.yahoo.net
Deny
# twitter.com rule
Site twitter.com *.twitter.com
Accept from twitter.com *.twitter.com
Deny
yet the twitter.com script is still active in the above page. Did I code it wrong?
Thanks again -
Mozilla/5.0 (Windows NT 5.1; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Re: limiting scripts to their home sites.
The coding looks fine, Dave, and a test run (with Fx 3.6.25, but I'll confirm on 9.x) shows well, too. I think it's a matter of (mis)interpreting the result:
When the page refreshed, a NS block-logo showed where the Twitter widget would be, because I have all Embeddings page checked, including on trusted (whitelisted) sites. But it showed that yes, the Twitter script was trying to load the widget. Had I clicked OK on that placeholder, or unchecked "Apply these restrictions to whitelisted sites too" in NS Embeddings page, the Twitter widget would load.
Next: I leave the page, add your Twitter rule (copy/paste exactly as is), "refresh", "OK".
I go back to the page.
Now, there is no NS block-logo.
Instead, there is only the word "Tweet", which becomes a link on mouseover, with destination
IIUC, I believe that *that* is what you are seeing as "the script is still active". But if one tries to click the link, the beautiful result is a top bar,
So both the widget and the share link are indeed blocked.
Natural question: Why is Twitter showing at all?
Because that link is embedded in the source code of tottenhamhotspurs itself:
NoScript can't remove that code from tottenham's page, but it can prevent it from executing the third-party scripts, either the sharing or implanting the widget. Just as I could put in this reply, but can't force it to run against the might of NS and ABE.
The same effect could be achieved by leaving FB and Twitter in the default-deny zone, then temp-allowing them at FB and at Twit, respectively.
ABE lets you do that automatically.
All of this will become much easier in the long-awaited NoScript 3.x for the desktop, which will have per-site permissions built in.
[1] Whereas NS blocks all executable content, whether from the site you are on or elsewhere, unless whitelisted or TA'd, RequestPolicy blocks all requests to any *other* site from the site you are on, executable or not. But not from the site you are on.
E. g.: A single-pixel clear .gif image placed by an ad agency, commonly referred to as a "Web beacon" by the ad industry, and as a "Web bug" by privacy-conscious users. Calling the source of the invisible image lets the ad people track you around any site that lets them place it.
NS would not block this, being non-executable. RP would not block any scripts, Flash objects, whatever, *on the site you are on*.
Put the two together, it's awesome control.
*If and when you've digested most or all of NoScript*
, see more:
RequestPolicy home page:
- Tom
I went to your link without any ABE USER rules, allowed the site's own script, allowed the script from twitter.com, and also allowed it in RequestPolicy. See the footnote on that add-on. [1]the twitter.com script is still active in the above page
When the page refreshed, a NS block-logo showed where the Twitter widget would be, because I have all Embeddings page checked, including on trusted (whitelisted) sites. But it showed that yes, the Twitter script was trying to load the widget. Had I clicked OK on that placeholder, or unchecked "Apply these restrictions to whitelisted sites too" in NS Embeddings page, the Twitter widget would load.
Next: I leave the page, add your Twitter rule (copy/paste exactly as is), "refresh", "OK".
I go back to the page.
Now, there is no NS block-logo.
Instead, there is only the word "Tweet", which becomes a link on mouseover, with destination
Code: Select all
http://twitter.com/share
IIUC, I believe that *that* is what you are seeing as "the script is still active". But if one tries to click the link, the beautiful result is a top bar,
And in the Error Console (Tools > Web Developer on 9.x), blue Info messages:Request GET http twitter dot com/share <<< (your link etc.) filtered by ABE (your rule quoted)
Code: Select all
[ABE] <twitter.com *.twitter.com> Deny on {GET http://platform.twitter.com/widgets.js <<< http://www.tottenhamhotspurs.tv/forum/players-lounge/ - 2}
USER rule:
Site twitter.com *.twitter.com
Accept from twitter.com *.twitter.com
Deny
[ABE] <twitter.com *.twitter.com> Deny on {GET http://twitter.com/share <<< http://www.tottenhamhotspurs.tv/forum/players-lounge/ - 6}
USER rule:
Site twitter.com *.twitter.com
Accept from twitter.com *.twitter.com
Deny
Natural question: Why is Twitter showing at all?
Because that link is embedded in the source code of tottenhamhotspurs itself:
Code: Select all
<!-- Tweet Button -->
<span class="tweet"><a href="http://twitter.com/share" class="twitter-share-button" data-count="horizontal" data-via="THFCforum">Tweet</a><script type="text/javascript" src="http://platform.twitter.com/widgets.js"></script></span>
Code: Select all
http://EvilTomT_Virus.com

The same effect could be achieved by leaving FB and Twitter in the default-deny zone, then temp-allowing them at FB and at Twit, respectively.
ABE lets you do that automatically.
All of this will become much easier in the long-awaited NoScript 3.x for the desktop, which will have per-site permissions built in.
[1] Whereas NS blocks all executable content, whether from the site you are on or elsewhere, unless whitelisted or TA'd, RequestPolicy blocks all requests to any *other* site from the site you are on, executable or not. But not from the site you are on.
E. g.: A single-pixel clear .gif image placed by an ad agency, commonly referred to as a "Web beacon" by the ad industry, and as a "Web bug" by privacy-conscious users. Calling the source of the invisible image lets the ad people track you around any site that lets them place it.
NS would not block this, being non-executable. RP would not block any scripts, Flash objects, whatever, *on the site you are on*.
Put the two together, it's awesome control.
*If and when you've digested most or all of NoScript*

RequestPolicy home page:
Cheers,More information on the privacy reasons for using RequestPolicy is available at:
https://www.requestpolicy.com/privacy
More information on the security reasons for using RequestPolicy is available at:
https://www.requestpolicy.com/security
RequestPolicy is not a replacement for NoScript! Each focuses on different, important issues. For the best security, we recommend using both RequestPolicy and NoScript. More information on the difference between the two is available here:
https://www.requestpolicy.com/faq#faq-noscript
- Tom

Last edited by Tom T. on Tue Jan 03, 2012 6:52 am, edited 1 time in total.
Reason: fix link
Reason: fix link
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25
Re: limiting scripts to their home sites.
Dave, the above results were reproduced *exactly* on Firefox 9.01. I also tested the Facebook "Share" link with your rule; same (good) results.
I even allowed the Twitter widget before implementing your rule. Yes, the like/dislike button and counter show up. But with your rule, no.
One clarification: Since the http *LINK* to twitter.com/share is embedded in the page, just like this one is, NoScript and ABE won't block the link.
Memo: Don't click untrusted or unknown links.
(That one was safe.
)
But for embedded links that call Javascript from third-party sources, as in the above reply about the source code, NoScript and/or ABE will block those for you, unless you allow them. Hope that's all clear now.
I even allowed the Twitter widget before implementing your rule. Yes, the like/dislike button and counter show up. But with your rule, no.

One clarification: Since the http *LINK* to twitter.com/share is embedded in the page, just like this one is, NoScript and ABE won't block the link.
Memo: Don't click untrusted or unknown links.

(That one was safe.

But for embedded links that call Javascript from third-party sources, as in the above reply about the source code, NoScript and/or ABE will block those for you, unless you allow them. Hope that's all clear now.
Mozilla/5.0 (Windows NT 5.1; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Re: limiting scripts to their home sites.
Tom, thanks for a fantastic answer. Frankly, more detail that I currently understand.
I've never even heard of Requestpolicy. I've been using NS for a while,
but never really learned it, nor the kinds of issues that are involved.
Looks like I need to start at the NS FAQ, then come back and re-read your
post, then checkout RP.
Thanks again.
PS - I still have an old (slow) Windows 200 PC and the main reason I use NS
besides safer surfing is faster surfing. Some of the scripts take the cpu
to 100%. However, limiting the scripts with NS I can dramatically
improve the whole surfing situation on this slow box.
Thanks again -
Dave
I've never even heard of Requestpolicy. I've been using NS for a while,
but never really learned it, nor the kinds of issues that are involved.
Looks like I need to start at the NS FAQ, then come back and re-read your
post, then checkout RP.
Thanks again.
PS - I still have an old (slow) Windows 200 PC and the main reason I use NS
besides safer surfing is faster surfing. Some of the scripts take the cpu
to 100%. However, limiting the scripts with NS I can dramatically
improve the whole surfing situation on this slow box.
Thanks again -
Dave
Mozilla/5.0 (X11; Linux i686; rv:9.0.1) Gecko/20100101 Firefox/9.0.1
Re: limiting scripts to their home sites.
Sorry about that. It's always difficult to know the tech level of any given user, unless they either say so ("I'm a noob" or "I run the IT dept. at my company"), or after some conversation reveals it. High-tech users sometimes feel condescension if answers are geared toward average users, and vice-versa.davexnet wrote:Tom, thanks for a fantastic answer. Frankly, more detail that I currently understand.
Remedy: Feel free to ask about anything, or not, knowing that your rule is working regardless. Continue to write ABE rules as you have -- you've nailed it.

Excellent plan, Sir!davexnet wrote:I've never even heard of Requestpolicy. I've been using NS for a while, but never really learned it, nor the kinds of issues that are involved.
Looks like I need to start at the NS FAQ, then come back and re-read your post, then checkout RP.

Hate to hit you while still overloaded, but I owe one further clarification. Realized it some time after replying, but decided to let you try to digest all of that first.

Correction: NoScript won't block the linked page from loading if you click the link, although it will default-deny the other page's scripting.Tom T. wrote:One clarification: Since the http *LINK* to twitter.com/share is embedded in the page, just like this one is, NoScript and ABE won't block the link.
ABE, however, *can* block the linked page from loading, if you create a rule based on the destination. Just as the twitter rule prevented the twitter login page from loading even when I clicked the link.
However, not clicking the link works just as well.

Absolutely. Skip this part if you like.davexnet wrote:PS - I still have an old (slow) Windows 200 PC and the main reason I use NS besides safer surfing is faster surfing.

One of the big improvements claimed in newer browser versions is speed. Yet my Fx 9.x, 3.x, and even 2.0.0.20 (support ended at end of 2008) tend to measure about the same speed on my ISP's connection speedometer (a/k/a "benchmarking tool"). One reason: Much of the claimed improvement is from faster scripting engines. But as a user who denies 95%+ of the scripting out there, plus the other stuff blocked per the NS Options > Embeddings tab, that speed advantage is diminished a good bit.
Actually, the cpu will spike sharply, often to 100%, almost any time you open a new page, a new program, etc. That's so the new thing can load ASAP. Which makes sense.davexnet wrote: Some of the scripts take the cpu to 100%.
I just opened a new, blank Open Office doc, and cpu went to 100 until the blank doc showed and was ready. Then it drops back to the current single digit reading. But *the time spent* at 100 can shorten dramatically, as you said.
I'm on a six-year-old machine (that's like dog years - it's 42 years old in computer years), a laptop with what today would be regarded as a slow CPU, but yes, it can be quite fast when you take the garbage out of web pages.davexnet wrote:However, limiting the scripts with NS I can dramatically improve the whole surfing situation on this slow box.
Hint: If you haven't maxed the machine's capacity for physical memory (RAM), and can still find RAM modules for it, adding more RAM to older machines can often make a nice speed boost. I won't bore you with the technical reasons why.
You're quite welcome. And sorry for the info overload, but others read these threads, too. At the moment, more than 200 views. So the info may be helpful to those who are following, even though not posting. (lame excuse for verbosity, but it's the best that comes to mind, LOL).Thanks again -
Dave
I'll mark this as Resolved, since your original question has been answered (along with a lot of unasked questions,

Feel free to post back as needed.
Again, cheers.

- Tom.
Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.25) Gecko/20111212 Firefox/3.6.25