They're using an incredibly stupid method of deterring robots, and you're getting caught up in it.
Seriously, either don't visit that site or complain to them. Really. Because what they're doing is not only a bad way to achieve their goal, it's guaranteed to catch legitimate users who are security-conscious but not crazy power users (meaning, people like you).
Code: Select all
<!-- Note the "nospam" added. It can be anything. The idea is to throw a HTTP 404 (invalid page) when JavaScript is disabled. Robots have that setting to improve performance therefore they will not reach the form and try another website. -->
<noscript>
Please enable JavaScript!
<META HTTP-EQUIV="Refresh" CONTENT="0;URL=http://www.tekscan.com.nospam/">
</noscript>
This is bad also because a smart evil bot could just ignore META refresh (or go through the whole page first, find the form they don't want it to find, then follow the refresh).
Instead, they should be generating/injecting whatever form they're talking about using an external script file hosted on their server - that would be just as effective (if not more effective) at blocking the class of robots they're worried about, and people like you could still view the site.
Anyway, work-arounds: (these are 3 different work-arounds, *not* 3 steps you need to do)
1) about:config > set accessibility.blockautorefresh to true
2) NoScript Options > Whitelist > add manually tekscan.com
3) NoScript Options > Advanced > Untrusted > Forbid META redirections inside <NOSCRIPT> elements (haven't tried this one myself)