Anchor Markers

We’ve had a bug open for quite a long time about implementing the W3C Common User Agent Problems recommendation from 2001 to highlight the target of intra-page named anchors.

Someone recently posted their ideas in that bug; I don’t think much of their UI, but I think it would be great if Firefox had a solution for this. (It would be nice also to style internal links slightly differently, perhaps with a dotted underline, so people don’t do “Open In New Tab”, and end up with five copies of the same large page.)

Is there an extension which does this? If not, does anyone feel like knocking one up?

One implementation method would be to highlight and then fade out the elements contained in the <a name=””> tag. However, this might not work everywhere because of the following pattern:

<a name=”heading”></a>

People do this a) to make sure the heading is visible on the page, and b) because people used to write bad CSS which caused hover effects to apply to anything within an <a> tag, whether it had an href or not. But maybe this problem isn’t a showstopper.

Only Browser Geeks Need Apply…

The Alliance and Leicester Building Society’s “3D Secure” secure internet shopping service has the following requirements:

What are the system requirements for 3D Secure?

3D Secure requires the use of Windows Microsoft® Internet Explorer 5.5 and 6.0, Windows Netscape® 7.1 and 7.2, Windows AOL ® 9, Windows Firefox® 1.0 and Macintosh Safari®.

Wow. It’s a pretty small and exclusive clientele who can run all of those at once. I guess you’d need a Mac (for Safari) running at least two copies of Parallels (one for each version of IE)…

Styling Internal Anchors

Ever seen an interesting link on a large page, middle-clicked it to open it in a new tab and then realised that it was an internal anchor, and you are just loading another copy of the same large page?

To have every internal anchor marked with a small “#” symbol after it, add the following to your userContent.css:

a[href^="#"] {
padding-right: 6px;
background: url("data:image/png,%89PNG%0D%0A%1A%0A%00%00%00%0DIHDR%00%00%00%06\
%97%89%00%00%00%00IEND%AEB%60%82") center right no-repeat;

IE Plays Catchup

Is it just me, or could the “IE Add-ons Contest” have been renamed the “IE Add-the-features-Firefox-has-that-we-don’t Contest”? Of the four top addons, three implement Firefox features for IE. And the last is an extension we had first.

Even the description page for the grand prize winner admits as much:

Inline Search is an add-on for Internet Explorer that mimics Firefox’s search behavior…

Imitation is the sincerest form of flattery :-)

IE 7 Cripples IDN

Microsoft’s policy about how IE 7 will handle IDNs has changed slightly in beta 3, but unfortunately as it stands will still have a serious detrimental effect on IDN take-up. Here’s why.

IE 7 displays all IDN domain names as punycode (e.g., unless the copy of IE has the “language” of that domain name configured as one of its Accept Languages.[0] If it displays the ugly and indecipherable punycode, it also presents a yellow security bar, saying “We can’t display this domain name; click for options”, where presumably the user might have the option of adding to the whitelist whatever language IE thinks the domain name is in.

This will cripple IDNs in almost any international market, simply because domain owners are not going to want an unknown percentage of users visiting their domain to have that horrible user experience. You are a German company – will you choose an IDN domain name containing a ß as your primary domain name if you know you might one day want to expand into the European market and sell goods outside Germany? And that almost all your European customers will have to go through this?

IDNs might be perhaps used when the site owner can guarantee that all their visitors will have a particular language configured – but how common is that? Even aside from the situation above, this is the “World Wide” Web, and people use the browser of a friend, or an Internet café. The browser doesn’t really know what languages its user speaks, and it’s unlikely that the user will take time to tell it. When was the last time you configured the Acceptable Languages in a browser you were using? And if you did, when you stopped using that browser, did you remove them and reset the setting?

The sad thing is that this measure by itself doesn’t improve security. A particular domain name is either dangerous or it isn’t – that is, it’s either a homograph of another domain registered to a different person, or it isn’t. If the domain name is a homograph then all those people who, by default or by configuration, have that language configured are at risk. And if it’s not a homograph, why not display it to everyone from the start?

The other measure IE 7 is taking, which is to forbid most script mixing, will improve security. But here they have gone the other way – this measure is too draconian. Script-mixing by itself is not dangerous, as long as your registry is on the ball.

Firefox has a system based on a whitelist of TLDs whose registries have sensible anti-homograph policies. Only they can tell if a domain name is dangerous or not; browsers just don’t have enough information. Our policy allows many more safe domain names.

Unfortunately, as domain owners will only pick names which work everywhere, IE 7 is further restricting the set of names that can be used in practice. Having worked for a long time on making IDN safe and usable in browsers, it’s very sad to see its uptake stunted in this way. :-( I hope they change their minds and remove that first check, but I fear it’s too late.

[0] There will also be a host of problems caused by the fact that domain names use characters from particular scripts, or perhaps multiple scripts, and IE has a list of languages. Languages and scripts have a really complex relationship – in which language is the letter é? What Accept Language do I have to have configured to correctly view www.café.com? I haven’t covered this further because it’s secondary to the even bigger problem mentioned above.

IE 7’s Effect On Firefox Market Share

rebron’s IE 7 Competitive Analysis is worth a read. I’m glad that printing has been picked out as an area where Firefox needs future work.

It’s nice to think, as rebron suggests, that IE 7 won’t take away share from Firefox. Perhaps it won’t by converting users back directly. But it could have indirect effects; where today, if people get a new computer with IE 6, they think “Ick! Install Firefox now!”, if it comes with IE 7, they may just live with it. Also, people often install Firefox for their relatives or friends – will they be as eager to do so when the feature/usability gap is smaller?

I guess the answer to that is, let’s keep that gap wide :-)

HTTPOnly For Firefox – Sort Of

IE implements a non-standard feature called HTTPOnly, which allows cookies to be set such that they are only sent back to the webserver, and are not available via JS. This mitigates cookie stealing using XSS.

Firefox hasn’t got it yet; we’re a bit held up by our legacy cookie “database” format, cookies.txt. However, Stefano Di Paola has come up with a way to implement the same feature, using the hackability of the Firefox JS engine.

Put the following line of code at the top of the first bit of JavaScript your page runs:

HTMLDocument.prototype.__defineGetter__(“cookie”,function (){return null;});

and, as long as the XSS injection hole is further down the page, cookie access from script will be impossible.

“</scr” + “ipt>”

Everyone knows that you can’t use the literal text “</script>” inside a <script> block, because the web browser will interpret it as the end of the block. But do you understand why it was designed that way, or do you think it is just a bug that no-one has got around to fixing yet?

Raymond Chen has the lowdown.

Carefully Tuned User Irritation

Yngve has an interesting post about how to deal with the problem of banks etc. doing login by submitting from an insecure to a secure page.

The aim is not to protect each user’s form submission when using the broken page; the aim must be to get the bank to fix the site. So we need to change the browser to inconvenience the bank’s customers enough that they complain to the bank, but not enough that they try and change browsers to one which does not have this “feature”. In other words, we need to carefully tune the level of user irritation ;-)

So how can you inconvenience the users? One option is Yngve’s popup on submission; make the users press a big button marked “Submit Insecure Data”. That should cause a few panicky calls to the bank’s tech support line. Another option would be to delay the rendering of the next page by five seconds or so, while displaying some sort of warning in the blank space; banks like their sites to be snappy, and they don’t like worried customers.

If we are going to make browser changes, we’d need to do it in a synced up fashion, so people didn’t simply reduce their security by switching browser provider.

One last option would be to sponsor a 3rd party “major banks security assessment”, which took in details like this, the format of emails they sent out, whether they used third parties for email delivery, and so on. Publicise the results, and try and shame the lagging banks into compliance.

SSL Changes In Opera 9

Yngve Pettersen of Opera has posted about the changes they are making to SSL in Opera 9, including disabling SSL 2, introducing TLS 1.1 with some useful extensions, and warning on weak ciphers.

Galeon To Become Epiphany + Extensions

The Galeon project has announced that they and Epiphany are now close enough that the extra features that Galeon provides can be implemented as Epiphany extensions, and so plan to move forward in that direction. They feel that this way would mean much less duplication of effort.

It’s not an easy thing to make such a decision – pride often keeps “my pet project” going for much longer than it should – and so I applaud the Galeon developers, and hope that the resulting browser(s) become better than the sum of the parts.

IE’s Phishing Filter

The IE Blog has a post about the new Phishing Filter which will be built into IE 7. Basically, there’s a client-side whitelist and a server-side blacklist; if you turn the filter on, every URL you visit which is not on the whitelist gets sent off to Microsoft’s servers to be checked. And if you suspect a site is a phishing site, you can click “Report Phishing Site” on the Tools menu to send that URL off into a queue to be verified.

However, for privacy reasons, IE strips off the URL parameters before sending off URLs. And this is where the problems with such an approach start to become apparent. What guarantees that the web page the manual URL checker person views (requested without URL parameters) is going to be the same one that the original reporter saw?

The URLs phishers distribute by email can be mangled and made unique in many ways; DNS wildcards, mod_rewrite and query parameters are just three. Really smart phishing site implementations would continue to server the phishing content for a given unique URL to the same IP address or class C range, but send innocent content back to any different IP address. Or they could use cookies to achieve the same effect. Microsoft engineer Peter Torr lists quite a few methods of URL mangling while explaining why the phishing filter doesn’t use hashing. However, he doesn’t say that they are all quite effective at making the filter’s life difficult even without hashing.

Server-blacklist-based anti-phishing implementations put you in an arms race, and one in which the phishers hold all the cards. They have 20,000-strong botnets with automatic deployment tools; you have to check every submitted URL by hand. They can invent new ways of obfuscating and redirecting URLs; you are limited by the tools built into your deployed client. They have a large financial incentive; you are giving away a free product.

There’s no magic bullet, but I believe the correct route to take is a combination of greater SSL use (which means we need SSL vhosting), stronger certificate field verification and OCSP, combined with in-browser standalone heuristics and a sprinkling of user education. A minimal amount of the latter is IMO, sadly, unavoidable – it’s very hard to protect people who will put their credit card number into just any web form which asks for it.