Jorge Villalobos
Ideas, technology and the occasional rant

Wednesday, January 18, 2006

The new "ping" attribute might ruin Firefox

When I click on a link on a page, I want the browser to go to that page. Nothing more, nothing less, nothing else. A simple anchor tag with a simple href. Sometimes even javascript is acceptable, but I'm not very fond of those types of link. I hate it when I click on something without knowing where I'm going or what's happening in the background.
Some people may wonder what's the point of having such obscure links and not simple straightforward anchors. The answer: user activity tracking. See, most website administrators will gather and store everything they can about your browsing habits, whether you like it or not. Most of it is pretty harmless and helps them improve their service. They identify the most visited pages and give them more importance. The less visited ones are analyzed in order to identify problems or broken links. Nothing to panic about; they just want to know if you are finding everything without problems.
All of this can be done with relative ease using server logs. A user can be tracked with his IP number, and the different pages requested can be ordered through time to get a very reliable browsing history. What's missing here? Well, you don't know how is it that the user exited the site.
You can easily identify the last page the user visited within your site, but you don't know what the user did next. Maybe he closed the window or tab. Maybe he clicked on an external link. But which one? How do you know where are your users going? Well, now is when things begin to get ugly.
Some websites currently use a wide variety of techniques in order to know exactly where their visitors are going. One that is very easy to identify is when links point to something like http://www.somesite.com/link?id=200 or http://www.somesite.com/link?url="http://www.someothersite.com". You can tell you're not going directly to the place specified, but you are rather being redirected to a special page that will store your click and will then redirect you to the place you wanted to go. Now they know where you went.
Other more devious techniques involve using javascript events to change the URL right after the click, so that visitors will see the correct URL when hovering on the link but will be directed to another page in reality. That's just plain wrong by my book.
These click tracking tricks (try saying that 3 times in a row) slow down browsing, annoy the tech-savvy and completely fool everybody else into doing things they didn't agree on. I wouldn't qualify it quite as spyware, but it's close, and very wrong to say the least. Now is when things get really, really ugly.
Darin Fischer, Mozilla developer and Google employee, has recently blogged about a new, experimental attribute that has been added to anchor tags in the Firefox developer builds. In his words:
It is now possible to define a ping attribute on anchor and area tags. When a user follows a link via one of these tags, the browser will send notification pings to the specified URLs after following the link.

Now, let's be very clear: this is currently present only on Firefox developer builds and might never make it to the public, which is part of the reason for today's rant. No sites implement this attribute yet and it's very unlikely that they ever will. You should also read Darin's response to user feedback to form your own opinion about this issue. Now follows my own.
This feature will be labeled as spyware unless it can be easily disabled. The current version can be disabled through a preference and there's no reason to believe it won't always be like that. In the worst case it will be easy to disable it through an extension or a Greasemonkey script, both of which must be in the making as we speak.
Now, if website administrators really want this information to the point of using such ugly techniques as they currently do, why would they rely on such a weak and, may I add with utmost disgust, BROWSER-SPECIFIC feature? They won't. They will continue doing what they do, possibly adding an extra "ping", you know, "just in case". So now we have more unnecessary traffic and the problem hasn't changed at all. Calling Microsoft PR...
The way I see it, this issue is no different than popups. It's a battle between browser makers and webmasters. One wants to give their users the best experience possible, the others want to sell whatever they sell in the best way possible, usually stepping over the rights and comfort of their visitors. Popups must be blocked, and link redirections should be blocked as well. I'm thinking some work could be done with Greasemonkey to prevent this on a per-site basis. I know the problem is very complicated and just as popups, new ways to bypass the protections will be found. Still, I think it's a fight worth fighting and hopefully this will lead to more site administrators abandoning the fight and focus on doing their jobs.
I won't even get started with the WhatWG. You make up your minds. Apparently the word "standard" means nothing to some people. Don't get me wrong, I know standards are driven by innovation, but there's an order to things. If the W3 wants to go with this abomination, fine, let it be a standard and then you can implement it. At least you would have somebody else to blame, instead of giving Mozilla such a bad name.
Because, in the end, all of this is about reputation. "The browser you can trust", remember? A lot of people are seriously angry about this. They may be taking rushed decisions or just being trolls, but in reality a lot of us are disturbed with the direction Mozilla development could be taking. I know (now) that there's a Bugzilla post about this change, but they could have been a little more open about this, considering it is obviously a very controversial feature. A lot of people probably feel they weren't asked about this or they would have complained before. I know I do.
This better not make it out of the experimental phase. It will become Firefox's ActiveX if you continue with it. Enough damage has been done already, bury the damned thing. Leave the unilateral "my standard" making to the unprofessionals.

Get Firefox 1.5!

Monday, January 09, 2006

Tricking the robot guards

Unpredictability is something most software developers don't want to see in their creations. Gotta hate when that program crashes right after it was working "just fine". I'm sure events give GUI developers headaches and hardware interrupts give kernel developers horrible, horrible nightmares. Most of us work striving for quite the opposite: total predictability and reliability. And I am happy with that.
Yet, when we deal with the unknown, we should always expect the worst. We never know what's on the other side of the interface. Even if we do, we can't assume it will remain the same over time. We have to deal with unpredictable behavior as best as we can, even if it has to come down to showing a miserable "I give up" message to the user.
Still, it's ridiculous to expect programs to be prepared for all kinds of random behavior, and this is something that probably hasn't been exploited enough. I'm no expert in malicious code, but I have a brain, so here it goes: I think malware could take advantage of random behavior to further infiltrate a system and conceal its existence.
As I said, I'm no malware writer. Quite frankly I despise people who waste their time on such things; software writers shouldn't have to spend half their time building walls and electric fences around their code. But I digress. My point is that I'm not really sure how could random behavior could be exploited to obtain easier access to the system. I'll focus on the second part: concealing presence.
There are tools designed to detect malware. Since malware comes in different flavors, these tools tend to use a wide variety of detection techniques to cover (ideally) all types. Some look for specific files, some look for certain running processes, some look for differences in files.
Now, certain malware programs will replace dlls or executables in order to introduce their harmful behavior into the system. Malware authors, in all their brightness, don't look two steps ahead and simply make their programs get rid of the original files, introducing their own. This makes it easy for them to be detected because abnormal behavior can be detected simply by executing the code in a controlled environment.
So here's what I think: malware that changes some interface or binary could randomly switch between wrong behavior and normal behavior, making detection a little harder. There's some probability that the malware will not be detected using that technique because the normal behavior could be obtained. I'm pretty sure the robot guards are not programmed for unpredictability. Malware detectors follow predefined detection lists and techniques, and are unlikely to repetitively perform the same test on the same component within a short span of time.
So this is a heads up for the people that work fighting malware: think unpredictable. If you only test by execution, do so more than once. Perhaps you already considered this, but I think it was worth mentioning.
Also, unpredictability works both ways. If I have to break into a place, I'll take a Hollywood style laser grid over a guard dog any day. Improvisation and randomness are very powerful tools, and changing the rules of the game is something trespassers are never prepared for. I don't have any specific ideas except maybe file relocation or security hole simulation, but some clever ideas could come out of this line of thinking.
Security seems to be a topic I keep coming back to. I don't know why I think so much about it. It's not an area I'm very interested in, but past jobs and future posts seem to revolve around the topic. I'm not a specialist in... well, anything, but I hope this can help in some way. This is to the people who spend their day looking for ways of making our computers safer. Keep up the good work.

Get Firefox 1.5!

Friday, January 06, 2006

Philosophy on the road

I arrived to the UK yesterday after the longest trip I've ever had, and I'm already experiencing the difference. I saw a road sign yesterday that said this:
"Tiredness can kill. Take a break."
Traffic accident prevention or life philosophy? I for one am adding this to my list of life mottos. Maybe it's just a way to justify the endless hours I've wasted playing Ultima 7 recently. Oh, well, those religious freaks aren't going to kill themselves.
I'm loving it so far. Let's see how everything else goes.

Get Firefox 1.5!