GCHQoogle: so much for "Don't be evil"Given what we now know about the mass surveillance, and attack on the infrastructure of the internet, conducted by Britain’s GCHQ and America’s NSA (as well as their Chinese, Russian, German, etc counterparts).

Given that we now know, for a fact, that almost every byte of non-encrypted traffic is recorded and analysed, shouldn’t we now make a concerted effort to finally deprecate vanilla HTTP in favour of HTTP over TLS (HTTPS)?

When you use HTTP, it is a trivial matter for an attacker to see the content of the pages you visit, when, and how often you visit them. When using HTTP, there is also no guarantee that the content of the page hasn’t been modified without your knowledge, exposing you to all kinds of attacks.

Encryption, by and large, removes these problems, as well as massively increasing the cost of mass surveillance. Is it not time for all of us, as well as standards organisation like the IETF, push to make HTTPS the default? Even during my time I’ve seen insecure protocols like telnet and FTP go from widespread use to being almost completely replaced by secure alternatives (ssh and scp), so could we not do the same with HTTP?

Certificate authorities

Ok, there is one big difference between HTTPS and ssh (ok, many many, but one I care about here), and that is that HTTPS relies on certificate authorities. These are necessary in order to distribute trust, so that browsers can know to automatically accept a certificate and verify the server it is connecting to is who it says it is.

This is much nicer for the average user than, say, manually verifying the server’s fingerprint (as you have to do with SSH), but comes with some pretty serious problems if we were to make it default:

  • Every site owner would have to get a certificate, and these can only be obtained by a certificate authority if you don’t want browsers to pop up a big red warning, meaning we further bake these guys in to the Internet’s DNA.
  • Certificate authorities can be directly pressured by governments, so, a government attacker could MITM you on a secure connection and present you with a certificate that your browser accepts as valid, and so will give you no warning (of course, this is much more costly than the blanked mass surveillance that is currently going on).
  • Getting a certificate either costs money, and/or has restrictions placed on their use (for example, no commercial use, in the case of StartCom). This is really bad, since it essentially requires permission from a third party to launch a site.

It is this last causes me most concern, since it essentially provides an easy way of suppressing minority views.

Imagine that we lived in a world where HTTP had been deprecated, and browsers no longer supported unencrypted HTTP, or could, but you had to request it specifically (essentially the reverse of what we currently have). You wanted to launch a site that expressed a minority view – perhaps you were critical of your government, or you wanted to leak some information about crimes being committed, is it not inconceivable that you could have trouble obtaining a certificate? Given that certificate authorities are companies who worry about their bottom line, and are a convenient point for the bad guys to apply pressure?

If you couldn’t get a certificate in this environment, it could dramatically reduce the audience that would see your site.

So, perhaps before we move to deprecate HTTP, we must first find a better way than certificate authorities to distribute trust? How could we accomplish this? Perhaps we could take advantage of the fact that most people’s browsers automatically update, and so we could distribute browsers with expected certificates for sites hard coded into them (giving an added advantage that we could pin certificates)?

Anyway, its complicated, and I’m thinking aloud here… what are your thoughts?

WebMention is a modern re-implementation of PingBack, which uses only HTTP and x-www-urlencoded content rather than infinitely more complicated, not to mention bloated, XMLRPC requests. It was developed by the #indieweb community at IndieWebCamp, and is rapidly seeing adoption.

Since the best way to understand a protocol is to write an implementation of it, I bashed together a basic implementation of it for Elgg.

The plugin will automatically send webmention pings for content with URLs in the $object->description field (you can easily expand on this), it also exposes a webmention endpoint, and sets the appropriate discovery headers and meta tags. Plugin authors can hook into the plugin hooks that elgg-webmention generates and handle incoming mentions appropriately.

There is still a little more to do, the next step I think is to hook into a microformats parser, in order to get some richer semantic information as to the type of mention one is generating. My friend Ben has a very neat video of this kind of thing in action, and his idno project already implements it in the core code.

Have a play!

» Visit the project on Github…