skip to main bit
a man slumped on his desk, from 'The Sleep of Reason Produces
      Monsters'

Oblomovka

Currently:

Archive for November, 2007

2007-11-24

thanks for the future

It’s in some degree a little tragic that holidays give me a chance to REALLY GEEK OUT – as though I don’t have endless opportunities at other times. But, honestly, and perhaps equally as tragically, geeking out is what I do when I need to become philosophical and retrospective and moody and irritated at everyone. It also encourages me to drink heavily, all of which I think are the true meanings of festivals in the human condition anyway.

My geeking out this Thanksgiving (apart from a hard-to-fight urge to buy consumer electronics, which I swear to God must be down to morphic resonance, because I’m really not that interested in retail usually, have successfully ad-blocked most of my life, and don’t have any money right now anyway), was a day zero purge of my servers and laptop home directories.

Usually this kind of thing ends up in complete disaster, like deciding one day to take your car apart and put it back together again.

This time it’s been working out rather well. Instead of naked and crying on the floor, I’m clear-headed though a bit chilly.

There wasn’t really any reason for it, except that recently I’ve been very forward-looking: as though I’ve been trembling on the edge of a precipice in a wing-suit. I’ve not been looking down, and I’ve not been jumping, but I’ve certainly not been looking behind me either. So it was about time to push myself over.

rm -rf! goes the urge, and I gave into it. I’m slowly re-introducing (or rewriting) all the scripts that I use, and taking into account the lessons I’ve learnt in the last few years.

It’s really easy to be frozen in the headlights when you delete your existing directory structure and start again, because you end up thinking so much about the future. But you can’t tell anything about the future, so there’s no point to being frozen. You just have to express confidence that it’s not going to be awful, and jump.

I have a fairly concrete aim, which is to see how close I can get to having a setup that is a) replicated everywhere, and where I can b) fall-back to different machines if one breaks, and c) I can throw at an EC2 instance as easily as I can throw it at a Nokia N810 or my Mac laptop, and d) shares as much as I can to the rest of the world.

God knows, I won’t get there, but as this has been my aim for some years, I have some lessons learnt, and some new technology has been rolling along for it. For instance, for years I’ve been following Joey Hess’s Living Life In Subversion credo: version-controlling my home directory, and trying to keep as much of it as public as I could.

It’s a great way to think of your digital life, not because of the fact that it keeps all your documents as revertable backups (like MacOS’s Time Machine) or allows you to sync your home directory across many systems so much, as the discipline of thinking “how much on my computer should be private, and how much should be public?”. Joey keeps a huge chunk of his home directory in a public repository, and it’s incredibly educational – both for readers and for him, I suspect.

Being more public is terrifying, and yet freeing at the same time: apart from anything else, you quickly learn to discriminate between public-because-i-created-it-and-want-to-share and public-because-it’s-not-actually-mine. That’s to say, I have a bunch of free ebooks, say, that I can make public because they’re public documents. But at the same time, I don’t worry too much about backing them up, because I have a world of backups out there already. I’m sure those of you who torrent films or used to file-share music will recognise that feeling. Why should I keep this, when so many others have a copy I could obtain easily enough? My copy of Brian Eno’s albums takes up a few megabytes on my hard drive. Should I back it up? Or should I just keep the receipt, knowing I could get a new copy so quickly if it was lost locally, here?

Thinking about what’s really private is also very clarifying. Passwords are private. Are bookmarks? Which bookmarks? What’s the minimum set of bookmarks I can make private? More people are asking this after their heavy del.icio.us use, I bet.

Here’s my current new directory structure. Like the old one, the public bits will appear sooner or later as a browseable repository on this machine. The private stuff is ghettoised into a single folder – directories that hold things like ssh settings are symlinked into it.

Most of this is (now) kept under Mercurial, a distributed version control system. The scripts are in Python, where I can help it. The structure is replicated across all my machines, with the same contents. Not everything is amenable to version control, but I have some ideas about how to keep the other stuff mirrored across all my machines too.

I’ve also got all my machines talking to each over IPV6. I want them to become more chatty, and less like they’re hiding on the edge. And I’m also pushing the edges of designing this system so I can share it with my closest friends: have it flexible enough to know when it is being used by someone else.

I realise some of this must not make much sense, but hopefully as I explore more, what I’m trying to do will become clearer. I’ll get all seasonal and aphoristic until I get there: Doing this kind of purge isn’t my way of apologising for the past, but thanking the future in advance.

2007-11-18

celebrating org’s second birthday

Update! Becky says that the sainted Joseph Rowntree Reform Trust have offered up to 10,000UKP of matching funds for every supporter ORG gets — so if you join for a fiver a month, you’ll be giving ORG ten pounds to play with!

So Join Now! Join Twice!

It’s hard to believe that it’s two years since the Open Rights Group got its first public support from a thousand British Net pioneers. Flicking through their annual report today, it feels like they’ve been around for decades.

What ORG’s staff has done is astounding. When we first sketched out what could be done with a thousand people’s fivers, we thought we could just about pay to have someone clueful on the phone 9-5. That would be enough: redirecting uhh, clue-misdirected journalists to the unheard-from Net users, and real tech experts (not just blowhards) who could explain without the usual fearmongering or special interest hype.

Honestly, I felt that if we just managed to have someone next to the music industry spokesman next time a TV show swallowed the “Internet is full of pirates and criminals, and must be smacked into obeyance” line, ORG would have paid its way.

Instead, here’s what ORG has done with its scant resources. After two years, it’s not just a media clearing house, although it does that too. It battled a celebrity-studded publicity campaign that sought to extend copyright terms. It fought the hype with with clear facts and economics, and won – the first time that has happened anywhere.

Its advice to the Gowers report on Intellectual Property helped give the British Government the most progressive outline of future IP policies in the world.

It organized Britain’s first ever volunteer analysis of electronic voting, and showed that bad e-voting counts could have changed who won in Scotland: a revelation that still shocks me.

Right now, is planning to advise British businesses on how to work with the new norms of copyright. And to guide it, it’s assembled an amazing group of British-based advisory board members, including the coder of Apache’s SSL support, one of Linux’s key figures, the co-founder of the UK’s first commercial Internet provider, and the drummer off of “Blur” – match the names to the reputations. They really are involved in the strategic and technical decisions that ORG makes every day – and it shows.

If you want just a recent example of the sort of in-depth knowledge ORG already shows, check out this GrokLaw interview with Becky Hogge, ORG’s Executive Director. Detailed, smart comments on the BBC iPlayer, a messy but vital part of the UK online debate right now. Now imagine that kind of knowledge being inserted, behind the scenes, in press rooms, Whitehall offices, and TV studios, day in, day out.

Part of the reason it’s been so successful is because of the incredible input of ORG supporters. It’s hard to point to the offline work this incredible team manage, but just give you a taste: If you want the most vibrant, wide-ranging, sensible discussions of IP and privacy online: subscribe to ORG-discuss, a list that has representatives of the music industry, her majesty’s sceptical press, security mavens, and free software advocates. It’s knowledgeable *and* very civil, a minor miracle in itself. You might also want to check out ORG’s equally smart wiki.

Here is where I ask you for moneySo, here’s the most amazing thing. ORG doesn’t do that on a thousand people’s fivers at all. ORG does it on less.

To get our ballpark income, ORG would have had to have converted every single one of the pledge-signers. I think we got around 50%.

So to celebrate two years, I encourage everyone to try and push the membership up to the promised one thousand. No, two thousand.

If you’re an ORG supporter, pressgang two of your friends to join. Find that online pal who is even more fanatical than you in pursuit of digital rights. Tell the blowhards on Digg or Slashdot it’s time to put their pounds where their posts are. Heck, buy one in your mum or niece’s name for Christmas: it’s their Internet too. And check whether your own membership has lapsed (It happens – *blush* mine expired earlier this year, and I missed the memo – I’m back in the black now). Just click here.

Think what ORG can do in the next two years. Think what we can do with 2000 members. Think what we can do with 20,000.

Most of all, think what will happen if we don’t do something.

2007-11-14

how many nines does one person need?

In case you think this piece is more incoherent than usual, I should explain that you’re reading it as I write it. More on that when I finally write the final conclusion to the piece. If you’re looking at this in an RSS reader, and there’s about five million other earlier versions, I apologise. Your aggregator is doing something that I wasn’t expecting, and I think may be a little silly. As the old spammers used to say, “just hit delete!”

So the edge (which is to say, where you live: your home server, your cellphone even, whatever is closest to you on the Internet) seems to be getting more reachable than it was. But what about reliability? If you’ve ever run an important service on your home machine, you’ll know about the Vacation Effect. This is a mysterious force which causes the home computer that to handles your email to crash within hours of you leaving home for a three week vacation, causing you to have to advertise for burglars on Craigslist to break into your home and reboot it.

Even if you can imagine the hardware at your house to be somehow more reliable, there’s always your flakey Net connection, which is up and down like a sine wave’s drawers. Dynamic IPs dynamically change, cable modems reset every few minutes, DSLs are flippantly unplugged by backhoes and disgruntled CO engineers. How could you possible imagine you could run a reliable service on that?

So, as some of you might have guessed, I moved Oblomovka off its co-loc a few weeks ago when I started this series, and transferred the whole website to my Mac Mini (perversely running Ubuntu) that I have in my cupboard here at home. I haven’t heard anyone complain about its unavailabilty, but then again I haven’t invited comments. I’m pretty sure it’s been down a few times, and as its heaviest user, I know I haven’t been able to ping it on occasion.

But it hasn’t been a huge problem. Partly because I’m not really an essential service to anyone else. Oblomovka down? Oh well, I guess I only have a few hundred other blogs to search. Mostly, though, I think it’s down to my first major point:

Now, smart readers may have spotted the problem here. I began this discussion because I was confused and worried that we hand over our most private data to companies like Google, and SixApart and Amazon, when really the safest place to keep private data is on your own machine. Am I now suggesting that if somehow it’s okay that your edge server is flakey, because, hey, you can always use Google, and SixApart and Amazon. Aren’t I contradicting myself?

Yes. NO. Hold on. There’s a real difference between holding your data yourself and passing it to others in the event of emergency or changed circumstance. It would be better to think of these cloud services not as where you keep your data, but as temporary caches for the edge. I have an encrypted backup stored somewhere out there. I’m confident that only I have access to that. I’m not sure what I’d do if my home system did go down, but in an ideal world, I’d just feed that backup my key, have it float into operation on an EC2 machine, and then point oblomovka’s DNS to itself. When I had time to get things back to normal at home, I’d evaporate that EC2 machine. Oh, sure, evil gnomes from DoubleClick or the NSA might have pickpocketed that image while it was running, but I would at least have some intimation that was happening (just as the cops might break into my house and feel my pants, but I’d have a hair on the window if I was truly that paranoid). And if what we were talking about was truly just a cache of my current state (like a memcache of the last time my server was around), then it would not expose much deep knowledge of your precious life.

Especially as the benefits of being on the edge grew. For instance, for those of you peering at this from your RSS feed or on the Website before 12:20AM PDT, you’ll have noticed that I’ve been typing this directly into my home server. The words appear almost one by one on the site, as I tweak and update it. I could do that with a remote server, but it’d be a pain: the proximity of the edge to where I am gives the me the low latency and the reactivity to do these things. I feel nearer to the Net when it works like this. It feels more like I thought the Net would be, and more like how I think it will be soon.

Reached this point at:2007-11-15T00:11-0800. Keeping the message at the top to give an idea of how this might have felt to watch being typed at the time. Still under construction for typos and grammar fixes.

|

2007-11-13

reachability on the edge

I realise that I left everybody on tenterhooks in that last post about why NAT and the unreliability of most consumer-quality endlinks. I’m sorry to have kept you waiting for my poorly-thought-out flailings so long. In my defense, my dreams have become far too weird to document here. Suffice to say they have included a dream that was half about falling in love in your mid-twenties, and half about being a prominent insect collector, and one stormer of a revery in which I dreamt that I was writing one of these blog entries about dreams. How self-referential can you get? As self-referential as this paragraph, is the answer.

Anyway, one of the problems with the edge is that these days it is largely behind NATs and unreachable for incoming connections. Usually we get around this by using ICE, STUN, and other horrendous hacks that you don’t want to stare at too long. But is this a problem these days? I think not for a couple of reasons.

Yeah, but what about availability? Aren’t consumer last-mile connections prone to zzzzzzzt NO CARRIER?

Tomorrow! I promise!

(If you enjoy these sorts of discussions, but wish it was a little less one-sided, let me here point you to Zooko’s vintage-yet-riproaring P2P-hackers list, where you can learn from people who actually understand these things.