Email inboxes and the GTD 2-minute rule

By , March 20, 2014 12:46 am

Today’s dose of structured procrastination resulted in something I’ve been meaning to build for quite a while: a timer to help apply the two minute rule from David Allen’s famous GTD (Getting Things Done) system to the processing of a maildir-format email inbox.

Briefly, the idea is that when processing your inbox, for each email you have a maximum of two minutes to either:

  • perform any actions required by that email, or
  • add any such actions to your TODO list, and move the email out of the inbox. (IMHO, best practice is to move it to an archive folder and have a system for rapid retrieval of the email via the TODO list item, e.g. via a hyperlink which will retrieve the email based on its Message-Id: header, using an indexing mail search engine.)

However, I find that I frequently exhibit the bad habit of fidgeting with my inbox – in other words, checking it frequently to satisfy my curiosity about what new mail is there, without actually taking any useful action according to the processing (a.k.a. clarification) step. This isn’t just a waste of time; it also increases my stress levels by making me aware of things I need to do whilst miserably failing to help get them done.

Another bad habit I have is mixing up the processing/clarification phase with the organizing and doing phases – in other words, I look at an email in my inbox, realise that it requires me to perform some actions, and then I immediately launch right into the actions without any thought as to how urgent they are or how long they will take. This is another great way of increasing stress levels when they are not urgent and could take a long time, because at least subconsciously I’m usually aware that this is just another form of procrastination.

So today I wrote this simple Ruby timer program which constantly monitors the number of emails in the given maildir-formatted folder, and shows you how much of the two minutes you have left to process the item you are currently looking at. Here’s a snippet of the output:

1:23    24 mails
1:22    24 mails
1:21    24 mails
1:20    24 mails
Processed 1 mail in 41s!
Average velocity now 57s per mail
At this rate you will hit your target of 0 mails in 21m 55s, at 2014-03-19 23:18:59 +0000
2:00    23 mails
1:59    23 mails
1:58    23 mails
1:57    23 mails

You can see that each time you process mail and remove it from the email folder, it resets the counter back to two minutes. If you exceed the two minute budget, it will start beeping annoyingly, to prod you back into adherence to the rule.

So for example if you have 30 mails in your inbox, using this timer it should take you an absolute maximum of one hour to process them all (“process” in the sense defined within David Allen’s GTD system, not to complete all associated tasks).

Since gamification seems to be the new hip buzzword on the block, I should mention I’m already enjoying the fact that this turns the mundane chore of churning through an inbox into something of a fun game – seeing how quickly I can get through everything. And I already have an item on the TODO list for collecting statistics about each “run”, so that I can see stuff like:

  • on avarege how many emails I process daily
  • how often I process email
  • on average how many emails I process during each “sitting”
  • how much time I spend processing email
  • whether I’m getting faster over time

I also really like being able to see an estimate of the remaining time – I expect this will really help me decide whether I should be processing or doing. E.g. if I have deadlines looming and I know it’s going to take two hours to process my inbox, I’m more likely to consciously decide to ignore it until the work for my deadline is complete.

Other TODO items include improving the interface to give a nice big timer and/or progress bar, and the option of a GTK interface or similar. Pull requests are of course very welcome ;-)

For mutt users, this approach can work nicely in conjunction with a trick which helps focus on a single mail thread at a time.

Hope that was useful or at least interesting. If you end up using this hack, I’d love to hear about it!

Share

Easier upstreaming / back-porting of patch series with git

By , September 19, 2013 9:22 pm

Have you ever needed to port a selection of commits from one git branch to another, but without doing a full merge? This is a common challenge, e.g.

  • forward-porting / upstreaming bugfixes from a stable release branch to a development branch, or
  • back-porting features from a development branch to a stable release branch.

Of course, git already goes quite some way to making this possible:

  • git cherry-pick can port individual commits, or even a range of commits (since git 1.7.2) from anywhere, into the current branch.
  • git cherry can compare a branch with its upstream branch and find which commits have been upstreamed and which haven’t. This command is particularly clever because, thanks to git patch-id, it can correctly spot when a commit has been upstreamed, even when the upstreaming process resulted in changes to the commit message, line numbers, or whitespace.
  • git rebase --onto can transplant a contiguous series of commits onto another branch.

It’s not always that easy …

However, on the occasions when you need to sift through a larger number of commits on one branch, and port them to another branch, complications can arise:

  • If cherry-picking a commit results in changes to its patch context, git patch-id will return a different SHA-1, and subsequent invocations of git cherry will incorrectly tell you that you haven’t yet ported that commit.
  • If you mess something up in the middle of a git rebase, recovery can be awkward, and git rebase --abort will land you back at square one, undoing a lot of your hard work.
  • If the porting process is big enough, it could take days or even weeks, so you need some way of reliably tracking which commits have already been ported and which still need porting. In this case you may well want to adopt a divide-and-conquer approach by sharing out the porting workload between team-mates.
  • The more the two branches have diverged, the more likely it is that conflicts will be encountered during cherry-picking.
  • There may be commits within the range you are looking at which after reviewing, you decide should be excluded from the port, or at least porting them needs to be postponed to a later point.

It could be argued that all of these problems can be avoided with the right branch and release management workflows, and I don’t want to debate that in this post. However, this is the real world, and sometimes it just happens that you have to deal with a porting task which is less than trivial. Well, that happened to me and my team not so long ago, so I’m here to tell you that I have written and published some tools to solve these problems. If that’s of interest, then read on!

Continue reading 'Easier upstreaming / back-porting of patch series with git'»

Share

announcing the Scale Matcher!

By , August 23, 2013 7:00 pm

I’ve been a bit of a hermit the last few weeks, burning the candle both ends and spending the majority of my spare time building a new toy … well actually it started out as a toy, but now I think it’s good enough for musicians to use as a serious tool for improving their improvisation / compositional skills, and harmonic understanding.

So I’m very pleased (and relieved) to be able to announce … <drum roll> … the Scale Matcher!  It should work equally well on your computer, phone, and tablet.  Please try it out and let me know what you think!  You can also click the About and FAQ buttons to find out more.

Thanks to Barak Schmool for providing the original inspiration to do this, and for the time he spent testing it out and suggesting improvements.

Scale Matcher home page

Scale Matcher sample results page

Share

cello lessons from a dead genius

By , January 28, 2013 8:05 pm

Well, it seemed like a good idea at the time …

In the summer of 2011, I quit my job to resume full-time music studies. During the summer semester at the Berkeley Jazzschool in California, I started learning John Coltrane’s solo on the title track of his famous album Blue Train. It was really tough going, but addictive – I was getting my arse handed to me on a plate on a daily basis by a dead person, but I felt like I was way off the well-trodden path and that was really satisfying!

After 3 months studying in various places in the USA, I got back home and resumed work on this transcription in earnest. It became part of my daily routine, and I craved the day that I could play the whole thing note perfect at the same speed as the original. There were so many notes to fit in that I had to come up with totally new ways to use my left thumb, on which the normal cellist’s callus grew to epic proportions. Trane became the best cello teacher I never had. Unfortunately, just around the time I was getting close to being able to nail it, real life intervened, and I had to refocus on earning money. Inspired by Benoît Sauvé’s incredible rendition of the same solo on recorder (recorder?! what a mofo – check out his other videos), I did a couple of very rough recordings with my compact camera for posterity, and moved on.

Sometime later, I discovered a John McLaughlin video on YouTube (sadly no longer available) which had an awesome animated transcription at the bottom – a really cool glimpse inside the craft of a master musician. Then it occurred to me that I could do the same kind of thing with my video, and publish it in case there are any other jazz cellists out there who would be interested in it. I put a lot of effort into notating and fingering it, so it seemed a waste to just let it rot and never see the light of day. After all, I already had the source files and a video, so it was just a simple matter of combining the two, right? How hard could it be?

Very very hard, it turns out. I had to write two new pieces of software, completely overhaul a third, and fix some obscure bugs hidden deep inside a fourth. But I didn’t discover that until I’d reached the point of no return …

I’ll probably blog more at some point about the software engineering hoops I had to jump through in order to make this all work. Email me if you’re interested.

In the mean time, hope you enjoy the video! (You can also view it on YouTube.)

Share

Tool-building hacks #1: audible pings

By , February 11, 2012 8:07 pm

I think I’m genetically a tool-builder. My dad and uncle both take great pleasure in carefully selecting and buying or building tools for their workshops, my mum’s an expert woodcarver with a fair array of sharp pointy things, and these are not the only examples in the family. For me, the habit has manifested in electronic form, and I really enjoy programming new scripts etc. to help me work more efficiently. In fact I’ve collected such a vast array of them over the years that I had to start tracking them under version control back in 1999. (CVS did me proud for many years, but it did not age well, and I finally migrated them to git a few months ago.)

I can’t claim to be remotely unusual in this respect though – there’s a whole subculture of programmers (“hackers”) who can relate to this mindset. Sometimes when I build a new tool, I get the impulse to share it with the world in case it turns out to be of use to anyone else. In the past, these hacks have ended up on my software web page, but I think this blog might be a better medium.

So, without further ado, here’s a cute hack I just built: bping, a wrapper around ping(8) which makes it do a lot of beeping ;-) – one beep per packet received, with pitch going up an octave for every doubling of the response latency. (See below for installation instructions.) Why did I want this?

I use ethernet over power to connect the machines in my bedroom to the equipment in the lounge. In a high-rise apartment block with about 100 wireless networks fighting over the same spectrum band, this is (normally) a much more reliable option. However yesterday the connections in my home network started behaving very weirdly. I tried pinging various machines on the network from each other to narrow down the problem, but it was annoying to have to keep going between rooms to visually monitor the output from the various pings when it would have been quicker to be able to hear the quality of the connections being tested. So I wrote bping, which also turns a laptop into a sort of Geiger counter for wifi.

Currently bping uses bip (here’s a suggested approach to installation) which in turns uses sox to make the beeps. This works fine on Fedora 15 but unfortunately for some reason takes longer than one second to run on Ubuntu 11.10 and openSUSE 12.1, regardless of how short the beep is. I think it’s something to do with pulseaudio, but I haven’t bothered to figure it out yet. Answers on a e-postcard please …

By the way, many of my hacks are already publically available on my github account, but most aren’t documented yet. The current exceptions are:

I’ll try to document some of the others soon. Watch this space!

Share

Panorama Theme by Themocracy