Currently showing posts tagged: Free Software

Cloud rearrangement for fun and profit

By , May 17, 2015 4:42 am

In a populated compute cloud, there are several scenarios in which it’s beneficial to be able to rearrange VM guest instances into a different placement across the hypervisor hosts via migration (live or otherwise). These use cases typically fall into three categories:

  1. Rebalancing – spread the VMs evenly across as many physical VM host machines as possible (conceptually similar to vSphere DRS). Example use cases:
  2. Consolidation – condense VMs onto fewer physical VM host machines (conceptually similar to vSphere DPM). Typically involves some degree of defragmentation. Example use cases:
  3. Evacuation – free up physical servers:

Whilst one-shot manual or semi-automatic rearrangement can bring immediate benefits, the biggest wins often come when continual rearrangement is automated. The approaches can also be combined, e.g. first evacuate and/or consolidate, then rebalance on the remaining physical servers.

Other custom rearrangements may be required according to other IT- or business-driven policies, e.g. only rearrange VM instances relating to a specific workload, in order to increase locality of reference, reduce latency, respect availability zones, or facilitate other out-of-band workflows or policies (such as data privacy or other legalities).

In the rest of this post I will expand this topic in the context of OpenStack, talk about the computer science behind it, propose a possible way forward, and offer a working prototype in Python.

If you’re in Vancouver for the OpenStack summit which starts this Monday and you find this post interesting, ping me for a face-to-face chat!

Continue reading 'Cloud rearrangement for fun and profit'»


Announcing git-deps: commit dependency analysis / visualization tool

By , January 19, 2015 12:15 am

I’m happy to announce a new tool called git-deps which performs automatic analysis and visualization of dependencies between commits in a git repository. Here’s a screencast demonstration!

Back in 2013 I blogged about some tools I wrote which harness the notes feature of git to help with the process of porting commits from one branch to another. These are mostly useful in the cases where porting is more complex than just cherry-picking a small number of commits.

However, even in the case where there are a small number of desired commits, sometimes those commits have hidden dependencies on other commits which you didn’t particularly want, but need to pull in anyway, e.g. in order to avoid conflicts during cherry-picking. Of course those secondary commits may in turn require other commits, and before you know it, you’re in dependency hell, which is only supposed to happen if you’re trying to install Linux packages and it’s still 1998 … but in fact that’s exactly what happened to me at SUSEcon 2013, when I attempted to help a colleague backport a bugfix in OpenStack Nova from the master branch to a stable release branch.

At first sight it looked like it would only require a trivial git cherry-pick, but that immediately revealed conflicts due to related code having changed in master since the release was made. I manually found the underlying commit which the bugfix required by using git blame, and tried another cherry-pick. The same thing happened again. Very soon I found myself in a quagmire of dependencies between commits, with no idea whether the end was in sight.

So wouldn’t it be nice if you could see the dependency tree ahead of time, rather than spending a whole bunch of time resolving unexpected conflicts due to missing dependencies, only to realise that the tree’s way deeper than you expected, and that actually a totally different approach is needed? Well, I thought it would, and so git-deps was born!

In coffee breaks during the ensuing openSUSE conference at the same venue, I feverishly hacked together a prototype and it seemed to work. Then normal life intervened, and no progress was made for another year.

However thanks to SUSE’s generous Hack Week policy, I have had the luxury of being able to spending some of early January 2015 working to bring this tool to the next level. I submitted a Hack Week project page, announced my intentions on the git mailing list, started hacking, missed quite a bit of sleep, and finally recorded the above screencast.

The tool is available here:

Please give it a go and let me know what you think! I’m particularly interested in hearing ideas for use cases I didn’t think of yet, and proposals for integration with other git web front-ends.


How to build an OpenStack cloud from SUSEcon’s free USB stick handouts

By , December 11, 2014 3:28 pm

Once again, SUSEcon was a blast! Thanks to everyone who helped make it such a great success, especially all our customers and partners who attended.

If you attended the final Thursday keynote, you should have been given a free USB stick preloaded with a bootable SUSE Cloud appliance. And if you missed out or couldn’t attend, download a copy here! This makes it possible for anyone to build an OpenStack cloud from scratch extremely quickly and easily. (In fact, it’s almost identical to the appliance we used a few weeks ago to win the “Ruler of the Stack” competition at the OpenStack summit in Paris.)

Erin explained on stage at a high-level what this appliance does, but below are some more specific technical details which may help in case you haven’t yet tried it out.

The appliance can be booted on any physical or virtual 64-bit x86 machine … but before we start! – if you would like try running the appliance in a VM using either KVM or VirtualBox, then there is an even easier alternative which uses Vagrant to reduce the whole setup to a one-line command. If you like the sound of that, stop reading and go here instead. However if you want to try it on bare metal or with a different hypervisor such as VMware or HyperV, read on!

Continue reading 'How to build an OpenStack cloud from SUSEcon’s free USB stick handouts'»


Email inboxes and the GTD 2-minute rule

By , March 20, 2014 12:46 am

Today’s dose of structured procrastination resulted in something I’ve been meaning to build for quite a while: a timer to help apply the two minute rule from David Allen’s famous GTD (Getting Things Done) system to the processing of a maildir-format email inbox.

Briefly, the idea is that when processing your inbox, for each email you have a maximum of two minutes to either:

  • perform any actions required by that email, or
  • add any such actions to your TODO list, and move the email out of the inbox. (IMHO, best practice is to move it to an archive folder and have a system for rapid retrieval of the email via the TODO list item, e.g. via a hyperlink which will retrieve the email based on its Message-Id: header, using an indexing mail search engine.)

However, I find that I frequently exhibit the bad habit of fidgeting with my inbox – in other words, checking it frequently to satisfy my curiosity about what new mail is there, without actually taking any useful action according to the processing (a.k.a. clarification) step. This isn’t just a waste of time; it also increases my stress levels by making me aware of things I need to do whilst miserably failing to help get them done.

Another bad habit I have is mixing up the processing/clarification phase with the organizing and doing phases – in other words, I look at an email in my inbox, realise that it requires me to perform some actions, and then I immediately launch right into the actions without any thought as to how urgent they are or how long they will take. This is another great way of increasing stress levels when they are not urgent and could take a long time, because at least subconsciously I’m usually aware that this is just another form of procrastination.

So today I wrote this simple Ruby timer program which constantly monitors the number of emails in the given maildir-formatted folder, and shows you how much of the two minutes you have left to process the item you are currently looking at. Here’s a snippet of the output:

1:23    24 mails
1:22    24 mails
1:21    24 mails
1:20    24 mails
Processed 1 mail in 41s!
Average velocity now 57s per mail
At this rate you will hit your target of 0 mails in 21m 55s, at 2014-03-19 23:18:59 +0000
2:00    23 mails
1:59    23 mails
1:58    23 mails
1:57    23 mails

You can see that each time you process mail and remove it from the email folder, it resets the counter back to two minutes. If you exceed the two minute budget, it will start beeping annoyingly, to prod you back into adherence to the rule.

So for example if you have 30 mails in your inbox, using this timer it should take you an absolute maximum of one hour to process them all (“process” in the sense defined within David Allen’s GTD system, not to complete all associated tasks).

Since gamification seems to be the new hip buzzword on the block, I should mention I’m already enjoying the fact that this turns the mundane chore of churning through an inbox into something of a fun game – seeing how quickly I can get through everything. And I already have an item on the TODO list for collecting statistics about each “run”, so that I can see stuff like:

  • on avarege how many emails I process daily
  • how often I process email
  • on average how many emails I process during each “sitting”
  • how much time I spend processing email
  • whether I’m getting faster over time

I also really like being able to see an estimate of the remaining time – I expect this will really help me decide whether I should be processing or doing. E.g. if I have deadlines looming and I know it’s going to take two hours to process my inbox, I’m more likely to consciously decide to ignore it until the work for my deadline is complete.

Other TODO items include improving the interface to give a nice big timer and/or progress bar, and the option of a GTK interface or similar. Pull requests are of course very welcome 😉

For mutt users, this approach can work nicely in conjunction with a trick which helps focus on a single mail thread at a time.

Hope that was useful or at least interesting. If you end up using this hack, I’d love to hear about it!


more uses for git notes, and hidden treasures in Gerrit

By , October 2, 2013 2:05 pm

I recently blogged about some tools I wrote which harness the notes feature of git to help with the process of porting commits from one branch to another. Since then I’ve discovered a couple more consumers of this functionality which are pretty interesting: palaver, and Gerrit.

Continue reading 'more uses for git notes, and hidden treasures in Gerrit'»


Panorama Theme by Themocracy