quest-blog

Metaphors for computers: the slave

In the previous article, I proposed the toolshop as a metaphor for a computer. Thinking of computers as toolshops assign them to a passive role. This was true twenty years ago, but with the advent of global networking and - particularly - with mobile computing, computers can no longer be called passive; they are becoming symbiotic. Indeed, we increasingly depend on them to remind us of what we should do, to make movie recommendations, to do a thousand boring chores we can no longer be bothered to do ourselves.

The twelve-factor app

I do not normally do "look-it-I-found" posts, but sometimes you encounter some article, or even a whole community, that makes you go "This is exactly what I feel!"

The Twelve-Factor App is such an article. It collects in one compact article several relevant points about writing software-as-a-service, that I have been trying to make. In particular, the article articulates the relationship between such things as dependencies, releases and deploys, which I have long felt to be marginalized in the greater software development discussion.

options 0.9.4 released

There is a new minor release of options. New in this version is support for enums. Consider:

enum Optimization { size, speed };
@Option(shortName="o")
public Optimization optimization = Optimization.size;

This will automatically limit the number of values the user can choose for this option to the names of the enum constants.

Metaphors for computers: the toolshop

In order for there to be meaningful learning about computing, we need metaphors for computers. In order to learn, humans build whole networks of information pieces at once and so need some simplifications to bootstrap these networks. The most powerful such simplifications are metaphors.

Thus, in order to learn about computers, we need some way of thinking about them. In this article, I propose the first such metaphor.

Six months with puppet

When I retired the server in my closet in favour of Amazon EC2 instances, it quickly became clear that I needed to operate as if the EC2 instances could be lost at any moment: when all was said and done, these systems were not under my control. Putting effort into setting up individual services by hand no longer made sense. Thus, I needed a better configuration management system and I needed true automation that could quickly rebuild machines if they were lost.

Shell scripting patterns: configuration and option parsing

Adhering to software patterns lowers the threshold for doing the right thing, even when the quick hack beckons. This is particularly true in shell script handling of options and configuration. Essentially, the top of all scripts should look like this:

Shell scripting patterns: paths, file globs and and current dir

The basis of this pattern is simple: never ever change current working directory in your scripts. In fact, never ever assume that you have a working directory unless your command explicitly works from local directory (e.g. find). Consequently, you should never use relative paths in your script unless you got it from the caller; since your script doesn't change working directory, it's fine to just use whatever path the caller handed you.

Shell scripting patterns: errors and failure

In my opinion, error handling is the area where shell script clearest shows its age. The idea that the shell will by default ignore that commands signal an error state (i.e. a non-zero exit code) seems very strange when viewed through the lens of modern programming theory. Thus, all scripts should start with this instruction, that tells the shell to care about exit codes:

set -e

Now, the following code will no longer result in tears when /destination does not exist:

Shell scripting patterns: returning from functions

One shortcoming of shell scripting is the inability to return anything of significance from a shell script function. Consider: get a function that returns the youngest file in a directory. The basic moving part in this is ls -tr /da/dir | tail -1. Abstracting this to a function seem problematic, given that we cannot return a value. However, functions is very similar to external commands in bash, so you can do this:

functions latest_file() {
  ls -tr "$1" | tail -1
}

... and then simply invoke it thus:

Shell scripting patterns - a personal perspective

It is often said that no serious development project should ever use shell script.

While it is certainly true that there are some serious shortcomings is the bash strain of shells, this is not to say that these shortcomings cannot be overcome. Like so much else in programming, it is mostly a matter of finding good patterns that contain those shortcomings.

Pages