Scaling down

The programming world is obsessed with scaling up. How many million lines of code can we maintain? How many petabytes of data can we process? How deeply can I customise this editor? More code, more data, more people, more machines.

Nobody talks about scaling down.

The vast majority of programs are never written. Ideas die stillborn because the startup cost is too high to bear. When we focus entirely on the asymptotic cost of developing large systems we neglect the constant costs that make tedious grinds out of simple tasks.

There is a great deal to be gained from switching the focus from what we can do to what we can get done, from creating the most expressive tools to creating the most efficient tools. To do this we need to become conscious of the friction imposed by our tools. When we scale up, the concerns are performance, modularity, maintainability, expressiveness. A toolset optimised for small-scale programming must have different metrics:

The instinctive reaction is that the problems are overblown and everything would be fine if everybody would just use language / tool / methodology X.

So let's try it. Pick one of these programs and solve it however you think best. Record a video and afterwards break down your activity minute by minute.

So much of what we actually do goes unnoticed after years of practice and routine. The reality may be quite different from what you imagine.