I was reading through an AOP presentation (Digital Landscape Report vol. 54) this week when I came across one slide which quite startled me. It was talking about the growth of mobile advertising at Facebook and Twitter.
I’ve just read a long, passionate post on Marc Scott’s Coding2Learn blog lamenting the fact that kids can’t use computers. The hypothesis, simply put, is that far from being “digital natives” most kids simply don’t know their way round a computer or, indeed, a smart phone. This is not to say they don’t use them – but that they don’t know what to do when anything – even something quite basic – goes wrong.
Instinctively I had quite a bit of sympathy with the argument; knowing how to troubleshoot a wireless connection or an external monitor, for example, seems to me pretty useful basic stuff.
However, as I thought about it some more I became convinced that this kind of lament is really a symptom of a technology in transition. I can imagine a similar post being written (if the medium had existed) in the 60s or 70s bemoaning the fact that car drivers simply don’t understand the mechanics of what they are driving anymore. Motoring was a do-it-yourself activity for a long time – I remember as late as the 80s doing quite a bit of tinkering with spark plugs and the like to keep my cheap, old and unreliable cars on the road. It is now decades since I’ve known what to do looking into the bonnet of a modern car.
I suspect computer technology is going through just such a transition. Marc Scott’s suggestion for fixing the dearth of computer knowledge is, among other things, to get kids to use Linux computers which need a lot of configuration (which means learning a fair bit about the operating system). But I think he hints at the change that’s coming when he talks about mobile:
This ones tricky. iOS is a lost cause, unless you jail-break, and Android isn’t much better. I use Ubuntu-Touch, and it has possibilities. At least you feel like the mobile phone is yours. Okay, so I can’t use 3G, it crashes when I try to make phone calls and the device runs so hot that when in my jacket pocket it seconds as an excellent nipple-warmer, but I can see the potential.
That, surely, is the point. Computers should fade into the background and “just work”. As he says:
Technology affects our lives more than ever before. Our computers give us access to the food we eat and the clothes we wear. Our computers enable us to work, socialise and entertain ourselves. Our computers give us access to our utilities, our banks and our politics. Our computers allow criminals to interact with us, stealing our data, our money, our identities. Our computers are now used by our governments, monitoring our communications, our behaviours, our secrets.
That being so, we need technology that works when you switch it on, that monitors its own health and fixed itself when anything is awry, that protects us from crime and from being spied upon. We shouldn’t be expected to be able to dismantle computers or smart phones in order to make sure they are working properly.
It is faintly ridiculous that computers can develop glitches and then expect us to search the company’s knowledge bases for the solutions which we then need to manually implement. Why aren’t they self-diagnosing and self-healing using all that superfluous computing power? Partly, I guess, because there is still a lot of tinkerer’s pride and self-satisfaction finally solving these techno-riddles and hence not much consumer outrage at this situation. But this won’t wash for very much longer.
In the end, though it is fun (for some) to be able to tinker with their technology, much like old-car enthusiasts tinkering in their garages, these days are drawing to a close and ubiquitous computing that “just works”, monitors itself and corrects problems as they occur will become the standard, for better or worse.