Stephen's mental dustbin
Yesterday's piece by George Monbiot got me thinking. It's worth a read, whether or not you're interested in what it got me thinking.
Much of macroeconomics is concerned with the dynamics of lending, investment and growth. Many on the left tend to lambast “speculation” as the demonic cause of financial crisis. But speculation is an inherent and necessary part of absolutely any economy, because it is part of human activity. All investment, including lending, is speculative. Our problem seems to be something to do with the valuation of speculative wealth, not its existence.
How much is it okay to speculate? How much should we lend? How much value should we attribute to speculative wealth? These are the real questions.
What seems more suspect to me are the notions of money, “base money” and fractional reserve banking. What are reserves anyway? Why do we need them? The problem seems to be that when you lend money to someone, they have “money”, i.e. what you lent them, and you have a promise of its return, which is “valued” at the same. So, by magic, we have twice as much “money”---but half of it is speculative. In fact, it might all be speculative, if the money you lent in the first place was not real, but itself came into your account on some speculative basis. So what is “real money” anyway?
It seems to me that somehow we want to restrict the structure or shape of allowed speculation, to rule out pathological cases. This could be a solution similar in spirit to how Russell's notion of type ruled out a pathology which created paradoxes in naive set theory.
Our concept of money is so slippery---given some sum of money, we have no idea how much speculation is int it---that money supply, like the valuation of anything, is regulated by a “mood” rather than any rigorous process. We are doomed to a cycle of aparent boom and bust, because lending orgies followed by lending paranoia seems to be How Banks Work. Like a logical paradox, we oscillate wildly between two extremes that are both wrong, yet both inexorable consequences of each other.
Staying on the mathematical theme: why should money be a scalar? It seems that by collapsing various different dimensions of value---its magnitude is one, but also its uncertainty or “speculative-ness” might be another---we are creating confusion that can't be helping with the problem.
A final thought, to extend the metaphor even further, is as follows. Types are based on a notion of construction: we cannot quantify over sets we cannot first construct. So it seems that with wealth, we need a similar restriction: the wealth we speculate on should somehow, by some fairly direct route, be anchored in something of real worth. Many articles, including Wikipedia's brief history of fractional reserve banking, start with assuming gold (or other precious metals) as their most primitive valuable quantity. Currency issuers used to do the same, and now don't. It's fairly clear that real worth cannot be defined in terms of gold (a substance of mostly fake worth if ever there was one). I have no idea what: goods and/or labour that contribute towards a human goal (referencing some conception of human needs) seem to be the only things of “real worth”, but I wouldn't yet claim those can be built into a theory of value.
Nevertheless, I believe there is a design space of economies. At the moment, all western economies follow the same design, just with different parameters. Many people (non-CS, non-engineers) have a hard time understanding what I mean by “design space”, so here goes. Economies are like machines with different-sized parts but the same overall form. Different economies may have different levels of government intervention, different tax--spend balances, varying interest rates and inflation levels and so on. But they are nevertheless the same design.
The design is the same because some structural properties are shared across all of them. Here are some of those properties. The only notion of value is market value. Prices are controlled by market value. By default, anything can be traded. There is no direct link between labour and price. There is no distinction in the means of valuing supply-constrained (e.g. collectables) versus resource-constrained (e.g. iPhones) goods. These are all properties I think it worth reconsidering. Few people think about these things. In after-dinner conversation on economics. ‘What other system is there?’ is a popular question. Coming up with a plausible new way for economies to work a.k.a. a design, is a hard problem... at the moment I'm only arguing that more than one design is possible. Until we can get people thinking about the possibilities, we can't hope to find a viable alternative.
This post is about the Newsnight report on computer science teaching, Monday 10th October. Here is the iPlayer link, and the report starts at 30:00ish. The programme will disappear on Monday 17th, but if you'd like a copy, I can possibly magic one up, so let me know.
Kirsty Wark could not have got the piece off to a worse start by telling a “joke” about “ten kinds of people...” NO! The whole point is that there are two, not ten. That joke only works written down. Kirsty Wark is clearly spouting her lines without understanding them, which is completely and utterly the problem. But perhaps she's old enough to be excused from even the hypothetical opportunity for a basic education in computer science, so I should move on.
The phrase “using computers” cropped up a few times as what's lacking in teaching. But that's completely the wrong phrase. That is what is being taught. I will expound on this in a moment. First, a more trivial complaint: even in its “flagship” and most “high-brow” programme, the BBC clearly can't resist spending its money on appallingly cheesy production, right down to the Crystal Maze-style captions introducing interviewees. Two Rory Cellan-Joneses is two more than I can handle.
I don't personally think a fall from third to sixth in the games industry stakes is actually a calamity. I think it was a bit of an accident of circumstance (or history) that we had such a strong games industry in the first place, and some fluctuation is natural. But if we take it as a premise that things are probably going down the tubes---and that doesn't seem implausible, regardless of the games industry numbers---then we might still care about the rest of the report.
The report cites teaching ICT as a “cause of decline”. This is clearly nonsense. Teaching how to use Word and Excel does no harm, and in fact is useful. I learnt to use Word when at secondary school, and it was pretty useful to do so. Rather, it's not teaching any of the other stuff that is the problem. And this other stuff, despite having something to do with computers, is not an alternative to ICT---it's just something else that needs teaching. “Proper computing skills”, another phrase that cropped up, is not it at all.
What we really want to encourage is an engineering mentality, applied to primarily non-physical systems---for which computers are the host substrate. "Digital Meccano" is an interesting idea which, unlike much else that was discussed in the report, captures the spirit that I think really is lacking. Seeing a computer as a host and enabler for a world of engineering possibilities---rather than a utility device for writing letters and playing games and media and social networking and reading news---is the key distinction.
Another high point of the report was how the Eric Schmidt quotation (which was spot on) was nicely framed. And the considerable talk about the “joined-up curriculum”, the games industry, the potential links between “conventional” were not badly positioned. That said, they did rather confuse the topic---since any proper engineering is a creative enterprise, whether or not it involves shiny graphics or music or other traditional “art” forms, but sadly this seemed to be too subtle for Rory Cellan-Jones to put across.
In the studio interview after the report, Ed Vasey was superbly slippery. His “many programmers are self-taught” argument was a wonderful bit of deception---people can teach themselves all manner of things, so why bother with an education system at all?---and he continued to trot out wrong-yet-appealing Big Society-esque arguments quite coherently for the rest of the interview. “We need businesses to get behind this” was the typical conservative message: the private sector can do it, silly. He claimed that “ICT is taught badly”---is it? For all I know, it might be taught very well. You don't need a degree in CS to teach ICT, for sure. So again, the point that we need computer science teaching as distinction from ICT has not really got through.
A thought about Raspberry Pi: why a physical computer? It can only mean faff plugging and unplugging cables. In general, being physical is limiting. It's good for kids in zero-computer households I suppose, and resummons a nostalgic picture of that charming conflict for the family TV. But then again, if you don't have a computer, you'll have to go out and buy a USB keyboard and mouse. Another doubt I have is that the hardware will be modern enough, and hence complicated enough, that it's probably not great for learning for low-level programming (which would otherwise be one good reason for having a physical device). Surely what we need is a cloud-hosted IDE, with a nicely-crafted HTML5 interface and the option to run code on emulated hardware targets if you want to do the low-level stuff. It is more fun to tinker with a real device, but a system-on-chip is not much less abstract than a cloud service. You certainly don't get the sense that you could build it out of ball bearings and bits of string. I once read a book chapter (but didn't subsequently actually carry out the project---how very me) about how to build a DAC, so basically a rudimentary sound card, out of a a parallel port and a resistive ladder. Now that was enlightening.
Vasey did eventually scored points by grasping the “office services” nature of ICT (nice phrase). But he slightly spoiled things on the next line by equating “computer science” with “how to program”. I might be asking too much by going into this distinction, but even how to program, although important, is not what computer science is about! It's a necessary step, but it's barely the beginning. A computer scientist is not just a skilled technician who can make a complex machine (i.e. a programming language implementation) do impressive things. Rather, he (or, sometimes, she) is a polymath: a philosopher who understands the essence of deep concepts---what are number, structure, meaning, computation, communication?---and an engineer who can apply this understanding to real-world practical tasks, through the complex physical devices. This means understanding how computers work (an understanding much deeper than just knowing how to program), but also knowing how they can work, how they might work, and consequently, having insight into what as-yet unrealised benefits computers might bring to humanity. An education system that can start enough bright young people off on this path still seems a long way off; understanding the different between ICT and programming is a small step towards it, but we can and should set our sights higher.