The Luddite fallacy

October 28, 2011

2011 marks the 200th anniversary of the Luddite movement. 2011 is a fitting year to commemorate them as it has also been a year of popular protests with the “indignados” in Madrid, the recent attempts to occupy Wall Street and the St Paul’s protest camp in London Today’s protesters could learn a thing or two from the Luddites; . They certainly knew how to create a lasting brand. Will we be talking about the “Indignants” in 2211? Something about Luddites has captured the public imagination for the last two centuries.

Ned Ludd, the man who gave his name to the movement was a, a sort of Robin Hood-like figure among the protesters. There had been a young apprentice called Ludd or Ludham from Anstey near Leicester. Admonished by a superior for shoddy workmanship, he took his anger out on two frames for knitting hosiery, wrecking them completely. Word got around. After that whenever a machine was destroyed someone would say, “Ludd must have been here.” This is how a mythical leader was born and he became a source of inspiration for the protesters.

The Luddites emerged at the time of the Napoleonic Wars. The industrial revolution was in full swing. It should be pointed out that they were not opposed to technology per se; many were highly skilled machine operators in the textile industry. What they objected to was the automated looms that could be operated by cheap, unskilled labour, with the loss of jobs for many skilled textile workers. On March 11,1811, in Nottingham, a demonstration of textile workers demanding more work and better wages was broken up by the army. That night the disgruntled workers went to a nearby village and smashed up textile machinery. The movement spread rapidly throughout England in 1811 and 1812 with Yorkshire and Lancashire as two hotbeds of revolt. Mills and factory machinery were the typical targets of these handloom weavers. They publicised their actions in circulars mysteriously signed, “King Ludd.”

For a short time the Luddites created panic in the British establishment and they even clashed in battles with the British Army.  The Luddites would meet at night on the moors surrounding the industrial towns, practising drills and manoeuvres. They were also into cross-dressing. However this seems to have been a way of disguising themselves. They often enjoyed local support, but once the government decided that they posed a serious risk and decided to repress them, their days were numbered. Machine breaking became a capital crime in 1812, legislation which was opposed by Lord Byron, one of the few prominent defenders of the Luddites. In order to suppress the movement mass trials were held resulting in many executions and penal transportations. By around 1816 the Luddites were a spent force.

The Luddites had clearly tapped into a common feeling. The nightmare vision of a world in which technology has eliminated human productive labour has been around since the early years of the Industrial Revolution. Automated production lines, computers and industrial robots have only reinforced this feeling. Each generation believes that the latest technology will be the one that eradicates employment. I tend to be very sceptical of economic populism and contrary to these populist beliefs, there is scant evidence to support the claim that technological development is responsible for rising levels of unemployment in the medium to long term. But these bad ideas keep coming back to haunt us. Whenever there is significant long-term unemployment, machines get the blame.

These fallacies are regularly trotted out in the media. Earlier this year Jesse Jackson Jr. lamented the dangers of the iPad, wondering what would happen to all the jobs associated with paper:

A few short weeks ago I came to the House floor after having purchased an iPad and said that I happened to believe, Mr. Speaker, that at some point in time this new device, which is now probably responsible for eliminating thousands of American jobs. Now Borders is closing stores because, why do you need to go to Borders anymore? Why do you need to go to Barnes & Noble? Buy an iPad and download your newspaper, download your book, download your magazine,

And this summer President Obama was on NBC News trying to explain why companies weren’t hiring:

There are some structural issues with our economy where a lot of businesses have learned to become much more efficient with a lot fewer workers. You see it when you go to a bank and you use an ATM, you don’t go to a bank teller, or you go to the airport and you’re using a kiosk instead of checking in at the gate. So all these things have created changes. . . .”

George Bush was rightly criticised for many of the things he said, but I find it shocking that a President who is constantly praised for his stunning intellect could display such wilful economic illiteracy. Actually I’m not shocked at all by politicians spouting this kind of nonsense. Indeed, Obama may well be aware that it is bullshit, but is trying to make an appeal to populism. Either way it’s depressing that this is the man responsible for economic policy in the world’s most powerful economy.

A dynamic economy will see radical changes. This is what Joseph Schumpeter called this process creative destruction, the transformation that accompanies radical innovation. I did a piece about creative destruction in the financial sector in 2008. It has to be said that there has been precious little of this since the onset of the current economic crisis. Creative destruction has been around since we started inventing tools. The printing press was bad news for those who produced manuscripts. This process has been on steroids since the industrial revolution. Agriculture is a prime example. In 1900, nearly forty of every hundred Americans worked in farming to feed a country of ninety million people. A century later, it takes just two out of every hundred workers. There are many more examples:

  • Modern office technology has cut the number of secretaries.
  • Undergoing LASIK surgery allows consumers to throw their glasses away.
  • Digital cameras have forced photo labs to close.

There has been massive destruction of employment. There should be no more than five people working in the whole world! But as economist John Kay has pointed out, in the two hundred years since the Luddites first went around wrecking machinery productivity has increased more than fifty-fold. So we don’t have 98% unemployment; we produce fifty times as much.

We need to see the interconnectedness of all this; these ongoing processes cannot be understood in isolation. Resources no longer needed to feed the nation have been freed to meet new consumer demands. Over the decades, workers no longer required in agriculture moved to the cities, where they became available to produce other goods and services. Economist Walter Williams has a better grasp of economics than Obama and Jackson Jr.:

Certain jobs are destroyed by technology. You’re right, but many more are created. Think about it. If 90 percent of Americans still had been farmers in 1900, where in the world would we have gotten workers to produce all those goods that were not even heard of in 1790, such as telephones, steamships and oil wells? We need not go back that far. If there hadn’t been the kind of labour-saving technical innovation we’ve had since the 1950s – in the auto, construction, telephone industries and many others – where in the world would we have gotten workers to produce things that weren’t heard of in the ’50s, such as desktop computers, cell phones, HDTVs, digital cameras, MRI machines, pharmaceuticals and myriad other goods and services?

Creative destruction actually makes societies wealthier, by putting scarce resources to more productive uses. The savings from higher productivity don’t just go to the evil capitalist owners. They lower costs of doing business. In the short term this may mean higher profits. However, new competition tends to lead to lower prices as firms compete with each other to attract consumers. These consumers benefit from a higher standard of living as they have to work fewer and fewer hours to earn enough money to buy food, shoes or a car. It is technology that brings us a higher standard of living. It isn’t just the rich who get cheaper stuff. And I don’t know about Obama, but I love the convenience of using an ATM whenever I feel like it.

The problem is the time lag. While the disruption of the labour market and the destruction of businesses are immediate and very visible, the benefits from creative destruction are in the long term. As a result, societies there will always be a temptation to try and block the process of creative destruction, implementing policies to resist economic change. These attempts will almost always have a deleterious impact on the economy as inefficient producers, who should have gone out of business, hang around at a high cost to consumers or taxpayers. It prevents the shifting of resources to emerging sectors. The tragedy is that by trying to hold back the tide, you do not avoid pain. The ultimate cost of these misguided policies is stagnation, job losses, bankruptcies and a lower standard of living.

I don’t know what will happen in the future. The Luddites and their intellectual heirs may simply have been premature in their dire predictions. Will job creation match job destruction in perpetuity? With robotics and artificial intelligence set to advance rapidly this century, even many skilled jobs could come under threat. But I believe that jobs are created by what economist Julian Simon called the “ultimate resource” – our natural human resourcefulness and ingenuity. Human wants are insatiable -people always want more of something. This is what will create jobs in the future – jobs that we cannot even conceive of now.

Progress and its discontents

October 28, 2011

To complement this week’s post about The Luddite fallacy I have a couple of extra bits:

The first is a letter from Don Boudreaux of Café Hayek to the U.S. Representative for California’s 9th congressional district, the Democrat, Barbara Lee:

Dear Ms. Lee:

Fred Barnes reports in the Weekly Standard that you refuse to use computerized checkout lanes at supermarkets (“Boneheaded Economics,” Oct. 24).  As you – who are described on your website as “progressive” – explain, “I refuse to do that.  I know that’s a job or two or three that’s gone.”

Overlooking the fact that you overlook the lower prices on groceries made possible by this labor-saving technology, I’ve some questions for you:

Do you also avoid using computerized (“automatic”) elevators, riding only in those few that still use manual elevator operators?

Do you steer clear of newer automobiles equipped with technologies that enable them to go for 100,000 miles before needing a tune-up?  I’m sure I can find for you, say, a 1972 Chevy Vega that will oblige you to employ countless mechanics.

Do you shun tubeless steel-belted radial tires on your car – you know, the kind that go flat far less often than do old-fashioned tires?  No telling how many tire-repairing jobs have been destroyed by modern technology-infused tires.

Do you and your family refuse flu shots in order to increase your chances of requiring the services of nurses and M.D.s – and, if the economy gets lucky and you and yours get seriously ill, also of hospital orderlies and administrators?  Someone as aware as you are of the full ramifications of your consumption choices surely takes account of the ill effects that flu shots have on the jobs of health-care providers.

You must, indeed, be distressed as you observe the appalling amount of labor-saving technologies in use throughout our economy.  It is, alas, a disturbing trend that has been around for quite some time – since, really, the invention of the spear which destroyed the jobs of some hunters.

The second is an anecdote from Russell Roberts in an article about technology and jobs:

 The story goes that Milton Friedman was once taken to see a massive government project somewhere in Asia. Thousands of workers using shovels were building a canal. Friedman was puzzled. Why weren’t there any excavators or any mechanized earth-moving equipment? A government official explained that using shovels created more jobs. Friedman’s response: “Then why not use spoons instead of shovels?”


Ice cream cones, frozen chickens and the meaning of disgust

October 22, 2011

Disgust, one of our most basic emotions, is fundamentally a biological adaptation which helps us to keep away from ingesting substances that could make us sick or even kill us – faeces, vomit, phlegm, blood, urine and rotten meat are universally seen as disgusting because they contain harmful toxins. One gram of human faeces can contain 100 million viruses and over a million bacteria. Steven Pinker has called this “intuitive microbiology“. Disgust is apparently unique to humans.  I do perhaps have an unhealthy interest in this subject – this summer I took my family to an exhibition, Dirt: the filthy reality of everyday life, at the Wellcome Collection in London. If you feel so inclined, you can see the exhibition here.

Some of our disgust is hard-wired – when disgust first emerges in young children, at the age of around three, it is a consequence of brain maturation, not early experience or cultural teaching. Disgust can also be learned, because while some things are universally dangerous, others vary according to the environment. A fascinating area is that of food taboos; one man’s meat is another man’s poison. Indeed animal flesh is especially susceptible to environmental pressures. Pork is the classic example. In the climate of the Middle East eating it was dangerous. Thus a religious taboo prohibiting it emerged both among Jews and Arabs. At least they can agree on that! Of course this taboo then takes on a life of its own. With modern refrigeration it is perfectly okay to eat pork anywhere in the world. However the taboo remains.

If disgust were limited to gastronomy it would more of a curiosity and it would have less social relevance. But, there has been what is known as an exaption. This is when a trait that evolved because it served one particular function, comes to serve another. They occur in anatomy; Bird feathers are a classic example: initially these may have evolved for temperature regulation, but later were co-opted for flight. And exaptions occur in behaviour. In this case disgust has entered the realm of morality. MRI studies have shown that lying, cheating, and stealing, behaviours that may threaten group cohesion or co-operation, activate areas in the brain associated with disgust. And these days there seems to be a lot of indignation going around.

Whenever I read about moral disgust two scenarios, involving incest and frozen chickens, often seem to crop up. There is an online survey called Taboo, which asks you to judge a number of controversial moral scenarios including these two:

Sarah and Peter were brother and sister. They were on vacation together away from home. One night they were staying alone in a tent on a beach. They decided it would be fun to have sex. They were both over 21. They had sex and enjoyed it. They knew that for medical reasons Sarah could not get pregnant. They decided not to have sex with each other again, but they never regretted having had sex once. In fact, it remained a positive experience for them throughout their lives. It also remained entirely their secret (until now!).

A man goes to his local grocery store once a week and buys a frozen chicken. But before cooking and eating the chicken, he has sexual intercourse with it. Then he cooks it and eats it. He never tells anyone about what he does, never regrets it and never shows any ill effects from behaving this way. He remains an upstanding member of his community.

Both scenarios involve no harm to its practitioners and third parties are not hurt. I won’t go into the chicken for now, but the first scenario is a particularly thorny question. The incest taboo is a human universal, which is so powerful it goes beyond blood relations. I am referring to the Westermarck effect, or reverse sexual imprinting. This kicks in when two people who live in close domestic proximity during the first few years in the lives of either of them become incapable of feeling sexual attraction. This phenomenon, first described by the Finnish anthropologist Edvard Westermarck, has been observed in many places and cultures, such in the Israeli kibbutz system.  In kibbutzim children are raised communally in peer groups, based on age, and not biological relation. One study showed that out of the nearly 3,000 marriages that occurred across the kibbutz system, only fourteen were between children from the same peer group. And of those fourteen, none had been reared together during the first six years of life.

How do we explain these taboos to ourselves? Jonathan Haidt, a professor of psychology at the Universityof Virginia, coined the term moral dumbfoundedness to describe our reactions. He found that when presented with these scenarios people give a reason. When that reason is stripped from them, they will find another one. When the new reason is stripped from them, they bring up another one. Only when they run out of reasons will they admit defeat – “I don’t know; I can’t explain it; it’s just wrong.” This is moral dumbfoundedness.

There is a school of thought that believes that deep-seated revulsion should be seen as a sign that an activity is intrinsically harmful or bad. One proponent of this view is Leon Kass, who was chairman of President George W. Bush’s commission on bioethics. He argues that while disgust is not an argument, “In some crucial cases, however, it is the emotional expression of deep wisdom, beyond wisdom’s power completely to articulate it.” This is the wisdom of repugnance. This way of thinking has important practical implications: Kass argues that the idea of human cloning is disgusting, and therefore should be banned. Having said that, he also thinks that eating ice cream cones undermines our dignity:

Worst of all from this point of view are those more uncivilized forms of eating, like licking an ice cream cone… This doglike feeding, if one must engage in it, ought to be kept from public view, where, even if WE feel no shame, others are compelled to witness our shameful behaviour.” What Freud would have made of this quote, I shudder to think. I don’t like psychobabble but this man definitely has some “issues”

While I am in favour of spontaneous order and organic change, I find Kass’s arguments unconvincing. I am especially worried about the danger of false positives. History is littered with examples of groups and individuals being considered disgusting. Philosopher Martha Nussbaum has critiqued disgust-based morality because it can become a justification for persecution of out-groups:

Throughout history, certain disgust properties – sliminess, bad smell, stickiness, decay, foulness – have repeatedly and monotonously been associated with Jews, women, homosexuals, untouchables, lower-class people – all of those are imagined as tainted by the dirt of the body“.

Male homosexuals have been a traditional target and not just in the past. In From Disgust to Humanity, Nussbaum, a prominent professor of law and philosophy at the University of Chicago, explains that much of the political rhetoric around gay rights is bound up in the language of disgust, with words like vile, revolting, contaminate and defile being the currency.  In crude terms, much of the anti-gay argument is bound up in faeces and saliva, germs, contagion and blood. You may think that Nussbaum was exaggerating, but in the United States gay rights can inspire a very visceral response.  At a recent state Judiciary Committee meeting the New Hampshire state Representative, the Republican, Nancy Elliott, decided to enlighten us with her views on homosexuality. During a debate on a proposal to repeal the state’s same-sex marriage bill, she described anal sex “taking the penis of one man and putting it in the rectum of another man and wriggling it around in excrement.” You can see the video here.

The bottom line is that it is impossible to find a correlation between what disgusts us and any moral norms. If only it were that simple! As we have seen reactions of disgust often have their origin in our most atavistic prejudices. The more I look into the origins of morality, the more confused I get. Well that’s enough pontificating for today. I fancy a bite to eat. Kentucky Fried Chicken followed by a Cornetto would seem to fit the bill.

By coincidence Jonathon Haidt has a piece about the Wall Street protests at The Moral Foundations of Occupy Wall Street.

QI: A selection #9

October 22, 2011

Here is another selection of trivia that I have picked from the QI column in the Telegraph:

Anosmia (Greek for “no smell”) can be congenital, or can be caused by a severe blow to the head, a virus or vitamin A deficiency. Viral anosmia (such as that caused by a bad cold) is usually temporary. Smell and memory are intimately linked. Damage to the temporal cortical region of the brain – the site of memory – does not affect the ability to detect smell, but prevents the ability to identify it. Patients suffering from Alzheimer’s disease often lose their sense of smell as well as their memory.

Brown sugar has fewer calories because it contains more water. Refiners of white sugar from the United States wrecked the success of brown sugar sales by creating a smear campaign against the stuff in the late 19th century. They produced photographs of horrible-looking microbes living in brown sugar to put people off. In 1900, a bestselling cookbook picked up on this and said that brown sugar was often infested with “a minute insect”.

No one knows why people stopped wearing hats after the Second World War. New hairstyles, the rise of the car, demobilisation – even the new fashion for sunglasses – all took the blame for the sudden abandonment of the hat. At first the hat industry thought hatlessness was a passing fad and newspaper reports of 1948 bemoaned the new “barehead” fashion. People who dared to walk hatless through the hat-making towns of Denton and Stockport risked being abused by local factory workers who saw their livelihoods disappearing.

If you or your children have just started a depressing summer job, fear not. Multi-billionaire Warren Buffett’s first job was at his grandfather’s grocery shop; Bill Murray sold chestnuts outside a grocer’s; Orlando Bloom worked at a clay-pigeon shooting range; Beyoncé Knowles swept up in her mother’s hairdressing salon and Mick Jagger sold ice cream. Brad Pitt dressed up as a giant chicken to promote a restaurant.

The Aztecs called gold “the excrement of the gods”. It was valued less than feathers, their most valuable currency. Decoratively they much preferred brass, introduced by the Spanish invaders.

The French writer Guy de Maupassant (1850-1893) liked to eat lunch in the restaurant of the Eiffel Tower because he hated the structure, and it was the only place he could not see it. He really hated it: “A high and skinny pyramid of iron ladders, this giant ungainly skeleton upon a base that looks built to carry a colossal monument of Cyclops, but just peters out into a ridiculous thin shape like a factory chimney.”

In 1991, to celebrate the bicentenary of Mozart’s death in 1791, Triumph International,Japan’s second-largest lingerie company, made a musical bra with blinking lights which played 20 seconds of Twinkle Twinkle Little Star. Although their intentions were commendable, the company had made a common error in attributing the piece to Mozart. Although he had composed variations on the tune, the lyrics were written by London-based sisters Jane and Ann Taylor and the melody was originally a French folk tune.

The word encyclopaedia literally means a “circle of learning” and was originally used to indicate a well-rounded education. It was not used as a title for books of general knowledge until the 17th century.

Wombats (Vombatus ursinus) have evolved with special anal sphincters that produce cubic faeces or scat. They use them to mark out their territory, leaving them perched on rocks, leaves and logs. Their shape helps stop them from rolling off.

The shortest nation in Europe is Malta. The Maltese have an average height of 5ft 4ins (164.9cm) compared with the EU average of 5ft 5½ins (169.6cm). Notable short people include Horace, Joan of Arc, Alexander Pope (4ft 6in), Goya, Lord Byron, Franz Schubert (5ft 1in), Leo Tolstoy, JM Barrie (4ft 11in), Judy Garland (4ft 11in) and Yuri Gagarin (5ft 1in). Someone who wasn’t short was Napoleon, who at 5ft 6½in was taller than the average Englishman at the time.

The shortest war in history was the Anglo-Zanzibar War, which took place on August 27 1897 and lasted 38 minutes. When the Sultan of British-administered Zanzibar died, his nephew, Khalid bin Barghash, succeeded him, in direct contravention of the wishes of the British consul, who had suggested another candidate. Undeterred, Khalid climbed into the royal palace through a broken window with 2,000 supporters in tow, raised the Zanzibar flag and proclaimed himself Sultan. The British then issued him with an ultimatum: abdicate or face war. When the deadline expired at 9am the next morning, the British gunships opened fire, bombarding the palace and setting it on fire. Khalid escaped toMombasa, leaving 500 casualties behind him. Only one British sailor was slightly injured.

The longest place name in Europe is on Anglesey: Llanfairpwllgwyngyll-gogerychwyrndrobwll-llantysiliogogogoch. It was cooked up as a publicity stunt in 1860 when thevillageofLlanfair(which means St Mary’s church) opened the island’s first railway station. Local businessmen came up with the idea of creating the longest station sign in Britain, made up of the existing names of the village, a nearby hamlet and a local whirlpool. The world title, however, goes to Bangkok in Thailand, which in 1782 was given a ceremonial name: Krung Thep Mahanakhon Amon Rattanakosin Mahinthara Yuthaya Mahadilok Phop Noppharat Ratchathani Burirom Udomratchaniwet Mahasathan Amon Phiman Awatan Sathit Sakkathattiya Witsanukam Prasit.

No one is sure where the name España comes from. It might be from the Greek Hesperia, meaning “western land” or the Phoenician, Hispnanihad meaning “land of rabbits”.Spainis certainly rich in rabbits: the first written reference to the art of ferreting rabbits occurs in Pliny the Elder’s Natural History, which tells of how, in 6BC, the Emperor Augustus sends ferrets to the Balearic Islands to control a plague of rabbits.

The future of the BBC

October 16, 2011

The BBC, the world’s first national broadcasting organisation, began life as a private company, the British Broadcasting Company on 18 October 1922. In those days it was a syndicate of six telecommunications companies, including Marconi and General Electric. Five years later the company was wound up and a royal charter created the British Broadcasting Corporation. Nation shall speak peace unto nation was the celebrated motto. The way radio initially developed was typically British. The fears of its negative impact on information and taste led to a paternalistic, top-down approach. There was no vulgar advertising. And because the press perceived the BBC as a threat, there were no news bulletins before seven in the evening. In 1932 using a system developed by John Logie Baird, the BBC began its television coverage with a limited television service. Since then the BBC has grown massively. By the start of this century it had become the largest broadcaster in the world, with about 23,000 staff.

The figure of the director-general has been fundamental in the corporation. There have been 14 of them in the 80 years of the BBC’s existence. The Corporation’s first director was a thirty-three-year-old Scottish Presbyterian, John Reith. A most high-minded man, he had no truck with the idea that the radio should be for mere entertainment.  Educating and informing the masses should be its goal; give the public what they needed, not what they wanted. Another famous director general was Hugh Greene, brother of the writer, Graham. He was in charge of the BBC in the swinging sixties and was responsible for modernising an organisation that had fallen behind in the face of the challenge from the newly created commercial channel, ITV. He also had to defend the BBC from attacks by the ‘clean-up TV’ campaigner Mary Whitehouse, whose National Viewers and Listeners Association used to rail against the permissive society. The current DG, Mark Thompson, will be leading the BBC in a new cash-stripped era.

On 20 October 2010 Chancellor George Osborne announced that the television licence fee would be frozen at its current level – £145.50 per year per household –  until the end of the current charter in 2016. The BBC will also have to take on the full cost of running the BBC World Service, which had been the responsibility of the Foreign and Commonwealth Office. And just this month further cuts have been announced with the BBC expected to reduce its budget by up to 20% over the next few years. That will still leave it with 3.5 billion a year – hardly chicken. This leaves it second largest budget of any UK broadcaster, trailing British Sky Broadcasting’s £5.9 billion.

The BBC has since its creation been attacked and criticised. Their first test came with the general strike of 1926. And of course recently there was the bust-up between the BBC and Blair and Campbell over those dodgy dossiers about Iraq. The Beeb is often accused of political bias. Objectivity is impossible but I think the BBC compares very favourably with public broadcasters around the world.

Then there are those more light-hearted moments. One of my favourites has to be Guy Goma, a business studies graduate from Brazzaville in the Republic of the Congo, who was the victim of mistaken identity. Goma, who was wrongly identified in the press as a taxi driver, was waiting for a job interview at television Centre. He was confused with an IT expert, who also happened to be called Guy; the producer had wanted Guy Kewney, editor of, to be interviewed about the Apple vs. Apple court case. Instead it was Goma who went on air. He was interviewed for nearly two minutes. “I just thought keep going” said Goma after his ordeal was over. I thought he did rather well.

Having grown up with it, I have a soft spot for the Beeb. I Claudius, Yes Minister and Life on Earth were all surely examples of public broadcasting at its very best. Of course they also produce dross. But as they say, 90% of everything is probably crap. However, we can’t go back to the past. The organisation has become rather bloated. Does it really need to do everything that it currently does? The problem is how to focus. I’m not arguing for something on the limited scale of PBS in the United States, but £3.5 billion should be sufficient. The problem is that everyone will have their own idea of what is essential. When the BBC tried to close Radio 6 last year there were howls of protest and the station was reprieved. My criteria would be that the BBC should provide what the market won’t. Maybe the cash constraints will force a bit more creativity. I don’t really see the point of the BBC doing Strictly Come Dancing. I don’t think that anyone could argue that this couldn’t be done by the private sector. Of course the most difficult area in the future will be sports. The problem here is not one of market failure. Murdoch’s Sky would love a crack at the BBC’s sporting jewels. The problem is that access will only be available to paid subscribers. In America a market solution has emerged – the big four sports appear on free-to-air television. If you want to see every match of your team, then you have to pay. This seems to me like an intelligent solution. You get the national exposure and the corresponding advertising revenues that a sport requires and you can also charge those who are willing to pay for a premium service.

The BBC faces some challenging decisions over the next few years. It will have to downsize. I wouldn’t want to go back to the old days when there were just two or three channels. I love the variety we can enjoy now.  American television is really so much better than it used to be. With satellite, cable and YouTube we are spoilt for choice. We can be our own director-generals. But I am sure that the BBC has enough talent to remain relevant for another eighty years.

Twimmolation and other new words

October 16, 2011

Here is another selection of new words I found on the Wordspy website:

butler lie

A lie used to politely avoid or end an email, instant messaging, or telephone conversation. We use the term “butler lies” to allude to the social buffering function that butlers provided for their employers.

filter bubble

Search results, recommendations, and other online data that have been filtered to match your interests, thus preventing you from seeing data outside of those interests.


An activist who supports or lobbies for laws that ban infant circumcisions.

IKEA effect

Increased feelings of pride and appreciation for an object because it has been self-made or self-assembled.


n. The baseless and exaggerated fear that the Internet and current social trends are having negative effects on children. [Juvenile + paranoia.]

last name effect 

The closer a person’s childhood surname is to the end of the alphabet, the faster that person tends to make purchase decisions.

omega male

The man who is least likely to take on a dominant role in a social or professional situation.


A deep appreciation for the aesthetic qualities of paper; a preference for reading items printed on paper rather than displayed on a screen.

pity friend

On a social networking site, a person whose friend request you accept out of pity.


The destruction of a person’s career or reputation caused by lewd or insensitive Twitter posts. [twitter + immolation.]

Five famous psychological experiments #2

October 9, 2011

Last year I did a post about five famous psychology experiments. Here is another selection:

Blind to the unexpected

In 1999 cognitive psychologists Daniel Simons and Christopher Chabris came up with the so-called “invisible gorilla” test. Their volunteers had to watch a one-minute video where two groups of people — half dressed in white, the other half in black — are passing basketballs around. The volunteers were told to count the passes among players dressed in white shirts while ignoring the passes of those in black. During the video, a woman in a gorilla suit walked into the centre of the frame, pounded her chest and then walked off.  It would seem to be the most obvious thing in the world. However, about half the people missed it. This effect is known as inattentional blindness. When you are focussing on one activity you can become blind to the unexpected. Last year they repeated the study; they wanted to see if the people who had heard about experiment would notice other unexpected events in a new video. Like the first time those who hadn’t seen it had a 50% success rate. As you would expect, all 23 of the experimentees who knew about the original experiment saw the gorilla but only 17% saw one or both of the new unexpected events – the curtain changing colour and one player on the black team leaving the game. You may find this experiment trivial, but one done by NASA using commercial pilots with thousands of hours of flying experience in a state-of-the art flight simulator is more worrying. During a simulated landing in foggy conditions some of the pilots failed to notice a jet parked on the runway!

Clairvoyant rats and pigeons

Last week I wrote about the unreliability of expert predictions. There are experiments that show that animals can do better than humans some times. I’m not referring to Paul the Octopus, who was able to correctly predict the winner of each ofGermany’s seven matches in the 2010 World Cup, as well as the result of the final. In this case it was rats and pigeons. The experiment involved researchers flashing two lights, one green and one red, onto a screen. However, the exact sequence was kept random. The rats and pigeons were quick to discover that the optimum strategy was to always go for green, guaranteeing an 80 percent hit rate. Humans, on the other hand, tried to see a pattern where there was none and only achieved 68% success. What’s more they would persist in the erroneous strategy even after they had been told that the flashing lights were random. Another study with Yale students produced similar results; they couldn’t accept 40% error, so they ended up with almost 50%.

Frightening Little Albert

Behaviourist John Watson believed, following the principles of classical conditioning, that he could condition a child to fear a stimulus which normally would not be frightening. The subject of the study was a nine–month old baby called Albert. At the beginning of the experiment Little Albert was exposed, to a white rat, a rabbit, a dog, a monkey, masks with and without hair and burning newspapers among other things. During this phase Little Albert showed no fear toward any of these items. In later trials, Watson and his assistant Rosalie Rayner made loud sounds behind Albert’s back by striking a long steel pipe with a hammer when the baby touched one of the chosen items. Not surprisingly on these occasions, Little Albert cried and showed fear when he heard the noise. The final stage of the experiment was to present Albert with only the stimuli. He became very upset as the rat appeared in the room. He cried, turned away from the rodent, and tried to move away. Watson had show that emotional responses could be conditioned, or learned. Indeed, Little Albert seemed to generalize his fear to other furry objects so that when Watson sent a non-white rabbit into the room seventeen days after the original experiment, Albert also became distressed. He showed similar reactions when presented with a furry dog, a seal-skin coat, and even when Watson appeared in front of him wearing a Santa Claus mask with a white cotton beard. The story has a sad ending. Albert, whose real name appears to have been Douglas Merritte, was the son of one Arvilla Merritte, then an unmarried woman who was a wet nurse at the Harriet  Lane Home. Nothing is known about the long-term effects of Watson’s experiment on the child. Tragically he died at the age of six on May 10, 1925 and is buried in a cemetery in Maryland.

Make me straight

Dr. Robert Galbraith Heath, founder and chairman of the Department of Psychiatry and Neurology at Tulane University in New Orleans, Louisiana, did research, partially financed by the CIA and the US military, which involved stimulation of the brains using surgically implanted electrodes. His subjects were institutionalized psychiatric patients, often African Americans. He wanted to use this brain stimulation relieve the symptoms of major psychiatric disorders such as severe depression and schizophrenia. However despite this laudable desire, his methods left a lot to be desired. One of his collaborators was the Australian psychiatrist Harry Bailey, who later recalled that they had used African Americans as subjects because “they were everywhere and cheap experimental animals”  His most infamous  experiment was on Patient B-19, a 24-year-old gay man who wanted Heath to make him straight. Heath implanted electrodes in his head, showed him straight porn movies, and then activated the pleasure centres of the brain via the electrodes. A prostitute was hired to see if his treatment had worked. Did patient B-19 actually become heterosexual?  Following discharge from the hospital, he had a sexual relationship with a married woman for almost 10 months. His homosexual activity was reduced during this period, but did not stop completely.  I couldn’t find any long-term follow-up information. Heath seemed excited about the prospects for this therapy, but fortunately homosexual conversion therapy with brain surgery and pleasure centre stimulation did not catch on.

The monster study

Most of us are familiar with the film The King’s Speech. At the beginning of the film a therapist has the future king put seven pebbles in his mouth to get him to take his mind off stuttering. This goes back to ancient Greece where the famous orator Demosthenes is said to have used the same treatment. Anyway it didn’t work out; George spat them out and the hapless therapist was promptly sent packing. On the other side of the Atlantic, at more or less the same time, an infamous experiment was taking place. The year was 1939 and the place,  Davenport,Iowa. The aim of the experiment was to make kids stutter. The intentions were noble; Dr. Wendell Johnson, a speech pathologist believed that stutterers were not born and the stigma of being labelled a stutterer d actually make them worse, and in some cases caused ‘normal’ children to start stuttering. To prove his point, he ran an experiment which has since become known as the ‘Monster Study’. The 22 youngsters from a veterans’ orphanage who were recruited to participate in the experiment were divided into two groups. The first were labelled ‘normal speakers’ and the second ‘stutterers’. In reality only half of the group labelled stutterers had actually shown signs of stuttering. During the course of the experiment, the normal speakers were given positive encouragement. But what made the study so notorious was what happened to the stutterers’ group. They received negative reinforcement – they were lectured about stuttering and constantly reminded not to repeat words. And the rest of the teachers and staff at the orphanage were told them the whole group were stutterers.  Although none of the test subjects actually became stutterers they became very embittered when they discovered in 2001 what had been done to them.  The quality of their schoolwork fell off and they would suffer a number of psychological and emotional scars later in their lives. The university issued an apology after the study was made public in news reports. On 17 August 2007, six of the orphan children were awarded $925,000 by the State of Iowa.


So there you are. These are some of the things psychologists got up to. One would assume that they don’t do some of the more ethically questionable things that I have described above. They were different times. I could have mentioned Pavlov, who experimented on humans as well as dogs. Watching the videos can be quite painful. The uncomfortable fact is that we did learn a lot from these experiments and others carried out in those years. There were some pretty terrible experiments going on in other fields. One of the most shocking must surely be Tuskegee syphilis study conducted between 1932 and 1972 in Tuskegee,  Alabama by the U.S. Public Health Service on poor, rural black men. They received free health care, but they were never told they had syphilis, nor were they ever treated for it. The aim was to see what would happen if the disease went untreated. I hope in 2050 a future blogger will not have to write about what we were doing in the 21st century.

Poem #1 : Totally like whatever, you know? Taylor Mali

October 9, 2011

I do enjoy reading and listening to poetry. This Taylor Mali poem, Totally like whatever, you know?,  is a nice one to begin with.

In case you hadn’t noticed,

it has somehow become uncool

to sound like you know what you’re talking about?

Or believe strongly in what you’re saying?

Invisible question marks and parenthetical (you know?)’s

have been attaching themselves to the ends of our sentences?

Even when those sentences aren’t, like, questions? You know?


Declarative sentences – so-called

because they used to, like, DECLARE things to be true

as opposed to other things which were, like, not –

have been infected by a totally hip

and tragically cool interrogative tone? You know?

Like, don’t think I’m uncool just because I’ve noticed this;

this is just like the word on the street, you know?

It’s like what I’ve heard?

I have nothing personally invested in my own opinions, okay?

I’m just inviting you to join me in my uncertainty?

What has happened to our conviction?

Where are the limbs out on which we once walked?

Have they been, like, chopped down

with the rest of the rain forest?

Or do we have, like, nothing to say?

Has society become so, like, totally . . .

I mean absolutely . . . You know?

That we’ve just gotten to the point where it’s just, like . . .



And so actually our disarticulation . . . ness

is just a clever sort of . . . thing

to disguise the fact that we’ve become

the most aggressively inarticulate generation

to come along since . . .

you know, a long, long time ago!


I entreat you, I implore you, I exhort you,

I challenge you: To speak with conviction.

To say what you believe in a manner that bespeaks

the determination with which you believe it.

Because contrary to the wisdom of the bumper sticker,

it is not enough these days to simply QUESTION AUTHORITY.

You have to speak with it, too.

You can download and listen to it here. If you prefer there is a YouTube video.

My media week 09/10/11

October 9, 2011

On his blog Stephen Fry paid tribute to Steve Jobs

John Gray Is not convinced by Steven Pinker’s book The Better Angels of Our Nature: the Decline of Violence in History and Its Causes. In Prospect Magazine he argues against  Delusions of peace. I intend to read it soon and I will give you my opinion as to whether the world is becoming a less violent place.

On BBC Radio 4’s In Our Time Melvyn Bragg and guests discussed the work of the Scottish Enlightenment philosopher David Hume. He was sceptical of religion and was suspected of being an atheist. His analysis of miracles is something that influenced my thinking.

Distrust me, I’m an expert

October 1, 2011

Economists give their predictions to a digit after the decimal point to show that they have a sense of humour.  Anonymous

Legends of prediction are common throughout the whole Household of Man. Gods speak, spirits speak, computers speak. Oracular ambiguity or statistical probability provides loopholes, and discrepancies are expunged by Faith.  Ursula K. LeGuin

There are two classes of forecasters: those who don’t know and those who don’t know they don’t know. J K Galbraith


A few yeas back I did a piece called Really terrible predictions, which listed some of the most infamously bad predictions of the last century or so. Gems included IBM chairman Thomas Watson’s 1943 prediction that there would be a world market for five computers and Yale economics professor Irving Fisher’s claim that stocks had reached a permanently high plateau. He came up with this on October 16, 1929!

The post came back to me this summer while reading Dan Gardner’s Future Babble, a book which examines why experts are so bad at predicting the future. Of course the fact that experts sometimes get it spectacularly wrong doesn’t prove anything. As Ben Goldacre likes to say, the plural of anecdotes is not data. You need to conduct a rigorous study where hundreds of these experts – academics, intelligence analysts, economists, political scientists and even journalists – make predictions. Then you have to see how they have done.Gardner was able to find exactly this – the research of a professor of psychology at the University of Pennsylvania, Philip Tetlock. Beginning in the 1980s and for a period of twenty years Tetlock examined 27,451 forecasts by 284 experts about inflation elections, wars etc. It was an exhaustive study if ever there was one and the results did not speak highly of the experts abilities. Indeed they did little better than those proverbial dart-throwing chimps. That’s right, no better than random chance.

We need to dig more deeply into the results. The ones who were worse than average actually did worse than if they had been tossing a coin. Now that is quite an achievement! And even the ones who did better were not much better than random chance. Another paradoxical conclusion is that there was an inverse correlation between confidence and accuracy; the greater the expert’s confidence the less accurate the predictions were. According to Gardner what made the difference here was the style of thinking.

When examining the experts’ records it is helpful to use the philosopher Isaiah Berlin’s distinction, borrowed from the Greek poet Archilochus: “The fox knows many things, but the hedgehog knows one big thing.”

Tetlock analysed the difference in prediction styles in a 2005 book:

Low scorers look like hedgehogs: thinkers who “know one big thing,” aggressively extend the explanatory reach of that one big thing into new domains, display bristly impatience with those who “do not get it,” and express considerable confidence that they are already pretty proficient forecasters, at least in the long term. High scorers look like foxes: thinkers who know many small things (tricks of their trade), are sceptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible “ad hocery” that require stitching together diverse sources of information, and are rather diffident about their own forecasting prowess.

The experts were asked many kinds of questions, not all within their area of expertise. The foxes did better when asked about their field. This is exactly what you would expect. But the bizarre thing was that the hedgehogs actually did worse when making predictions in their specialised area.

The worrying aspect about this is that everyone loves a hedgehog. They are the ones who get invited as pundits on to television shows, write in newspapers and appear on the bestseller lists. We don’t want nuance. And Tetlock found that the more famous the experts, the less accurate they were.

Psychologists were probably not surprised by Tetlock’s results. What we are dealing with here then is not merely the inherent complexity of predicting the future. We have our flawed human cognitive abilities to take into account.Gardnerlooks at the cognitive biases that can have a negative effect on our ability to predict the future. Here are some of the biases he mentions:

Optimism bias is the tendency to believe that we are better than we really are – we are all above-average in intelligence, looks etc. Getting married? Other people will end up in the divorce courts.  Starting a new business? Most fail, but mine will be different. This may seem like delusional thinking but the evolutionary advantage as is that it encourages people to take action and makes them better able to deal with setbacks. To paraphrase Jack, we can’t handle the truth.

Another danger is confirmation bias. Once we form a belief we tend to seek out and accept information that supports it and not bother to look for information that does not. And even if we are actually presented with information that doesn’t fit, we will be hypercritical, looking for any excuse to dismiss it as worthless.

Status quo bias is the tendency to see tomorrow as being like today. We lack the imagination to see beyond today’s trends. Of course, this doesn’t mean we expect nothing to change. But most attempts at prediction seem to begin with current trends, which are then projected into the future. Current trends do often continue, but the further we look into the future, the more likely it is these trends will be reversed.

Negativity bias is a predilection for doom and gloom. We are drawn to bad news or images, and we are more likely to remember them than positive information.

What is so interesting about these types of biases is that none of us are immune to them.  If someone had told me just thirty months ago that Spain would go on to win the European football championship and the World Cup in the space of two years I would have thought that they should be locked up. I assumed that all my experience in the past was valid. This is where experts can be especially dangerous. I have no problem admitting to my flawed thinking, but the experts with their experience, intelligence and expertise can actually be more prone to these psychological foibles.

At the end of the book Gardner looks at the work of controversial political scientist Bruce Bueno de Mesquita, who specialises in international relations and foreign policy.

Bueno de Mesquita doesn’t really care about the local culture, history, economy, or any of the other considerations that more traditional political scientists analyse. For him the key is self-interest and he uses game theory to make models of the future. He claims a number of impressive hits. His model predicted:

  • Brezhnev being succeeded by the dark horse Andropov, who nobody at the time even considered a possibility.
  • China’s crackdown on dissidents four months before Tiananmen Square
  • The second Intifada and the end of the Middle East peace process, two years before it happened.

Bueno de Mesquita claims a hit rate of 90%. It sounds very impressive, but it does also make me a bit suspicious. We need to know the difficulty of these predictions. What did he get wrong? I am especially sceptical about black swans those rare and unpredictable events that can have catastrophic results. We seem to be incapable of predicting these.

I’m certainly not suggesting we leave the field to astrologers and clairvoyants But we have to recognise the difficulty of the enterprise and be aware that uncertainty is always going to be there. We sometimes just have to admit that we don’t know.