Tuesday, November 11, 2014

Television rots your brain

When I was younger we were warned that television rots your brain.

Now Baroness Susan Greenfield tells us that digital technologies and social media do the same, as summarised in her book Mind Change: How digital technologies are changing our minds. As Susan (that's what she asked us to call her) was the guest on this week's QandA I rocked along - and even asked a question.

The question I wanted to ask Susan but didn't get chosen for was in her original field of study - degenerative brain disease. The question was, simply, in the light of the Government's proposed health research fund whether we spend too much researching cancer and coronary disease and avoiding death, and not enough on the degenerative diseases that affect people for a long time alive.

But that isn't the kind of hip topic they like on QandA.

As I have the luxury of this blog I thought I might explore the issue I raised at greater leisure. (At the time of writing the transcript hasn't been posted so I'm relying on my recall from last night and one reviewing.)

Firstly I had never really paid much attention to Susan before last night. I only bought the book on Kindle on my way in on the train and read bits of it (actually using the Kindle app on my iPad). And it was the fact that I'd read her comments on how multi-tasking and continuous input/output processing were putting pressure on the ability to think that I cared.

Unfortunately things all got a bit blurred on TV last night between two disparate strands on digital technologies. The first was at the level of sociology and psychology and the concern about how the technology changes social interaction - the chain that leads to the whole discussion about narcissism.

The second and more interesting is that digital technologies and social media actual change the way we think. In essence being more immediately reactive rather than reflective, and dare I say, cognitive.

As readers of my columns in iTnews and the AFR would know I am very concerned about the decline in STEM skills. That concern suggests a question "Are students finding maths and science harder than it used to be?"

That's why I'd highlighted in the book as I read Susan's comment that the "ability to make connections where they didn't exist before, to connect the dots, could account for talents in a number of academic areas, including philosophy, mathematics, science and music."

So to me it was a bit of a recursive exercise when Susan had talked about the impact of the tech on thinking to have a non-scientific answer.

Now, let me remind readers that though I am a great fan of science, I am not a great fan of the phrase "the science says" as I observed in the context of climate science. Science is "privileged knowledge" built by a repeated process of application of theory to observable events and adapting, or even abandoning, the theory id the results are anomolous.

Real world science is done by real people and so all the characteristics that Tom Kuhn identified as normal science abound, there is group think. The fact that a paper appears in a peer reviewed journal does not mean the paper's method, data and conclusions are correct - just that they are not totally wrong.

And quite frankly, despite the way I framed my question, one person's personal experience can be enough to destroy a well verified theory. If I see the first black swan it is my experience that confounds the the previous "law" that all swans are white.

But that wasn't what was happening last night. As I said part of the difficulty was that there were two simultaneous strands of discussion going on - one about the overtly social impacts and the other the neuroscientific.

And I wasn't expecting the lawyers and economists to argue science with the scientist. I was, however, expecting that they might acknowledge that it is science. It is the inability of the non-scientific to recognise science when they see it that is of concern.

I wasn't really trying to pick on Laura John, but I was critical of the lawyerly debating tricks. To respond to science with "Susan makes some good points..." and then disregard anything Susan actually said indicates Laura has a great career ahead of her.

James Patterson was as equally dodgy, but then again he is a disciple of the totally falsified (to use the Popperian term) theories of neo-classical economics. (And in fairness to my former boss Albo - he said he supports markets over a command economy, but he doesn't believe in the infallible self-creating market of neo-classical theory).

Finally I wasn't necessarily myself making a call on the science that Susan is referencing. But I do want to comment on the criticism made by Tony Jones on air and in the Facebook discussion that Greenfield has not done any peer reviewed research here and that if she believed the theory she should.

That is an incorrect understanding of the process and progress of science. It is the discussion of the outcomes of many pieces of research that creates new patterns, new theories. It is the kind of discussion Susan is leading that establishes research programs. (And to be fair to Susan - that is exactly what the blurb on the book says "What could this mean, and how can we harness, rather than be harnessed by, our new technological milieu to create better alternatives and more meaningful lives? Using the very latest research (up to the end of 2013), Mind Change is intended to incite debate as well as yield the way forward."

My very simple conclusion on limited evidence is that yes the higher use of digital technologies by young people mean their brains - at least their minds - are very different from those of my generation. To say they are different does not alone say they are better or worse. There are good reasons to think they are worse (the behavioural characteristics and thinking being two examples), but there are also reasons to think they are better (multitasking does get through voluminous quantities of stuff).

And if they are worse, the only solution isn't to say that social media needs to be used less. There are other options including that the processes by which ethics are learnt need to adapt (e.g. because you can't see the reaction you need to think harder about the golden rule of treating others as you would like to be treated), or utilising the increased I/O approach to increase thinking by algorithm rather than proof.

What isn't appropriate is to decide to do nothing simply because you don't like the conclusion - that's what climate change deniers do.

(Did TV rot my brain? Yes and no. My experience of the world is far more visual than was my parents - and that includes from Vietnam on seeing the horror of war directly. But the trade off has been he decline in descriptive language. And you can see that on the nightly news - what is defined to be "newsworthy" is something they have footage of. Completely inconsequential car accidents in the US are featured more highly than an earthquake in a developing country because that is where they have cameras.

But TV also created a world of drama not previously matched, especially comedy. I find the whole world a lot more amusing than I would have without these opportunities. Satirical writing and black ink cartoons are no match for Mad As Hell.

And finally when Albo used his analogy of the person at the concert watching their device record the event I was reminded of the Leunig cartoon of sunset.)

*********************************
Please like my campaign Facebook Page

1 comment:

Unknown said...

There may be a conversation to be had about this, but Susan Greenfield hasn't staked any claim to be part of it; - http://www.theguardian.com/commentisfree/2009/may/15/bad-science-susan-greenfield-computers
Maybe when she can produce some evidence we can start a conversation. Until then, as Ben Goldacre implies, it's probably all just 'flim flam'.