I’ve been trying, with mixed success, to reduce the amount of plastic waste I generate. Spending months in Freiburg will do that to you.
So imagine my joy at seeing this headline in the Guardian this week:
Scientists accidentally create mutant enzyme that eats plastic bottles (The Guardian, 16 Apr 2018)
Woohoo! I no longer need to feel guilty about all the single-use packaging my groceries are in!
Any images of plastic melting into water and carbon dioxide were swiftly dispelled, however, when I read the article itself:
So the enzyme that eats plastic bottles already existed, and was discovered in 2016. What the scientists accidentally did is they accidentally made it better at eating plastic bottles.
How much better is the new mutant enzyme at breaking down PET (polyethylene terephthalate)?
See, here’s the problem I have. The developments in the article are genuinely exciting, not for their immediate uses but because of what they suggest is possible — that we may eventually find and/or develop enzymes that can break down all plastics (not just PET) in a matter of, say, hours or days — as opposed to the centuries plastics currently take to degrade.
But none of that possibility is conveyed in a sensationalist headline that focuses on the idea of eating plastic bottles.
Confusogenic Cancer Communications
Let’s visit another bad science headline, from October 2015.
Processed meats rank alongisde smoking as cancer causes — WHO (The Guardian, 26 Oct 2015)
You might remember this one. When the World Health Organisation (WHO)’s International Agency for Research on Cancer (IARC) released its report on the link between processed meats and colon cancer in Lancet Oncology, the news made headlines all over the world. The Guardian’s headline was especially egregious, for reasons I’ll point out in a second, but many major news outlets responded with similar headlines:
Meat Is Linked To Higher Cancer Risk, W.H.O. Report Finds (New York Times)
Processed meats do cause cancer — WHO (BBC News)
All of these headlines say that eating meat causes (or “is linked to”) cancer, but the Guardian’s headline says one thing that the other headlines do not: that processed meat is in some way as bad at causing you cancer as smoking.
Let’s take a look at the openings of each of these articles, too. The Guardian:
The New York Times:
The Washington Post:
If you look carefully, you’ll notice that the New York Times, Washington Post and the BBC talk specifically about the findings in the Lancet Oncology) paper itself, which is a meta-analysis of existing studies and pretty readable even for someone without any college-level medical or biology knowledge:
The Guardian, instead, opted to talk about the WHO’s categorisation of processed meat as a class 1 carcinogen:
Here’s the problem: the IARC’s classification of carcinogens does not classify them by degree of risk of carcinogenicity, but rather by the strength of the evidence that they are carcinogenic. Atlantic writer Ed Yong put it most elegantly in his article, Why is the World Health Organization So Bad At Communicating Cancer Risk?:
Scientific Literacy, Or Lack Thereof
While it’s true that the WHO’s communications on the carcinogenicity of processed meat were pretty dang bad, I also think the overall level of scientific literacy among non-scientists is pretty poor.
This comment isn’t about how much science non-scientists know — it’s about whether non-scientists know how to read science at all. And to be fair, this isn’t something I knew to seek out for myself either — it was something that I accidentally thrust on myself.
My final semester at NYU, I took a class called Learning To Speak: First and Second Language Acquisition Of Sound. I asked the professor beforehand what the class was like, and she said it was “lots of reading”.
Easy peasy, I thought. I’m a Spanish major, I can do lots of reading. In my imagination there was some kind of textbook of language acquisition, and we’d read a chapter or two every week.
Of course that’s not what happened. Every week, we read two to three papers on how people learn to speak and understand spoken language, and then we critiqued them. We discussed how well or how poorly the experiments were designed, how the subject pools may have affected the outcome, alternative interpretations of the results, and so on. (While discussing one paper with a particularly baffling choice of subjects, our professor said, “You’re all young and like, ‘for the science!’ but maybe he had a publishing deadline and decided the the data was good enough.”)
Honestly, I don’t remember half of the conclusions from the papers we read, but what I took away from the class was much more valuable. I learnt how to read a scientific paper, how to look for and poke at chinks in the armour, the linguistic and statistical sleights of hand that researchers might use to shore up data that is in reality not very conclusive. It was the first time I’d been forced to pull back the curtain and actually look at how scientific knowledge is created — and therefore how solid or shaky that knowledge might be.
This is very different from how science tends to be taught up to high school. At that level, education focuses not on the experiment but on the result. High school science is about showing a grasp of principles that are already well-established, without necessarily exploring how those principles got established in the first place.
It’s no surprise, then, that when we’re watching science being written, we have no idea what to do with it. It’s not that we don’t want to see the sausage being made, necessarily — we’re not even taught what that looks like. Right through high school we’re only shown complete sausages, and given the vague impression that they come straight from the animal like this. (Okay, that metaphor died fast.)
Except maybe don’t eat sausages, because, you know, they raise your risk of colorectal cancer from 4.3% to just over 5%.
Alternatives to the Current Model of Science Education
There’s no need to throw the baby out with the bathwater here — we don’t have to overhaul pre-university science education as it is, but we probably should change the experimental part of it.
Imagine a virtual reality sandbox with its own rules of physics. Imagine the challenge of using whatever you find in this virtual reality to try and determine acceleration due to gravity, or the refraction index of a particular gemstone, or the chemical composition of an unknown liquid. You could generate a complex system with some element of randomness for teaching students about designing and conducting experiments on a population, and the different ways of massaging data to fit a desired result. There’s so much more we could do in this arena.
Right now, science journalists have the job of communicating to us what these science papers say. They don’t always do a very good job of it, and neither do the scientists.
Let’s learn to meet them halfway.