Skip to main content

Together we are beating cancer

Donate now
  • Health & Medicine

Food and cancer: why media reports are often misleading

by Sarah Williams | Analysis

14 December 2012

1 comment 1 comment

Plate of food (image from WIkimedia Commons)

This research shows the danger of overstating the results of single studies

The media’s appetite for things that cause or prevent cancer can be as notable for its sheer volume as for – in some cases – its hype. And food is a key area of interest, because everyone can relate to the latest headlines on bacon or broccoli.

Rarely a week goes by without headlines on the latest “cancer-busting” food or, at the other end of the spectrum, the unexpected perils of pop. So a research article with the intriguing title ‘Is everything we eat associated with cancer? A systematic cookbook review‘, published in the American Journal of Clinical Nutrition recently, couldn’t fail to catch our attention.

This review looked at a selection of cookbook ingredients to see if they had been investigated for links to cancer. The researchers found that most of the 50 foods they looked at had been associated with an increased or decreased cancer risk in the scientific literature.

But is this a testament to the power of different foods over our chance of developing cancer? Or is it proof that scientists just can’t make their mind up?

In fact it’s neither. The most important finding of the review was that single studies often found links that had only weak evidence behind them.

The research provokes thoughts about why we do research, how much one study on its own can tell us, and the value of considering the balance of evidence.

Perhaps most importantly, is also raises questions about how much detail scientists and the media provide as context when they present results to the public.

And rather ironically, and worryingly, one paper decided this study warranted the conclusion that there is ‘no proven link between foods and cancer’. This is clearly not the case, as we explain below.

What’s a ‘systematic cookbook review’ anyway?

The authors of this review – the first of its kind – picked page numbers at random from a cookbook, and used the ingredients they found to create a list of 50 different foods to investigate. The book (The Boston Cooking-School Cook Book) is over a hundred years old, but most of the foods on the list, such as olives, lamb, flour and mushrooms, are standard fare today.

The authors then used PubMed, a comprehensive database of published scientific research, to look for studies linking each of the foods to an altered risk of cancer in people. Of the 50 foods on the list, they found that 40 had been the subject of at least one study. The 10 ingredients that hadn’t been investigated were generally more unusual (terrapin, for instance).

Building conclusions on shaky ground

For each type of food, the authors then looked in more detail at the studies they’d found, picking up to 10 results, or choosing the most recent where there were more than 10 studies. They compared what the studies had concluded, with how strong their evidence was – and particularly how statistically certain they could be that their finding wasn’t just down to chance.

Almost three-quarters of the studies claimed that consuming the food in question affected a person’s risk of cancer, rather than making no or little difference. But in most cases the evidence behind these claims was either weak or simply not there.

Results that had at least weak evidence tended to be highlighted in study abstracts – the short summary presented at the start of the paper, which is often the basis for press releases and subsequent media stories – whereas the findings that could have been down to chance (i.e. weren’t statistically significant) were reported only in the full manuscript.

And a finding with only weak evidence behind it starts to look shakier if you discover it was the strongest result out of many – after all, a one in 20 chance of being a fluke is not very impressive if someone’s had another 19 goes. This practice of highlighting findings from subgroup analysis, as it’s known, can be particularly misleading, as often only the abstract of an academic paper is available free of charge.

The way studies compared the amounts of a food people ate also varied widely. Less than 1 in 7 of the studies compared people who didn’t eat a food with those who did. Instead they compared people who on average had different amounts, which makes it more difficult to work out if it is the specific food that’s having the effect. And similar studies, comparing the same food and same type of cancer, often used different ways of classifying people’s consumption, making it hard to work out if findings supported one another or not.

The balance of evidence

Measuring scales (image from Wikimedia Commons)

The overall balance of evidence is more important single studies

Next, the authors looked at studies called ‘meta-analyses’, that pooled the results of other studies. These aim to calculate the overall size of a possible link between a food and cancer risk, and are considered by many to be ‘gold standard’ evidence. Though of course they are only as good as the data you put into them.

The authors found that links calculated in meta-analyses tended to be much weaker than the findings from the single studies. Tellingly, the meta-analyses were much more likely to conclude that a food didn’t make much difference to a person’s risk of cancer. This suggests that the single studies were likely to be overestimating any link they found.

But this doesn’t necessarily mean that the studies were pointless or even badly conducted. The foods we eat are a sensible factor to consider when thinking about cancer risk, particularly when it comes to cancers of the digestive tract, which most of the studies focused on.

But what this study highlights is the importance of the balance of evidence when considering the impact of a study. And alongside that, we need to consider whether there’s a reasonable way for a given food to influence cancer risk, or whether there’s another explanation for a potential link.

For example people who eat more healthily tend to take more interest in their health in general, which could help to explain any observation that eating apples keeps the doctor away.

As a whole, science is like an enormous jigsaw puzzle with every study contributing one small piece to the picture. Corners and easily identifiable parts of the image may seem like the most important – but you still need all those bits of sky. As the person looking at the puzzle, the key is to take in the whole thing and be aware of any gaps. This is how science works: each new academic publication doesn’t tend to cause wild swings in opinion – rather the slow accumulation of evidence allows the movement towards established fact.

But the popular media often highlights disagreement and controversy, so we need to remind ourselves not to read too much into a one-off claim.

So is there a link between diet and cancer or not?

Some of the foods thrown up by the cookbook search have been found to have clear links to cancer – but through decades of accumulated research, not isolated studies. Alcohol (wine, rum and sherry were on the list) is a well-established cause of cancer, and there’s a known link with eating too much red meat (lamb, pork, veal and beef) and with salty foods. On the reduced risk side, many of the foods considered were fruits and vegetables – getting your five a day is another good way to reduce the risk of cancer.

Despite a headline saying otherwise, this review definitely doesn’t demonstrate that there’s no link between diet and cancer. It just underscores the need for caution when interpreting the often limited evidence. So while it’s fair to say that broadly what we eat can affect our risk, in most cases we simply don’t have enough evidence when it comes to specific foods.

The message that the best way to reduce the risk of cancer through what you eat is to have a healthy balanced diet may not be new and exciting, but it remains the one with the strongest evidence.

Further reading

Reference

Schoenfeld, J. & Ioannidis, J. (2012). Is everything we eat associated with cancer? A systematic cookbook review American Journal of Clinical Nutrition DOI: 10.3945/ajcn.112.047142

    Comments

  • Katie
    17 December 2012

    Another useful mythbusting artcile from this blog!

    Comments

  • Katie
    17 December 2012

    Another useful mythbusting artcile from this blog!