Red Meat & Human Health

In Sigma Statementsby Alan FlanaganLeave a Comment

Red Meat & Human Health

Estimated Reading Time = ~ 30 - 35 minutes

Context: Prevailing Guidelines on Red Meat and Health

Red meats are distinguished as ‘unprocessed’ or ‘processed’. The distinction between unprocessed and processed meats is an important aspect of evaluating the evidence related to health outcomes. Unprocessed meats includes mammalian muscle meats, from beef, lamb, mutton, veal, pork, goat, or game meats (venison, duck, etc.). Processed meats are defined by preservation or flavour-enhancing methods, including salting, curing, smoking, and industrial processing.

Prevailing guidelines, including national health regulatory bodies and international organisations, recommend reducing red meat consumption, and limiting processed meat intake. The exact amounts may differ:

  • the World Cancer Research Fund (WCRF) recommending no more than 350-500g red meat per week
  • the UK guidelines recommend 70g per day
  • Irish guidelines recommend lean red meat be consumed 2-3 days per week.

Both US and Irish dietary guidelines state that lean meats may contribute to nutritional status within the context of healthful diet pattern, with a particular emphasis on iron intake.

Historically, reductions in red meat intake were recommended as a strategy to reduce overall dietary saturated fat intake. It should be stated that the dietary recommendations for saturated fat are not based on red meat intake, but relate to the overall levels of intake in the diet, which extend beyond meat intake alone. In addition, with a recent emphasis on overall dietary patterns in nutrition, reference health-promoting dietary patterns, in particular the Mediterranean and traditional Japanese dietary patterns, while not necessarily absent red meat, the contributions of this food group are generally lower than the historic levels in certain Northern European, and Western industrialised countries. However, the primary basis underpinning guidelines has been the relationship between meat consumption and bowel/colorectal cancer.

Is Red Meat Carcinogenic?

The International Agency for Research on Cancer (IARC) 2015 Working Group classified processed meat as carcinogenic, based on ’sufficient evidence’ that high consumption of processed meats causes colorectal cancer. To be classified as “carcinogenic to humans”, the IARC prioritises epidemiological studies, supported by population-based case-control studies, and mechanistic research, which have to accumulate to a point where there are ‘consistent different populations, which make chance, bias, and confounding unlikely as explanations...’ for the results other than that the agent – in this case processed meat – causally increases risk for cancer. This classification has been upheld in subsequent analyses, with a more recent updated examination of the results from prospective cohort studies published since the 2015 IARC decision finding that high consumption of processed meat (dose response per 50g/d) was strongly associated with colorectal cancer incidence.

The classification of unprocessed meat as 'probably carcinogenic to humans’ was, however, slightly more controversial, as this was largely based on mechanistic studies in the absence of clear and consistent associations in epidemiology. The IARC conclusion on the observational data was that clear associations between unprocessed red meat and colorectal cancer were lacking. A limitation of the IARC analysis was a lack of clear associations with dietary patterns as a whole, and - unlike processed meat intake, where the relationship was clear and consistent - reliance on mechanistic studies would inevitably bias the analysis toward the ‘probably carcinogenic’ designation. 

The recent 2017 WCRF report similarly designated red meat as ‘probably a cause of colorectal cancer’, relying primarily on the direction of effect, which was positive, but overall not statistically significant in included cohort studies, or three included pooled analyses. The WCRF dose-response meta-analysis found no significant association per 100g/d, and similar to the IARC classification for unprocessed red meat, it appears that the conclusion was largely based on trend of association coupled with mechanistic plausibility. 

The mechanisms identified related to: 

  1. the formation of heterocyclic amines (HCAs) and polycyclic aromatic hydrocarbons (PCAs) from cooking at high temperatures.
  2. the induction of carcinogenesis by N-nitroso compounds (NOCs) stimulated by heme iron. 

Experimental models are designed to examine the exposure of interest (i.e. meat, heme iron, HCAs, or NOCs) on mechanistic processes. However, potentially protective or mediating compounds are excluded.  For example, mechanistic studies examining carcinogenesis from heme iron often use feeds that are low in calcium, ascorbate, a-tocopherol, or fibre, which inhibit heme-mediated formation of endogenous NOC. While this is reasonable from an experimental perspective, it does not necessarily extrapolate to the effects of a whole diet in humans. Emphasis on mechanistic studies may have over-inflated the relationship between the exposure and outcome by excluding the context of a whole diet pattern, in the absence of clear data from epidemiology.

However, the reality is that interventions in humans to date have largely focused on traditional risk factors associated with red meat, in particular saturated fat and related impacts on blood lipids. Therefore, the extent to which these moderating factors may attenuate the strength of the associations observed in certain cohort studies is relatively unknown at this juncture. This is due to the fact that the processes influencing disease outcomes, in particular CVD, cancer, and type-2 diabetes, may not necessarily relate to the intermediate risk factors typically studied, and may relate to other processes. It also reflects the fact that while it is possible to control for foods/food groups, if there is a specific nutrient-nutrient interaction that we don’t understand in detail yet, it is difficult to add that as a factor to control. Finally, many intervention studies to date do not specifically look at these potential mediating factors.

Nonetheless, the IARC publication remains arguably the most cogent overall review of the literature in relation to processed meat, as the analytical process emphasises the Bradford-Hill criteria for causality:

These criteria are more dynamic than ranking systems that emphasise value of respective study designs over the scientific process of evaluation (e.g. the GRADE system discussed below). The scientific process of evaluation that incorporates multiple lines of evidence, interpretation, synthesis, coherence, and extrapolation. In this case, the body of evidence from epidemiology and mechanistic research provided a consistent increase of risk from processed meat, together with biological plausibility. Factors like the high concentrations of nitrates, exogenous NOCs, sodium, and fat content of processed meats would appear to define the difference in risk between processed and unprocessed meats, resulting in a more consistent relationship with increased risk from processed meat, and a dose-response with over 50g/d. The standard of proof to be met to designate processed meat as carcinogenic to humans was a consensus of experts on what the body of evidence demonstrated for human health, and ‘proof’ of causation meaning causally increases risk (as distinct from direct, demonstrable causation which would be a higher threshold and require evidence from trial designs that may demonstrate causality). Thus, while relationships may be observed with other health outcomes, the primary factor underpinning guidelines to reduce meat intake for health purposes is the relationship with cancers, in particular bowel cancers.

NutriRECS Consortium 2019 Analysis & “Guidelines”

Controversy surrounding meat consumption and health outcomes was recently generated by the publication of a series of meta-analyses by a research group, known as the 'Nutritional Recommendations (NutriRECS) Consortium’, which in addition to publishing the analyses also published “new dietary guidelines” recommending that individuals continue with current levels of meat consumption, on the basis of “low certainty” evidence for reducing meat intake benefiting health outcomes.

The reality of the results from the analyses is that the findings were largely consistent with the previous literature demonstrating reduced risk of various mortality outcomes comparing higher meat intakes to lower. The study did not find a null association; the entire premise of the conclusions and recommendations is that the “certainty” of the evidence is low. This has been misinterpreted to suggest that the findings were null. 

In effect, there are three components to the discourse surrounding this publication:

  1. The results;
  2. The certainty of the findings according to the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) criteria;
  3. The purported ‘guidelines’ which were formulated based on the certainty rating, not the results.

Point 1 - The Results

In an overall analysis, the findings were consistent with the wider literature, encompassing major prospective cohort studies across different populations. The results for the mortality outcomes commonly assessed in relation to meat intake were as follows, comparing high tertiles of intake to low:

It should be stated that the presentation of the results considered both processed and unprocessed meat intake, and the relative risk reduction for reducing either by three servings per week was similar. The majority of cohort studies assessing the relationship between red meat intake and health outcomes generally have demonstrated differences in effect between processed and unprocessed meat consumption. Overall, processed meat consumption is significantly more strongly associated with CVD, cancer, and all-cause mortality. Unprocessed meat consumption is more of a statistical grey area, as highlighted above in relation to the IARC designation (and which will be discussed further below). 

However, overall the actual results of the NutriRECS analysis were nothing groundbreaking having regard to the totality of the literature on red meat and health outcomes. The entire premise for the ultimately “guidelines” (in quotations as NutriRECS is an ad hoc body which had no specific mandate to issue any guidelines) was therefore the certainty ratings of the results.

Point 2 - The Certainty Rating

The certainty rating is the core feature of these publications underpinning the subsequent recommendations. The studies included were rated using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) criteria. However, a critical feature of the GRADE system is that it is based on biomedical interventions, in particular pharmacological interventions. It emphasises randomised controlled trials with a high degree of internal validity, and has a default rating of ‘low-certainty’ for observational research. A key point in relation to assessing epidemiological data using the GRADE system was highlighted by the authors in their introduction:

"According to the GRADE approach, observational studies may provide moderate- or high-certainty evidence if they show a large magnitude of effect or a dose-response gradient and when suspected biases work against the observed direction of effect. Observational studies without these characteristics provide low-certainty evidence..."

Therefore, the ability to deem an observational study 'moderate-high certainty' in the GRADE system is conditional upon a 'rating up' of the study (where the evidence is ‘rated up’ from the default grade of ‘low certainty’ which the system applies to prospective cohort studies, to he 'moderate-high certainty'), based on those two criteria:

  1. large magnitude of effect.
  2. dose-response gradient. 

First, the magnitude of effect sought to rate up an observational study using GRADE is huge: a relative risk of >2.0—5.0 or 0.5–0.2, with no plausible confounders. However, in nutritional epidemiology, many prospective cohort studies result in a relative risk in a range of 0.8-1.2, or even 0.9-1.2. This often reflects the fact that the exposure of interest captured by dietary assessments is average intake over time, in relation to diseases with long latency periods. It is rare that significant numbers in a cohort would have such high levels of protective or detrimental foods/nutrients of interest, that the magnitude of effect would be as high as warranted for an observational study to be ‘rated up’ according to GRADE. This is critical for the second criteria, which is the dose-response. 

In nutrition, the dose-response is dependent on the unit of exposure, e.g., 10 servings of vegetables and fruit per day will yield a more pronounced RR, and reduction in risk, compared to 5, but if the average intake in the cohort is 1-5 servings, this will yield a flatter dose-response. Therefore, while it is technically possible that a prospective cohort study, as the mainstay of nutritional epidemiology, could be ‘rated up’ using GRADE, the reality is that the criteria required to rate up such a study still results in a bias against observational nutrition research. Rating each result in the study as “low-certainty” was, in effect, an inevitability of using the GRADE criteria. This is critical for the final issue, the guidelines formulated as a result of the publications.

Point 3 - The Purported Guidelines

In substance, the purported guidelines are not based on the results, they are based on the certainty aspect. The authors speak for themselves on this matter, in the publication reviewing randomised controlled trials

"Our results from the evaluation of randomized trials do not support the recommendations in the United Kingdom, United States, or World Cancer Research Fund guidelines on red meat intake (8 –10). One could argue, however, that neither do they seriously challenge those recommendations... Our results highlight the uncertainty regarding causal relationships between red meat consumption and major cardiometabolic and cancer outcomes.” [Emphasis added]

One could argue, however, that if the results do not seriously challenge public health recommendations, the justification for the guidelines produced from this analysis is difficult to either comprehend or substantiate. Nearly 25-years ago, Rose outlined the difference between a) precision, individualised nutrition, and b) public health nutrition. The fundamental goal of public health nutrition is to shift the distribution of the bell-curve of risk in the whole population from higher risk categories, to lower risk categories (see image below). Individual risk reduction involves moving high-risk individuals into a normal risk range. Given the consistent associations between red and processed meat intake and other unhealthy lifestyle habits - in particular smoking rates, BMI, alcohol intake, and lower vegetable, fruit, and fibre intake - the fact that the NutriRecs guidelines focused on individual-level recommendations to not reduce meat consumption arguably violates the precautionary principle, if we consider high meat consumers to be high-risk individuals due to the composite of their lifestyle behaviours.

This is because at the whole-population level, the guiding principle is the precautionary principle, which is in effect a risk-management tool. Fundamentally, the precautionary principle is applicable in situations characterised by a margin of uncertainty. This is distinct from prevention, which is specific to known causes. Errors in epidemiological research typically skewer risk estimates toward a null association. Thus, the fact that their actual findings were consistent with the wider literature was sufficient alone for the application of the precautionary principle. The uncertainty in a complex area, having regard to the precautionary principle, would generally translate into a general, population-wide recommendation to reduce and/or replace red meat, not use uncertainty in a research context to recommend a factor which may possess a margin of risk. The NutriRECS guidelines were, it seems, focused on the wrong framework of ‘certainty’ to be applied in this context.

In addition, much has been made of the difference between absolute risk and relative risk, as the study authors presented the absolute difference per 1000 person years. For example, with colorectal cancer, the absolute difference per 1000 persons for reducing meat intake by two-thirds was stated to be 1 less diagnosis. Results like these are offered as evidence of minimal benefit to reducing meat intake, however, it is possible to be selective with downplaying the outcomes. For example, the difference for overall cancer incidence was stated to be 18 fewer diagnoses per 1000 persons; 10 fewer for CVD per 1000 persons; 15 fewer for all-cause mortality per 1000 persons. These numbers are not trivial when scaled up to the whole-population level. 

This fundamental difference between individual nutrition vs. public health nutrition is where there can be disconnect in interpreting differences between absolute risk and relative risk. If the percentage increase in relative risk may seem to inflate the relationship, when we scale up to risk for the entire population, absolute risk increases may diminish the relationship. The critical point here from a whole-population perspective is that risk is not homogenous across populations. Even where the cause-effect relationship, i.e., meat consumption and a particular health outcome, is nominally the same, the risk between the exposure and outcome is relative to the characteristics of the population studied: background diet, wider lifestyle factors, environmental exposures, etc. Relativity is therefore a defining feature of risk for the same variable. Thus, the effect of the same exposure is not constant across populations.

This is evident in the fact that, for example, cohorts from the US tend to exhibit much more significant increased risk for adverse health outcomes from meat, in particular processed meat, consumption, compared to European or Asian cohorts. Keeping with the colorectal cancer example, assuming that the absolute difference in risk is 1 less diagnosis from reducing meat consumption assumes that the effect of that dietary change - the change in exposure - is the same across populations, when this is not the case. 

While we should therefore be guarded against the inflated exaggerations in media headlines based on relative risk, we should be equally guarded in misconstruing absolute risk, and overemphasising an individual-level risk reduction equation with whole-population, public health nutrition approaches to shift the entire distribution of risk to a lower overall risk category. The fact that the ultimate results were still consistent with the totality of literature, which suggests a benefit to reducing overall meat consumption for a range of health outcomes - albeit with wide variance in the magnitude of that effect relative to different outcomes - should be sufficient for the whole-population approach to be congruent with that overall body of evidence. 

Strength of Associations in Prospective Cohort Studies

Limitations of meta-analysis in nutrition research can render true comparisons between an exposure and an outcome difficult to elucidate, as most - if not all - meta-analyses are more focused on obtaining a single summary estimate of effect, when as stated above, the effect of the same exposure is not homogenous across populations. An additional potential methodological hindrance to effective comparisons in nutritional epidemiology is a lack of sufficient contrast in exposures (i.e., low variability in intake of an exposure of interest), and lack of sufficient numbers in the cohort to account for the degree of measurement error inherent in dietary assessment methods. 

Two prospective cohort studies in particular have been specifically designed to account for these two potential limitations: 

  1. the National Institutes of Health-American Association of Retired Persons Diet and Health Study (NIH-AARP)
  2. the European Prospective Investigation into Cancer and Nutrition (EPIC) study.

The NIH-AARP cohort included up to 545,653 participants in analysis, with a range of red meat intake of 9.3g/1000kcal to 62.5g/1000kcal. For a 2,500kcal diet, this would equate to an absolute intake ranging from about 23g per day at the low end, to 156g per day at the upper end. Processed meat ranged from 5.1g/1000kcal to 19.4g/1000kcal. At 10-years follow-up, including 47,976 and 23,276 male and female deaths, respectively.

Comparing the highest vs. lowest intake of red meat found:

In men:

  • a 31% relative risk increase for all-cause mortality (95% CI 1.27-1.35)
  • a 22% relative risk increase for cancer mortality (95% CI 1.16-1.29)
  • a 27% relative risk increase for cardiovascular mortality  (95% CI 1.20-1.35)

In women:

  • a 36% relative risk increase for all-cause mortality  (95% CI 1.30-1.43)
  • a 20% relative risk increase for cancer mortality  (95% CI 1.12-1.30)
  • a 50% relative risk increase for cardiovascular mortality  (95% CI 1.37-1.65). 

Analysis of diet according to high risk (past/current smoker) vs. low risk (never-smoker) yielded overall similar results, with never-smokers having similar mortality risk to past/current smokers in relation to both red meat and processed meat intake. It is important to note that these are relative risks, however, the confidence intervals for each result suggest the probability of a minimum increase in risk for all outcomes with higher intake of red and processed meat, in this cohort.

Clear overall pattern relationships were identified, with the highest quintile* of red meat corresponding to:

  • higher total energy intake
  • higher total fat intake and saturated fat intake
  • less fibre
  • lower vegetable and fruit intake
  • lower education status
  • higher body mass index (BMI)
  • higher likelihood of current smoking status.

[*A quintile being any of five equal groups into which a population can be divided according to the distribution of values of a particular variable. So in this case the highest quintile refers to the top 20% of the population for red meat intake.]

A common misconception reading such a list of covariates is to assume that all are confounders. However, this is incorrect; there are distinct differences between confounders (i.e., smoking), and moderating or mediating factors (i.e., fibre, fruit). A general lack of understanding for the differences between such variables is widespread in discourse surrounding nutritional epidemiology. However, a fundamental difference is that a confounder may have direct relationship with the outcome, while a moderating factor may influence the size of the effect and the full operation of a cause-effect relationship. However, a moderating factor does not invalidate that a relationship exists between the exposure and the outcome. In the NIH-AARP, analysis of diet according to high risk (past/current smoker) vs. low risk (never-smoker) yielded overall similar results, with never-smokers having similar mortality risk to past/current smokers in relation to both red meat and processed meat intake. 

A 2017 paper from the same cohort (with a median follow-up of 15.6yrs, 7.5 million person-years of follow-up in which 128,524 participants died), comparing the highest vs. lowest intake of red meat resulted in a 26% relative risk increase (95% CI 1.23-1.29). This updated analysis examined the associations between mediators, in particular heme iron, processed meat nitrates, and processed meat nitrites, finding that:

  • Heme iron mediated 22.8% of the effect between processed meat and cancer
  • Heme iron mediated 24.1% of the effect between processed meat and CVD
  • Nitrates mediated 37.0% of the effect between processed meat and cancer
  • Nitrates mediated 72.0% of the effect between processed meat and CVD

Thus indicating a stronger effect of nitrates in processed meat than heme iron. 

For unprocessed red meat, heme iron exhibited a stronger mediating effect, statistically accounting for 32.7% of the relationship with cancer, and 14.3% of the relationship with CVD. Substitution analyses indicated that, across the board, the replacement of red meat with unprocessed white meat yielded reductions in risk for all mortality outcomes analysed. In particular, for the above-mentioned tendency to conflate mediating factors with confounders, sensitivity analyses accounting variables like BMI, alcohol, smoking status, fruit and vegetable intake, baseline health status, and vitamin supplements, the main effects described above remained statistically significant. 

We will stay focused on US cohorts to develop this picture. While the Nurses Health Study (NHS) and Health Professionals Follow-up Study (HPFS) lack the sample size of the NIH-AARP, their dietary assessment methods are among the most robustly validated in nutritional epidemiology. Pooled analysis of both cohorts included 22-years of follow-up in the HPFS, 28-years of follow-up in the NHS, and overall 2.96 million person-years of follow-up in which 23,926 deaths had occurred. Rather than a meta-analysis, a pooled analysis combines all of the primary data from included studies together, providing increased statistical power to detect associations. The pooled analysis of these studies resulted in an 18% relative risk increase for CVD mortality for unprocessed red meat (95% CI 1.13-1.23), and 21% for processed red meat (95% CI (1.13-1.31). For cancer mortality, there was a 10% relative risk increase from unprocessed red meat (95% CI 1.06-1.14), and 16% for processed red meat (95% CI 1.09-1.23). The fact that adjustment for heme iron attenuated the association is consistent with the mediating effect of this dietary constituent observed in the NIH-AARP study.

In this pooled analysis, the highest intake of red meat in the HPFS was 176g/d vs. lowest of 21.2g/, while in the NHS the highest quintile was 184.5g/d vs. lowest of 43.4g/d. Thus, the magnitude of exposure contrast was broadly similar between the NIH-AARP, NHS, and HPFS studies. Further, the net effect of measurement error in epidemiology is to bias results toward the null; what this means is that measurement error results in underestimations of effect, and influences the results to ‘no association’. In both the NIH-AARP and the NFS & HPFS cohorts, analyses accounting for measurement error in fact increased the strength of the association. The same covariates were observed for higher red meat intake; lower fruit and vegetables, wholegrains, higher total energy, lower physical activity, higher BMI, greater likelihood of current smoking and alcohol intake. However, consistent with the NIH-AARP, the strength of the association in the NHS and HPFS studies survived adjustment for these factors. 

A salient feature of diet-disease associations is that risk is generally cumulative, and characterised by the relationship between:

  1. dose (level of intake)
  2. duration of exposure

These factors indicate that in US cohorts, there is a consistent relationship between the exposure, and average intake of higher amounts of red meat over time are consistently associated with increased risk for, in particular, cardiovascular and cancer mortality outcomes. While these cohorts are generally characterised by wider unhealthful dietary and lifestyle characteristics, these moderating factors do not result in a null relationship between the exposure and outcome. This indicates that, based on adjusted models, the dietary association is independent of non-dietary lifestyle factors. Secondly, while the potential mediating effects of other dietary constituents is relevant, as we will see in the European and Asian cohorts, there appears to be an effect of dose evident in the US cohorts, as potential mediating factors - i.e, dietary fibre, vegetable, wholegrain intake - did not display large variance across quintiles of red meat intake. 

The EPIC cohort was designed to account for the narrow variance in dietary exposures across populations, and to have a large sample size to account for potential measurement error. The cohorts were spread across 23 centres in 10 European countries - France, Italy, Spain, The Netherlands, Germany, Greece, Sweden, Norway, Denmark, and the United Kingdom. Dietary assessment was conducted at baseline, which was country-specific, with each dietary assessment having been validated in that specific population as part of the EPIC methodology. Food-frequency questionnaires (FFQ) were primarily utilised. Although certain countries employed direct interviews with participants. To calibrate the accuracy of the FFQ, a random sample of 8% of the overall cohort across all centres completed a computerised 24-hour diet recall which was compared to the FFQ completed by the same people in the sample.

In total, 448,568 participants were included for analysis, with a median follow-up of 12.7-years in which 26,344 deaths occurred. While no significant associations were found for unprocessed red meat intake, there was a significant association between processed meat consumption, with a 30% relative risk increase for CVD per 50g/d (95% CI 1.17-1.45), and 11% relative risk increase for cancer per 50g/d (95% CI 1.03-1.21).

In relation to unprocessed meat, a feature of the EPIC cohort overall is a relatively low median intake across the cohort of 51g/d, which reflects a range from 20g/d median in Sweden to 70g/d in Denmark for unprocessed red meat. In the overall cohort, processed meat consumption is sometimes greater than, or equivalent to, unprocessed meat consumption, which may reflect the fact that dietary pattern factor analysis indicates that a dietary pattern characterised by ‘pork, processed meat, and potatoes’ is prevalent across certain European populations. The median intake in the highest category of unprocessed meat consumption - 110.8g/d in men and 70.9g/d in women - based on the validation studies, indicates a significantly lower overall level of intake for the highest category compared to counterpart US cohorts.

The overall EPIC cohort has also been examined specifically in relation to ischemic heart disease (IHD), including 409,885 participants and 7,198 cases of non-fatal MI or fatal IHD, and a mean follow-up period of 12.6-years. Comparing the top quintile of red meat intake to the lowest, there was a 19% relative risk increase for IHD from combined (median 138g/d) red and processed meat (95% CI 1.06–1.33), but only borderline significant associations for either subcategory alone. 

However, an aspect of the statistical analysis in this paper which warrants comment is the fact that the association was strengthened upon exclusion of the first 4-years of follow-up, with a 25% relative risk increase per 100g/d increment in red and processed meat intake (95% CI 1.09-1.42). The practical significance of this finding is that in nutritional epidemiology, strengthening of a relationship upon exclusion of early follow-up suggests the short-term intervention studies may not be informative for elucidating true mechanistic underpinnings of the relationship. It implicates a particular temporal relationship between the exposure and outcome, and thus caution is warranted in the often naive and over-simplistic extrapolation of short-term intervention studies as evidence for a lack of effect of red meats on health outcomes that may be mediated by more latent, long-term factors. 

For example, the oft-cited Beef in an Optimally Lean Diet (BOLD) trial examined the effects of high meat (142g/d) intake on cardiovascular risk factors in the context of a diet containing 6% saturated fat, finding a reduction in atherogenic blood lipids over 12-weeks. However, such a reduction could be expected from that level of saturated fat content. More particularly, the long-term cohort studies implicate factors like heme iron, preservative compounds like nitrates and nitrites, and byproducts of cooking method such as heterocyclic amines (HAA), polycyclic aromatic hydrocarbons (PAH), and N-nitroso compounds (NOC). Short-term interventions with cardiovascular risk factors as endpoints provide no extrapolation to the potential cumulative effect of exposure to such dietary compounds over time. 

To illustrate this point, a controlled intervention in humans found that 300g/d red meat consumption was associated with increased formation of NOC, an effect that was inhibited by increasing concentrations of the short-chain fatty acid butyrate achieved through supplementing the red meat diet with 40g/d high-amylose maize starch. Such a threshold of specific supplemental fibre intake is not reflective of habitual dietary consumption. While this suggests dietary fibre may be a moderating factor, it does not invalidate that a relationship of causal inference exists between the exposure and outcome.

Red Meat & Type-2 Diabetes

While CVD and cancer are both common outcomes of interest, type-2 diabetes also exhibits a relationship with red meat intake. The EPIC-InterAct study, a case-cohort nested within the EPIC study investigating the relationship between lifestyle and T2DM, included a sub-group of 15,258 participants selected from the overall EPIC cohort. In this sub-cohort, the association with T2DM was moderately significant. Mean total meat intake ranged from the lowest quintile of 50.4g/d to 186.0g/d in the highest quintile. :

  • an 8% relative risk increase for total meat (95% CI 1.04-1.11)
  • a 7% relative risk increase for unprocessed red meat (95% CI 1.02-1.13)
  • a 12% relative risk increase for processed meat (95% CI 1.05- 1.18)

Country-specific effects were also observed, with greater effect sizes in Sweden, the UK, Spain, and Italy, compared to other countries.

In the US cohorts, analysis of the NHS, NHS-2, and HPFS indicated a relative risk increase in pooled analysis of these cohorts of:

  • 14% relative risk increase for total meat (95% CI 1.11-1.18)
  • 12% relative risk increase for unprocessed red meat (95% CI 1.08-1.16)
  • 29% relative risk increase for processed meat (95% CI 1.22-1.37), and 

The greater magnitude of effect size in US cohorts compared to European cohorts is therefore also evident for T2DM. Similar to the relationships between CVD and cancers, this relationship may reflect the cumulative effect of dose and duration of exposure, with up to 20-years follow-up in the HPFS, and 28-years follow-up in the NHS.

However, it should be stated that the associations in the EPIC cohorts overall are more moderate, and the confidence intervals wider, than those observed in US cohorts. This may reflect differences in cooking methods, with grilling/barbecue more prevalent in the US, and also the generally higher levels of intake (dose) and longer follow-up periods in (duration).

Comparisons with Asian Cohorts

Asian cohorts also exhibit a difference in risk of the same exposure, and are generally characterised by low levels of intake. In women intakes range from as low as 9.9 g/d to 50.9 g/d (in the Ohsaki National Health Insurance Cohort Study, and the Shanghai Women's Health Study, respectively). In men intakes range from 14.2g to 92.3g/d (in the Japan Public Health Centre-Based Prospective Cohort Study, and the Seoul Male Cohort Study, respectively).

In comparing high versus low levels of intake, no significant associations were noted in pooled analysis of 8 Asian cohorts between red meat intake and risk for all-cause mortality, CVD mortality, or cancer, in men or women. To reiterate a point made above, the effect size in nutritional epidemiology is influenced by having an appropriate contrast in exposure. What the Asian cohorts indicate is that comparing narrow contrasts in exposure, results in no association, or even positive associations. These narrow contrasts in exposure occured in the Asian cohorts as such as the “high”’ levels of intake are still actually low. This again suggests that dose is an important factor in assessing the overall evidence base, and moving between lower levels of intake - in this case less than 100g/d - may not increase risk. 

Further illustration of this point may be seen in the Japan Collaborative Cohort Study (JACC), an analysis investigating the associations between red meat intake and cardiovascular disease (51,683 people with 820,076 person-years of follow-up). That analysis compared the highest to the lowest quintile of red meat intake, giving comparisions of:

  • Men = 77.6g/d vs. 10.4g/d
  • Women = 59.9g/d vs. 7.5g/d

They found a high red meat intake (compared to low) was not associated with ischemic heart disease mortality, stroke mortality, or total CVD. This lack of association may reflect either: a) inadequate exposure contrast, or b) a true lack of effect of lower doses of intake.

The aforementioned section illustrates a number of points, with particular relevance for nutritional epidemiological methods:

  1. The effect of the same exposure - the risk - is not homogenous across populations.
  2. Narrow variation in dietary exposures can render true associations difficult to detect.
  3. Comparing ‘high vs. low’ requires a sufficiently wide contrast in levels of intake of the exposure of interest.
  4. Large cohorts have more power to account for measurement error.
  5. If an association strengthens after exclusion of the early follow-up period, it indicates that short-term interventions may not be information for the true nature of the temporal relationship, and factors influencing the underlying disease aetiology.

In accounting for these factors, and by examining cohorts with different levels of exposure contrast, the following points emerge:

  1. The relative risk of red meat consumption appears most pronounced in US cohorts (characterised by higher relative risks and narrow confidence intervals).
  2. Increased risk is observed in European cohorts, but the overall magnitude of effect is weaker, and confidence intervals wider.
  3. This may reflect a difference in both dose and duration of exposure; median intakes in Europe are substantially lower than US cohorts.
  4. The potential for differences in dose to be a factor is corroborated by data from Asian cohorts, where exposure comparisons are made between intakes of all less than 100g/d. This may reflect a genuine lack of effect of the dose, or a lack of sufficient exposure contrast to detect an effect.
  5. Large cohorts with considerable person-years follow-up and wide exposure contrasts consistently find an association.
  6. Short-term interventions focused on traditional risk factors, in particular blood lipids and blood pressure, are not informative of the potential relationship between meat preservatives, cooking byproducts, and heme iron, on related health outcomes, and are over-extrapolated.

There may inevitably arguments put forth that these associations are observed in participants whom tend to have higher BMI, smoking rates, alcohol intake, and lower vegetable and fruit intake. In an attempt to pre-emptively address such an argument, two points warrant making. First, such factors are well-established variables to account for in statistical analysis. Lifestyle factors are controlled for by using multivariate statistical analysis, and thus these associations are still present after adjustment for these factors. Secondly, it is important to reiterate the difference between a related dietary variable as a moderating factor, not a confounder. The influence of moderating factors is generally to influence the size of the effect. While wholegrains, vegetables, and fruit intake, may all moderate the relationship between red meat and outcomes, these factors do not in and of themselves invalidate that a relationship exists between the exposure and outcome. 

Comparisons to the Main Vegetarian Cohort Studies

It may be instructive to characterise the relationship between red meat and associated health outcomes relative to the main vegetarian cohort studies. In EPIC-Oxford cohort, there was no significant difference between vegetarians and non-vegetarians in:

  • overall mortality
  • ischemic heart disease mortality
  • cancer mortality
  • cerebrovascular disease mortality

However, an issue with many of the vegetarian prospective cohort studies has been the rather crude, dichotomous definitions of diet. Defining a diet as simply vegetarian or non-vegetarian may not account for the array of dietary patterns encompassed in these broad definitions.

The overall lack of strong differences in overall risk perhaps reflects that even the ‘meat eaters’ reference group (i.e., the group the vegetarian diets were compared to) did not differ in many characteristics to the extent that one might think. Fruit and vegetable intake was slightly higher in vegetarians, but overall relatively similar between groups (<20% difference). In addition, the dose component may be relevant; median intake of meat was 78.1g/d and 69.7g/d in men and women, respectively. It may mean that the EPIC-Oxford cohort is unlikely to represent a true test of the health effects of high vs. low meat diets. Similar to the effects noted in the Asian cohorts, it may be that there is an inadequate exposure contrast, or a true lack of effect of low levels of red meat consumption. The data from EPIC-Oxford taken together with the Asian cohorts indicates the latter may be a more accurate representation.

The other major Western cohort comparing non-vegetarian (i.e., ‘meat eaters) to vegetarian diets is the Adventist Health Study-2, which has one of the largest comparisons between groups with 42,500 and 53,500 non-vegetarian and vegetarian participants, respectively. While the EPIC-Oxford cohort is generally healthier than the average UK population average from which the cohort is selected, it is not comparable in overall health characteristics to the AHS-2. A characteristic of the AHS-2 non-vegetarians is an overall better diet quality than the comparative EPIC-Oxford non-vegetarians, with 30.4g/d fibre on average, and overall greater wholegrain, vegetable and fruit intake. Thus, for the contentions that low levels of these factors in higher meat diets in other cohorts is a ‘confounder’, the effects of meat intake vs. non-meat diets may be more testable in this cohort.

 Meat intake in the AHS-2 averages 3 times per week. Compared to the non-vegetarian group, vegetarians have lower CVD mortality, but not cancer mortality. An interesting feature of the AHS-2 is that the significant findings are observed primarily in men, while in women many of the associations with reduced mortality are not significant. This discrepancy between men and women was also noted in the EPIC-InterAct cohort, which assessed outcomes relative to iron intake, suggesting that differences in iron status between men and women may in some part explain the divergent associations between sexes. This potential factor warrants further investigation. 

Overall, the AHS-2 lends greater support to a reduction in incidence of breast, colorectal, and prostate cancers, and all-cause mortality, for non-meat diets compared to meat-inclusive diets, effects which are not observed in the EPIC-Oxford cohort. A feature of both EPIC-Oxford and AHS-2 is that dietary patterns defined by fish and/or dairy intake are associated with more positive health outcomes compared to meat-inclusive diets. This is consistent with the prospective cohort studies cited above, and the effects noted either in comparative analysis or where the effects of substitution of red meat for white meat has been modelled.

A further comparison with traditionally vegetarian populations may be seen in certain of the Asian prospective cohort studies, where many of the associations observed in Western cohorts are either not observed in counterpart Asian cohorts, or the inverse has been observed. For example, diets characterised as vegetarian in India is not necessarily representative of a health-promoting dietary pattern, with high amounts of fried food and desserts consumed. In addition, the distinctly low levels of meat intake renders comparisons between high vs. low levels essentially uninformative.

Taken together, the main vegetarian cohort studies are more heterogeneous than often portrayed, and certainly do not support a binary construct of the healthfulness of a diet being defined by the presence or absence of animal produce, given the associations between, in particular, fish and dairy consumption. However, our present analysis is focused on the effects of red meat, and ultimately, the vegetarian cohorts in Western populations are generally characterised by a ‘healthy user bias’ in the cohort demographics, such that meat intake is either low in dose (EPIC-Oxford) or infrequent (AHS-2), rendering true comparisons and associations more difficult to detect.


The goal of this Sigma Statement is not to resolve what any individual chooses to do with their diet. The goal is merely to present an appraisal of the evidence, objectively. Certainly, the general caveat that residual confounding will never be ruled out applies. However, this is not the limitation many think it to be, because risk in the population is always a composite sum of factors. The relevant question, echoing Rose, is how to shift the bell curve of risk across the population to a lower overall risk.

The reality of the data is that cohorts with sufficiently large sample sizes to account for measurement error, and with sufficient exposure contrasts between high vs. low levels of intake, consistently find an association between red meat and worse outcomes, in particular, cardiovascular disease and cancers. This is not to suggest a causal relationship. However, we submit that the short-term intervention studies on traditional intermediate risk factors are insufficient for providing insight into the potential mediating factors (preservatives, cooking byproducts, heme iron) that may underpin a more long-term temporal relationship between the exposure and outcome. It appears from the strength of the association, particularly in US cohorts, that ‘high’ may be defined as >130g/d, and lower levels - certainly <100g/d - do not exhibit such associations, in either European or Asian cohorts.

Underneath the summary of key points section below, is a table summarising the conclusions of this statement, based on the Bradford-Hill criteria. As our assessment of the evidence in relation to processed meat accords with the IARC and wider body of evidence supporting causal inference for high (<50g/d) processed meat consumption and adverse outcomes, in particular cardiovascular disease, cancer, and type-2 diabetes, the table focuses specifically on red (unprocessed) meat.

Summary of Key Points

    1. The distinction between unprocessed and processed meats is an important aspect of evaluating the evidence related to health outcomes.
    2. In relation to red meat and health outcomes, emphasis on mechanistic studies may have over-inflated the relationship between the exposure and outcome by excluding the context of a whole diet pattern.
    3. Risk is not homogenous across populations. Even where the relationship between meat consumption and a particular health outcome is the same, the risk between the exposure and outcome is relative to the characteristics of the population studied.
    4. The fundamental goal of public health nutrition is to shift the distribution of the bell-curve of risk in the whole population from higher risk categories, to lower risk categories. [Compared to individual risk reduction, which involves moving high-risk individuals into a normal risk range]
    5. Short-term interventions focused on traditional risk factors, in particular blood lipids and blood pressure, are not informative of the potential relationship between meat preservatives, cooking byproducts, and heme iron, on related health outcomes, and are over-extrapolated.
    6. In diet-disease associations, risk is cumulative, and characterised by the relationship between dose (level of intake) and duration of exposure.
    7. Comparing “high vs. low” requires a sufficiently wide contrast in levels of intake of the exposure of interest. It appears from the strength of the association, particularly in US cohorts, that ‘high’ may be defined as >130g/d. Comparing various intakes at lower levels (e.g. <100g/d) does not exhibit strong associations.
    8. The relative risk of red meat consumption appears most pronounced in US cohorts, characterised by higher relative risks and narrow confidence intervals. Increased risk is observed in European cohorts, but the overall magnitude of effect is weaker, and confidence intervals wider.
    9. The reality of the data is that cohorts with sufficiently large sample sizes to account for measurement error, and with sufficient exposure contrasts between high vs. low levels of intake, consistently find an association between red meat and health outcomes, in particular, cardiovascular disease and cancers.

    Red (Unprocessed) Meat & Health: Current Conclusions
    Bradford-Hill Criteria
    Criteria Notes


    How strong is the association? Strength is not merely size of effect, but relative to differences in the exposure and how that relates to the outcome.

    The association is strongest with wide exposure contrasts [>130g/d vs. <20g/d] and indicates a minimum increase in risk comparing high vs. low levels of intake. No evidence of reduced risk with higher levels vs. lower.


    Is the association observed in different populations or circumstances? And, whether multiple lines of evidence are consistent in supporting the association between exposure and outcome.

    Associations with CVD and T2DM are observed in both North American and European cohorts, while associations with cancer are not evident in European cohorts. When sufficient exposure contrast exists, in particular for CVD and T2DM, comparing high vs. low intake, the results are consistent across populations.


    Does the exposure precede the outcome? An important factor is whether the association changes over time.

    Participants in the major prospective cohort studies are disease-free at baseline, and diet is assessed prior to onset disease [minimising potential for recall bias, reverse causality, and selection bias]. The association is temporal, and strengthened associations following exclusion of short-term follow-up supports a long-term temporal relationship.


    Is the association limited to a specific population/circumstance? For nutrition sciences, this criteria is a consideration of whether the exposure is ‘sufficient cause’ to bring about the outcome.

    The association is not specific to any population. Population characteristics related to high levels of consumption include a range of similar lifestyle and dietary factors across populations; the associations survive adjustment for these factors, indicating a degree of sufficiency/specificity in the exposure-outcome relationship.

    Biological Gradient

    Does increasing the exposure [dose] increase the effect? A dose-response curve supports a causal inference.

    Cohorts with sufficiently wide exposure contrasts indicate an increasing relative risk and tight confidence intervals, with increasing dose of exposure. Lower doses [100g/d] do not appear to be associated with the same outcomes.


    Does existing knowledge indicate a plausible biological mechanism? Assess the potential association against the biological plausibility of the exposure contributing to the outcome.

    The various effects of heme iron, nitrates, nitrites, and cooking method byproducts, indicate biologically plausible potential mechanisms. These effects may be mediated by other dietary constituents, but nonetheless the biological plausibility itself is present.


    Is the association consistent with wider existing knowledge? Interpretation of the data should be coherent with current knowledge.

    Incorporating the consistency for specific outcomes across populations, and mechanistic studies which support mediation analysis regarding particular dietary constituents [heme iron/nitrates/nitrites], indicate a coherence in the data.

    Experimental Evidence

    What does data from experimental research demonstrate? In the absence of ethically feasible RCTs, epidemiological associations may only be evaluated through illustrating mechanisms and pathways.

    Data from experimental models indicates effects of effects of heme iron, nitrates, nitrites, and cooking method byproducts, on factors including carcinogenesis, oxidative stress, lipid peroxidation, and genotoxicity.


    Does changing the exposure reduce the frequency/incidence of disease? Assessing the impacts of increasing or decreasing the exposure on the outcome.

    The direction of effect for lower vs. higher doses is consistently associated with reduced risk. Substitution analyses modelling effects of replacement indicate a reduction in risk when substituted for other dietary protein sources [i.e., fish, white meat, low-fat diary, plant-protein sources].

    Statement Author: Alan Flanagan, PhD (c)

    Alan is the Research Communication Officer here at Sigma Nutrition. Alan is currently pursuing his PhD in nutrition at the University of Surrey, UK, with a research focus in chrononutrition. Alan previuosly completed a Masters in Nutritional Medicine at the same institution.

    Originally a lawyer by background in Dublin, Ireland, Alan combines an investigative and logical approach to nutrition together with advocacy skills to communicate the often complicated world of nutrition science, and is dedicated to guiding healthcare professionals and the lay public in science-based nutrition.

    Leave a Comment