macrofactor

Understanding Nutrient Targets

In Part 2 of our five-part micronutrient article series, we discuss nutrient targets: where they come from, what they mean, and how to think about them.
Understanding Nutrient Targets

With MacroFactor radically expanding its micronutrient analytics, we thought this would be an opportune time to discuss micronutrients: what they are, what micronutrient targets represent, and considerations for tracking micronutrient intake.

This is part two of a five-part series:
1) Understanding Micronutrient and Essential Nutrient Categories
2) Understanding Nutrient Targets
3) Considerations for Micronutrient Tracking: Precision and Difficulty
4) Which Micronutrients Are Worth Monitoring?
5) Micronutrients are Important, But They Aren’t Everything

Our Knowledge Base also has an archive of additional information about each nutrient you can track in MacroFactor, including what the nutrient actually does, the likelihood of insufficient or excessive intake, and good food sources for each nutrient.

With that out of the way, let’s dive in!

What do all of the Micronutrient Targets Actually Mean?

When you start reading about micronutrients, you’ll encounter a number of acronyms associated with nutrient targets: DRI, DRV, RDA, EAR, AR, PRI, UL, and AI. So, let’s briefly discuss what these acronyms mean, and where micronutrient targets come from.

DRI: Dietary Reference Intake

DRIs refer to all scientifically derived reference values for various nutrients. In other words, “DRI” is the general term to describe all of the other nutrient targets discussed below.

DRV: Dietary Reference Value

DRVs are identical to DRIs. In a trend that will continue below, American health agencies (primarily the United States Department of Agriculture – USDA) and European health agencies (primarily the European Food Safety Authority – EFSA) follow the same basic processes when creating nutrient targets, but they use slightly different terminology.

Almost everyone (in the US, at least) has at least a passing familiarity with RDAs. Standard food labels present micronutrients in reference to their RDAs.

You can think of an RDA as a “better safe than sorry” micronutrient target. RDAs are designed to meet the needs of 97-98% of the population. In other words, most people can get away with consuming a bit less than the RDA for each nutrient, and very few people will need to consume more than the RDA.

PRI: Population Reference Intake

PRIs are the European version of RDAs. Much like DRIs and DRVs, RDAs and PRIs are identical concepts, communicating the same information.

EAR: Estimated Average Requirement

EARs are pretty self-explanatory. The EAR is the … estimated average requirement for a particular nutrient. For each nutrient, about half of individuals need to consume a bit more than the EAR for optimal health, and about half of individuals can get away with consuming a bit less than the EAR (assuming that intake needs for each nutrient are normally distributed).

For what it’s worth, I suspect a lot of people think about RDAs or PRIs in the manner they should think about EARs. In other words, I think a lot of people believe that RDAs are the average requirement for each nutrient, but you should really aim to exceed the RDAs. In actuality, EARs are the average requirement. So, if you want to make sure you’re covering your bases, you might want to ensure you exceed the EAR for micronutrients (in case your micronutrient requirements are a bit higher than average), but reaching the RDA or PRI means you’re probably comfortably in the clear.

AR: Average Requirement

ARs communicate the same information as EARs. Once again, this is just a difference in terminology between the US and Europe. EAR is the USDA’s term, and AR is the EFSA’s term.

AI: Adequate Intake

Americans and Europeans finally agree on an acronym!

AIs are similar to RDAs, PRIs, EARs, and ARs, but they’re a bit less scientifically rigorous. Basically, if you see an RDA, PRI, EAR, or AR, that means that there’s enough research to confidently determine population-level intake requirements for a particular nutrient. If you see an AI, that means that health agencies are confident that the nutrient is necessary for optimal health, but there’s not sufficient research to confidently determine population-level intake requirements for that particular nutrient.

So, to determine an AI, researchers will study the dietary patterns of people who are apparently healthy, and who aren’t experiencing the negative effects that should result from inadequate intake of a particular nutrient.

From there, an AI may be determined in a variety of different ways. Generally, the AI is the average intake of the nutrient in an apparently healthy population. However, AIs are sometimes based on the lowest observed intake at which no signs of insufficiency are observed in a population. Other times, they’re defined against some external standard; for instance, the AI for calcium intake in infants is based on the typical calcium intake infants would achieve if they were exclusively breastfed.

In other words, the exact information being communicated by an AI varies from nutrient to nutrient, but the functional takeaway is the same: if you reach the AI for a particular nutrient, there’s a very good chance that you’re consuming enough of that particular nutrient.

UL: Tolerable Upper Intake Level

Often, there are no known drawbacks of consuming “too much” of a particular nutrient. For some nutrients, however, excessive intake can cause problems that range from mild to severe.

For example, excess vitamin C intake generally just causes a bit of nausea and diarrhea, whereas grossly excessive vitamin A intake can be lethal.

ULs communicate “the maximum daily intake levels at which no risk of adverse health effects is expected for almost all individuals in the general population – including sensitive individuals – when the nutrient is consumed over long periods of time.” (source)

In other words, ULs are quite conservative. They’re determined based on:

  1. The highest continuous intake at which no adverse effects are observed (no observed adverse effect level – NOAEL) or the lowest continuous intake at which rare adverse effects start to be observed (lowest observed adverse effect level – LOAEL)
  2. The severity of the consequences of an excessive intake. ULs are more conservative when the result of excessive intake is more severe (death or long-term damage), and less conservative when the result of excessive intake is less severe (a bit of diarrhea, for instance).
  3. The amount, type, and strength of the evidence used to determine the NOAEL or LOAEL. ULs will be more conservative when the NOAEL or LOAEL is extrapolated from animal research or based on a small handful of case studies, and less conservative when there’s a lot of high-quality human data to confidently establish a firm NOAEL or LOAEL.
hypothetical example of risk of adverse effects compared to population intake
The fraction of the population having usual nutrient intakes above the Tolerable Upper Intake Level (UL) is potentially at risk; the probability of adverse effects increases as nutrient intakes increase above the UL, although the true risk function is not known for most nutrients. NOAEL = no-observed-adverse-effect level, LOAEL = lowest-observed-adverse-effect level. (Source: Using the Tolerable Upper Intake Level for Nutrient Assessment of Groups)

So, you’ll probably be fine if you exceed the UL for a particular nutrient from time to time, but you probably shouldn’t make a habit of it.

Just to illustrate, 100g of raw beef liver contains about 5000mcg of vitamin A, and the UL is just 3000mcg/day. However, acute toxicity generally requires a single dose in excess of 100-times the RDA. The RDA is 900mcg for men and 700mcg for women, so an acute toxic dose is typically at least 70,000-90,000mcg of vitamin A. Furthermore, chronic toxicity generally doesn’t occur unless people consume at least 10-times the RDA (7000-9000mcg/day) consistently for a period of months-to-years (source).

So, a single serving of liver will probably put you above the UL for vitamin A, but that doesn’t mean that eating a single serving of liver will cause problems for most people. Furthermore, if you’re not at elevated risk for hypervitaminosis A, a daily serving of liver will probably never cause any problems. However, if you are at elevated risk for hypervitaminosis A, and you ate 100g of liver every day, that might cause problems over a period of months-to-years.

Basically, you probably don’t need to call poison control if you slightly exceed the UL for certain micronutrients from time to time. But, if you’re regularly exceeding the UL for a particular micronutrient, it would probably be a good idea to modify your diet so that your normal consumption of the particular nutrient falls below the UL.

LTI: Lower Threshold Intake

LTIs are essentially the inverse of RDAs. Assuming the intake needs for most nutrients are normally distributed, the EAR is right in the middle of the distribution, and RDA is two standard deviations above the mean, and the LTI is two standard deviations below the mean. So, about 95% of individuals will have intake requirements between the LTI and the RDA. If you’re meeting or exceeding the RDA of a particular nutrient, there’s a very small chance that your intake is insufficient. Conversely, if you’re barely meeting or falling short of the LTI, there’s a very small chance that your intake is sufficient – only 2-3% of individuals will have nutrient needs below the LTI.

Just as a general note before moving on, I’ll mostly be sticking with the American acronyms (RDA, EAR, and DRI) for the rest of this article and this series – using both sets just got unnecessarily wordy and confusing. So, if you’re a European reading this article, just interpret RDA as PRI, EAR as AR, and DRI as DRV.

How RDAs, EARs, and LTIs are determined

If you really want to get into the weeds, a 2010 publication from the EFSA discusses how DRIs are determined in detail. It’s dense, but surprisingly readable. However, most readers probably don’t need to know all of the ins-and-outs of determining DRIs, so I’ll just walk you through the basic steps of the process, using calcium as a model example.

Step 1: Determine the analytical endpoint

To start with, authors decide what endpoint(s) to assess, and the type(s) of evidence that can be used to evaluate the impact of nutrient intake on the relevant endpoint(s).

With calcium, the primary concern is with maintaining bone health. So, you might think that determining intake requirements for calcium would be as simple as looking for research reporting both calcium intake and measures of bone health (bone mineral density, bone mineral content, rates of osteoporosis, etc.), and identifying the level of intake associated with generally positive bone-related outcomes.

However, there are a few problems with that approach. That type of research would only give you a snapshot of a single point in time – bone-related measures typically change very slowly over a period of years, so you’d need to know people’s calcium intake over the entire period in which their bone health was changing. Furthermore, there are numerous confounders. For example, whether or not dietary calcium can be incorporated into your bones depends heavily on vitamin D levels – if you eat a ton of calcium, but you’re deficient in vitamin D, you may still have poor bone health. Similarly, exercise is the main stimulus for bone remodeling – if you consume enough calcium and have sufficient vitamin D levels, but you have a very sedentary lifestyle, you may still have poor bone health.

So, the authors opted to focus on calcium balance studies, where researchers rigorously monitor calcium intake and calcium excretion under controlled conditions. Since the goal of (adult) calcium targets is the maintenance of bone calcium levels, and this type of research can precisely determine the level of calcium intake required to achieve calcium balance (i.e. the point at which the total calcium in the body neither increases nor decreases), it was selected as the primary basis for determining calcium targets.

Step 2: Analyze the data

As mentioned previously, calcium balance studies work by rigorously monitoring calcium intake and excretion. Monitoring calcium intake is simple enough – you monitor the food and beverages subjects consume, and as long as you know the calcium concentrations in all of those foods and beverages, you know how much calcium individuals are consuming.

Monitoring calcium excretion is somewhat gross, but it’s also pretty straightforward. When you consume calcium, your body won’t absorb all of it as it passes through your digestive tract. Some will wind up in your feces. Once absorbed, your body will also excrete excess calcium in your urine. So, to monitor calcium excretion, you need to collect subjects’ urine and feces, and analyze the calcium content in the excrement.

When calcium intake is low, excretion will generally exceed intake. As calcium intake increases, calcium excretion increases, but not quite as fast as calcium intake. So, at some point, excretion will match intake, resulting in neutral calcium balance. However, the point at which excretion matches intake won’t necessarily occur at the exact same level of calcium intake for all individuals – some people may achieve neutral calcium balance at an intake of 600mg/day, while other people may achieve neutral calcium balance at an intake of 900mg/day.

So, to initially determine the EAR, researchers just need to identify the point at which intake matches excretion on average. From there, they calculate the variability in calcium excretion (i.e. they calculate the standard deviation for calcium excretion), and add a margin for safety to accommodate people who excrete more calcium than normal, relative to their intake. That margin for safety is equal to two standard deviations from the mean, to accommodate 95% of individuals in the population. When you add that margin for safety to the EAR you have an initial determination of the RDA and LTI.

Here’s how the calcium balance data looks when graphed out:

data guiding the determination of calcium DRIs

Points above the red line are individuals in negative calcium balance, and points below the red line are individuals in positive calcium balance. As you can see, at intakes around 715mg/day (the EAR), about half of the individuals are in positive calcium balance, and about half are in negative calcium balance. Conversely, at intakes above 904mg/day (the RDA), the vast majority of individuals are in neutral-to-positive calcium balance, and only a handful are in negative calcium balance.

Step 3: Final adjustments and rounding

Calcium balance studies have one major blind spot that’s relatively easy to account for. Your body primarily excretes calcium via urine and feces, but you also lose a bit of calcium in your sweat. It’s relatively easy to collect subjects’ urine and feces, but it’s rarely worth worrying about collecting every drop of sweat a subject generates, so calcium balance studies don’t directly monitor calcium losses via sweat. However, the typical calcium loss via sweat is only about 40mg/day. So, the authors of the calcium guidelines compensated for this blind spot in the research by adding 40mg to the intake targets: an EAR of 745mg/day, and an RDA of 944mg/day.

However, humans generally like nice, round numbers. So, the EFSA rounded those figures to an EAR of 750mg/day, and an RDA of 950mg/day. The LTI is the same distance from the mean in the opposite direction, meaning the LTI is 550mg/day.

Of note, those DRIs are for adults over 25 years old. Since younger adults have a greater capacity to gain (not just maintain) bone mass, the authors performed further adjustments to reflect that fact and generate targets consistent with achieving positive calcium balance. For other nutrients, there are additional adjustments for sex, pregnancy status, lactation status.


I realize that the last section may have been a bit overkill, but I think it’s useful for illustrating a few things about micronutrient targets:

1) Nutrient needs differ

DRIs are based on population data, but you’re an individual, not a population. As you can see in the graph above, some folks were in slight positive calcium balance while consuming 600mg/day, while others were in slight negative calcium balance while consuming 1200mg/day. By the very nature of how DRIs work, about half of individuals could consume a bit less than the EAR and be just fine, whereas 2-3% of people would need to consume more than the RDA to ensure their needs are met.

2) The data used to generate DRIs is often relatively imprecise.

Look at the graph above, and try to ignore the blue and red lines. Would an EAR or RDA jump off the screen at you? 

Probably not.

And, for what it’s worth, nutrition researchers will readily admit that the extant data could be used to credibly argue for higher or lower nutrient targets, that “expert opinion” can play a major role in determining the intake targets for some nutrients that lack robust data, and that major judgment calls need to be made for determining the endpoint that guidelines optimize for (i.e. net balance of a particular nutrient vs. blood levels of a nutrient or biomarker vs. clinical endpoints resulting for nutrient deficiency or inadequacy).

I selected calcium for the example above because it’s a nutrient that does have a robust body of research to lean on when generating nutrient recommendations. For nutrients that just have AIs (instead of RDAs), the data is much less robust. In fact, the USDA and EFSA have some disagreements about which nutrients do actually have enough data to warrant RDAs. For instance, the USDA has defined EARs and RDAs for vitamin E and magnesium, but the EFSA has only issued AIs, suggesting that American scientists believe that there is sufficient data to estimate population-level requirements for these nutrients, whereas European scientists believe the data is not yet sufficient to quantify average vitamin E and magnesium intake requirements.

Beyond that, DRIs provide clean numbers that conceal a lot of messiness regarding nutrient sources, bioavailability, and the context of an individual’s diet. For instance, your body absorbs heme iron (the iron present in animal products) much more efficiently than non-heme iron (the iron present in plants), so you could credibly argue DRIs for iron should be higher for vegetarians and vegans than for omnivores. But, for DRIs to be useful public health tools, they need to be simple enough to serve as an easy reference for most individuals in a population. If people needed to go through a multi-step decision tree to figure out their DRIs for most nutrients, very few people would actually bother in the first place. So, having a smaller number of less exact DRIs that do a pretty good job of describing nutrient needs for most individuals is ultimately more useful than having a larger number of more exact DRIs that do a marginally better job of describing nutrient needs in smaller and smaller subpopulations. But, by the same token, there are any number of reasons why your particular micronutrient needs might be higher or lower than the EAR or RDA.

Basically, estimating micronutrient needs and quantifying micronutrient intake targets is a science, but it’s a relatively inexact science.

For a bit of fun reading, you might enjoy this in-depth summary of a conference that served as a retrospective look at the process of developing the US’s DRIs. It covers the decisions researchers made when determining DRIs (and when deciding how to determine DRIs in the first place), debates and disagreements regarding those decisions, and ideas for how the process of determining DRIs could improve in the future.

3) Nutrient guidelines are predicated on cost/benefit analyses, so we should think in terms of ranges rather than exact targets.

No single DRI should be interpreted as the single number to aim for. In fact, the very concept of EARs, RDAs, and LTIs is predicated on the assumption that nutrient needs vary between individuals:

Nutrient guidelines are predicated on cost/benefit analyses, so we should think in terms of ranges rather than exact targets.
Source: Scientific Opinion on principles for deriving and applying Dietary Reference Values

Furthermore, DRIs are based on fundamental concepts of population-level risk assessment and cost-benefit analysis. RDAs are both a) higher than the nutrient needs for most individuals, and b) the lowest targets that ensure that the vast, vast majority of individuals won’t develop nutrient deficiencies (if they meet the RDAs). So, they maximize the benefits for most individuals (avoiding nutrient deficiencies), while minimizing the cost (the literal cost of purchasing and consuming foods that provide way more of a particular nutrient than you actually need). Similarly, as discussed previously, ULs are the highest intake of a particular nutrient that still minimizes risks of an adverse event, but most people can occasionally exceed the UL for a particular nutrient without experiencing any negative effects.

Nutrient DRIs and rates of adverse outcomes
Source: Scientific Opinion on principles for deriving and applying Dietary Reference Values

So, instead of looking at the RDA for a particular nutrient and treating it as the single number to aim for, it’s more useful to think of DRIs as sets of numbers that define a range of intake targets.

How to think about DRIs
LTI = Lower threshold intake. EAR = Estimated average requirement. RDA = Recommended dietary allowance. UL = Tolerable upper intake level. NOAEL = No observed adverse effect level. LOAEL = Lowest observed adverse effect level

It’s very likely that you do need to consume more than the LTI, but there’s a pretty decent chance that intakes between the LTI and RDA will be sufficient (with the probability of positive outcomes increasing as you get closer to the RDA or PRI). Similarly, any intake between the RDA and the UL is very likely to have positive effects, and very unlikely to have negative effects, without the probability of positive and negative outcomes significantly varying within that range. In other words, if the RDA for a particular nutrient is 100mg, and the UL is 2000mg, the net effect of consuming 150mg and the net effect of consuming 1500mg should be pretty similar.

Understanding Nutrient Targets and Ranges in MacroFactor

Micronutrient goals in MacroFactor are defined by a range consisting of a floor, a target, and a ceiling.

The floor is typically the LTI, the target is typically the RDA or AI, and the ceiling (if there is a ceiling) is typically the UL.

To generate these ranges, we consulted the DRIs published by the US Food and Nutrition Board, and the DRVs published by the European Food Safety Authority. We wanted the ranges to be as forgiving as they could justifiably be, so when US and European food authorities had slightly different ranges, we typically selected the lower of the two LTIs as the floor, and the higher of the two ULs as the ceiling. But, in the spirit of RDAs and PRIs functioning as “better safe than sorry” intake targets, we typically selected the higher of the two values to serve as the actual target.

To illustrate, here are the American and European DRIs and DRVs for vitamin C:

Summary of American and European Vitamin C DRIs/DRVs
MenWomen
USEuropeUSEurope
LTI60mg70mg45mg65mg
EAR/AR75mg90mg60mg80mg
RDA/PRI90mg110mg75mg95mg
UL2000mgn/a2000mgn/a

The American LTIs are lower, so they define the floor of the vitamin C target range in MacroFactor (60mg for men, and 45mg for women). The European PRI is higher than the American RDA, so it serves as the target in MacroFactor (110mg for men, and 95mg for women). Finally, the EFSA hasn’t defined a UL for vitamin C, so the American UL serves as the ceiling in MacroFactor (2000mg for everyone).

When a nutrient has an AI instead of an RDA, that means the LTI isn’t known. In the absence of sufficient data to calculate an EAR and RDA, health authorities typically assume that individual needs vary within a ±20% range. So, we use that assumption for calculating nutrient floors for nutrients that only have AIs.

To illustrate, here are the American and European AIs and implied pseudo-LTIs for potassium:

Summary of American and European Potassium DRIs/DRVs
MenWomen
USEuropeUSEurope
AI3400mg3500mg2600mg3500mg
Implied “LTI”2720mg2800mg2080mg2800mg

The European AIs are higher, so they serve as the target in MacroFactor (3500mg for both men and women). The “LTIs” implied by the American AIs are lower, so they serve as the floor (2720mg for men, and 2080mg for women). Neither health agency has put forth a UL for potassium, so there’s no potassium ceiling in MacroFactor.

As one final consideration, both American and European health authorities have set ULs exclusively for synthetic/supplemental forms of some nutrients. For instance, health agencies on both sides of the Atlantic haven’t set a UL for total folate consumption from natural food sources, but they have set a UL of 1000μg/d of synthetic folate coming from supplements or fortified foods. In these instances, the ceiling in MacroFactor is equal to the RDA + the UL for supplemental intake. So, sticking with the folate example, the American RDA (400μg/d) is higher than the European PRI (330μg/d), so the RDA serves as the target. Since the UL for supplemental folate is 1000μg/d, the folate ceiling in MacroFactor is 400 + 1000 = 1400μg/d.

If you’d like to define different nutrient floors, targets, and ceilings for yourself (perhaps you want to stick with exclusively American DRIs or Europeans DRVs, or you have a medical condition that necessitates higher or lower intakes of a particular nutrient), you can adjust those values for yourself in MacroFactor. This knowledge base entry explains how it’s done.

Confidently control your nutrition

Download and start your free trial to get your custom macro plan and food logging access today.

Related articles

Body Composition Assessments are Less Useful Than You Think
Body Composition Assessments are Less Useful Than You Think

Measuring body composition seems to be a mild obsession in the fitness community, but individual-level body composition assessments are far less useful than most people realize. In this article, we explain the issues and present a few alternative outcome measures that are more useful.

Scroll to Top