Nutrition Comes Of Age In The 20th Century

Nutrition was barely a science at the turn of the 20th century. Even today, most researchers probably would agree that the best is yet to come. That said, the 20th century brought great advances in the discovery of essential nutrients, their roles in disease prevention and the translation of this information into nutritional policy.

Food restrictions and shortages during World War I created the need to ensure we were providing for the nutritional requirements of our soldiers and workers. In 1917, the U.S. Department of Agriculture issued the first dietary recommendations based on five food groups. In 1924, iodine was added to salt to prevent goiter. With that change, the rate of goiter in Michigan, for example, quickly fell from 39 to 9 percent.

The great depression of the 30s led to the development of food relief and food commodity distribution programs, including school meal and nutrition education programs, and national food consumption surveys.

The 30s also saw continued growth in the incidence of pellagra. Commonly described as the 4D disease for diarrhea, dermatitis, dementia, and death, pellagra was originally considered an infectious disease and reported as such in public records. Between 1906 and 1940 some 3 million cases and 100,000 deaths in the United States were attributed to pellagra. In 1937, researchers confirmed that pellagra was not an infectious disease caught from someone else, but the result of insufficient niacin in the diet. This led to the adoption of public policy to enrich all flour with niacin, along with iron, thiamin, and riboflavin. By the end of the 1940s pellagra was a disease of the past.

The 1940s was a time of great interest in nutrition. Our country was again engaged in war, food was rationed and as many as 25 percents of those drafted into the military showed evidence of past or present malnutrition. A program to fortify milk with vitamin D was instituted, followed by a decline in the incidence of rickets.

In the second half of this century, the focus has moved to controlling chronic diseases such as heart disease, cancer, diabetes, and obesity. This movement began with the initiation of the Framingham Heart Study in 1949. This study, which continues today, was very effective in showing how diet and sedentary lifestyles contribute to the development of heart disease. Fat, particularly saturated fat, was identified as a major culprit.

The 70s brought nutrition labeling, a growing interest in nutrition education and a plethora of low-fat products. These have continued today, stimulated in part by the initiation of mandatory nutrition labeling for all processed foods in 1994.

The good news is that the percent of total calories consumed as fat has declined in recent years, from 40 to 33 percent between 1977 and 1996. Mortality from heart disease and stroke also has declined, in part because of changes in diet and lifestyle, but also because of advances in early detection and treatment.

In the 90s we have seen an increased focus on nutrition and cancer, with the promotion of public health messages to increase the number of fruits and vegetables eaten daily.

What lies ahead? Our most urgent challenge in the coming century will be controlling obesity. Though we're consuming a lower percent of our calories from fat, total calorie intake is not declining. Our wealth has brought us too much good tasting food and too little need to move our bodies. Since 1970, the rate of obesity in the United States has increased dramatically. It is currently estimated that 55 percent of adults over age 20 are overweight or obese, with all the concurrent health problems.

Addressing this challenge will require major effort in the 21st century.

Best Medical Alerts Services
Best Medical Alerts Services