Dry Mix Science Term Paper
Do you love to use bright and vibrant colored art supplies such as markers or paints? Do you ever wonder how these colors are made?
The variety of colors comes from colored molecules. These are mixed into the material—whether ink or paint—to make the product. Some colored molecules are synthetic (or man-made), such as "Yellow No. 5" found in some food dyes. Others are extracted from natural sources, such as carotenoid (pronounced kuh-RAH-tuh-noid) molecules. These are molecules that make your carrot orange. They can be extracted from concentrated natural products, such as saffron.
But there is more to making a color look the way it does in your homemade artwork. You might have learned that many colors, such as orange and green, are made by blending other, "primary" colors. So even though our eyes see a single color, the color of a marker, for instance, might be the result of one type of color molecule or it might be a mix of color molecules responsible. This science activity will help you discover the hidden colors in water-soluble markers.
We see objects because they reflect light into our eyes. Some molecules only reflect specific colors; it is this reflected, colored light that reaches our eyes and tells our brains that we are seeing a certain color.
Often the colors that we see are a combination of the light reflected by a mixture of different-color molecules. Even though our brains perceive the result as one color, each of the separate types of color molecules stays true to its own color in the mixture. One way to see this is to find a way to separate out the individual types of color molecules from the mixture—to reveal their unique colors.
Paper chromatography is a method used by chemists to separate the constituents (or parts) of a solution. The components of the solution start out in one place on a strip of special paper. A solvent (such as water, oil or isopropyl alcohol) is allowed to absorb up the paper strip. As it does so, it takes part of the mixture with it. Different molecules run up the paper at different rates. As a result, components of the solution separate and, in this case, become visible as strips of color on the chromatography paper. Will your marker ink show different colors as you put it to the test?
- Two white coffee filters
- Drawing markers (not permanent): brown, yellow and any other colors you would like to test
- At least two pencils (one for each color you will be testing)
- At least two tall water glasses (one for each color you will be testing), four inches or taller
- Two binder clips or clothespins
- Drying rack or at least two additional tall water glasses (one for each color you will be testing)
- Pencil or pen and paper for taking notes
- Carefully cut the coffee filters into strips that are each about one inch wide and at least four inches long. Cut at least two strips, one to test brown and one to test yellow. Cut an extra strip for each additional color you would like to test. How do you expect each of the different colors to behave when you test it with the paper strip?
- Draw a pencil line across the width of each paper strip, about one centimeter from the bottom end.
- Take the brown marker and a paper strip and draw a short line (about one centimeter) on the middle section of the pencil line. Your marker line should not touch the sides of your strip.
- Use a pencil to write the color of the marker you just used on the top end of the strip. Note: Do not use the colored marker or pen to write on the strips, as the color or ink will run during the test.
- Repeat the previous three steps with a yellow marker and then all the additional colors you would like to test.
- Hold a paper strip next to one of the tall glasses (on the outside of it), aligning the top of the strip with the rim of the glass, then slowly add water to the glass until the level just reaches the bottom end of the paper strip. Repeat with the other glass(es), keeping the strips still on the outside and away from the water. What role do you think the water will play?
- Fasten the top of a strip (the side farthest from the marker line) to the pencil with a binder clip or clothespin. Pause for a moment. Do you expect this color to be the result of a mixture of colors or the result of one color molecule? If you like, you can make a note of your prediction now.
- Hang the strip in one of the glasses that is partially filled with water by letting the pencil rest on the glass rim. The bottom end of the strip should just touch the water level. If needed, add water to the glass until it is just touching the paper. Note: It is important that the water level stays below the marker line on the strip.
- Leave the first strip in its glass as you repeat the previous two steps with the second strip and the second glass. Repeat with any additional colors you are testing.
- Watch as the water rises up the strips. What happens to the colored lines on the strips? Does the color run up as well? Do you see any color separation?
- When the water level reaches about one centimeter from the top (this may take up to 10 minutes), remove the pencils with the strips attached from the glasses. If you let the strips run too long, the water can reach the top of the strips and distort your results.
- Write down your observations. Did the colors run? Did they separate in different colors? Which colors can you detect? Which colors are on the top (meaning they ran quickly) and which are on the bottom (meaning they ran more slowly)?
- Hang your strips to dry in the empty glasses or on a drying rack. Note that some colors might keep running after you remove the strips from the water. You might need longer strips to see the full spectrum of these colors. The notes you took in the previous step will help you remember what you could see in case the colors run off the paper strip. Look at your strips. How many color components does each marker color have? Can you identify which colors are the result of a mixture of color components and which ones are the result of one hue of color molecule? Are individual color components brightly colored or dull in color? How many different colors can you detect in total?
- Extra: Most watercolor marker inks are colored with synthetic color molecules. Artists often like to work with natural dyes. It is fairly easy to make your own dye from colorful plants such as blueberries, red beets or turmeric. To make your own dye, have an adult help you finely chop the plant material and place it in a saucepan. And add just enough water to cover the plant material. Let the mixture simmer covered on the stove for approximately 10 to 15 minutes. If, at this point, the color of your liquid is too faint, you might want to remove the lid of the saucepan and continue boiling until some liquid has evaporated and a more concentrated color is obtained. Let it cool and strain when needed. Now you have natural dye. (Handle with caution, as it can stain surfaces and materials.) To investigate the color components of this dye, repeat the previous procedure but replace the marker line with a drop of natural dye. A dropper will help create a nice drop. Let the drop of dye dry before running the paper strip. Would the color of your natural dye be the result of a mixture of color molecules or one specific color molecule? Does the marker of the same color as your natural dye run in a similar way as your natural dye does?
- Extra: In this activity you used water-soluble markers in combination with water as a solvent. You can test permanent markers using isopropyl rubbing alcohol as a solvent. Do you think similar combinations of color molecules are used to color similar-colored permanent markers?
- Extra: You can investigate other art supplies, including paints, pastels or inks in a similar way. Be sure to always choose a solvent that dissolves the material that is being tested to run the chromatography test. Isopropyl rubbing alcohol, vegetable oil and salt water are some examples of solvents used to perform paper chromatography tests for different substances.
Observations and results
Did you find that brown is made up of several types of color molecules, whereas yellow only showed a single yellow color band?
Marker companies combine a small subset of color molecules to make a wide range of colors, much like you can mix paints to make different colors. But nature provides an even wider range of color molecules and also mixes them in interesting ways. As an example, natural yellow color in turmeric is the result of several curcuminoid molecules. The brown pigment umber (obtained from a dark brown clay) is caused by the combination of two color molecules: iron oxides (which have a rusty red-brown color) and manganese oxides (which add a darker black-brown color).
In this activity you investigated the color components using coffee filters as chromatography paper. Your color bands might be quite wide and artistic, whereas scientific chromatography paper would yield narrow bands and more-exact results.
Throw away the paper strips and wash the glasses.
More to explore
Paper Chromatography, from ChemGuide
Paper Chromatography: Is Black Ink Really Black?, from Science Buddies
Make Your Own Markers, from Science Buddies
Candy Chromatography: What Makes Those Colors?, from Science Buddies
Find the Hidden Colors of Autumn Leaves, from Scientific American
This activity brought to you in partnership with Science Buddies
The world has progressed through hunter–gatherer, agricultural, and industrial stages to provider of goods and services. This progression has been catalyzed by the cultural and social evolution of mankind and the need to solve specific societal issues, such as the need for preservation to free people from foraging for food, and the need for adequate nutrition via consistent food supply year round. These forces led to the development of the food industry, which has contributed immensely to the basis for a healthy human civilization and helped society prosper and flourish (Lund 1989).
Development of food science and technology
According to Harvard Univ. biological anthropologist Richard Wrangham, food processing was launched about 2 million years ago by a distant ancestor who discovered cooking, the original form of food processing (Wrangham 2009). Later, but still during prehistoric times, cooking was augmented by fermenting, drying, preserving with salt, and other primitive forms of food processing, which allowed groups and communities to form and survive. Humans thus first learned how to cook food, then how to transform, preserve, and store it safely. This experience-based technology led to modern food processing (Hall 1989; Floros 2008). Much later, the domestication of plants and land cultivation became widespread, and at the end of the last Ice Age, humans revolutionized eating meat by domesticating animals for food. Thus, plant and animal agriculture also contributed to improving the human condition.
Study of every ancient civilization clearly shows that throughout history humans overcame hunger and disease, not only by harvesting food from a cultivated land but also by processing it with sophisticated methods. For example, the 3 most important foods in Ancient Greece—bread, olive oil, and wine—were all products of complicated processing that transformed perishable, unpalatable, or hardly edible raw materials into safe, flavorful, nutritious, stable, and enjoyable foods (Floros 2004).
Today, our production-to-consumption food system is complex, and our food is largely safe, tasty, nutritious, abundant, diverse, convenient, and less costly and more readily accessible than ever before. This vast food system includes agricultural production and harvesting, holding and storing of raw materials, food manufacturing (formulation, food processing, and packaging), transportation and distribution, retailing, foodservice, and food preparation in the home. Contemporary food science and technology contributed greatly to the success of this modern food system by integrating biology, chemistry, physics, engineering, materials science, microbiology, nutrition, toxicology, biotechnology, genomics, computer science, and many other disciplines to solve difficult problems, such as resolving nutritional deficiencies and enhancing food safety.
The impact of modern food manufacturing methods is evident in today's food supply. Food quality can be maintained or even improved, and food safety can be enhanced. Sensitive nutrients can be preserved, important vitamins and minerals can be added, toxins and antinutrients (substances such as phytate that limit bioavailability of nutrients) can be removed, and foods can be designed to optimize health and reduce the risk of disease. Waste and product loss can be reduced, and distribution around the world can be facilitated to allow seasonal availability of many foods. Modern food manufacturing also often improves the quality of life for individuals with specific health conditions, offering modified foods to meet their needs (for example, sugar-free foods sweetened with an alternative sweetener for people with diabetes).
|Biology, Cell Biology||Understanding of postharvest plant physiology, food quality, plant disease control, and microbial physiology; food safety|
|Biotechnology||Rice with increased content of beta-carotene (vitamin A precursor); enzymes for cheesemaking, breadmaking, and fruit juice manufacture|
|Chemistry||Food analysis, essential for implementing many of the applications listed here; improved food quality; extended shelf life; development of functional foods (foods and food components providing health benefits beyond basic nutrition)|
|Computer Science||Food manufacturing process control, data analysis|
|Genomics||Understanding of plant and animal characteristics; improved control of desirable attributes; rapid detection and identification of pathogens|
|Materials Science||Effective packaging; understanding of how materials properties of foods provide structure for texture, flavor, and nutrient release|
|Microbiology||Understanding of the nature of bacteria (beneficial, spoilage, and disease-causing microorganisms), parasites, fungi, and viruses, and developments and advances in their detection, identification, quantification, and control (for example, safe thermal processes for commercial sterilization); hygiene; food safety|
|Nutrition||Foods fortified with vitamins and minerals for health maintenance; functional foods for addressing specific health needs of certain subpopulations; development of diets that match human nutrient requirements; enhanced health and wellness|
|Physics, Engineering||Efficient food manufacturing processes to preserve food attributes and ensure food safety; pollution control; environmental protection; waste reduction efforts|
|Sensory Science||Understanding of chemosenses (for example, taste and odor) to meet different flavor needs and preferences|
|Toxicology||Assessment of the safety of chemical and microbiological food components, food additives|
Controversies about processed foods
Although today the public generally embraces and enjoys key benefits of the food supply—value, consistency, and convenience—some suggest that the cost to society of obtaining these benefits is too high. Negative perceptions about “processed foods” also exist, especially among consumers in the United States. A range of factors contributes to these perceptions. These include uneasiness with technology, low level of science literacy, labeling, and advertising that have at times taken advantage of food additive or ingredient controversies, influence on perception of voluntary compared with involuntary nature of risk, and high level of food availability (Slovic 1987; Clydesdale 1989; Hall 1989). Other factors contributing to negative public perceptions about processed foods include the increasing prevalence of obesity in many industrialized or developed countries, use of chemicals in production or additives in foods, little personal contact between consumers and the agricultural and food manufacturing sectors, food safety issues, and concern that specific ingredients (particularly salt), may contribute to illnesses or impact childhood development (Schmidt 2009).
Some books on food in the popular press have implied that the food industry has incorrectly applied the knowledge of food science and technology to develop processed foods that result in poor dietary habits. The premise of some critics of processed foods is that knowledge of chemistry and the physical properties of food constituents allow the food industry to make processed foods that result in overeating and cause the general population to abandon whole foods. The argument is stretched further to suggest that the development of processed foods is responsible for promoting bad eating habits and is the cause of chronic disease. Such an argument is specious, because personal preferences, choice, will power, and lifestyle factor into the decision of what and how much to eat. The challenge surrounding the connection between lifestyles and health (that is, diet and chronic disease) is discussed in the next section of this review.
The population challenge
During the 2009 World Summit on Food Security, it was recognized that by 2050 food production must increase by about 70%—34% higher than it is today—to feed the anticipated 9 billion people (FAO 2009a). This projected population increase is expected to involve an additional annual consumption of nearly 1 billion metric tons of cereals for food and feed and 200 million metric tons of meat.
Another challenge is the large, growing food security gap in certain places around the world. As much as half of the food grown and harvested in underdeveloped and developing countries never gets consumed, partly because proper handling, processing, packaging, and distribution methods are lacking. Starvation and nutritional deficiencies in vitamins, minerals, protein, and calories are still prevalent in all regions of the world, including the United States. As a consequence, science-based improvements in agricultural production, food science and technology, and food distribution systems are critically important to decreasing this gap.
In addition, energy and resource conservation is becoming increasingly critical. To provide sufficient food for everyone in a sustainable and environmentally responsible manner, without compromising our precious natural resources, agricultural production must increase significantly from today's levels and food manufacturing systems must become more efficient, use less energy, generate less waste, and produce food with extended shelf life.
Although scientific and technological achievements in the 20th century made it possible to solve nutritional deficiencies, address food safety and quality, and feed nearly 7 billion people, further advancements are needed to resolve the challenges of sustainably feeding the growing future population in industrialized and developing nations alike. In fact, to meet the food needs of the future, it is critically important that scientific and technological advancements be accelerated and applied in both the agricultural and the food manufacturing sectors.
Achievements and promises
The next section of this review, “Evolution of the Production-to-Consumption Food System,” summarizes the parallel developments of agriculture and food manufacturing from the beginnings of modern society (the Neolithic revolution) to the present; it also addresses the current diet and chronic disease challenge. The subsequent section, “Food Processing: A Critical Element,” explains why food is processed and details the various types of food processing operations that are important for different food manufacturing purposes. Then the following section, “Looking to the Future,” outlines suggestions to improve our food supply for a healthier population, and briefly discusses the various roles that researchers, consumers, the food industry, and policy makers play in improving the food supply for better health; it also addresses the promises that further advancements and application of technologies in the food system hold for the future.
Evolution of the Production-to-Consumption Food System
The life of the hunter–gatherer was generally uncertain, dangerous, and hardscrabble. Thomas Hobbes, in his Leviathan (I561), described life in those times as “the life of man in a state of nature, that is, solitary, poor, nasty, brutish, and short.” Agriculture transformed that existence by making available a far larger and generally more reliable source of food, in large part through domestication and improvement of plants and animals.
Domestication leads to civilization
Domestication is the process of bringing a species under the control of humans and gradually changing it through careful selection, mating, and handling so that it is more useful to people. Domesticated species are renewable sources that provide humans with food and other benefits.
At the end of the last Ice Age, humans domesticated plants and animals, permitting the development of agriculture, producing food more efficiently than in hunter-gatherer societies, and improving the human condition. Domestication did not appear all at once, but rather over a substantial period of time, perhaps hundreds of years. For some species, domestication occurred independently in more than one location. For animals, the process may have begun almost accidentally, as by raising a captured young animal after its mother had been killed and observing its behavior and response to various treatments. Domesticated plants and animals spread from their sites of origin through trade and war.
The domestication of plants and animals occurred primarily on the Eurasian continent (Smith 1998). A prominent early site was in the Middle East, the so-called Fertile Crescent, stretching from Palestine to southern Turkey, and down the valleys of the Tigris and Euphrates Rivers, where barley, wheat, and lentils were domesticated as early as 10000 y ago and sheep, goats, cattle, and pigs were domesticated around 8000 y ago. Rice, millet, and soy were domesticated in East Asia; millet, sorghum, and African rice in sub-Saharan Africa; potato, sweet potato, corn (maize), squash, and beans in the Americas; Asiatic (water) buffaloes, chickens, ducks, cattle, and pigs in the Indian subcontinent and East Asia; pigs, rabbits, and geese in Europe; and llamas, alpacas, guinea pigs, and turkeys in the Americas.
The introduction of herding and farming was followed by attempts to improve the wild varieties of plants and animals that had just been domesticated. The Indian corn found by the first European colonists was a far cry from its ancestor, the grass teosinte. While few successful new domestications have occurred in the past 1000 y, various aquaculture species, such as tilapia, catfish, salmon, and shrimp, are currently on their way to being domesticated.
Although the primary goal of domestication (ensuring a more stable, reliable source of animal and plant foods) has not fundamentally changed, the specific goals have become highly specialized over time. For example, we now breed cattle for either beef or dairy production, and cattle and hogs for leaner meat. We breed chickens as either egg layers or broilers. In addition, selection for increased efficiency of producing meat, milk, and eggs is prominent in today's agriculture, as discussed later in this section.
Agriculture, built on the domestication of plants and animals, freed people from the all-consuming task of finding food and led to the establishment of permanent settlements. What we know as civilization—cities, governments, written languages, an expanding base of knowledge, improved health and life span, the arts—was only possible because of agriculture. Along with domestication of plants and animals, people began the journey of discovery of methods to extend the useful life of plant and animal food items so that nourishment could be sustained throughout the year. With a fixed (nonnomadic) population also came primitive food storage and, with that, improvements in food safety and quality.
In July 2009, an important discovery and conjecture was made about the recognition that food security was of paramount importance. Kuijt and Finlayson (2009) reported that they believe they have discovered several granaries in Jordan dating to about 11000 y ago. This would suggest that populations knew the importance of having a dependable food supply before the domestication of plants. The authors further suggested that “Evidence for PPNA (Pre-Pottery Neolithic Age) food storage illustrates a major transition in the economic and social organization of human communities. The transition from economic systems based on collecting and foraging of wild food resources before this point to cultivation and foraging of mixed wild and managed resources in the PPNA illustrates a major intensification of human-plant relationships.” Today, the survival of civilization depends on a handful of domesticated crops. Of the roughly 400000 plant species existing today (Pitman and Jorgensen 2002), fewer than 500 are considered to be domesticated.
Selecting for desirable crop traits
The primary force in crop domestication and subsequent breeding is selection, both artificial and natural, described below. Charles Darwin, in developing the theory of natural selection, relied heavily on the knowledge and experiences of plant and animal breeders (Darwin 1859). Crops were domesticated from wild ancestors’ gene pools that had been altered by selection imposed by early agriculturalists and by natural selection imposed by biotic and abiotic environmental factors (Harlan and others 1973; Purugganan and Fuller 2010). Selection changes gene pools by increasing the frequency of alleles (genes encoded by a place in the genome and that may vary between individuals and mutant/parent strains) that cause desirable traits and decreasing the frequency of alleles that cause undesirable traits. Modern crop varieties are still shaped by the same forces.
The causes of the bursts of domestication activity have been the subject of much speculation (Smith 1998), but the changes symptomatic of domestication are well established for many species (Harlan and others 1973; Doebley and others 2006). Legumes and the large-seeded grasses collectively known as cereals (for example, maize, wheat, rice, and sorghum) contribute most of the calories and plant protein in the human diet. For these and other annual crops such as sunflower and squash, the initial changes during domestication involved ease of harvesting and the ability to compete with weeds. Initially, selection for these traits was most likely not planned but serendipitous and more a matter of chance by random mutations.
The most significant problem confronting most agriculturalists, both early and modern, is weed competition. Early agriculturalists scattered seeds on ground that had been prepared, most likely by burning or some other disruption of the soil surface. Those seeds that passed their genes onto the next generation (natural selection) were those that best competed with weeds. Selection pressure due to weed competition results in a number of changes, including the reduction or elimination of seed dormancy and larger seeds (Harlan and others 1973; Smith 1998). Dormancy is very undesirable in annual crops, and most domesticated species germinate rapidly upon planting. Selection against dormancy has been so extreme, however, that under certain weather conditions, seeds of modern wheat varieties (Triticum aestivum) and barley (Hordeum vulgare) sprout while still in the seed head, destroying the value of the grain crop. Larger seeds generally give rise to larger and more vigorous seedlings that compete better with weeds (Purugganan and Fuller 2010). In the grasses, selection for larger seed size is associated with increased starch and decreased protein in the endosperm. For example, the protein content of teosinte (Zea mays parviglumis)—the wild ancestor of maize (Zea mays mays), which is referred to as corn in North America—is approximately 30%, while the protein content of modern maize is 11% (Flint-Garcia and others 2009).
While the goal of selection is to alter the targeted trait (appearance and/or performance) and the genetic variation underlying the selected trait will be reduced over time, unselected traits will also often change, and these changes may be negative (for example, reduced endosperm protein in grasses that have been selected for larger seeds).
For example, in the United States, the major selection criterion for maize is increased grain yield (Tracy and others 2004), and strong selection pressure for increased grain yield leads to increased starch content and decreased protein content (Dudley and others 2007). Critics focus on such changes as evidence that the quality of our food supply has been “damaged” by modern plant breeding and agricultural practices. But has it? In United States agriculture, maize is grown for its prodigious ability to convert the sun's energy into chemical energy (carbohydrates), while we have abundant sources of plant and animal protein. In other parts of the world, maize is a staple crop, and diets of many people are deficient in protein. To improve the nutrition of the poor whose staple is maize, plant breeders at the Intl. Center for Maize and Wheat Improvement (Centro Internacional de Mejoramiento de Maíz y Trigo, CIMMYT) developed quality protein maize (QPM) that has an improved protein content and amino acid profile (Prasanna and others 2001). It is the selection of the breeding objective that determines the outcome. Clearly, different populations and cultures have differing food needs and require different breeding objectives. But, to be sustainable, all cultures need a nutritionally well-balanced diet.
Changes in food animal agriculture and fisheries
Animal food products are good sources of high-quality protein, minerals (for example, iron), and vitamins, particularly vitamin B12, which is not available in plant materials. Livestock production is a dynamic and integral part of the food system today, contributing 40% of the global value of agricultural output, 15% of total food energy, and 25% of dietary protein and supporting the livelihoods and food security of almost a billion people (FAO 2009b). Seafood, including products from a growing aquaculture segment, provides at least 15% of the average animal protein consumption to 2.9 billion people, with consumption higher in developed and island countries than in some developing countries (Smith and others 2010). Except for most of sub-Saharan Africa and parts of South Asia, production and consumption of meat, milk, and eggs is increasing around the world, driven by population and income growth and urbanization (FAO 2009b; Steinfeld and others 2010). The rapidly increasing demand for meat and dairy products has led during the past 50 y to an approximately 1.5-fold increase in the global numbers of cattle, sheep, and goats; 2.5-fold increase in pigs; and 4.5-fold increase in chickens (Godfray and others 2010). The nutritional impact of animal products varies tremendously around the world (FAO 2009b; Steinfeld and others 2010).
The structure of the livestock sector is complex, differs by location and species, and is being transformed by globalization of supply chains for feed, genetic stock, and other technologies (FAO 2009b). The current livestock sector has shifted from pasture-based ruminant species (cattle, sheep, goats, and others having a multichamber stomach, one of which is the rumen) to feed-dependent monogastric species (for example, poultry) and is marked by intensification and increasing globalization (Steinfeld and others 2010). A substantial proportion of livestock, however, is grass-fed (Godfray and others 2010) and small-holder farmers and herders feed 1 billion people living on less than $1 a day (Herrero and others 2010).
The rates of conversion of grains to meat, milk, and eggs from food animals have improved significantly in developed and developing countries (CAST 1999). Technological improvements have taken place most rapidly and effectively in poultry production, with broiler growth rates nearly doubled and feed conversion ratios halved since the early 1960s. In addition to these productivity gains, bird health and product quality and safety have improved through applications of breeding, feeding, disease control, housing, and processing technologies (FAO 2009b). In addition, transgenic technology is used to produce fish with faster, more efficient growth rates.
Meeting the needs of a growing population
As a result of improved public health measures and modern medicine, the population has mushroomed from an estimated 1 to 10 million in 10000 BC to an estimated 600 to 900 million in AD 1750 and an estimated 6.8 billion today. Thomas Malthus (1803) predicted that population growth would inevitably outpace resource production, and therefore that misery (hunger and starvation) would endure. Undoubtedly, application of science and technology in agriculture and food and beverage manufacturing has negated these predictions and fed population growth (Figure 1).
The application of science to agriculture has dramatically increased productivity, but until the Green Revolution of the 1960s and 1970s, productivity was not keeping pace with population growth. Large areas of the world, including the 2 most populous nations, China and India, were experiencing severe food shortages and anticipating worse. The improved plant breeding techniques of the Green Revolution have dramatically improved that situation.
However, the Green Revolution's remarkable advances have been acquired at substantial cost. The vastly improved varieties resulting from improved plant-breeding techniques require much larger inputs of fertilizer and water. Poor farmers often cannot afford the fertilizer, and adequate water supplies are becoming an increasing problem in many areas. Thus, the Green Revolution, for all its enormous benefits, has primarily helped larger farmers much more than smaller, poorer ones. In addition, pesticide applications in the developing world are too often inappropriate or excessive—in some cases because the farmer is unable to read the label—and there is no structure (for example, a regulatory agency such as the Environmental Protection Agency) to regulate their use.
Problems are not, however, confined to the developing world. Nutrient run off in the United States and other countries leads to algal blooms in lakes and estuaries and to “dead zones” completely lacking in oxygen in lakes and oceans. Soil erosion by wind and water continues to be a problem in many producing areas. Soil quality thus suffers. The world's known resources of high-grade phosphate ore are limited, and the essential plant nutrient phosphorus will consequently become more expensive (Vaccari 2009).
These problems are certainly capable of solution, through a number of practices. Beneficial options include “no-till” agriculture (which leaves the root systems of previous crops undisturbed, thereby retaining organic matter and greatly discouraging erosion), integrated pest management, IPM (which focuses pesticide use where needed, substantially decreasing the amount used), precision agriculture (which site-specifically targets production inputs such as seed, fertilizer, and pesticides where and when needed), drip irrigation (controlled trickling of water), and use of new technology for recovering nitrogen and phosphorus from processing wastewater for use as fertilizer (Bongiovanni and Lowenberg-Deboer 2004; Frog Capital 2009; Gebbers and Adamchuk 2010).
Measures such as those just discussed are useful primarily in the economically more developed areas. Developing countries require other steps adapted to their local areas and focused particularly on improvements for the many millions of small, poor farmers. Improved plant varieties, produced both by conventional breeding and through biotechnology are necessary, as are improved varieties of fish and livestock. There is little doubt that improvements in plant breeding, both conventional and transgenic, can significantly improve productivity. Technological improvements, such as automated plant monitoring via robotics, are “helping plant breeders trim years off the process of developing crop varieties tailored to local conditions” (Pennisi 2010).
The list of such needs is far too long to explore here, but it also must include public health measures. A major problem yet to be addressed is the subsidization of agricultural products in developed nations. Products from small, unsubsidized farmers in developing nations cannot compete in the world market with subsidized products from advanced nations. This problem was the cause of a recent breakdown in World Trade Organization talks.
Some see organic agriculture as an answer to these problems. Organic farming has some clear merits, particularly those practices, such as crop rotation and the use of green or natural biocontrol agents and animal manure, which have been used by farmers for millennia (King 1949). The use of degraded plant and animal residues increases the friability (tendency to crumble, as opposed to caking) and water-holding capacity of soil, and nutrients from decaying plants and animal manure are more slowly available than those from most commercial fertilizers. Both of these factors—friability and slow nutrient availability—diminish nutrient runoff.
While organic agriculture continues to grow in response to consumer preferences in the developed world, there are limitations to widespread use of organic practices. Organic agriculture requires substantially more land and labor than conventional practices to produce food, and the resulting yields are not great enough and too expensive to address the needs of the growing population. The supply of composted animal manure is limited and relatively expensive compared to commercial fertilizers. Organic agriculture excludes the use of synthetic pesticides, and the few “natural” ones that are permitted are seldom used (Lotter 2003). Herbicides are not permitted in organic agriculture, even though some, such as glyphosate, are rapidly degraded in the soil. These exclusions require more manual labor for weed and pest control. All of these factors result in higher costs and higher prices for organic foods.
Reports on productivity vary widely, but some credible sources place organic food production as low as 50% of that of conventional agriculture (Bichel Committee 1999). Yield differences may be attributable to a number of factors such as agro-ecological zone (for example, temperate and irrigated compared with humid and perhumid), crop type, high-input compared with low-input level of comparable conventional crop, and management experience (Zundel and Kilcher 2007). In addition, current organic methods exclude the use of the products of modern biotechnology—recombinant DNA technology—essential to future increases in agricultural productivity. Nevertheless, the more useful practices of organic agriculture must be part of the agriculture of the future.
Although poverty and malnutrition exist in all countries, by far the most severe problems in achieving availability, safety, and nutritive value of food and beverages occur in the developing world (IFPRI 2009). Water shortages and contaminated water, poor soil, destruction of forest for fuel, use of animal manure for fuel, the spread of plant and animal diseases, and the complete lack of a sound food safety infrastructure are among the most vexing problems. Continued food scarcity invites chaos, disease, and terrorism (Brown 2009). The gap between developing and developed nations is not only in economics but also in science, governance, and public information. Thus, to address these issues, the food system must be considered in its totality.
Eighty percent of agricultural land is used for grain fed to meat animals and yields only 15% of our calorie intake. Many have suggested that world food shortages could be greatly alleviated by consuming less meat and using the grain supplies now consumed by animals more directly. Reduction in meat intake, particularly red meats, would confer some health benefits, but the potential effects on world food supplies are less clear and quite possibly much less than many presume. If developed nations consume much less meat, the price of meat will fall and poorer nations will consume more. If more grain is consumed, grain prices will rise, to the detriment of populations that already rely heavily on grain. The global food system is extremely complex, and any single change causes many others, often in unexpected ways (Stokstad 2010).
Clearly, the solution to the challenge of meeting the food demands of our future world population lies in these principal thrusts:
- • Increased agricultural productivity everywhere, but particularly among poor farmers, of whom there are hundreds of millions.
- • Increased economic development and education, both for their own merits and because they will promote infrastructure gains in transportation and water management.
- • Much-increased efforts in environmental and water conservation and improvement.
- • Continued improvements in food and beverage processing and packaging to deliver safe, nutritious, and affordable food.
- • Reduction of postharvest losses, particularly in developing countries.
We must achieve all of these goals. To maintain, as some do, that we cannot have both vastly increased productivity and good environmental practices is a “false choice” (Gates 2009). Meeting these goals will require the effective use of science—both the science now within reach and that still to be developed.
Preserving the food supply
Postharvest losses occur between harvest and consumption as a result of spoilage of raw agricultural commodities, primarily during storage and transportation, before they can be stabilized for longer-term storage. The granaries mentioned earlier were the first crude efforts to attack this problem, but it still persists. Postharvest losses due to rodents, insects, and microbial spoilage in some areas amount to 30% or more of the harvested crop. This results in wasted seed, water, fertilizer, and labor. Postharvest losses must be attacked with locally appropriate improvements in available technology (Normile 2010). It is not enough merely to increase and conserve the supply of raw food; it must be conserved against further loss by processing and be packaged, distributed to where it is needed, and guaranteed in its safety, nutritional value, and cultural relevance. That is the role of science and technology and engineering applied to the processing of foods and beverages.
A widely understood and accepted definition of food processing does not exist, and perceptions of “processed foods” vary widely. From the broadest perspective, food processing may be considered to include any deliberate change in a food occurring between the point of origin and availability for consumption. The change could be as simple as rinsing and packaging by a food manufacturer to ensure that the food is not damaged before consumer accessibility, or as complex as formulating the product with specific additives for controlling microorganisms, maintaining desired quality attributes, or providing a specific health benefit, followed by packaging that may itself play a role in microbial control or quality preservation. Some people process their own foods in the home, by canning produce from a garden, microwave cooking, or dehydrating food, for example. Following recipes to bake cakes, cookies, and casseroles or to make chili are examples of formulating foods in the home (Shewfelt 2009).
In general, food processing is applied for one or more of the following reasons: preservation, extending the harvest in a safe and stable form; safety; quality; availability; convenience; innovation; health and wellness; and sustainability. Although the private sector carries out these processes and delivers the final product to the consumer, public investment in generating the science and engineering base necessary to continue the creativity and ultimate application of new technologies is clearly warranted.
Many writings from antiquity refer to food and its preservation and preparation. Major advances in food preservation accelerated with the development of canning, which proceeded from the investigations of Nicolas Appert in France and the subsequent activities of Peter Durand in England in the early 19th century. Appert used corked glass bottles to preserve food, and Durand introduced the concept of metal cans. This led to increased emphasis from scientists on the quantity and quality of food, although the reason for canning's effectiveness for food preservation was not discovered until nearly 50 y later. Louis Pasteur reported to the French Academy of Sciences in 1864 on the lethal effect of heat on microorganisms. W. Russel of the Univ. of Wisconsin and Samuel Cate Prescott and William Lyman Underwood of the Massachusetts Inst. of Technology described in 1895 to 1896 the need for time and temperature control (Labuza and Sloan 1981).
|“Mr. Appert found the art of fixing seasons; he makes spring, summer and fall live in bottles similarly to the gardener protecting his tender plants in greenhouses against the perils of the seasons.” (From the Courrier de l’Europe of February 10, 1809; Szczesniak 1992).|
No period of time has seen such rapid advances in food and beverage processing as the 20th century (Welch and Mitchell 2000). Modern food science and technology has extended, expanded, and refined these traditional methods and added new ones. Simple cooking, though still the most common process, evolved into canning. Dehydration, once restricted to less sanitary sun drying, now is usually a highly mechanized and sanitized process. Refrigeration has evolved from cool storage to sophisticated refrigerators and freezers, and the industrial techniques of blast freezing and individual quick freezing (IQF) are less detrimental to nutritional quality and sensory quality (for example, taste, texture). All of these developments contributed to increased nutritional quality, safety, variety, acceptability, and availability of foods and beverages. Many of these techniques are now combined into more effective preservation technologies through the concept of “hurdle technology,” combining techniques to create conditions that bacteria cannot overcome, such as combining drying with chemical preservatives and packaging, or mild heat treatment followed by packaging and refrigerated storage (Leistner and Gould 2002).
Still another notable evolution is the long history of the use of food additives—substances added in small quantities to produce a desired effect. Of the 32 “technical effects” (functional purposes) listed by the Food and Drug Administration in the Code of Federal Regulations, 24 can be recognized in the few cookbooks and recipe compilations that have survived from more than 150 y ago.
Among the additives that were once used to produce these technical effects (Hall 1978) are
- • Pearl ash (from wood ashes) and vinegar as leavening agents.
- • Sodium silicate (water glass) for dipping eggs to preserve them.
- • Lye for hulling corn.
- • Sulfur dioxide from burning sulfur as a fumigant and preservative.
- • Unlined copper utensils for making pickles greener.
- • Saltpeter and roach alum as curing and pickling agents.
- • Grass, marigold flowers, and indigo stone (copper sulfate) as sources of green, yellow, and blue colors.
Before the days of widespread industrial production of food and before the advent of modern chemistry and toxicology, these and many other crude additives were used confidently within the family without any knowledge of the risks they presented.
In the 20th century, the development of the science of toxicology permitted the careful evaluation of the safety of substances added to food. The advent of modern chemistry permitted the detection of intentional adulteration of foods by purveyors using deceitful practices, and led to the passage and enforcement of modern food laws. Frederick Accum's “Treatise on the Adulteration of Food,” published in 1820, marked the beginning of this effort. In the United States, the Pure Food and Drugs Act of 1906 prohibited adulteration and misbranding of food, issues that continued to be addressed in the United States via federal statutes. Prior to 1958, the burden of proving that a substance posed an unacceptable risk rested with the government. In that year, the Food Additives Amendment to the 1938 Federal Food, Drug, and Cosmetic Act changed that by advancing the concept of “adulteration” and imposing on food manufacturers the task of proving prior to marketing that an additive is safe under the conditions of its intended use.
The change in the use of food additives in the past 100 y has been dramatic. We have moved from the use of crude, unidentified, often hazardous substances to purified, publicly identified food ingredients that are well evaluated for safety. Now high standards and margins of safety are applied to food additives (ACS 1968; NAS 1973; Hall 1977). Today, because of modern means of detection, intentional food adulteration in industrialized countries is considered uncommon, occurring more often in foods imported from countries without effective food safety infrastructure. Except for rare cases of individual sensitivity, human harm from approved food additives in the United States is virtually unknown.
Advances in food science and technology
Drying, canning, chemical preservation, refrigeration (including chilling and freezing), and nutrient conservation and fortification were the significant advances of the 19th and 20th centuries and permitted population growth in more developed countries. Such population growth could only occur if there was sufficient food. The industrial revolution could not have occurred without a food delivery system that allowed people to leave the farms, migrate to the cities, and engage in useful production of goods and services for society.
Among the important developments during the early part of the 20th century were the discovery of vitamins and the realization of the importance of other micronutrients such as iodine, iron, and calcium. Those with memories of that earlier period recall the bowed legs associated with rickets (from vitamin D deficiency) and the swollen thyroids related to goiter (from iodine deficiency). With the introduction of the draft just before World War II, the army discovered widespread malnutrition among young American males. This led to the foundation of the Food and Nutrition Board of the Inst. of Medicine of the Natl. Academies and also the development in 1941 of the Recommended Dietary Allowances (RDAs) for essential nutrients. The difficulty of achieving these RDAs from available foods, especially among the poor, led manufacturers to fortify common foods with vitamins and other micronutrients, beginning with iodized salt in 1924. Today, fortified foods, defined by federal Standards of Identity, include such staples as pasta, milk, butter, salt, and flour.
Technological innovations in food preservation were dependent on advances in the sciences, especially chemistry and microbiology. How these sciences and technologies are applied within each society depends on the economic, biological, cultural, and political contexts for each society. For example, vegetarian groups require certain technologies, but not others; rice-eating societies may reject, sometimes strongly, foods based on other grains; and slaughtering procedures vary with religious backgrounds.