We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cereal products provide 50 % of iron and 30 % of zinc in the UK diet. However, despite having high content, the bioavailability of minerals from cereals is low. This review discusses strategies to increase mineral bioavailability from cereal-based foods. Iron and zinc are localised to specific tissue structures within cereals; however, the cell walls of these structures are resistant to digestion in the human gastrointestinal tract and therefore the bioaccessibility of these essential minerals from foods for absorption in the intestine is limited. In addition, minerals are stored in cereals bound to phytate, which is the main dietary inhibitor of mineral absorption. Recent research has focused on ways to enhance mineral bioavailability from cereals. Current strategies include disruption of plant cell walls to increase mineral release (bioaccessibility) during digestion; increasing the mineral:phytate ratio either by increasing the mineral content through conventional breeding and/or agronomic biofortification, or by reducing phytate levels; and genetic biofortification to increase the mineral content in the starchy endosperm, which is used to produce white wheat flour. While much of this work is at an early stage, there is potential for these strategies to lead to the development of cereal-based foods with enhanced nutritional qualities that could address the low mineral status in the UK and globally.
In this article, I take up the case of runic writing to reflect upon James Scott’s view of the nexus between writing and various forms of domination in early states, especially the use of literacy for taxation in cereal-growing societies. Scott’s theses provide interesting matter “to think with,” even when his grasp of historical detail has been found wanting. It is not controversial to grant Scott that cuneiform writing was a remarkable tool for statecraft, and exploitation, in the first states of Mesopotamia, around 3500 BC. The same is true of writing in other early states. But in the first states of Scandinavia, particularly Denmark ca. AD 500–800, writing had a more troubled relationship with the state. No evidence survives that runic writing was used to administer taxation or much else, as it was in other agrarian civilisations. It is true that the runic script was used to commemorate kings, most famously by Haraldr Blátǫnn (r. ca. 958–ca. 986.). But, statistically speaking, it was more often used to aggrandize the sort of local big men who usually resisted centralized power. In this article, I survey the relationship between runic writing and administration. I consider what the Danish situation suggests about the relationship between states and writing and offer a tentative hypothesis of a short-lived attempt at runic bureaucracy around 800, which created—and quickly lost control of—a shortened variety of the runic script (the Younger Futhark).
This paper assesses the effects which the building of Hadrian's Wall had on the patterns of supply and communication from the continent. Existing systems were strengthened rather than altered, and Hadrian's reign saw the full development of ports and military installations on the North Sea and Channel coasts. Navigation to Britain and sailing conditions on various routes are discussed, comparing their importance in the transport of wine, oil, exotic plants and samian ware and the movement of military personnel. Use of the Rhône–Rhine axis is emphasised for the movement of goods from Central Gaul and the Mediterranean, but other rivers in western and north-western Gaul were of some importance, as the details of samian distribution demonstrate. Finally, non-state organisation of the acquisition and distribution of commodities supplied to the army on Hadrian's Wall is strongly favoured.
The paper explores the emergence and development of arable farming in southeastern Norway by compiling and analyzing directly dated cereals from archaeological contexts. By using summed probability distributions of radiocarbon dates and Bayesian modeling, the paper presents the first comprehensive analysis of the directly dated evidence for farming in the region. The models provide a more precise temporal resolution to the development than hitherto presented. The results demonstrate that the introduction of arable farming to southeastern Norway was a long-term development including several steps. Three different stages are pointed out as important in the process of establishing arable farming: the Early and Middle Neolithic, the Late Neolithic, and the Early Iron Age.
Endosperm tissue that nourishes the embryo during seed development, upon maturity, nourishes the global population with special reference to cereal crops like maize, wheat and rice. In about 70% of the angiosperms, endosperm genome content is ‘3n’ with 2:1 (maternal:paternal) contribution, as a result of the second fertilization event. However, angiosperms evolution also documents diversity in endosperm genome content from ‘2n’ to ‘15n’, in scale with the corresponding maternal genome dosage variability (‘1n’ to ‘14n’), whereas paternal contribution is invariable. In apomicts, due to lack of fertilization, or pseudogamy (fertilization of the central cell for endosperm formation), endosperm genome dosage (m:p) has been reported to range between 1:1 and 8:3. Exceptionally, the central cell with one unreduced nucleus and fused with a reduced sperm cell, with 2:1 normal genome dosage, has been reported in Panicum. Altered genome dosage levels are reportedly correlative with eccentricities among maternal and paternal contribution to seed resource allocation. Besides endosperm ploidy variability between species of angiosperms, the present review gives an overview of the ploidy variability in endosperm cells within a seed, up to ‘690n’. In addition to genome-scale variability in the endosperm, some taxa of angiosperms exhibit chlorophyllous endosperms and some chlorophyllous embryos. Also, endosperm cell number during seed development is reported to have a strong association with grain weight at maturity. Genes underlying these traits of variability are unknown, and the present review underscores the variability and highlights the potential of the single-cell sequencing techniques towards understanding the genetic mechanisms associated with these variable traits.
The introduction of agriculture is a key defining element of the Neolithic, yet considerable debate persists concerning the nature and significance of early farming practices in north-west Europe. This paper reviews archaeobotanical evidence from 95 Neolithic sites (c. 4000–2200 cal bc) in Wales, focusing on wild plant exploitation, the range of crops present, and the significance of cereals in subsistence practices. Cereal cultivation practices in Early Neolithic Wales are also examined using cereal grain stable carbon (δ13C) and nitrogen (δ15N) isotope analysis. The Early Neolithic period witnessed the widespread uptake of cereals alongside considerable evidence for continued wild plant exploitation, notably hazelnuts and wild fruits. The possibility that wild plants and woodlands were deliberately managed or altered to promote the growth of certain plants is outlined. Small cereal grain assemblages, with little evidence for chaff and weed seeds, are common in the Early Neolithic, whereas cereal-rich sites are rare. Emmer wheat was the dominant crop in the Early Neolithic, while other cereal types were recorded in small quantities. Cereal nitrogen isotope (δ15N) values from Early Neolithic sites provided little evidence for intensive manuring. We suggest that cultivation conditions may have been less intensive when compared to other areas of Britain and Europe. In the later Neolithic period, there is evidence for a decline in the importance of cereals. Finally, the archaeobotanical and crop isotope data from this study are considered within a wider European context.
This study aimed to examine in vivo starch digestion kinetics and to unravel the mechanisms of starch hydrolysing enzymes. Ninety pigs (23 (sd 2·1) kg body weight) were assigned to one of nine treatments in a 3×3 factorial arrangement, with starch source (barley, maize, high-amylose (HA) maize) and form (isolated, within cereal matrix, extruded) as factors. We determined starch digestion coefficients (DC), starch breakdown products and digesta retention times in four small-intestinal segments (SI1–4). Starch digestion in SI2 of pigs fed barley and maize, exceeded starch digestion of pigs fed HA maize by 0·20–0·33 DC units (P<0·01). In SI3–4, barley starch were completely digested, whereas the cereal matrix of maize hampered digestion and generated 16 % resistant starch in the small intestine (P<0·001). Extrusion increased the DC of maize and HA maize starch throughout the small intestine but not that of barley (P<0·05). Up to 25 % of starch residuals in the proximal small intestine of pigs was present as glucose and soluble α(1–4) maltodextrins. The high abundance of glucose, maltose and maltotriose in the proximal small intestine indicates activity of brush-border enzymes in the intestinal lumen, which is exceeded by α-amylase activity. Furthermore, we found that in vivo starch digestion exceeded our in vitro predictions for rapidly digested starch, which indicates that the role of the stomach on starch digestion is currently underestimated. Consequently, in vivo glucose release of slowly digestible starch is less gradual than expected, which challenges the prediction quality of the in vitro assay.
Diuraphis noxia (Kurdjumov), Russian wheat aphid, is one of the world's most invasive and economically important agricultural pests of wheat and barley. In May 2016, it was found for the first time in Australia, with further sampling confirming it was widespread throughout south-eastern regions. Russian wheat aphid is not yet present in New Zealand. The impacts of this pest if it establishes in New Zealand, could result in serious control problems in wheat- and barley-growing regions. To evaluate whether D. noxia could establish populations in New Zealand we used the climate modelling software CLIMEX to locate where potential viable populations might occur. We re-parameterised the existing CLIMEX model by Hughes and Maywald (1990) by improving the model fit using currently known distribution records of D. noxia, and we also considered the role of irrigation into the potential spread of this invasive insect. The updated model now fits the current known distribution better than the previous Hughes and Maywald CLIMEX model, particularly in temperate and Mediterranean areas in Australia and Europe; and in more semi-arid areas in north-western China and Middle Eastern countries. Our model also highlights new climatically suitable areas for the establishment of D. noxia, not previously reported, including parts of France, the UK and New Zealand. Our results suggest that, when suitable host plants are present, Russian wheat aphid could establish in these regions. The new CLIMEX projections in the present study are useful tools to inform risk assessments and target surveillance and monitoring efforts for identifying susceptible areas to invasion by Russian wheat aphid.
Competitive cereal cultivars are less susceptible than others to weed interference. Their characterization may provide selection criteria that can be used as guidelines to develop new, even more competitive cultivars. Root exudates are a potential means by which competitive cultivars reduce weed growth. The objectives of this study were to evaluate the effect of cereal root exudates on Brassica kaber (DC.) L. C. Wheeler growth, to isolate and characterize the allelochemical compounds released by spring cereal cultivars, and to determine if a relation exists between these allelochemicals and cultivar competitiveness. Highly competitive (HC) and lesser competitive (LC) cultivars of four crop kinds (Triticum aestivum L. [wheat], Avena sativa L. [wild oat], two- and six-rowed Hordeum vulgare L. [barley]) were selected based on previous work. Exudates from undisturbed root systems of B. kaber and cereals were collected and used in a bioassay test with B. kaber. Root exudates were analyzed for 16 common phenolic compounds using high-performance liquid chromatography (HPLC). Bioassays indicated that cereal exudates had no negative effect on B. kaber germination, but all concentrations of cereal root exudates inhibited B. kaber root and hypocotyl growth. As cereal root exudate concentration increased, B. kaber growth decreased. For each crop kind, B. kaber growth inhibition was greater with HC cultivars than with LC cultivars. The root exudates of all crop kinds and cultivars contained benzoic, caffeic, ferulic, o-coumaric, and vanillic acids as well as scopoletin. Para-hydroxybenzoic acid was found in exudates from T. aestivum, A. sativa, and two-rowed H. vulgare cultivars. Para-coumaric acid was not identified in root exudates from LC H. vulgare cultivars. Gentisic acid was produced by A. sativa and H. vulgare. Vanillic and o-coumaric acids along with scopoletin may be responsible for the allelopathic effects of H. vulgare, T. aestivum, and A. sativa cultivars. These three compounds may be useful as possible indicators of allelopathic potential of genotypes under development and thus considered for use in breeding programs.
Field surveys were conducted during 1978 and 1979 to determine the abundance and distribution of weeds in fields seeded to barley, oats, spring wheat, and mixtures of barley and oats in various proportions in the province of Prince Edward Island. Using a stratified random sampling procedure, weeds were counted in 536 fields during the 2-yr survey period. The weed flora had a large number of species that occurred at high densities, probably due to the limited herbicide use on Prince Edward Island. The average total number of species (64), number of species per field (20), and weed density (253 plants/m2) were similar among the four crop types and the five Extension Districts of the province. Only 49 of the 77 species encountered during the survey were found in 5% or more of the fields. Low cudweed, corn spurry, and common lambsquarters were the most abundant species, occurring in more than 80% of the fields at a mean density higher than 33.4 plants/m2. Red sorrel, smartweed, common hempnettle, broadleaf plantain, and quackgrass were also found in 80% or more of the fields, but at a mean density from 14.4 to 16.5 plants/m2. The perennials accounted for 51% of the commonly occurring species, annual and biennial broadleaf species accounted for 43%, and annual grasses were a minor group with only 6% of the species.
Field trials were conducted in 1985, 1986, and 1987 to identify effective herbicide treatments to control mayweed chamomile in winter wheat. Dicamba at 280 g ae/ha, 2,4-D at 560 g ae/ha, and bromoxynil at 560 g ai/ha did not control 11-cm-tall or larger mayweed chamomile and inconsistently controlled smaller plants. Mayweed chamomile control generally declined with increasing weed size. DPX-M6316 at 8.8 g ai/ha and DPX-L5300 at 4.4 g ai/ha controlled 89% or more mayweed chamomile. CGA-131036 at 4.9 g ai/ha controlled 95% or more mayweed chamomile. DPX-M6316, DPX-L5300, and CGA-131036 controlled 3- to 18-cm-tall mayweed chamomile similarly.
Better management of synthetic nitrogen (N) fertilizers in conventional agricultural systems laid the foundation for feeding the increasing world's population since the Green Revolution. However, excessive reliance on inorganic fertilizer has resulted in environmental degradation issues. Difficulties in soil nutrition management in organic cropping systems often results in lower and variable yields, also raising questions of sustainability. Improving nitrogen use efficiency (NUE) is thus of key importance to overcome environmental concerns in conventional systems and production limitations in organic systems. The differences in the two farming systems have impacts on crop traits and N cycles, making it difficult to enhance NUE with a single strategy. Different approaches need to be adopted to improve NUE in each system. Extensive efforts have been made to better understand mechanisms to potentially improve NUE in cereal crops under both systems. This review suggests that NUE may be improved through a combination of management practices and breeding strategies specific to the management system. Diversified crop rotations with legumes are effective practices to optimize the N cycle in both conventional and organic systems. Best Management Practices coupled with nitrification inhibitors, controlled release products and split-application practices can reduce N loss in conventional systems. In organic systems, we need to take advantage of available N sources and adapt practices such as no-tillage, cover crops, and catch crops. Utilization of beneficial soil microorganisms is fundamental to optimizing availability of soil N. Estimation of soil organic matter mineralization using prediction models may be useful to enhance NUE if models are calibrated for target environments. Cereal crops are often bred under optimum N conditions and may not perform well under low N conditions. Thus, breeders can integrate genetic and phenotypic information to develop cultivars adapted to specific environments and cultivation practices. The proper choice and integration of strategies can synchronize N demand and supply within a system, resulting in reduced risk of N loss while improving NUE in both conventional and organic systems.
The environment in which a plant grows (maternal environment) can affect seed viability, germination, and dormancy. We assessed the effects of maternal environment on wild oat seed viability, germination, dormancy, and pathogen infection by collecting and analyzing wild oat seed from above and below a barley canopy at three field sites in Montana. The viability of wild oat seed collected below a crop canopy was consistently less than it was for seed from the overstory but varied among sites and years. Reductions in viability because of relative canopy position ranged from 10% to 30%. Effects of position relative to crop canopy on weed seed germination/dormancy rates varied by site and suggest that the direction and magnitude of the effects of maternal environment on dormancy depend on environmental conditions. These effects may be driven by crop competition or by changes in seed pathogen pressure or both. Seven species each of fungi and bacteria were isolated from wild oat seeds. The only fungi causing reductions in seed viability (15%) was isolated from understory seeds, and several bacteria from both overstory and understory sources reduced seed germination. Results suggest that, in addition to the known weed-suppressive effects of using taller or earlier emerging varieties of crops, such crops can reduce weed spread through effects on weed seed demography because weeds growing beneath the crop canopy produce a reduced amount of viable seed that is less likely to germinate in the following year.
Field studies were conducted to compare the effectiveness of PRE and POST applications of a prepackaged mixture of flufenacet plus metribuzin with that of diclofop for winter wheat tolerance and control of Italian ryegrass. Additional studies investigated the effectiveness of reduced rates of flufenacet plus metribuzin applied POST to Italian ryegrass when wheat was in the spike stage. All PRE and POST applications of flufenacet plus metribuzin produced similar or greater injury to wheat and more consistent control of Italian ryegrass than PRE or POST applications of diclofop. PRE applications of flufenacet plus metribuzin controlled Italian ryegrass 73 to 77%, whereas POST applications controlled Italian ryegrass 77 to 99%. PRE applications of diclofop controlled Italian ryegrass 57%; POST application controlled Italian ryegrass 78%. Wheat injury from flufenacet plus metribuzin applications varied with application rate, cultivar, and year of application.
Determinants of the use of cereal and pulse residue for livestock feeding and soil mulching among smallholder farmers in the mixed farming system were analyzed. Crop residue (CR) is dual purpose resources in the mixed crop–livestock systems of the Ethiopian highlands. They serve as livestock feed and inputs for soil and water conservation. They are generated predominantly from cereals and pulses. However, in view of the allocation of CR, soil conservation and livestock are two competing enterprises. Identifying the determinants of the intensity of use of cereal and pulse residue may help in designing strategies for more efficient CR utilization. Data on CR were generated and its utilization was collected in two highland regions in Ethiopia from 160 households using a structured questionnaire. The data were analyzed using the multivariate Tobit model. Results of the study showed that farmers prefer using CR from pulses over CR from cereals for livestock feeding purposes. The proportion of CR from pulses that was used as feed was positively affected by education level of the farmer, livestock extension service, number of small ruminants and CR production from the previous season. Distance of farm plots from residences of the farm households negatively affected the proportions of cereal and pulse residue used for feed. The use of pulse residue increased significantly when the women participated in decision making on CR utilization. The proportion of cereal and pulse residue used for soil mulch was positively affected by the education level of the farmer, the distance between the homestead and the cultivated land, extension service, awareness about soil mulch, the slope of cultivated land, participation in farmer-to-farmer extension and CR generated in the preceding season. In view that pulse CR have better nutritive value compared with cereal CR, better utilization of CR could be achieved by maximizing the use of pulse residue as livestock feed and optimizing the use of cereal residue as soil mulch. More livestock extension on the nutritive value of pulse residue should be provided to the farmers who cultivate sloppy plots. Encouraging the culture of labor exchange among the farmers could result in increased labor availability in the farms that would facilitate the transport and storage of pulse residue and increase its use as livestock feed. Increasing the awareness among farmers about the superiority of the pulse residue over cereal residue as feed and encouraging use of cereal residue as soil mulch could optimize the utilization of CR in the household.
Plant macrofossils from the sites of Khao Sam Kaeo and Phu Khao Thong on the
Thai-Malay Peninsula show evidence of cross-cultural interactions,
particularly between India to the west and Southeast Asia to the east.
Archaeobotanical analysis of various cereals, beans and other crops from
these assemblages sheds light on the spread and adoption of these species
for local agriculture. There is also early evidence for the trade of key
commodities such as cotton. The plant remains illustrate a variety of
influences and networks of contact across South and Southeast Asia during
the late first millennium BC.
Plant lignans are diphenolic compounds ingested with whole grains and seeds and converted to enterolignans by the colonic microbiota. In the present study, we investigated absorption and metabolism of plant lignans and enterolignans in vivo after consumption of cereal-based diets. Six pigs fitted with catheters in the mesenteric artery and portal vein and with a flow probe attached to the portal vein along with twenty pigs for quantitative collection of urine were used for this study. The animals were fed bread based on wheat flour low in plant lignans and three lignan-rich breads based on whole-wheat grain, wheat aleurone flour or rye aleurone flour. Plant lignans and enterolignans in plasma were monitored daily at fast after 0–3 d of lignan-rich intake, and on the 4th day of lignan-rich intake a 10-h profile was completed. Urine samples were collected after 11 d of lignan-rich diet consumption. The concentrations of plant lignans were low at fast, and was 1·2–2·6 nmol/l after switching from the low-lignan diet to the lignan-rich diets. However, on the profile day, the concentration and quantitative absorption of plant lignans increased significantly from 33 nmol/h at fast to 310 nmol/h 0–2·5 h after ingestion with a gradual increase in the following periods. Quantitatively, the absorption of plant lignans across diets amounted to 7 % of ingested plant lignans, whereas the urinary excretion of plant lignans was 3 % across diets. In conclusion, there is a substantial postprandial uptake of plant lignans from cereals, suggesting that plant lignans are absorbed from the small intestine.
As in other cultivated species, dormancy can be seen as a problem in cereal production, either due to its short duration or to its long persistence. Indeed, cereal crops lacking enough dormancy at harvest can be exposed to pre-harvest sprouting damage, while a long-lasting dormancy can interfere with processes that rely on rapid germination, such as malting or the emergence of a uniform crop. Because the ancestors of cereal species evolved under very diverse environments worldwide, different mechanisms have arisen as a way of sensing an appropriate germination environment (a crucial factor for winter or summer annuals such as cereals). In addition, different species (and even different varieties within the same species) display diverse grain morphology, allowing some structures to impose dormancy in some cereals but not in others. As in seeds from many other species, the antagonism between the plant hormones abscisic acid and gibberellins is instrumental in cereal grains for the inception, expression, release and re-induction of dormancy. However, the way in which this antagonism operates is different for the various species and involves different molecular steps as regulatory sites. Environmental signals (i.e. temperature, light quality and quantity, oxygen levels) can modulate this hormonal control of dormancy differently, depending on the species. The practical implications of knowledge accumulated in this field are discussed.
In sheep production systems based on extensive grazing, neonatal mortality often reaches 15% to 20% of lambs born, and the mortality rate can be doubled in the case of multiple births. An important contributing factor is the nutrition of the mother because it affects the amount of colostrum available at birth. Ewes carrying multiple lambs have higher energy requirements than ewes carrying a single lamb and this problem is compounded by limitations to voluntary feed intake as the gravid uterus compresses the rumen. This combination of factors means that the nutritional requirements of the ewe carrying multiple lambs can rarely be met by the supply of pasture alone. This problem can overcome by supplementation with energy during the last week of pregnancy, a treatment that increases colostrum production and also reduces colostrum viscosity, making it easier for the neonatal lamb to suck. In addition, litter size and nutrition both accelerate the decline in concentration of circulating progesterone that, in turn, triggers the onsets of both birth and lactogenesis, and thus ensures the synchrony of these two events. Furthermore, the presence of colostrum in the gut of the lamb increases its ability to recognize its mother, and thus improves mother–young bonding. Most cereal grains that are rich in energy in the form of starch, when used as supplements in late pregnancy will increase colostrum production by 90% to 185% above control (unsupplemented) values. Variation among types of cereal grain in the response they induce may be due to differences in the amount of starch digested post-ruminally. As a percentage of grain dry matter intake, the amount of starch entering the lower digestive tract is 14% for maize, 8.5% for barley and 2% for oats. Supplements of high quality protein from legumes and oleiferous seeds can also increase colostrum production but they are less effective than cereal grains. In conclusion, short-term supplementation before parturition, particularly with energy-rich concentrates, can improve colostrum production, help meet the energy and immunological requirements for new-born lambs, and improve lamb survival.
Oats are undervalued in comparison with wheat, rice and barley, despite their unique composition that includes many of the nutrients required for health and a reduced risk of degenerative disease incidence. Furthermore, oats as whole grain and some of their associated products also contain β-glucan, a complex polysaccharide that has an approved health claim to reduce blood cholesterol levels and reduce the risk of CHD incidence if consumed at ≥ 3 g/d. At the agronomic level, oats exhibit optimal growth in regions of moderate temperature and long day length. In addition, they can tolerate wet weather and acidic soils more effectively than other cereals, such as wheat. Studies have shown that there is diversity in the content and composition of nutrients and health-beneficial components within the available wild and cultivated germplasm and that these are amenable to be enhanced by different agronomic practices as well as are susceptible to climatic variation. The advances in modern plant genetics, developed in sister cereals such as wheat, rice and barley, mean that oat development and exploitation should see an acceleration in the coming decade as they are adopted and applied. These advances include approaches such as genome sequencing, genotyping by sequencing and the allied next-level analytical approaches of RNA sequencing, transcriptome profiling and metabolomics. The collation and coordination of these approaches should lead to the generation of new, tailored oat varieties that are nutritionally enhanced and contain a greater proportion of health-beneficial components that can be translated through into a wide(r) range of consumer products with the ultimate hope of associated benefits to human health and nutrition.