The production of protein from animal sources is often criticized because of the low efficiency of converting plant protein from feeds into protein in the animal products. However, this critique does ...not consider the fact that large portions of the plant-based proteins fed to animals may be human-inedible and that the quality of animal proteins is usually superior as compared with plant proteins. The aim of the present study was therefore to assess changes in protein quality in the course of the transformation of potentially human-edible plant proteins into animal products via livestock production; data from 30 Austrian dairy farms were used as a case study. A second aim was to develop an approach for combining these changes with quantitative aspects (e.g. with the human-edible feed conversion efficiency (heFCE), defined as kilogram protein in the animal product divided by kilogram potentially human-edible protein in the feeds). Protein quality of potentially human-edible inputs and outputs was assessed using the protein digestibility-corrected amino acid score and the digestible indispensable amino acid score, two methods proposed by the Food and Agriculture Organization of the United Nations to describe the nutritional value of proteins for humans. Depending on the method used, protein scores were between 1.40 and 1.87 times higher for the animal products than for the potentially human-edible plant protein input on a barn-gate level (=protein quality ratio (PQR)). Combining the PQR of 1.87 with the heFCE for the same farms resulted in heFCE×PQR of 2.15. Thus, considering both quantity and quality, the value of the proteins in the animal products for human consumption (in this case in milk and beef) is 2.15 times higher than that of proteins in the potentially human-edible plant protein inputs. The results of this study emphasize the necessity of including protein quality changes resulting from the transformation of plant proteins to animal proteins when evaluating the net contribution of livestock to the human food supply. Furthermore, these differences in protein quality might also need to be considered when choosing a functional unit for the assessment of environmental impacts of the production of different proteins.
When fed human-edible feeds, such as grains and pulses, dairy cows are very inefficient in transforming them into animal products. Therefore, strategies to reduce human-edible inputs in dairy cow ...feeding are needed to improve food efficiency. The aim of this feeding trial was to analyze the effect of the full substitution of a common concentrate mixture with a by-product concentrate mixture on milk production, feed intake, blood values, and the edible feed conversion ratio (eFCR), defined as human-edible output per human edible input. The experiment was conducted as a change-over design, with each experimental period lasting for 7wk. Thirteen multiparous and 5 primiparous Holstein cows were randomly assigned to 1 of 2 treatments. Treatments consisted of a grass silage-based forage diet supplemented with either conventional ingredients or solely by-products from the food processing industry (BP). The BP mixture had higher contents of fiber and ether extract, whereas starch content was reduced compared with the conventional mixture. Milk yield and milk solids were not affected by treatment. The eFCR in the BP group were about 4 and 2.7 times higher for energy and protein, respectively. Blood values did not indicate negative effects on cows’ metabolic health status. Results of this feeding trial suggest that by-products could replace common concentrate supplements in dairy cow feeding, resulting in an increased eFCR for energy and protein which emphasizes the unique role of dairy cows as net food producers.
Besides the widely discussed negative environmental effects of dairy production, such as greenhouse gas emissions, the feeding of large amounts of potentially human-edible feedstuffs to dairy cows is ...another important sustainability concern. The aim of this study was therefore to investigate the effects of a complete substitution of common cereal grains and pulses with a mixture of wheat bran and sugar beet pulp in a high-forage diet on cow performance, production efficiency, feed intake, and ruminating behavior, as well as on net food production potential. Thirteen multiparous and 7 primiparous mid-lactation Holstein dairy cows were randomly assigned to 1 of 2 treatments in a change-over design with 7-wk periods. Cows were fed a high-forage diet (grass silage and hay accounted for 75% of the dry matter intake), supplemented with either a cereal grain-based concentrate mixture (CON), or a mixture of wheat bran and dried sugar beet pulp (WBBP). Human-edible inputs were calculated for 2 different scenarios based on minimum and maximum potential recovery rates of human-edible energy and protein from the respective feedstuffs. Dietary starch and neutral detergent fiber contents were 3.0 and 44.1% for WBBP, compared with 10.8 and 38.2% in CON, respectively. Dietary treatment did not affect milk production, milk composition, feed intake, or total chewing activity. However, chewing index expressed in minutes per kilogram of neutral detergent fiber ingested was 12% lower in WBBP compared with CON. In comparison to CON, the human-edible feed conversion efficiencies for energy and protein, defined as human-edible output per human-edible input, were 6.8 and 5.3 times higher, respectively, in WBBP under the maximum scenario. For the maximum scenario, the daily net food production (human-edible output minus human-edible input) increased from 5.4MJ and 250g of crude protein per cow in CON to 61.5MJ and 630g of crude protein in the WBBP diet. In conclusion, our data suggest that in forage-based dairy production systems, wheat bran and sugar beet pulp could replace common cereal grains in mid-lactation dairy cows without impairing performance, while strongly increasing human-edible feed conversion efficiency and net food production index.
The feed value of whole crop maize silage (WCMS) depends on nutrient composition, ruminal degradability and whole tract digestibility. However, as the ruminal degradation rate is involved in physical ...regulation of feed intake, ruminal degradability of WCMS may also affect feed intake and milk production of dairy cows. Thus, the aim of this study was to examine relationships between nutrient composition, ruminal degradability, and whole tract digestibility of WCMS and feed intake and milk production of dairy cows. Nine varieties were tested in 3 consecutive years. Nutrient composition analyses included proximate analysis and determination of cell wall constituents. Whole tract digestibility was determined in vivo using wethers and ruminal degradability was examined in situ using four rumen-fistulated steers. Feed intake and milk production were measured using nine cows per variety. Cows were fed a ration consisting of 75.0% WCMS, 8.5% hay and 16.5% soya bean meal (dry matter basis) ad libitum. Variety did not influence nutrient composition, except for the concentration of ADF (ADFom), ADL and utilisable CP (uCP). In contrast, variety had a significant effect (P < 0.05) on ruminal degradability of NDF (aNDFom) and on whole tract digestibility of organic matter (OM) and non-fibre carbohydrates. Dry matter intake (DMI) of WCMS tended to be affected by variety (0.05<P < 0.10) whereas no effect on energy-corrected milk production (ECM) was observed. The year of harvest had an influence on net energy for lactation and uCP concentration, ruminal degradability and whole tract digestibility of nutrients, as well as on DMI and ECM of dairy cows. Whole tract aNDFom digestibility of WCMS was positively correlated with aNDFom concentration (R2 = 0.48) and whole tract OM digestibility (R2 = 0.58). Furthermore, ruminal OM degradability was positively correlated with DMI (R2 = 0.57) and ECM (R2 = 0.49) of dairy cows. The results indicate that ruminal degradability and whole tract digestibility have a greater impact on the feed value of WCMS than nutrient composition and should be focused on to optimise feed intake and milk production of dairy cows. Thus, maize breeders should include ruminal degradability and whole tract digestibility parameters in variety testing programs to increase the informative value of variety descriptions for farmers.
During the winter feeding period in organic dairy production systems in the alpine and pre-alpine regions of Austria and its neighboring countries, maize silage is an energy-rich forage that is ...regularly included in grass-silage-based diets to improve the energy supply of the cows. Italian ryegrass (Lolium multiflorum Lam.) is also a high-energy fodder grass popular as forage for dairy cows, but it is rarely cultivated in Austrian organic agriculture. The two crops differ in their cultivation demands and characteristics. Italian ryegrass establishes rapidly and may reduce the risk of soil erosion. Italian ryegrass would be a beneficial addition to crop rotation, which is an essential tool in successful organic farming. In a 15-week feeding trial, Italian ryegrass silage and maize silage were fed to 22 lactating Holstein dairy cows. Organically produced Italian ryegrass silage and maize silage were included at a rate of 40% of dry matter (DM) in grass-silage-based mixed basal diets. The mixed basal diets were supplemented with modest amounts of additional concentrates (2.7–3.0 kg DM day−1). Owing to the higher energy content of maize silage as compared to Italian ryegrass silage, the maize diet provided more energy 6.3 MJ net energy for lactation (NEL) kg−1 DM than the ryegrass diet (6.15 MJ NEL kg−1 DM). The protein supply of the maize diet and the ryegrass diet was intended to be equal, but in fact the protein content of the maize diet was significantly lower (122 g crude protein kg−1 DM) than that of the ryegrass diet (141 g kg−1 DM). When the maize diet was fed, feed intake, milk yield and milk protein content were significantly higher as compared to the ryegrass diet. Also, intake of crude protein was significantly lower when feeding the maize diet, and in combination with the higher milk protein yield, this enabled an efficiency of gross nitrogen (N) utilization as high as 0.304. This level of N efficiency can be considered as above average and was significantly and considerably higher than the level of 0.259 observed when the ryegrass diet was fed. Therefore, maize silage upholds its reputation as an ideal energy-rich component in grass-silage-based dairy cow diets.
A rumen simulation technique was used to evaluate the effects of the complete substitution of a common concentrate mixture (CON) with a mixture consisting solely of by-products from the food industry ...(BP) at 2 different forage-to-concentrate ratios on ruminal fermentation profile, nutrient degradation, and abundance of rumen microbiota. The experiment was a 2×2 factorial arrangement with 2 concentrate types (CON and BP) and 2 concentrate levels (25 and 50% of diet dry matter). The experiment consisted of 2 experimental runs with 12 fermentation vessels each (n=6 per treatment). Each run lasted for 10d, with data collection on the last 5d. The BP diets had lower starch, but higher neutral detergent fiber (NDF) and fat contents compared with CON. Degradation of crude protein was decreased, but NDF and nonfiber carbohydrate degradation were higher for the BP diets. At the 50% concentrate level, organic matter degradation tended to be lower for BP and CH4 formation per unit of NDF degraded was also lower for BP. The BP mixture led to a higher concentration of propionate and a lower acetate-to-propionate ratio, whereas concentrations of butyrate and caproate decreased. Concentrate type did not affect microbial community composition, except that the abundance of bacteria of the genus Prevotella was higher for BP. Increasing the concentrate level resulted in higher degradation of organic matter and crude protein. At the higher concentrate level, total short-chain fatty acid formation increased and concentrations of isobutyrate and valerate decreased. In addition, at the 50% concentrate level, numbers of protozoa increased, whereas numbers of methanogens, anaerobic fungi, and fibrolytic bacteria decreased. No interaction was noted between the 2 dietary factors on most variables, except that at the higher concentrate level the effects of BP on CH4 and CO2 formation per unit of NDF degraded, crude protein degradation, and the abundance of Prevotella were more prominent. In conclusion, the results of this study suggest that BP in the diet can adequately substitute CON with regard to ruminal fermentation profile and microbiota, showing even favorable fermentation patterns when fed at 50% inclusion rate.
The great variability of outcome seen in stroke patients has led to an interest in identifying predictors of outcome. The combination of clinical and imaging variables as predictors of stroke outcome ...in a multivariable risk adjustment model may be more powerful than either alone. The purpose of this study was to determine the multivariable relationship between infarct volume, 6 clinical variables, and 3-month outcomes in ischemic stroke patients.
Included in the study were 256 eligible patients from the Randomized Trial of Tirilazad Mesylate in Acute Stroke (RANTTAS). Six clinical variables and 1-week infarct volume were the prespecified predictor variables. The National Institutes of Health Stroke Scale, Barthel Index, and Glasgow Outcome Scale were the outcomes. Multivariable logistic regression techniques were used to develop the model equations, and bootstrap techniques were used for internal validation. Predictive performance of the models was assessed for discrimination with receiver operator characteristic (ROC) curves and for calibration with calibration curves.
The predictive models had areas under the ROC curve of 0.79 to 0.88 and demonstrated nearly ideal calibration curves. The areas under the ROC curves were statistically greater (P<0.001) with both clinical and imaging information combined than with either alone for predicting excellent recovery and death or severe disability.
Combined clinical and imaging variables are predictive of 3-month outcome in ischemic stroke patients. Demonstration of this relationship with acute clinical variables and 1-week infarct information supports future attempts to predict 3-month outcome with all acute variables.
OBJECTIVE: To examine the degree to which variation in place of death is explained by differences in the characteristics of patients, including preferences for dying at home, and by differences in ...the characteristics of local health systems.
DESIGN: We drew on a clinically rich database to carry out a prospective study using data from the observational phase of the Study to Understand Prognoses and Preferences for Outcomes and Risks of Treatments (SUPPORT component). We used administrative databases for the Medicare program to carry out a national cross‐sectional analysis of Medicare enrollees place of death (Medicare component).
SETTING: Five teaching hospitals (SUPPORT); All U.S. Hospital Referral Regions (Medicare).
STUDY POPULATIONS: Patients dying after the enrollment hospitalization in the observational phase of SUPPORT for whom place of death and preferences were known. Medicare beneficiaries who died in 1992 or 1993.
MAIN OUTCOME MEASURES: Place of death (hospital vs non‐hospital).
RESULTS: In SUPPORT, most patients expressed a preference for dying at home, yet most died in the hospital. The percent of SUPPORT patients dying in‐hospital varied by greater than 2‐fold across the five SUPPORT sites (29 to 66%). For Medicare beneficiaries, the percent dying in‐hospital varied from 23 to 54% across U.S. Hospital Referral Regions (HRRs). In SUPPORT, variations in place of death across site were not explained by sociodemographic or clinical characteristics or patient preferences. Patient level (SUPPORT) and national cross‐sectional (Medicare) multivariate models gave consistent results. The risk of in‐hospital death was increased for residents of regions with greater hospital bed availability and use; the risk of in‐hospital death was decreased in regions with greater nursing home and hospice availability and use. Measures of hospital bed availability and use were the most powerful predictors of place of death across HRRs.
CONCLUSIONS: Whether people die in the hospital or not is powerfully influenced by characteristics of the local health system but not by patient preferences or other patient characteristics. These findings may explain the failure of the SUPPORT intervention to alter care patterns for seriously ill and dying patients. Reforming the care of dying patients may require modification of local resource availability and provider routines.
OBJECTIVETo assess the accuracy and validity of Acute Physiology and Chronic Health Evaluation (APACHE) III hospital mortality predictions in an independent sample of U.S. intensive care unit (ICU) ...admissions.
DESIGNNonrandomized, observational, cohort study.
SETTINGTwo hundred eighty-five ICUs in 161 U.S. hospitals, including 65 members of the Council of Teaching Hospitals and 64 nonteaching hospitals.
PATIENTSA consecutive sample of 37,668 ICU admissions during 1993 to 1996; including 25,448 admissions at hospitals with >or=to400 beds and 1,074 admissions at hospitals with <200 beds.
INTERVENTIONSNone.
MEASUREMENTS AND MAIN RESULTSWe used demographic, clinical, and physiologic information recorded during ICU day 1 and the APACHE III Equation topredict the probability of hospital mortality for each patient. We compared observed and predicted mortality for all admissions and across patient subgroups and assessed predictive accuracy using tests of discrimination and calibration. Aggregate hospital death rate was 12.35% and predicted hospital death rate was 12.27% (p = .541). The model discriminated between survivors and nonsurvivors well (area under receiver operating curve = 0.89). A calibration curve showed that the observed number of hospital deaths was close to the number of deaths predicted by the model, but when tested across deciles of risk, goodness-of-fit (Hosmer-Lemeshow statistic, chi-square = 48.71, 8 degrees of freedom, p < .0001) was not perfect. Observed and predicted hospital mortality rates were not significantly (p < .01) different for 55 (84.6%) of APACHE III's 65 specific ICU admission diagnoses and for 11 (84.6%) of the 13 residual organ system-related categories. The most frequent diagnoses with significant (p < .01) differences between observed and predicted hospital mortality rates included acute myocardial infarction, drug overdose, nonoperative head trauma, and nonoperative multiple trauma.
CONCLUSIONSAPACHE III accurately predicted aggregate hospital mortality in an independent sample of U.S. ICU admissions. Further improvements in calibration can be achieved by more precise disease labeling, improved acquisition and weighting of neurologic abnormalities, adjustments that reflect changes in treatment outcomes over time, and a larger national database. (Crit Care Med 1998; 26:1317-1326)