12 research outputs found

    Palatability of New Zealand Grass-Finished and American Grain-Finished Beef Strip Steaks of Varying USDA Quality Grades and Wet-Aging Treatments

    Get PDF
    The objective of this study was to evaluate palatability of strip loin steaks from grass- and grain-fed beef across5 United States Department of Agriculture (USDA) quality grades and 3 wet-aging periods. Beef strip loins (N=200; 20/USDA quality grade×fed cattle type) representing 5 USDA quality grades (USDA Prime, Top Choice [Average and High Choice], Low Choice, Select, and Standard) and 2 fed cattle types (New Zealand grass-finished and U.S. grain-finished)were used in the study. Each strip loin was equally portioned into thirds and randomly assigned to one of 3 wet-aging periods (7 d, 21 d, or 42 d). Consumer panelists (N=600; 120/location: Texas, California, Florida, Kansas, and Pennsylvania) evaluated 8 grilled beef steak samples for palatability traits, acceptability, and eating quality. All palatability traits were impacted by the interaction of diet×quality grade (P<0.05). Although similar (P>0.05) to grass-fed Prime steaks for juiciness, tenderness, and overall liking, grain-fed Prime steaks rated higher (P<0.05) than all other grass- and grain-finished treatments for all palatability attributes. Grass-finished Top Choice, Low Choice, and Standard steaks rated higher (P<0.05) than the respective grain-finished quality grades for juiciness and tenderness. Grain-finished Standard steaks rated lower (P<0.05) than all other grass- and grain-finished treatments for juiciness, tenderness, and overall liking but were similar (P>0.05) to grass-finished Standard steaks for flavor liking. Our results indicate that beef strip loin steaks of similar quality grades from grass-finished New Zealand cattle produce similar eating experiences when compared with those from U.S. grain-finished beef, even following extended postmortem aging. This indicates improved palatability for consumers based on marbling without respect to grass- or grain-finishing diets

    Flavor Development of Ground Beef from 3 Muscles, 3 USDA Quality Grades, and 2 Wet-Aging Durations

    Get PDF
    The objective of this study was to understand the influence of USDA quality grade, muscle, and aging duration on ground beef flavor development. Prime (PR), Low Choice, and Standard quality grade beef subprimals were collected and aged for either 21 or 42 d. Following aging, subprimals were fabricated into gluteus medius (GM), biceps femoris (BF),and serratus ventralis (SV) then ground and formed into patties. Raw patties were designated for proximate composition, fractionated fatty acids, and thiobarbituric acid reactive substances (TBARS). Cooked patties were designated for consumer sensory analysis, volatile compound analysis, and TBARS. Patties were cooked on a preheated griddle to 72°C. All data were analyzed as split-split plot where quality grade served as the whole plot factor, muscle as the subplot factor, and aging duration as the sub-subplot factor. Significance was determined at P<0.05. A quality grade×muscle interaction was observed for moisture, where regardless of muscle, PR subprimals had the lowest moisture percentage (P<0.05). Raw TBARS was not influenced by any interactions or main effects (P>0.05). Individually, the BF and 42 d aged subprimals had the greatest cooked malondialdehyde concentration (P<0.05). Patties from GM aged for 21 d were rated higher for flavor liking compared to GM aged for 42 d and SV aged for 21 and 42 d (P<0.05). GM patties aged for 21 d were rated higher for overall liking compared to GM patties aged for 42 d (P<0.05). Quality grade did not influence any lipid-derived volatile compounds (P>0.05). The SV produced less Maillard reaction products (P<0.05). Aging for 42 d increased lipid-derived volatiles (P<0.05). Consumer liking of aged product is dependent on muscle. Aging recommendations should be muscle-specific to maximize beef eating experience

    Flavor, Tenderness, and Related Chemical Changes of Aged Beef Strip Loins

    Get PDF
    Varying aging times and methods were evaluated for their effect on flavor, tenderness, and related changes involatile compounds and flavor precursors. Strip loin sections from USDA Choice beef carcasses (n = 38) were randomly assigned to treatments: (1) 3 d wet-aged, (2) 14 d wet-aged, (3) 28 d wet-aged, (4) 35 d wet-aged, (5) 49 d wet-aged, (6) 63 d wet-aged, (7) 21 d dry-aged, and (8) 14 d wet-aged followed by 21 d dry-aged. Samples were analyzed for trained sensory attributes, shear force, volatile compounds, and flavor precursors (fatty acids, free amino acids, and sugars). Discriminant function analysis was used to identify sensory attributes contributing the greatest to treatment differences. Flavor notes were not differentiated in beef aged up to 35 d, regardless of aging method. A shift in flavor occurred between 35 d and 49 d of wet-aging time that was characterized by more intense sour and musty/earthy notes. Both shear force assessment and trained panelists agreed that tenderness was not affected (P > 0.05) by additional aging beyond 28 d. Volatile compound production and liberation of amino acids and sugars increased (P < 0.01) during the progression of aging time, with no change (P > 0.05) in fatty acid composition, which may be a result of metabolic processes like microbial metabolism. Chemical properties shared strong positive relationships (r > 0.50, P < 0.001) with sour, musty/earthy, and overall tenderness. These results substantiate the deteriorative effect of extended aging times of 49 d or greater on flavor of beef strip loins without tenderness improvement

    Acute presentation of a heterotopic pregnancy following spontaneous conception: a case report

    Get PDF
    Spontaneous heterotopic pregnancy is a rare clinical condition in which intrauterine and extra uterine pregnancies occur at the same time. It can be a life threatening condition and can be easily missed with the diagnosis being overlooked. We present the case of a 40 year old patient who was treated for a heterotopic pregnancy. She had a transvaginal ultrasound because of a previous ectopic pregnancy and an intrauterine gestational sac was seen with false reassurances. The patient presented acutely with a ruptured tubal pregnancy and this was managed laparoscopically. The ectopic pregnancy was not suspected at her initial presentation. A high index of suspicion is needed in women with risk factors for an ectopic pregnancy and in low risk women who have free fluid with or without an adnexal mass with an intrauterine gestation

    Metaphylactic antimicrobial effects on occurrences of antimicrobial resistance in \u3ci\u3eSalmonella enterica, Escherichia coli\u3c/i\u3e and \u3ci\u3eEnterococcus\u3c/i\u3e spp. measured longitudinally from feedlot arrival to harvest in high-risk beef cattle

    Get PDF
    Aims: Our objective was to determine how injectable antimicrobials affected populations of Salmonella enterica, Escherichia coli and Enterococcus spp. in feedlot cattle. Methods and Results: Two arrival date blocks of high-risk crossbred beef cattle (n = 249; mean BW = 244 kg) were randomly assigned one of four antimicrobial treatments administered on day 0: sterile saline control (CON), tulathromycin (TUL), ceftiofur (CEF) or florfenicol (FLR). Faecal samples were collected on days 0, 28, 56, 112, 182 and study end (day 252 for block 1 and day 242 for block 2). Hide swabs and subiliac lymph nodes were collected the day before and the day of harvest. Samples were cultured for antimicrobial-resistant Salmonella, Escherichia coli and Enterococcus spp. The effect of treatment varied by day across all targeted bacterial populations (p ≤ 0.01) except total E. coli. Total E. coli counts were greatest on days 112, 182 and study end (p ≤ 0.01). Tulathromycin resulted in greater counts and prevalence of Salmonella from faeces than CON at study end (p ≤ 0.01). Tulathromycin and CEF yielded greater Salmonella hide prevalence and greater counts of 128ERYR E. coli at study end than CON (p ≤ 0.01). No faecal Salmonella resistant to tetracyclines or third-generation cephalosporins were detected. Ceftiofur was associated with greater counts of 8ERYR Enterococcus spp. at study end (p ≤ 0.03). By the day before harvest, antimicrobial use did not increase prevalence or counts for all other bacterial populations compared with CON (p ≥ 0.13). Conclusions: Antimicrobial resistance (AMR) in feedlot cattle is not caused solely by using a metaphylactic antimicrobial on arrival, but more likely a multitude of environmental and management factors

    Palatability Characterization of Fresh and Dry-Aged Ground Beef Patties

    Get PDF
    Descriptive trained sensory attributes, fatty acids, and volatile compounds were determined to characterize the effects of dry-aging on ground beef. Beef shoulder clods were ground to include 100% fresh beef, 100% dry-aged beef, and a 50% fresh and 50% dry-aged ground beef blend. Samples comprised of 100% dry-aged beef were rated greatest (P < 0.001) for browned/grilled, earthy/mushroom, and nutty/roasted-nut flavors; however, panelists also detected greater (P ≤ 0.011) incidences of sour/acidic and bitter flavors. The addition of dry-aged beef increased (P < 0.001) hardness and reduced (P < 0.001) tenderness. Dry-aging also caused a shift in saturated fatty acids, where shorter chain saturated fatty acids (≤ 16:0) were reduced (P ≤ 0.034) compared to stearic acid (18:0). Meanwhile, increases of trans-octadecenoic acid (18:1 trans) and decreases of cis monounsaturated fatty acids were present in dry-aged beef. Concentrations of 18:2 conjugated linoleic isomers were greatest (P < 0.001) in fresh beef and decreased with the incorporation of dry-aged beef. Several lipid-derived volatile compounds were greater (P < 0.05) in dry-aged beef compared with fresh beef, implying a greater degree of lipid degradation among dry-aged beef. Increases (P ≤ 0.031) were determined for 3- and 2-methyl butanal with the addition of dry-aged beef. Intermediates of the Maillard reaction, 2,3-butanedione and acetoin, were determined to be greatest (P ≤ 0.046) from dry-aged beef. Alterations of fatty acids and volatile compounds with dry-aging were determined to be related with intensity of individual flavor attributes. Overall, it may be concluded that inclusion of dry-aged beef impacts flavor profile through altered fatty acid profiles and flavor related compounds. These results support the idea that dry-aging may be utilized to impart an altered ground beef flavor experience

    Understanding the Impact of Oven Temperature and Relative Humidity on the Beef Cooking Process

    Get PDF
    The objective of this study was to evaluate the roles that cooking rate and relative humidity has on the sensory development of beef strip steaks. Thirty USDA Choice beef strip loins were collected from a commercial packing facility. Each strip loin was cut into steaks and randomly assigned to 1 of 6 cooking methods utilizing 2 oven temperatures (80°C and 204°C) and 3 levels of relative humidity [zero (ZH), mid (MH), and high (HH)]. Cooked steaks were used to evaluate internal and external color, Warner-Bratzler and slice shear force, total collagen content, protein denaturation, and trained sensory ratings. Relative humidity greatly reduced cooking rate, especially at 80°C. Steaks cooked at 80°C-ZH had the greatest (P 0.05). Steaks cooked at 80°C-ZH appeared the most (P 0.01) surface color. Total collagen was greatest (P 0.05) by treatment. Increased (P = 0.02) sarcoplasmic protein denaturation was observed with ZH and MH, while increased (P = 0.02) actin denaturation was observed only with ZH. Oven temperature did not influence (P > 0.05) protein denaturation. Trained panelists rated steaks the most tender (P 0.05) juiciness at 204°C; however, MH and HH produced a juicier (P < 0.01) steak when cooked at 80°C. Humidity hindered (P < 0.01) the development of beefy/brothy and brown/grilled flavors but increased (P = 0.01) metallic/bloody intensity. Lower oven temperatures and moderate levels of humidity could be utilized to maximize tenderness, while minimally affecting flavor development

    Influence of Maternal Carbohydrate Source (Concentrate-Based vs. Forage-Based) on Growth Performance, Carcass Characteristics, and Meat Quality of Progeny

    No full text
    The objective of this research was to investigate the influence of maternal prepartum dietary carbohydrate source on growth performance, carcass characteristics, and meat quality of offspring. Angus-based cows were assigned to either a concentrate-based diet or forage-based diet during mid- and late-gestation. A subset of calves was selected for evaluation of progeny performance. Dry matter intake (DMI), body weight (BW), average daily gain (ADG), gain to feed (G:F), and ultrasound measurements (muscle depth, back fat thickness, and intramuscular fat) were assessed during the feeding period. Carcass measurements were recorded, and striploins were collected for Warner-Bratzler shear force (WBSF), trained sensory panel, crude fat determination and fatty acid profile. Maternal dietary treatment did not influence (p &gt; 0.05) offspring BW, DMI, ultrasound measurements, percent moisture, crude fat, WBSF, or consumer sensory responses. The forage treatment tended to have decreased (p = 0.06) 12th rib backfat compared to the concentrate treatment and tended to have lower (p = 0.08) yield grades. The concentrate treatment had increased (p &lt; 0.05) a* and b* values compared to the forage treatment. These data suggest variation in maternal diets applied in this study during mid- and late-gestation has limited influence on progeny performance
    corecore