146 research outputs found

    Chewing activity, metabolic profile and performance of high- producing dairy cows fed conventional forages, wheat straw or rice straw

    Get PDF
    In this study, production and physiological responses of high-producing dairy cows fed wheat (WS) or rice (RS) straw, as a partial forage replacement for the conventional forages lucerne hay (LH) and maize silage (MS ), were investigated. The straws were treated under dry alkaline conditions, adjusted pH (pH ~12), and then ensiled. Twelve lactating Holstein cows were used in a replicated (n = 4) 3 × 3 Latin square design experiment with three periods of 21 days. Cows were offered one of three diets that differed in their forage sources: 1) 20% LH and 20% MS (control); 2) 12.8% LH, 12.8% MS; and 12.8% WS; and 3) 12.8% LH, 12.8% MS and 12.8% RS. Diet 1 had 60% concentrate, and diets 2 and 3 had 61.6% concentrate. Diets were iso-nitrogenous and iso-energetic. Supplemental buffer (NaHCO3) was omitted from the straw diets. However, straw diets contained greater sodium and dietary cation-anion difference (DCAD) compared with the control diet. Cows fed the WS had significantly greater apparent dry matter (DM) (69.7 versus 63.9%) and neutral detergent fibre (NDF) (55.4 versus 42.4%) digestibility than cows fed the control. Additionally, feeding either WS or RS significantly increased dry matter intake (DMI) (27.5 versus 25.6 kg/d ) and milk production (48.4 versus 45.6 kg/d) compared with control, but milk components were unaffected by treatments. Plasma minerals and metabolites concentrations and ruminal, urinary and faecal pH were similar across treatments. Feeding WS and RS resulted in lower time spent chewing per kg DMI compared with the control ( P = 0.01 ). Although there were no significant differences in performance between WS and RS, nutrient digestibility (DM, OM, and CP) was significantly higher while total chewing was lower for the WS diet than the RS diet. Partial inclusion of dry treated straw in lactating diets (12.8% DM basis) led to increases in sodium and DCAD levels and improved digestibility, DMI and milk yield without negative effects. Keywords: Cation and anion difference, cereal straw dietary sodium, lactating [email protected]

    Differences in maladaptive schemas between patients suffering from chronic and acute posttraumatic stress disorder and healthy controls

    Get PDF
    War, as a stressor event, has a variety of acute and chronic negative consequences, such as posttraumatic stress disorder (PTSD). In this context, early maladaptive schema-based problems in PTSD have recently become an important research area. The aim of this study was to assess early maladaptive schemas in patients with acute and chronic PTSD.; Using available sampling methods and diagnostic criteria, 30 patients with chronic PTSD, 30 patients with acute PTSD, and 30 normal military personnel who were matched in terms of age and wartime experience were selected and assessed with the Young Schema Questionnaire-Long Form, Beck Depression Inventory second version (BDI-II), the Beck Anxiety Inventory (BAI), and the Impact of Events Scale (IES).; Both acute and chronic PTSD patients, when compared with normal military personnel, had higher scores for all early maladaptive schemas. Additionally, veterans suffering from chronic PTSD, as compared with veterans suffering from acute PTSD and veterans without PTSD, reported more impaired schemas related, for instance, to Self-Control, Social Isolation, and Vulnerability to Harm and Illness.; The results of the present study have significant preventative, diagnostic, clinical, research, and educational implications with respect to PTSD

    Smoothness-Increasing Accuracy-Conserving (SIAC) filtering and quasi interpolation: A unified view

    Get PDF
    Filtering plays a crucial role in postprocessing and analyzing data in scientific and engineering applications. Various application-specific filtering schemes have been proposed based on particular design criteria. In this paper, we focus on establishing the theoretical connection between quasi-interpolation and a class of kernels (based on B-splines) that are specifically designed for the postprocessing of the discontinuous Galerkin (DG) method called Smoothness-Increasing Accuracy-Conserving (SIAC) filtering. SIAC filtering, as the name suggests, aims to increase the smoothness of the DG approximation while conserving the inherent accuracy of the DG solution (superconvergence). Superconvergence properties of SIAC filtering has been studied in the literature. In this paper, we present the theoretical results that establish the connection between SIAC filtering to long-standing concepts in approximation theory such as quasi-interpolation and polynomial reproduction. This connection bridges the gap between the two related disciplines and provides a decisive advancement in designing new filters and mathematical analysis of their properties. In particular, we derive a closed formulation for convolution of SIAC kernels with polynomials. We also compare and contrast cardinal spline functions as an example of filters designed for image processing applications with SIAC filters of the same order, and study their properties

    Hexagonal Smoothness-Increasing Accuracy-Conserving Filtering

    Get PDF
    Discontinuous Galerkin (DG) methods are a popular class of numerical techniques to solve partial differential equations due to their higher order of accuracy. However, the inter-element discontinuity of a DG solution hinders its utility in various applications, including visualization and feature extraction. This shortcoming can be alleviated by postprocessing of DG solutions to increase the inter-element smoothness. A class of postprocessing techniques proposed to increase the inter-element smoothness is SIAC filtering. In addition to increasing the inter-element continuity, SIAC filtering also raises the convergence rate from order k+1k+1 to order 2k+12k+1 . Since the introduction of SIAC filtering for univariate hyperbolic equations by Cockburn et al. (Math Comput 72(242):577–606, 2003), many generalizations of SIAC filtering have been proposed. Recently, the idea of dimensionality reduction through rotation has been the focus of studies in which a univariate SIAC kernel has been used to postprocess a two-dimensional DG solution (Docampo-Sánchez et al. in Multi-dimensional filtering: reducing the dimension through rotation, 2016. arXiv preprint arXiv:1610.02317). However, the scope of theoretical development of multidimensional SIAC filters has never gone beyond the usage of tensor product multidimensional B-splines or the reduction of the filter dimension. In this paper, we define a new SIAC filter called hexagonal SIAC (HSIAC) that uses a nonseparable class of two-dimensional spline functions called hex splines. In addition to relaxing the separability assumption, the proposed HSIAC filter provides more symmetry to its tensor-product counterpart. We prove that the superconvergence property holds for a specific class of structured triangular meshes using HSIAC filtering and provide numerical results to demonstrate and validate our theoretical results

    The minimum-error discrimination via Helstrom family of ensembles and Convex Optimization

    Full text link
    Using the convex optimization method and Helstrom family of ensembles introduced in Ref. [1], we have discussed optimal ambiguous discrimination in qubit systems. We have analyzed the problem of the optimal discrimination of N known quantum states and have obtained maximum success probability and optimal measurement for N known quantum states with equiprobable prior probabilities and equidistant from center of the Bloch ball, not all of which are on the one half of the Bloch ball and all of the conjugate states are pure. An exact solution has also been given for arbitrary three known quantum states. The given examples which use our method include: 1. Diagonal N mixed states; 2. N equiprobable states and equidistant from center of the Bloch ball which their corresponding Bloch vectors are inclined at the equal angle from z axis; 3. Three mirror-symmetric states; 4. States that have been prepared with equal prior probabilities on vertices of a Platonic solid. Keywords: minimum-error discrimination, success probability, measurement, POVM elements, Helstrom family of ensembles, convex optimization, conjugate states PACS Nos: 03.67.Hk, 03.65.TaComment: 15 page

    Finite quantum tomography via semidefinite programming

    Full text link
    Using the the convex semidefinite programming method and superoperator formalism we obtain the finite quantum tomography of some mixed quantum states such as: qudit tomography, N-qubit tomography, phase tomography and coherent spin state tomography, where that obtained results are in agreement with those of References \cite{schack,Pegg,Barnett,Buzek,Weigert}.Comment: 25 page

    Developing Ontologies withing Decentralized Settings

    Get PDF
    This chapter addresses two research questions: “How should a well-engineered methodology facilitate the development of ontologies within communities of practice?” and “What methodology should be used?” If ontologies are to be developed by communities then the ontology development life cycle should be better understood within this context. This chapter presents the Melting Point (MP), a proposed new methodology for developing ontologies within decentralised settings. It describes how MP was developed by taking best practices from other methodologies, provides details on recommended steps and recommended processes, and compares MP with alternatives. The methodology presented here is the product of direct first-hand experience and observation of biological communities of practice in which some of the authors have been involved. The Melting Point is a methodology engineered for decentralised communities of practice for which the designers of technology and the users may be the same group. As such, MP provides a potential foundation for the establishment of standard practices for ontology engineering

    The study of atmospheric ice-nucleating particles via microfluidically generated droplets

    Get PDF
    Ice-nucleating particles (INPs) play a significant role in the climate and hydrological cycle by triggering ice formation in supercooled clouds, thereby causing precipitation and affecting cloud lifetimes and their radiative properties. However, despite their importance, INP often comprise only 1 in 10³–10⁶ ambient particles, making it difficult to ascertain and predict their type, source, and concentration. The typical techniques for quantifying INP concentrations tend to be highly labour-intensive, suffer from poor time resolution, or are limited in sensitivity to low concentrations. Here, we present the application of microfluidic devices to the study of atmospheric INPs via the simple and rapid production of monodisperse droplets and their subsequent freezing on a cold stage. This device offers the potential for the testing of INP concentrations in aqueous samples with high sensitivity and high counting statistics. Various INPs were tested for validation of the platform, including mineral dust and biological species, with results compared to literature values. We also describe a methodology for sampling atmospheric aerosol in a manner that minimises sampling biases and which is compatible with the microfluidic device. We present results for INP concentrations in air sampled during two field campaigns: (1) from a rural location in the UK and (2) during the UK’s annual Bonfire Night festival. These initial results will provide a route for deployment of the microfluidic platform for the study and quantification of INPs in upcoming field campaigns around the globe, while providing a benchmark for future lab-on-a-chip-based INP studies
    • 

    corecore