958 research outputs found

    Comparative investigation into the effect of fertigation and of broadcast fertilization on the yield and nitrate content of lettuce (Lactuca sativa L.).

    Get PDF
    Three [unnamed] cultivars were grown successively during March to May, June to July, and July to August 1983 on a sandy soil. For each of these field experiments the treatments were: no N application, and fertilization with various amounts of ammonium nitrate, either as split applications via irrigation water or as a single broadcast application. Fertigation increased the availability and uptake of N and increased the nitrate content of the crop compared with broadcast fertilization. A second-degree polynomial model fitted the fertilization:NO3-content data. In the first cropping, yield was significantly higher when N was applied by fertigation compared with broadcast application, but in the following croppings there was no significant difference. However, it is suggested that the yield difference in the first cropping may be related to the very wet spring conditions (a 56 mm rain surplus compared with 105 and 116 mm deficits in the following two experiments) when leaching of NO3 from the upper soil layer would be expected. (Abstract retrieved from CAB Abstracts by CABI’s permission

    On-the-fly memory compression for multibody algorithms.

    Get PDF
    Memory and bandwidth demands challenge developers of particle-based codes that have to scale on new architectures, as the growth of concurrency outperforms improvements in memory access facilities, as the memory per core tends to stagnate, and as communication networks cannot increase bandwidth arbitrary. We propose to analyse each particle of such a code to find out whether a hierarchical data representation storing data with reduced precision caps the memory demands without exceeding given error bounds. For admissible candidates, we perform this compression and thus reduce the pressure on the memory subsystem, lower the total memory footprint and reduce the data to be exchanged via MPI. Notably, our analysis and transformation changes the data compression dynamically, i.e. the choice of data format follows the solution characteristics, and it does not require us to alter the core simulation code

    Psychometric Framework for Modeling Parental Involvement and Reading Literacy

    Get PDF
    Assessment, Testing and Evaluatio

    Variance Decomposition Using an IRT Measurement Model

    Get PDF
    Large scale research projects in behaviour genetics and genetic epidemiology are often based on questionnaire or interview data. Typically, a number of items is presented to a number of subjects, the subjects’ sum scores on the items are computed, and the variance of sum scores is decomposed into a number of variance components. This paper discusses several disadvantages of the approach of analysing sum scores, such as the attenuation of correlations amongst sum scores due to their unreliability. It is shown that the framework of Item Response Theory (IRT) offers a solution to most of these problems. We argue that an IRT approach in combination with Markov chain Monte Carlo (MCMC) estimation provides a flexible and efficient framework for modelling behavioural phenotypes. Next, we use data simulation to illustrate the potentially huge bias in estimating variance components on the basis of sum scores. We then apply the IRT approach with an analysis of attention problems in young adult twins where the variance decomposition model is extended with an IRT measurement model. We show that when estimating an IRT measurement model and a variance decomposition model simultaneously, the estimate for the heritability of attention problems increases from 40% (based on sum scores) to 73%

    Psychometric Framework for Modeling Parental Involvement and Reading Literacy

    Get PDF
    Assessment, Testing and Evaluatio

    Alpha particle production by molecular single-particle effect in reactions of 9^{9}Be just above the Coulomb barrier

    Full text link
    The α\alpha -particle production in the dissociation of 9^{9}Be on 209^{209}Bi and 64^{64}Zn at energies just above the Coulomb barrier is studied within the two-center shell model approach. The dissociation of 9^{9}Be on 209^{209}Bi is caused by a molecular single-particle effect (Landau-Zener mechanism) before the nuclei reach the Coulomb barrier. Molecular single-particle effects do not occur at that stage of the collision for 9^{9}Be+64^{64}Zn, and this explains the absence of fusion suppression observed for this system. The polarisation of the energy level of the last neutron of 9^{9}Be and, therefore the existence of avoided crossings with that level, depends on the structure of the target.Comment: 5 pages, 4 figure

    Reduced basis isogeometric mortar approximations for eigenvalue problems in vibroacoustics

    Full text link
    We simulate the vibration of a violin bridge in a multi-query context using reduced basis techniques. The mathematical model is based on an eigenvalue problem for the orthotropic linear elasticity equation. In addition to the nine material parameters, a geometrical thickness parameter is considered. This parameter enters as a 10th material parameter into the system by a mapping onto a parameter independent reference domain. The detailed simulation is carried out by isogeometric mortar methods. Weakly coupled patch-wise tensorial structured isogeometric elements are of special interest for complex geometries with piecewise smooth but curvilinear boundaries. To obtain locality in the detailed system, we use the saddle point approach and do not apply static condensation techniques. However within the reduced basis context, it is natural to eliminate the Lagrange multiplier and formulate a reduced eigenvalue problem for a symmetric positive definite matrix. The selection of the snapshots is controlled by a multi-query greedy strategy taking into account an error indicator allowing for multiple eigenvalues

    Change in hematologic indices over time in pediatric inflammatory bowel disease treated with azathioprine

    Get PDF
    Azathioprine leads to changes in mean corpuscular volume (MCV) and white blood cell (WBC) indices reflecting efficacy or toxicity. Understanding the interactions between bone marrow stem cells and azathioprine could highlight abnormal response patterns as forerunners for hematologic malig-nancies. This study gives a statistical description of factors influencing the relationship between MCV and WBC in children with inflammatory bowel disease treated with azathioprine. We found that leukopenia preceded macro¬cytosis. Macrocytosis is therefore not a good predictor of leukopenia. Further studies will be necessary to determine the subgroup of patients at increased risk of malignancies based on bone marrow response

    Nebulized heparin in burn patients with inhalation trauma : safety and feasibility

    Get PDF
    Background: Pulmonary hypercoagulopathy is intrinsic to inhalation trauma. Nebulized heparin could theoretically be beneficial in patients with inhalation injury, but current data are conflicting. We aimed to investigate the safety, feasibility, and effectiveness of nebulized heparin. Methods: International multicenter, double-blind, placebo-controlled randomized clinical trial in specialized burn care centers. Adult patients with inhalation trauma received nebulizations of unfractionated heparin (25,000 international unit (IU), 5 mL) or placebo (0.9% NaCl, 5 mL) every four hours for 14 days or until extubation. The primary outcome was the number of ventilator-free days at day 28 post-admission. Here, we report on the secondary outcomes related to safety and feasibility. Results: The study was prematurely stopped after inclusion of 13 patients (heparin N = 7, placebo N = 6) due to low recruitment and high costs associated with the trial medication. Therefore, no analyses on effectiveness were performed. In the heparin group, serious respiratory problems occurred due to saturation of the expiratory filter following nebulizations. In total, 129 out of 427 scheduled nebulizations were withheld in the heparin group (in 3 patients) and 45 out of 299 scheduled nebulizations were withheld in the placebo group (in 2 patients). Blood-stained sputum or expected increased bleeding risks were the most frequent reasons to withhold nebulizations. Conclusion: In this prematurely stopped trial, we encountered important safety and feasibility issues related to frequent heparin nebulizations in burn patients with inhalation trauma. This should be taken into account when heparin nebulizations are considered in these patients

    On-the-fly memory compression for multibody algorithms

    Get PDF
    Memory and bandwidth demands challenge developers of particle-based codes that have to scale on new architectures, as the growth of concurrency outperforms improvements in memory access facilities, as the memory per core tends to stagnate, and as communication networks cannot increase bandwidth arbitrary. We propose to analyse each particle of such a code to find out whether a hierarchical data representation storing data with reduced precision caps the memory demands without exceeding given error bounds. For admissible candidates, we perform this compression and thus reduce the pressure on the memory subsystem, lower the total memory footprint and reduce the data to be exchanged via MPI. Notably, our analysis and transformation changes the data compression dynamically, i.e. the choice of data format follows the solution characteristics, and it does not require us to alter the core simulation code
    • …
    corecore