968 research outputs found

    Sowing time effect on yield and quality of field beans in a changing meteorological situation in the Baltic region

    Get PDF
    Received: January 26th, 2021 ; Accepted: June 26th, 2021 ; Published: June 30th, 2021 ; Correspondence: [email protected] field beans (Vicia faba L.) need a lot of moisture to germinate, growers believe that they should be sown as early as possible in the spring. Field trial was carried out at the LLU RSF “Pēterlauki”, from 2018 to 2020. Following factors were researched: A) sowing time (early, medium and late), B) variety (‘Laura’, ‘Boxer’, ‘Isabell’), C) sowing rate (30, 40, 50 germinable seeds m-2 ), D) fungicide application (without and with application of fungicide at the GS 61-65). Meteorological conditions during the study had the greatest impact on the results as they were contrasting. Adverse meteorological conditions for field bean growing were observed in 2018 and in spring and early summer of 2019. The best year for bean yield formation was 2020, when temperature and precipitation was moderate. The highest average three year been yield was obtained sowing beans at the medium sowing time, however, equivalent yield was obtained sowing beans also in early sowing time. Fungicide application increased average three year yield significantly (p = 0.007) and independently of the sowing time. Influence of variety and sowing rate on average three year yield was insignificant, and it was not proved that any variety or sowing rate could be more suitable in a specific sowing time. Average three-year values of crude protein content, thousand seed weight and volume weight were affected by sowing time significantly (p < 0.001). Trial year, variety and fungicide application also affected all quality parameters significantly (p 0.05)

    Winter wheat leaf blotches development depending on fungicide treatment and nitrogen level in two contrasting years

    Get PDF
    Received: January 31st, 2021 ; Accepted: December 1st, 2021 ; Published: December 4th, 2021 ; Correspondence: [email protected] spot (caused by Pyrenophora tritici-repentis) and Septoria tritici blotch (caused by Zymoseptoria tritici) are the most widespread winter wheat leaf diseases in Latvia. The aim of the present research was to clarify the development of leaf blotches on winter wheat depending on fungicide treatment schemes under four nitrogen rates. A two-factorial trial was conducted at the Research and Study farm “Pēterlauki” (Latvia) of Latvia University of Life Sciences and Technologies. For this study, data from the 2018/2019 and 2019/2020 growing seasons was used. Four schemes of fungicide application and an untreated variant, as well as four nitrogen rates (N120, N150, N180, and N210 kg ha-1 ) were used. The total disease impact during the vegetation period was estimated by calculating the area under the disease progress curve (AUDPC). The severity of leaf blotches in winter wheat leaves differed significantly during both vegetation seasons. Tan spot was the dominant disease in 2019 (18.7% in untreated variant). The development of tan spot was reduced by fungicide treatment; however, only in 2019, the influence of fungicide was significant. Septoria tritici blotch was the dominant disease in 2020 (11.4% in untreated variant), and its development was decreased by fungicides. Nitrogen fertilizer rate had no significant effect on the development of Septoria tritici blotches. Yield harvested in 2020 were significantly higher than those in 2019 (on average 5.23 t ha-1 in 2019, 8.40 t ha-1 in 2020). The using of fungicides provided significant increase of yield but there were no significant differences among fungicide treatment schemes

    aCGHViewer: A Generic Visualization Tool For aCGH data

    Get PDF
    Array-Comparative Genomic Hybridization (aCGH) is a powerful high throughput technology for detecting chromosomal copy number aberrations (CNAs) in cancer, aiming at identifying related critical genes from the affected genomic regions. However, advancing from a dataset with thousands of tabular lines to a few candidate genes can be an onerous and time-consuming process. To expedite the aCGH data analysis process, we have developed a user-friendly aCGH data viewer (aCGHViewer) as a conduit between the aCGH data tables and a genome browser. The data from a given aCGH analysis are displayed in a genomic view comprised of individual chromosome panels which can be rapidly scanned for interesting features. A chromosome panel containing a feature of interest can be selected to launch a detail window for that single chromosome. Selecting a data point of interest in the detail window launches a query to the UCSC or NCBI genome browser to allow the user to explore the gene content in the chromosomal region. Additionally, aCGHViewer can display aCGH and expression array data concurrently to visually correlate the two. aCGHViewer is a stand alone Java visualization application that should be used in conjunction with separate statistical programs. It operates on all major computer platforms and is freely available at http://falcon.roswellpark.org/aCGHview/

    The recent intellectual structure of geography

    Get PDF
    An active learning project in an introductory graduate course used multidimensional scaling of the name index in Geography in America at the Dawn of the 21st Century, by Gary Gaile and Cort Willmott, to reveal some features of the discipline\u27s recent intellectual structure relevant to the relationship between human and physical geography. Previous analyses, dating to the 1980s, used citation indices or Association of American Geographers spe- cialty-group rosters to conclude that either the regional or the methods and environmental subdisciplines bridge human and physical geography. The name index has advantages over those databases, and its analysis reveals that the minimal connectivity that occurs between human and physical geography has recently operated more through environmental than through either methods or regional subdisciplines

    Rank-based model selection for multiple ions quantum tomography

    Get PDF
    The statistical analysis of measurement data has become a key component of many quantum engineering experiments. As standard full state tomography becomes unfeasible for large dimensional quantum systems, one needs to exploit prior information and the "sparsity" properties of the experimental state in order to reduce the dimensionality of the estimation problem. In this paper we propose model selection as a general principle for finding the simplest, or most parsimonious explanation of the data, by fitting different models and choosing the estimator with the best trade-off between likelihood fit and model complexity. We apply two well established model selection methods -- the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) -- to models consising of states of fixed rank and datasets such as are currently produced in multiple ions experiments. We test the performance of AIC and BIC on randomly chosen low rank states of 4 ions, and study the dependence of the selected rank with the number of measurement repetitions for one ion states. We then apply the methods to real data from a 4 ions experiment aimed at creating a Smolin state of rank 4. The two methods indicate that the optimal model for describing the data lies between ranks 6 and 9, and the Pearson χ2\chi^{2} test is applied to validate this conclusion. Additionally we find that the mean square error of the maximum likelihood estimator for pure states is close to that of the optimal over all possible measurements.Comment: 24 pages, 6 figures, 3 table

    A comprehensive re-analysis of the Golden Spike data: Towards a benchmark for differential expression methods

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Golden Spike data set has been used to validate a number of methods for summarizing Affymetrix data sets, sometimes with seemingly contradictory results. Much less use has been made of this data set to evaluate differential expression methods. It has been suggested that this data set should not be used for method comparison due to a number of inherent flaws.</p> <p>Results</p> <p>We have used this data set in a comparison of methods which is far more extensive than any previous study. We outline six stages in the analysis pipeline where decisions need to be made, and show how the results of these decisions can lead to the apparently contradictory results previously found. We also show that, while flawed, this data set is still a useful tool for method comparison, particularly for identifying combinations of summarization and differential expression methods that are unlikely to perform well on real data sets. We describe a new benchmark, AffyDEComp, that can be used for such a comparison.</p> <p>Conclusion</p> <p>We conclude with recommendations for preferred Affymetrix analysis tools, and for the development of future spike-in data sets.</p

    Permutationally invariant state reconstruction

    Get PDF
    Feasible tomography schemes for large particle numbers must possess, besides an appropriate data acquisition protocol, also an efficient way to reconstruct the density operator from the observed finite data set. Since state reconstruction typically requires the solution of a non-linear large-scale optimization problem, this is a major challenge in the design of scalable tomography schemes. Here we present an efficient state reconstruction scheme for permutationally invariant quantum state tomography. It works for all common state-of-the-art reconstruction principles, including, in particular, maximum likelihood and least squares methods, which are the preferred choices in today's experiments. This high efficiency is achieved by greatly reducing the dimensionality of the problem employing a particular representation of permutationally invariant states known from spin coupling combined with convex optimization, which has clear advantages regarding speed, control and accuracy in comparison to commonly employed numerical routines. First prototype implementations easily allow reconstruction of a state of 20 qubits in a few minutes on a standard computer.Comment: 25 pages, 4 figues, 2 table

    Search for new physics in high-mass diphoton events from proton-proton collisions at √s = 13 TeV

    Get PDF
    Results are presented from a search for new physics in high-mass diphoton events from proton-proton collisions at sqrt(s) = 13 TeV. The data set was collected in 2016–2018 with the CMS detector at the LHC and corresponds to an integrated luminosity of 138 fb−1 . Events with a diphoton invariant mass greater than 500 GeV are considered. Two diferent techniques are used to predict the standard model backgrounds: parametric fts to the smoothly-falling background and a frst-principles calculation of the standard model diphoton spectrum at next-to-next-to-leading order in perturbative quantum chromodynamics calculations. The frst technique is sensitive to resonant excesses while the second technique can identify broad diferences in the invariant mass shape. The data are used to constrain the production of heavy Higgs bosons, Randall-Sundrum gravitons, the large extra dimensions model of Arkani-Hamed, Dimopoulos, and Dvali (ADD), and the continuum clockwork mechanism. No statistically signifcant excess is observed. The present results are the strongest limits to date on ADD extra dimensions and RS gravitons with a coupling parameter greater than 0.1

    Search for high-mass exclusive γγ → WW and γγ → ZZ production in proton-proton collisions at s \sqrt{s} = 13 TeV

    Get PDF

    Measurement of the Higgs boson inclusive and differential fiducial production cross sections in the diphoton decay channel with pp collisions at s \sqrt{s} = 13 TeV

    Get PDF
    The measurements of the inclusive and differential fiducial cross sections of the Higgs boson decaying to a pair of photons are presented. The analysis is performed using proton-proton collisions data recorded with the CMS detector at the LHC at a centre-of-mass energy of 13 TeV and corresponding to an integrated luminosity of 137 fb−1^{−1}. The inclusive fiducial cross section is measured to be σfidσ_{fid}=73.4−5.3+5.4^{+5.4}_{−5.3}(stat)−2.2+2.4^{+2.4}_{−2.2}(syst) fb, in agreement with the standard model expectation of 75.4 ± 4.1 fb. The measurements are also performed in fiducial regions targeting different production modes and as function of several observables describing the diphoton system, the number of additional jets present in the event, and other kinematic observables. Two double differential measurements are performed. No significant deviations from the standard model expectations are observed
    • 

    corecore