62 research outputs found

    Flood of September 20-23, 1969 in the Gadsden County area, Florida

    Get PDF
    The center of low pressure of a tropical disturbance which moved northward in the Gulf of Mexico, reached land between Panama City and Port St. Joe, Florida, on September 20, 1969. This system was nearly stationary for 48 hours producing heavy rainfall in the Quincy-Havana area, 70-80 miles northeast of the center. Rainfall associated with the tropical disturbance exceeded 20 inches over a part of Gadsden County, Florida, during September 20 through 23, 1969, and the maximum rainfall of record occurred at Quincy with 10.87 inches during a 6-hour period on September 21. The 48-hour maximum of 17.71 inches exceeded the 1 in 100-year probability of 16 inches for a 7-day period. The previous maximum rainfall of record at Quincy (more than 12 inches) was on September 14-15, 1924. The characteristics of this historical storm were similar in path and effect to the September 1969 tropical disturbance. Peak runoff from a 1.4-square mile area near Midway, Florida, was 1,540 cfs (cubic feet per second) per square mile. A peak discharge of 45,600 cfs on September 22 at the gaging station on the Little River near Quincy exceeded the previous peak of 25,400 cfs which occurred on December 4, 1964. The peak discharge of 89,400 cfs at Ochlockonee River near Bloxham exceeded the April 1948 peak of 50,200 cfs, which was the previous maximum of record, by 1.8 times. Many flood-measurement sites had peak discharges in excess of that of a 50-year flood. Nearly 200,000wasspentonemergencyrepairstoroads.Anadditional200,000 was spent on emergency repairs to roads. An additional 520,000 in contractual work was required to replace four bridges that were destroyed. Agricultural losses were estimated at $1,000,000. (44 page document

    Probing the primordial power spectra with inflationary priors

    Full text link
    We investigate constraints on power spectra of the primordial curvature and tensor perturbations with priors based on single-field slow-roll inflation models. We stochastically draw the Hubble slow-roll parameters and generate the primordial power spectra using the inflationary flow equations. Using data from recent observations of CMB and several measurements of geometrical distances in the late Universe, Bayesian parameter estimation and model selection are performed for models that have separate priors on the slow-roll parameters. The same analysis is also performed adopting the standard parameterization of the primordial power spectra. We confirmed that the scale-invariant Harrison-Zel'dovich spectrum is disfavored with increased significance from previous studies. While current observations appear to be optimally modeled with some simple models of single-field slow-roll inflation, data is not enough constraining to distinguish these models.Comment: 23 pages, 3 figures, 7 tables, accepted for publication in JCA

    Reconstruction of the Primordial Power Spectrum using Temperature and Polarisation Data from Multiple Experiments

    Full text link
    We develop a method to reconstruct the primordial power spectrum, P(k), using both temperature and polarisation data from the joint analysis of a number of Cosmic Microwave Background (CMB) observations. The method is an extension of the Richardson-Lucy algorithm, first applied in this context by Shafieloo & Souradeep. We show how the inclusion of polarisation measurements can decrease the uncertainty in the reconstructed power spectrum. In particular, the polarisation data can constrain oscillations in the spectrum more effectively than total intensity only measurements. We apply the estimator to a compilation of current CMB results. The reconstructed spectrum is consistent with the best-fit power spectrum although we find evidence for a `dip' in the power on scales k ~ 0.002 Mpc^-1. This feature appears to be associated with the WMAP power in the region 18 < l < 26 which is consistently below best--fit models. We also forecast the reconstruction for a simulated, Planck-like survey including sample variance limited polarisation data.Comment: 8 pages, 5 figures, comments welcom

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    The Sample Analysis at Mars Investigation and Instrument Suite

    Full text link

    The LHCb upgrade I

    Get PDF
    The LHCb upgrade represents a major change of the experiment. The detectors have been almost completely renewed to allow running at an instantaneous luminosity five times larger than that of the previous running periods. Readout of all detectors into an all-software trigger is central to the new design, facilitating the reconstruction of events at the maximum LHC interaction rate, and their selection in real time. The experiment's tracking system has been completely upgraded with a new pixel vertex detector, a silicon tracker upstream of the dipole magnet and three scintillating fibre tracking stations downstream of the magnet. The whole photon detection system of the RICH detectors has been renewed and the readout electronics of the calorimeter and muon systems have been fully overhauled. The first stage of the all-software trigger is implemented on a GPU farm. The output of the trigger provides a combination of totally reconstructed physics objects, such as tracks and vertices, ready for final analysis, and of entire events which need further offline reprocessing. This scheme required a complete revision of the computing model and rewriting of the experiment's software

    Economies of scale in the production of swine manure Economias de escala na produção de dejetos de suínos

    No full text
    Manure production on grower/finisher swine operations in the United States was examined using data from 184 grower/finisher swine operations that participated in the United States National Animal Health Monitoring System's 1995 National Swine Study. Two methods were used: one, assuming that pigs produced 8.4% of their body weight in manure each day; another using the difference between feed fed and weight gained as a proxy variable to study manure production. Using this latter approach, a production function was developed. The function exhibited diminishing returns to scale when food waste was not fed to pigs, but constant returns to scale when food waste was included in their diets. The difference between feed fed and weight gained was lower on operations that restricted entry to employees only.<br>A produção de dejetos em granjas de crescimento e terminação de suínos nos Estados Unidos foi avaliada utilizando dados de 184 granjas participantes de um estudo nacional de 1995 do "United States National Animal Health Monitoring System". Dois métodos foram usados: um considerando que suínos produzem 8,4% do seu peso corporal de dejetos por dia e o outro usando a diferença entre o alimento ingerido e o ganho de peso como um indicador para o estudo da produção de esterco. Através desse último procedimento, desenvolveu-se uma função de produção
    corecore