441,483 research outputs found

    Return interval distribution of extreme events and long term memory

    Full text link
    The distribution of recurrence times or return intervals between extreme events is important to characterize and understand the behavior of physical systems and phenomena in many disciplines. It is well known that many physical processes in nature and society display long range correlations. Hence, in the last few years, considerable research effort has been directed towards studying the distribution of return intervals for long range correlated time series. Based on numerical simulations, it was shown that the return interval distributions are of stretched exponential type. In this paper, we obtain an analytical expression for the distribution of return intervals in long range correlated time series which holds good when the average return intervals are large. We show that the distribution is actually a product of power law and a stretched exponential form. We also discuss the regimes of validity and perform detailed studies on how the return interval distribution depends on the threshold used to define extreme events.Comment: 8 pages, 6 figure

    Evaluation of exposure-specific risks from two independent samples: A simulation study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Previous studies have proposed a simple product-based estimator for calculating exposure-specific risks (ESR), but the methodology has not been rigorously evaluated. The goal of our study was to evaluate the existing methodology for calculating the ESR, propose an improved point estimator, and propose variance estimates that will allow the calculation of confidence intervals (CIs).</p> <p>Methods</p> <p>We conducted a simulation study to test the performance of two estimators and their associated confidence intervals: 1) current (simple product-based estimator) and 2) proposed revision (revised product-based estimator). The first method for ESR estimation was based on multiplying a relative risk (RR) of disease given a certain exposure by an overall risk of disease. The second method, which is proposed in this paper, was based on estimates of the risk of disease in the unexposed. We then multiply the updated risk by the RR to get the revised product-based estimator. A log-based variance was calculated for both estimators. Also, a binomial-based variance was calculated for the revised product-based estimator. 95% CIs were calculated based on these variance estimates. Accuracy of point estimators was evaluated by comparing observed relative bias (percent deviation from the true estimate). Interval estimators were evaluated by coverage probabilities and expected length of the 95% CI, given coverage. We evaluated these estimators across a wide range of exposure probabilities, disease probabilities, relative risks, and sample sizes.</p> <p>Results</p> <p>We observed more bias and lower coverage probability when using the existing methodology. The revised product-based point estimator exhibited little observed relative bias (max: 4.0%) compared to the simple product-based estimator (max: 93.9%). Because the simple product-based estimator was biased, 95% CIs around this estimate exhibited small coverage probabilities. The 95% CI around the revised product-based estimator from the log-based variance provided better coverage in most situations.</p> <p>Conclusion</p> <p>The currently accepted simple product-based method was only a reasonable approach when the exposure probability is small (< 0.05) and the RR is ≤ 3.0. The revised product-based estimator provides much improved accuracy.</p

    Parallel Implementation of Interval Matrix Multiplication

    Get PDF
    International audienceTwo main and not necessarily compatible objectives when implementing the product of two dense matrices with interval coefficients are accuracy and efficiency. In this work, we focus on an implementation on multicore architectures. One direction successfully explored to gain performance in execution time is the representation of intervals by their midpoints and radii rather than the classical representation by endpoints. Computing with the midpoint-radius representation enables the use of optimized floating-point BLAS and consequently the performances benefit from the performances of the BLAS routines. Several variants of interval matrix multiplication have been proposed, that correspond to various trade-offs between accuracy and efficiency, including some efficient ones proposed by Rump in 2012. However, in order to guarantee that the computed result encloses the exact one, these efficient algorithms rely on an assumption on the order of execution of floating-point operations which is not verified by most implementations of BLAS. In this paper, an algorithm for interval matrix product is proposed that verifies this assumption. Furthermore, several optimizations are proposed and the implementation on a multicore architecture compares reasonably well with a non-guaranteed implementation based on MKL, the optimized BLAS of Intel: the overhead is most of the time less than 2 and never exceeds 3. This implementation also exhibits a good scalability

    On a Factorization Formula for the Partition Function of Directed Polymers

    Get PDF
    We prove a factorization formula for the point-to-point partition function associated with a model of directed polymers on the space-time lattice Zd+1 . The polymers are subject to a random potential induced by independent identically distributed random variables and we consider the regime of weak disorder, where polymers behave diffusively. We show that when writing the quotient of the point-to-point partition function and the transition probability for the underlying random walk as the product of two point-to-line partition functions plus an error term, then, for large time intervals [0, t], the error term is small uniformly over starting points x and endpoints y in the sub-ballistic regime ‖ x- y‖ ≤ tσ , where σ< 1 can be arbitrarily close to 1. This extends a result of Sinai, who proved smallness of the error term in the diffusive regime ‖ x- y‖ ≤ t1 / 2 . We also derive asymptotics for spatial and temporal correlations of the field of limiting partition functions

    What is the Culture at the University that Fosters a Spirit of Innovation and Entrepreneurship?

    Get PDF
    The culture in Rochester Institute of Technology’s Mechanical Engineering Technology department is one of long standing innovation and entrepreneurship. Our ‘Idea Factory’ starts in our freshmen seminar where the students are required to generate, investigate, and develop ideas. In later courses they refine and focus their designs up to and including developing prototypes. We are developing a ‘technology shelf’ that allows us to produce products that have evolved from engineering models and alternate process production runs to hard tooling and packaged products that have gone through all the rigors of the production process. It is not just about creating a product but rather the discovery of why a product was successful or why and where it failed. Our products are given a two year cycle where the product may be ‘re-engineered’ to one having either a robust technology and/or enhanced supporting processes. It is also more about creation and construction of models in addition to analysis. We have taken concurrent engineering and included the manufacturing, assembly, and packaging disciplines. A unique requirement of this process is the ‘infusing’ into other courses participating in solving a product’s design and development problems. For example, for evaluating the cantilever beam of a project, the strength of materials class got involved and used it as a class project. Similarly, in machine design the fatigue characteristics of the design were analyzed. Students now see the application of what they are learning in classes and experience the results of their work. With the right set of resources, both human and equipment, fostering spirit becomes a self-generating event. Once left to their own imagination, students will become a natural breeding ground of idea generators and the task then becomes to guide and coach the teams to further their ideas. Students will endlessly add features and enhancements to their ideas and there truly does come a time when you must ‘shoot the engineer and release the product’. The hidden jewel here is the learning curve experienced when there comes the time to actually develop the idea into a production level product. Creating a Product Realization Club has gained a reputation of letting the students truly run with their own ideas rather than ones given them by a faculty member and it is clear that students will chew on their ideas like a dog on a bone. This amount of intensity can never be accomplished by preset assignments and the trick is to get the student teams to champion their own work. In today’s technology there is seldom a product that involves only one discipline. Consequently, ‘interdisciplinary’ teams are encouraged. Teams frequently have students from different disciplines such as mechanical, electrical, packaging, industrial design, and even business that all contribute and share the experience of following an idea through a product life cycle. Faculty must be onboard with this concept and must show a unified support. Our Industrial Advisory Board is an active participant including financial and technical support. This culture is not just at the borders of RIT but is encouraged to include other colleges and even high schools. Competitions and meetings are held at regular intervals. In one particular example SUNY Morrisville is responsible for the manufacture of the injection molds for all the plastic parts for a unique kitchen scale. Publishing and presenting their work is an important requirement and teams from Morrisville and RIT were successful at the ASEE St. Lawrence section student paper and presentation competition. There are many hidden benefits from this endeavor. One major benefit is the early realization by the student as to why should take a Materials or Electronics course. We have seen an average of one grade point shift upward from our mechanical students when taking an electronics course. Another hidden benefit is that the students loose their fear of starting projects and developing them to the end. Faculty advisors can also benefit from this as they provide much more interactive advising to the students. The Product Realization endeavor has demonstrated impressive results in its early career and we are continuously exploring new techniques and ideas to further its benefits to the students and college

    Investigation of mass and nutrient losses during heat treatment of minced meat products

    Get PDF
    People associate existing fast food establishments not only with hamburgers and french fries, but also with nuggets, especially those enterprises that focus on the production of chicken products. With the classic processing of deep-fried nuggets, there is a rather large loss of product mass, and when using bad oil, the taste of the product is lost. Therefore, it is necessary to consider other types of heat treatment of this product to reduce production costs. The authors of the article conducted an experiment on alternative processing of nuggets in such equipment as a combi steamer and oven. The purpose of the experiment is to establish the weight loss of the product when using these types of heat treatment and to conduct an organoleptic evaluation of the resulting product. Raw materials were used from the manufacturer Meat Master LLC. The study was carried out according to 2 approved technologies on different equipment. The total weight loss of nuggets during processing in a combi steamer averaged 18%, and in an oven 13%. At the same time, processing in a combi steamer, in comparison with an oven, made it possible to better preserve the organoleptic properties of the nuggets. Also, thanks to the experiment, it was revealed that it takes time to bring the nuggets to culinary readiness, since during the experiment, the internal temperature of the product was recorded at intervals of 30 seconds. The product was prepared not by time, but by reaching a temperature in the depth of the product up to 800C. It was found that the average cooking time for nuggets in a combi oven is 6 minutes, and in an oven about 10 minutes. The results were the achievement of the goal of the experiment and the solution of related problems. The simulated cooking model made it possible to identify the time to bring to culinary readiness, the mass of losses and the organoleptic properties of nuggets cooked in a combi steamer and oven

    Methods and software for nonparametric estimation in multistate models.

    Get PDF
    Multistate models are a type of multi-variate survival data which provide a framework for describing a complex system where individuals transition through a series of distinct states. This research focuses on nonparametric inference for general multistate models with directed tree topology. In this dissertation, we developed an R package, msSurv, which calculates the marginal stage occupation probabilities and stage entry and exit time distributions for a general, possibly non-Markov, multistage system under left-truncation and right censoring. Dependent censoring is handled via modeling the censoring hazard through observable covariates. Pointwise confidence intervals for the above mentioned quantities are obtained and returned for independent censoring from closed-form variance estimators and for dependent censoring using the bootstrap. We also develop novel nonparametric estimators of state occupation probabilities, state entry time distributions and state exit time distributions for interval censored data using a combination of weighted isotonic regression and kernel smoothing with product limit estimation. Structural assumptions about the multistate system are avoided when possible. We evaluate the performance of our estimators through simulation studies and real data analysis of a UNOS (United Network for Organ Sharing) data set
    • …
    corecore