78 research outputs found

    Estimation of Probability Density Function of a Random Variable for Small Samples

    Get PDF
    A theoretical method of estimating the most general form of probability density functions of random variables has been described. The estimated probability density function depends on themean value of exponential dimrbut~ono f the lnlt~arla ndom variable. By the most general form of the probability density function, it is meant that the survivor functions are of exp (-A P) or exp (-y P ) or combinations of these functions for different values of h and y

    Continuation-Passing C: compiling threads to events through continuations

    Get PDF
    In this paper, we introduce Continuation Passing C (CPC), a programming language for concurrent systems in which native and cooperative threads are unified and presented to the programmer as a single abstraction. The CPC compiler uses a compilation technique, based on the CPS transform, that yields efficient code and an extremely lightweight representation for contexts. We provide a proof of the correctness of our compilation scheme. We show in particular that lambda-lifting, a common compilation technique for functional languages, is also correct in an imperative language like C, under some conditions enforced by the CPC compiler. The current CPC compiler is mature enough to write substantial programs such as Hekate, a highly concurrent BitTorrent seeder. Our benchmark results show that CPC is as efficient, while using significantly less space, as the most efficient thread libraries available.Comment: Higher-Order and Symbolic Computation (2012). arXiv admin note: substantial text overlap with arXiv:1202.324

    Schwinger Pair Production via Instantons in Strong Electric Fields

    Full text link
    In the space-dependent gauge, each mode of the Klein-Gordon equation in a strong electric field takes the form of a time-independent Schr\"{o}dinger equation with a potential barrier. We propose that the single- and multi-instantons of quantum tunneling may be related with the single- and multi-pair production of bosons and the relative probability for the no-pair production is determined by the total tunneling probability via instantons. In the case of a uniform electric field, the instanton interpretation recovers exactly the well-known pair production rate for bosons and when the Pauli blocking is taken into account, it gives the correct fermion production rate. The instanton is used to calculate the pair production rate even in an inhomogeneous electric field. Furthermore, the instanton interpretation confirms the fact that bosons and fermions can not be produced by a static magnetic field only.Comment: RevTex 7 Pages, No figure; Formulae for the production rate in very strong fields and references added; the final version accepted in Phys. Rev.

    The type II-plateau supernova 2017eaw in NGC 6946 and its red supergiant progenitor

    Get PDF
    We present extensive optical photometric and spectroscopic observations, from 4 to 482 days after explosion, of the Type II-plateau (II-P) supernova (SN) 2017eaw in NGC 6946. SN 2017eaw is a normal SN II-P intermediate in properties between, for example, SN 1999em and SN 2012aw and the more luminous SN 2004et, also in NGC 6946. We have determined that the extinction to SN 2017eaw is primarily due to the Galactic foreground and that the SN site metallicity is likely subsolar. We have also independently confirmed a tip-of-the-red-giant-branch (TRGB) distance to NGC 6946 of 7.73 ± 0.78 Mpc. The distances to the SN that we have also estimated via both the standardized candle method and expanding photosphere method corroborate the TRGB distance. We confirm the SN progenitor identity in pre-explosion archival Hubble Space Telescope (HST) and Spitzer Space Telescope images, via imaging of the SN through our HST Target of Opportunity program. Detailed modeling of the progenitor's spectral energy distribution indicates that the star was a dusty, luminous red supergiant consistent with an initial mass of ~15 M ⊙

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Optimal Decision Procedure to Declare Human Fatigue and Prediction of 3-Mean Repair Times with One Repairman(Short Communication)

    No full text
    This paper gives an optimum procedure to decide whether fatigue is present in a repairman and if it'is true prediction of the 3-mean repair times of the three failed systems

    Cross-Layer Latency Minimization in Wireless Networks with SINR Constraints

    No full text
    Recently, there has been substantial interest in the design of cross-\ud layer protocols for wireless networks. These protocols optimize\ud certain performance metric(s) of interest (e.g. latency, energy, rate)\ud by jointly optimizing the performance of multiple layers of the\ud protocol stack. Algorithm designers often use geometric-graph-\ud theoretic models for radio interference to design such cross-layer\ud protocols. In this paper we study the problem of designing cross-\ud layer protocols for multi-hop wireless networks using a more real-\ud istic Signal to Interference plus Noise Ratio (SINR) model for radio\ud interference. The following cross-layer latency minimization prob-\ud lem is studied: Given a set V of transceivers, and a set of source-\ud destination pairs, (i) choose power levels for all the transceivers, (ii)\ud choose routes for all connections, and (iii) construct an end-to-end\ud schedule such that the SINR constraints are satisfied at each time\ud step so as to minimize the make-span of the schedule (the time\ud by which all packets have reached their respective destinations).\ud We present a polynomial-time algorithm with provable worst-case\ud performance guarantee for this cross-layer latency minimization\ud problem. As corollaries of the algorithmic technique we show that\ud a number of variants of the cross-layer latency minimization prob-\ud lem can also be approximated efficiently in polynomial time. Our\ud work extends the results of Kumar et al. (Proc. SODA, 2004) and\ud Moscibroda et al. (Proc. MOBIHOC, 2006). Although our algo-\ud rithm considers multiple layers of the protocol stack, it can natu-\ud rally be viewed as compositions of tasks specific to each layer —\ud this allows us to improve the overall performance while preserving\ud the modularity of the layered structure.\u

    Stress corrosion cracking (SCC) of aluminium alloys

    No full text
    The stress corrosion cracking (SCC) behaviour of aluminium alloys has been studied for the past five decades and is still a research area of high interest due to the demand for higher strength aluminium alloys for fuel saving. This chapter brings out the general understanding of the SCC mechanism(s) and the critical metallurgical issues affecting the SCC behaviour of aluminium alloys. The developments made so far with regard to alloying and heat treatment of aluminium alloys for high SCC resistance are discussed. An overview of the available literature on the SCC of aluminium alloy weldments and aluminium alloy metal matrix composites are also presented
    corecore