8,722 research outputs found

    ANALISIS PERHITUNGAN TEBAL LAPIS TAMBAHAN PADA JALAN PANGERAN SURYANATA – PATUNG LEMBUSWANA KOTA SAMARINDA

    Get PDF
    Technical planning of the program carried out in East Kalimantan province are still using Component Analysis Method ISO 1732-1989-F with manual calculation. Along with the development of technology, Technical Directorate of the Directorate General of Highways, Ministry of Public Works introduced a new method using software such as Flexible Pavement Design Software (SDPJL) in 2011. The purpose of this study was to compare the additional overlay on flexible pavement using Component Analysis Method ISO 1732-1989-F and Flexible Pavement Design Software (SDPJL). Location of the study was conducted on roads Santan-Bontang in East Kalimantan province. Primary data used is the Land Capability (CBR value), Road Condition Index (RCI), and the geometric path which was held on 14 and 15 February 2013. Secondary data traffic Daily Average (LHR), Deflection (Benkelman beam), climate and temperature data obtained from the Planning and Monitoring Unit of National Roads (P2JN) East Kalimantan. From these data, overlay is calculated using Component Analysis Method SNI 1732-1989-F and Flexible Pavement Design Software (SDPJL). Overlay of additional analysis calculations using the obtained thick Component Analysis Method AC-WC 4 cm and AC-BC 2.35 cm, while the overlay of additional calculations using the method of Flexible Pavement Design Software (SDPJL) obtained thick AC-WC 4 cm and AC-BC 13 cm. Differences the calculation caused of parameters the data used

    A research protocol for developing a Point-Of-Care Key Evidence Tool 'POCKET': a checklist for multidimensional evidence reporting on point-of-care in vitro diagnostics.

    Get PDF
    INTRODUCTION: Point-of-care in vitro diagnostics (POC-IVD) are increasingly becoming widespread as an acceptable means of providing rapid diagnostic results to facilitate decision-making in many clinical pathways. Evidence in utility, usability and cost-effectiveness is currently provided in a fragmented and detached manner that is fraught with methodological challenges given the disruptive nature these tests have on the clinical pathway. The Point-of-care Key Evidence Tool (POCKET) checklist aims to provide an integrated evidence-based framework that incorporates all required evidence to guide the evaluation of POC-IVD to meet the needs of policy and decisionmakers in the National Health Service (NHS). METHODS AND ANALYSIS: A multimethod approach will be applied in order to develop the POCKET. A thorough literature review has formed the basis of a robust Delphi process and validation study. Semistructured interviews are being undertaken with POC-IVD stakeholders, including industry, regulators, commissioners, clinicians and patients to understand what evidence is required to facilitate decision-making. Emergent themes will be translated into a series of statements to form a survey questionnaire that aims to reach a consensus in each stakeholder group to what needs to be included in the tool. Results will be presented to a workshop to discuss the statements brought forward and the optimal format for the tool. Once assembled, the tool will be field-tested through case studies to ensure validity and usability and inform refinement, if required. The final version will be published online with a call for comments. Limitations include unpredictable sample representation, development of compromise position rather than consensus, and absence of blinding in validation exercise. ETHICS AND DISSEMINATION: The Imperial College Joint Research Compliance Office and the Imperial College Hospitals NHS Trust R&D department have approved the protocol. The checklist tool will be disseminated through a PhD thesis, a website, peer-reviewed publication, academic conferences and formal presentations

    SOME LOGARITHMICALLY COMPLETELY MONOTONIC FUNCTIONS AND INEQUALITIES FOR MULTINOMIAL COEFFICIENTS AND MULTIVARIATE BETA FUNCTIONS: Completely monotonic functions and inequalities

    Get PDF
    In the paper, the authors extend a function arising from the Bernoulli trials in probability and involving the gamma function to its largest ranges, find logarithmically complete monotonicity of these extended functions, and, in the light of logarithmically complete monotonicity of these extended functions, derive some inequalities for multinomial coefficients and multivariate beta functions. These results recover, extend, and generalize some known conclusions

    Closed formulas and identities on the Bell polynomials and falling factorials

    Get PDF
    The authors establish a pair of closed-form expressions for special values of the Bell polynomials of the second kind for the falling factorials, derive two pairs of identities involving the falling factorials, find an equivalent expression between two special values for the Bell polynomials of the second kind, and present five closed-form expressions for the (modified) spherical Bessel functions

    Rupture stress of eutectic composite ceramics with rod-shaped crystals

    Get PDF
    Eutectic composite ceramics has a wide range of applications in the aerospace industry due to its excellent mechanical properties. The rupture stress of the materials is a subject of considerable importance. Eutectic composite ceramics primarily consist of rod-shaped crystals, with a small amount of particles and preexisting defects dispersed throughout. Aligned nano-micron fibers are embedded within the rod-shaped crystals. Rupture stress of a eutectic composite ceramic depends on its fracture surface energy and preexisting defects. In this study, the equivalent fracture surface energy of a eutectic ceramic composite was calculated based on its additional fracture work. Next, the effects of the preexisting defects were considered. Then, a micromechanical model of the eutectic composite ceramic was established based on its microstructural characteristics. The defects were assumed to be lamellar, and the surrounding matrix was assumed to be transversely isotropic. Using this information, the rupture stress of the eutectic ceramic composite was predicted. A comparison of the theoretical and experimental results indicated that the predicted rupture stresses corresponded with the tested data

    Literature Survey on Interaction Techniques for Large Displays

    Get PDF
    When designing for large screen displays, designers are forced to deal with cursor tracking issues, interacting over distances, and space management issues. Because of the large visual angle of the user that the screen can cover, it may be hard for users to begin and complete search tasks for basic items such as cursors or icons. In addition, maneuvering over long distances and acquiring small targets understandably takes more time than the same interactions on normally sized screen systems. To deal with these issues, large display researchers have developed more and more unconventional devices, methods and widgets for interaction, and systems for space and task management. For tracking cursors there are techniques that deal with the size and shape of the cursor, as well as the “density” of the cursor. There are other techniques that help direct the attention of the user to the cursor. For target acquisition on large screens, many researchers saw fit to try to augment existing 2D GUI metaphors. They try to optimize Fitts’ law to accomplish this. Some techniques sought to enlarge targets while others sought to enlarge the cursor itself. Even other techniques developed ways of closing the distances on large screen displays. However, many researchers feel that existing 2D metaphors do not and will not work for large screens. They feel that the community should move to more unconventional devices and metaphors. These unconventional means include use of eye-tracking, laser-pointing, hand-tracking, two-handed touchscreen techniques, and other high-DOF devices. In the end, many of these developed techniques do provide effective means for interaction on large displays. However, we need to quantify the benefits of these methods and understand them better. The more we understand the advantages and disadvantages of these techniques, the easier it will be to employ them in working large screen systems. We also need to put into place a kind of interaction standard for these large screen systems. This could mean simply supporting desktop events such as pointing and clicking. It may also mean that we need to identify the needs of each domain that large screens are used for and tailor the interaction techniques for the domain

    On the well-posedness of the stochastic Allen-Cahn equation in two dimensions

    Get PDF
    White noise-driven nonlinear stochastic partial differential equations (SPDEs) of parabolic type are frequently used to model physical and biological systems in space dimensions d = 1,2,3. Whereas existence and uniqueness of weak solutions to these equations are well established in one dimension, the situation is different for d \geq 2. Despite their popularity in the applied sciences, higher dimensional versions of these SPDE models are generally assumed to be ill-posed by the mathematics community. We study this discrepancy on the specific example of the two dimensional Allen-Cahn equation driven by additive white noise. Since it is unclear how to define the notion of a weak solution to this equation, we regularize the noise and introduce a family of approximations. Based on heuristic arguments and numerical experiments, we conjecture that these approximations exhibit divergent behavior in the continuum limit. The results strongly suggest that a series of published numerical studies are problematic: shrinking the mesh size in these simulations does not lead to the recovery of a physically meaningful limit.Comment: 21 pages, 4 figures; accepted by Journal of Computational Physics (Dec 2011

    Fluctuations of Quantum Radiation Pressure in Dissipative Fluid

    Full text link
    Using the generalized Langevin equations involving the stress tensor approach, we study the dynamics of a perfectly reflecting mirror which is exposed to the electromagnetic radiation pressure by a laser beam in a fluid at finite temperature. Based on the fluctuation-dissipation theorem, the minimum uncertainty of the mirror's position measurement from both quantum and thermal noises effects including the photon counting error in the laser interferometer is obtained in the small time limit as compared with the "standard quantum limit". The result of the large time behavior of fluctuations of the mirror's velocity in a dissipative environment can be applied to the laser interferometer of the ground-based gravitational wave detector.Comment: 8 pages. Version published in Physics Letters
    • …
    corecore