670 research outputs found

    The use of information and communication technologies by portuguese teachers

    Get PDF
    We present a study made in Portugal, in 2001/2002, on the use of Information and Communication Technologies (ICT) by teachers of all teaching levels (except high education), in both public and private schools. It was an initiative of the Ministry of Education (“Nonio – 21st Century” program), which was carried out by the Competence Centre “Softsciences” and the Centre for Computational Physics of the University of Coimbra. Some of the conclusions of this study, that has collected data from 19337 teachers, are the following: the majority of Portuguese teachers own a PC and approximately half of them use it in several activities, though their use of computers with students is limited. Primary school teachers use often the PC in their schools, though, probably, in an incipient way. The self-training of teachers in ICT is quite common. The Internet is more used by 3rd cycle (last part of middle school) and high school teachers, being most of its users male and young. These and other conclusions should be taken into account in a strategy towards incrementing a better use of new technologies in schools. The whole study is available in: http://nautilus.fid.uc.pt/cec/estud

    Time-averaged MSD of Brownian motion

    Full text link
    We study the statistical properties of the time-averaged mean-square displacements (TAMSD). This is a standard non-local quadratic functional for inferring the diffusion coefficient from an individual random trajectory of a diffusing tracer in single-particle tracking experiments. For Brownian motion, we derive an exact formula for the Laplace transform of the probability density of the TAMSD by mapping the original problem onto chains of coupled harmonic oscillators. From this formula, we deduce the first four cumulant moments of the TAMSD, the asymptotic behavior of the probability density and its accurate approximation by a generalized Gamma distribution

    A fast and accurate numerical method for the left tail of sums of independent random variables

    Get PDF
    We present a flexible, deterministic numerical method for computing left-tail rare events of sums of non-negative, independent random variables. The method is based on iterative numerical integration of linear convolutions by means of Newtons–Cotes rules. The periodicity properties of convoluted densities combined with the Trapezoidal rule are exploited to produce a robust and efficient method, and the method is flexible in the sense that it can be applied to all kinds of non-negative continuous RVs. We present an error analysis and study the benefits of utilizing Newton–Cotes rules versus the fast Fourier transform (FFT) for numerical integration, showing that although there can be efficiency benefits to using FFT, Newton–Cotes rules tend to preserve the relative error better, and indeed do so at an acceptable computational cost. Numerical studies on problems with both known and unknown rare-event probabilities showcase the method’s performance and support our theoretical findings

    Generalized Master Equations for Non-Poisson Dynamics on Networks

    Full text link
    The traditional way of studying temporal networks is to aggregate the dynamics of the edges to create a static weighted network. This implicitly assumes that the edges are governed by Poisson processes, which is not typically the case in empirical temporal networks. Consequently, we examine the effects of non-Poisson inter-event statistics on the dynamics of edges, and we apply the concept of a generalized master equation to the study of continuous-time random walks on networks. We show that the equation reduces to the standard rate equations when the underlying process is Poisson and that the stationary solution is determined by an effective transition matrix whose leading eigenvector is easy to calculate. We discuss the implications of our work for dynamical processes on temporal networks and for the construction of network diagnostics that take into account their nontrivial stochastic nature

    Phonon Universal Transmission Fluctuations and Localization in Semiconductor Superlattices with a Controlled Degree of Order

    Get PDF
    We study both analytically and numerically phonon transmission fluctuations and localization in partially ordered superlattices with correlations among neighboring layers. In order to generate a sequence of layers with a varying degree of order we employ a model proposed by Hendricks and Teller as well as partially ordered versions of deterministic aperiodic superlattices. By changing a parameter measuring the correlation among adjacent layers, the Hendricks- Teller superlattice exhibits a transition from periodic ordering, with alterna- ting layers, to the phase separated opposite limit; including many intermediate arrangements and the completely random case. In the partially ordered versions of deterministic superlattices, there is short-range order (among any NN conse- cutive layers) and long range disorder, as in the N-state Markov chains. The average and fluctuations in the transmission, the backscattering rate, and the localization length in these multilayered systems are calculated based on the superlattice structure factors we derive analytically. The standard deviation of the transmission versus the average transmission lies on a {\it universal\/} curve irrespective of the specific type of disorder of the SL. We illustrate these general results by applying them to several GaAs-AlAs superlattices for the proposed experimental observation of phonon universal transmission fluctuations.Comment: 16-pages, Revte

    Economic model to examine the cost-effectiveness of FlowOx home therapy compared to standard care in patients with peripheral artery disease

    Get PDF
    Background: Critical limb ischaemia is a severe stage of lower limb peripheral artery disease which can lead to tissue loss, gangrene, amputation and death. FlowOx™ therapy is a novel negative-pressure chamber system intended for home use to increase blood flow, reduce pain and improve wound healing for patients with peripheral artery disease and critical limb ischaemia. Methods: A Markov model was constructed to assess the relative cost-effectiveness of FlowOx™ therapy compared to standard care in lower limb peripheral artery disease patients with intermittent claudication or critical limb ischaemia. The model used data from two European trials of FlowOx™ therapy and published evidence on disease progression. From an NHS analysis perspective, various FlowOx™ therapy scenarios were modelled by adjusting the dose of FlowOx™ therapy and the amount of other care received alongside FlowOx™ therapy, in comparison to standard care. Results: In the base case analysis, consisting of FlowOx™ therapy plus nominal care, the cost estimates were £12,704 for a single dose of FlowOx™ therapy per annum as compared with £15,523 for standard care. FlowOx™ therapy patients gained 0.27 additional quality adjusted life years compared to standard care patients. This equated to a dominant incremental cost-effectiveness ratio per QALY gained. At the NICE threshold WTP of £20,000 and £30,000 per QALY gained, FlowOx™ therapy in addition to standard care had a 0.80 and 1.00 probability of being cost-effectiveness respectively. Conclusions: FlowOx™ therapy delivered as a single annual dose may be a cost-effective treatment for peripheral artery disease. FlowOx™ therapy improved health outcomes and reduced treatment costs in this modelled cohort. The effectiveness and cost-effectiveness of FlowOx™ therapy is susceptible to disease severity, adherence, dose and treatment cost. Research assessing the impact of FlowOx™ therapy on NHS resource use is needed in order to provide a definitive economic evaluation

    The Keck Planet Search: Detectability and the Minimum Mass and Orbital Period Distribution of Extrasolar Planets

    Full text link
    We analyze 8 years of precise radial velocity measurements from the Keck Planet Search, characterizing the detection threshold, selection effects, and completeness of the survey. We carry out a systematic search for planets by assessing the false alarm probability associated with Keplerian orbit fits to the data. This allows us to understand the detection threshold for each star in terms of the number and time baseline of the observations, and size of measurement errors and stellar jitter. We show that all planets with orbital periods 20 m/s, and eccentricities <0.6 have been announced, and summarize the candidates at lower amplitudes and longer orbital periods. For the remaining stars, we calculate upper limits on the velocity amplitude of a companion, typically 10 m/s, and use the non-detections to derive completeness corrections at low amplitudes and long orbital periods. We give the fraction of stars with a planet as a function of planet mass and orbital period, and extrapolate to long period orbits and low planet masses. A power law fit for planet masses >0.3 Jupiter masses and periods <2000 days gives a mass-period distribution dN=C M^\alpha P^\beta dlnM dlnP with \alpha=-0.31 \pm 0.2, \beta=0.26\pm 0.1, and the normalization constant C such that 10.5% of solar type stars have a planet with mass in the range 0.3-10 Jupiter masses and orbital period 2-2000 days. The orbital period distribution shows an increase in the planet fraction by a factor of 5 for orbital periods beyond 300 days. Extrapolation gives 17-20% of stars having gas giant planets within 20 AU. Finally, taking into account differences in detectability, we find that M dwarfs are 3 to 10 times less likely to harbor a Jupiter mass planet than solar type stars.Comment: 20 pages, 17 figures, accepted for publication in PAS

    Firm heterogeneity and wages under different bargaining regimes : does a centralised union care for low-productivity firms?

    Full text link
    This paper studies the relationship between wages and the degree of firm heterogeneity in a given industry under different wage setting structures. To derive testable hypotheses, we set up a theoretical model that analyses the sensitivity of wages to the variability in productivity conditions in a unionsised oligopoly framework. The model distinguishes centralised and decentralised wage determination. The theoretical results predict wages to be negatively associated with the degree of firm heterogeneity under centralised wage-setting, as unions internalise negative externalities of a wage increase for low-productivity firms. We test this prediction using a linked employeremployee panel data set from the German mining and manufacturing sector. Consistent with our hypotheses, the empirical results suggest that under industry-level bargaining workers in more heterogeneous sectors receive lower wages than workers in more homogeneous sectors. In contrast, the degree of firm heterogeneity is found to have no negative impact on wages in uncovered firms and under firm-level contracts

    Concise and Tight Security Analysis of the Bennett-Brassard 1984 Protocol with Finite Key Lengths

    Full text link
    We present a tight security analysis of the Bennett-Brassard 1984 protocol taking into account the finite size effect of key distillation, and achieving unconditional security. We begin by presenting a concise analysis utilizing the normal approximation of the hypergeometric function. Then next we show that a similarly tight bound can also be obtained by a rigorous argument without relying on any approximation. In particular, for the convenience of experimentalists who wish to evaluate the security of their QKD systems, we also give explicit procedures of our key distillation, and also show how to calculate the secret key rate and the security parameter from a given set of experimental parameters. Besides the exact values of key rates and security parameters, we also present how to obtain their rough estimates using the normal approximation.Comment: 40 pages, 4 figures, revised arguments on security, and detailed explanaions on how to use theoretical result

    Profiling of human acquired immunity against the salivary proteins of Phlebotomus papatasi reveals clusters of differential immunoreactivity

    Get PDF
    Citation: Geraci, Nicholas S., Rami M. Mukbel, Michael T. Kemp, Mariha N. Wadsworth, Emil Lesho, Gwen M. Stayback, Matthew M. Champion, et al. 2014. “Profiling of Human Acquired Immunity Against the Salivary Proteins of Phlebotomus Papatasi Reveals Clusters of Differential Immunoreactivity.” The American Journal of Tropical Medicine and Hygiene 90 (5): 923–38. https://doi.org/10.4269/ajtmh.13-0130.Phlebotomus papatasi sand flies are among the primary vectors of Leishmania major parasites from Morocco to the Indian subcontinent and from southern Europe to central and eastern Africa. Antibody-based immunity to sand fly salivary gland proteins in human populations remains a complex contextual problem that is not yet fully understood. We profiled the immunoreactivities of plasma antibodies to sand fly salivary gland sonicates (SGSs) from 229 human blood donors residing in different regions of sand fly endemicity throughout Jordan and Egypt as well as 69 US military personnel, who were differentially exposed to P. papatasi bites and L. major infections in Iraq. Compared with plasma from control region donors, antibodies were significantly immunoreactive to five salivary proteins (12, 26, 30, 38, and 44 kDa) among Jordanian and Egyptian donors, with immunoglobulin G4 being the dominant anti-SGS isotype. US personnel were significantly immunoreactive to only two salivary proteins (38 and 14 kDa). Using k-means clustering, donors were segregated into four clusters distinguished by unique immunoreactivity profiles to varying combinations of the significantly immunogenic salivary proteins. SGS-induced cellular proliferation was diminished among donors residing in sand fly-endemic regions. These data provide a clearer picture of human immune responses to sand fly vector salivary constituents
    • …
    corecore