6,611 research outputs found

    A guideline for heavy ion radiation testing for Single Event Upset (SEU)

    Get PDF
    A guideline for heavy ion radiation testing for single event upset was prepared to assist new experimenters in preparing and directing tests. How to estimate parts vulnerability and select an irradiation facility is described. A broad brush description of JPL equipment is given, certain necessary pre-test procedures are outlined and the roles and testing guidelines for on-site test personnel are indicated. Detailed descriptions of equipment needed to interface with JPL test crew and equipment are not provided, nor does it meet the more generalized and broader requirements of a MIL-STD document. A detailed equipment description is available upon request, and a MIL-STD document is in the early stages of preparation

    Decentralised Learning MACs for Collision-free Access in WLANs

    Get PDF
    By combining the features of CSMA and TDMA, fully decentralised WLAN MAC schemes have recently been proposed that converge to collision-free schedules. In this paper we describe a MAC with optimal long-run throughput that is almost decentralised. We then design two \changed{schemes} that are practically realisable, decentralised approximations of this optimal scheme and operate with different amounts of sensing information. We achieve this by (1) introducing learning algorithms that can substantially speed up convergence to collision free operation; (2) developing a decentralised schedule length adaptation scheme that provides long-run fair (uniform) access to the medium while maintaining collision-free access for arbitrary numbers of stations

    Dairy and hog farming in northeastern Iowa

    Get PDF
    On Northeastern Iowa dairy and hog farms, highest returns were obtained where the number of milk cows equaled litters of pigs. This meant about 6 pounds of hogs were produced to each pound of butterfat. Where hog production was less, returns were lower. The butterfat-hog price ratio, during the years of the study, favored hogs, with 1 pound butterfat worth only 3.5 pounds of hogs. Generally, the strictly dairy herds were more profitable than the dual-purpose herds, even though butterfat prices were unfavorable in comparison to beef, during the period studied. Income from beef in the dual-purpose herds was not enough to offset the lower sales of butterfat. The dairy herds, with 16.6 cows, averaged 229 pounds butterfat sold or used in the household, and 493 pounds beef per cow, while the dual-purpose herds, with 14.1 cows, averaged 162 pounds butterfat output and 711 pounds beef per cow

    An economic study of the dairy enterprise in northeastern Iowa

    Get PDF
    In a study of 51 dairy and dual-purpose herds in 1935 and ] 936 it was found that the average value per head of the milk cows was 66inthehighproducingdairyherds,comparedto66 in the high producing dairy herds, compared to 49 in the low producing ones; while investment in buildings and fences was 120percowinthehighascomparedto120 per cow in the high as compared to 90 in the low producing herds. Investment per pound of butterfat produced, however , was lower with the higher producing and more valuable cows. The cows in the higher producing herds were fed more heavily and received better balanced rations; the total amount of concentrates amounting to approximately 2,300 pounds in the high and 1,200 in the low producing herds, while total value of all feeds plus pasture amounted to 72,comparedto72, compared to 50. When expressed per pound of butterfat, however, the values of feed and pasture were but little different between the high and low producing herds. In fact, the advantage was slightly with the high producing herds. There was a wide variation in amount of feed fed per cow, which was only partly related to the production per cow. The cows receiving the most feed generally produced more butterfat but not necessarily in proportion to the difference in amount of feed. Consequently, the cows receiving the most feed did not give the highest return per $100 of feed fed

    Thermoelectric response of Fe1+y_{1+y}Te0.6_{0.6}Se0.4_{0.4}: evidence for strong correlation and low carrier density

    Full text link
    We present a study of the Seebeck and Nernst coefficients of Fe1+y_{1+y}Te1−x_{1-x}Sex_{x} extended up to 28 T. The large magnitude of the Seebeck coefficient in the optimally doped sample tracks a remarkably low normalized Fermi temperature, which, like other correlated superconductors, is only one order of magnitude larger than Tc_c. We combine our data with other experimentally measured coefficients of the system to extract a set of self-consistent parameters, which identify Fe1+y_{1+y}Te0.6_{0.6}Se0.4_{0.4} as a low-density correlated superconductor barely in the clean limit. The system is subject to strong superconducting fluctuations with a sizeable vortex Nernst signal in a wide temperature window.Comment: 4 pages including 4 figure

    Accurate exchange-correlation energies for the warm dense electron gas

    Get PDF
    Density matrix quantum Monte Carlo (DMQMC) is used to sample exact-on-average NN-body density matrices for uniform electron gas systems of up to 10124^{124} matrix elements via a stochastic solution of the Bloch equation. The results of these calculations resolve a current debate over the accuracy of the data used to parametrize finite-temperature density functionals. Exchange-correlation energies calculated using the real-space restricted path-integral formalism and the kk-space configuration path-integral formalism disagree by up to ∼\sim1010\% at certain reduced temperatures T/TF≤0.5T/T_F \le 0.5 and densities rs≤1r_s \le 1. Our calculations confirm the accuracy of the configuration path-integral Monte Carlo results available at high density and bridge the gap to lower densities, providing trustworthy data in the regime typical of planetary interiors and solids subject to laser irradiation. We demonstrate that DMQMC can calculate free energies directly and present exact free energies for T/TF≥1T/T_F \ge 1 and rs≤2r_s \le 2.Comment: Accepted version: added free energy data and restructured text. Now includes supplementary materia

    A realistic evaluation : the case of protocol-based care

    Get PDF
    Background 'Protocol based care' was envisioned by policy makers as a mechanism for delivering on the service improvement agenda in England. Realistic evaluation is an increasingly popular approach, but few published examples exist, particularly in implementation research. To fill this gap, within this paper we describe the application of a realistic evaluation approach to the study of protocol-based care, whilst sharing findings of relevance about standardising care through the use of protocols, guidelines, and pathways. Methods Situated between positivism and relativism, realistic evaluation is concerned with the identification of underlying causal mechanisms, how they work, and under what conditions. Fundamentally it focuses attention on finding out what works, for whom, how, and in what circumstances. Results In this research, we were interested in understanding the relationships between the type and nature of particular approaches to protocol-based care (mechanisms), within different clinical settings (context), and what impacts this resulted in (outcomes). An evidence review using the principles of realist synthesis resulted in a number of propositions, i.e., context, mechanism, and outcome threads (CMOs). These propositions were then 'tested' through multiple case studies, using multiple methods including non-participant observation, interviews, and document analysis through an iterative analysis process. The initial propositions (conjectured CMOs) only partially corresponded to the findings that emerged during analysis. From the iterative analysis process of scrutinising mechanisms, context, and outcomes we were able to draw out some theoretically generalisable features about what works, for whom, how, and what circumstances in relation to the use of standardised care approaches (refined CMOs). Conclusions As one of the first studies to apply realistic evaluation in implementation research, it was a good fit, particularly given the growing emphasis on understanding how context influences evidence-based practice. The strengths and limitations of the approach are considered, including how to operationalise it and some of the challenges. This approach provided a useful interpretive framework with which to make sense of the multiple factors that were simultaneously at play and being observed through various data sources, and for developing explanatory theory about using standardised care approaches in practice

    Diet as prophylaxis and treatment for venous thromboembolism?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Both prophylaxis and treatment of venous thromboembolism (VTE: deep venous thrombosis (DVT) and pulmonary emboli (PE)) with anticoagulants are associated with significant risks of major and fatal hemorrhage. Anticoagulation treatment of VTE has been the standard of care in the USA since before 1962 when the U.S. Food and Drug Administration began requiring randomized controlled clinical trials (RCTs) showing efficacy, so efficacy trials were never required for FDA approval. In clinical trials of 'high VTE risk' surgical patients before the 1980s, anticoagulant prophylaxis was clearly beneficial (fatal pulmonary emboli (FPE) without anticoagulants = 0.99%, FPE with anticoagulants = 0.31%). However, observational studies and RCTs of 'high VTE risk' surgical patients from the 1980s until 2010 show that FPE deaths without anticoagulants are about one-fourth the rate that occurs during prophylaxis with anticoagulants (FPE without anticoagulants = 0.023%, FPE while receiving anticoagulant prophylaxis = 0.10%). Additionally, an FPE rate of about 0.012% (35/28,400) in patients receiving prophylactic anticoagulants can be attributed to 'rebound hypercoagulation' in the two months after stopping anticoagulants. Alternatives to anticoagulant prophylaxis should be explored.</p> <p>Methods and Findings</p> <p>The literature concerning dietary influences on VTE incidence was reviewed. Hypotheses concerning the etiology of VTE were critiqued in relationship to the rationale for dietary versus anticoagulant approaches to prophylaxis and treatment.</p> <p>Epidemiological evidence suggests that a diet with ample fruits and vegetables and little meat may substantially reduce the risk of VTE; vegetarian, vegan, or Mediterranean diets favorably affect serum markers of hemostasis and inflammation. The valve cusp hypoxia hypothesis of DVT/VTE etiology is consistent with the development of VTE being affected directly or indirectly by diet. However, it is less consistent with the rationale of using anticoagulants as VTE prophylaxis. For both prophylaxis and treatment of VTE, we propose RCTs comparing standard anticoagulation with low VTE risk diets, and we discuss the statistical considerations for an example of such a trial.</p> <p>Conclusions</p> <p>Because of (a) the risks of biochemical anticoagulation as anti-VTE prophylaxis or treatment, (b) the lack of placebo-controlled efficacy data supporting anticoagulant treatment of VTE, (c) dramatically reduced hospital-acquired FPE incidence in surgical patients without anticoagulant prophylaxis from 1980 - 2010 relative to the 1960s and 1970s, and (d) evidence that VTE incidence and outcomes may be influenced by diet, randomized controlled non-inferiority clinical trials are proposed to compare standard anticoagulant treatment with potentially low VTE risk diets. We call upon the U. S. National Institutes of Health and the U.K. National Institute for Health and Clinical Excellence to design and fund those trials.</p

    Endogenous fantasy and learning in digital games.

    Get PDF
    Many people believe that educational games are effective because they motivate children to actively engage in a learning activity as part of playing the game. However, seminal work by Malone (1981), exploring the motivational aspects of digital games, concluded that the educational effectiveness of a digital game depends on the way in which learning content is integrated into the fantasy context of the game. In particular, he claimed that content which is intrinsically related to the fantasy will produce better learning than that which is merely extrinsically related. However, this distinction between intrinsic and extrinsic (or endogenous and exogenous) fantasy is a concept that has developed a confused standing over the following years. This paper will address this confusion by providing a review and critique of the empirical and theoretical foundations of endogenous fantasy, and its relevance to creating educational digital games. Substantial concerns are raised about the empirical basis of this work and a theoretical critique of endogenous fantasy is offered, concluding that endogenous fantasy is a misnomer, in so far as the "integral and continuing relationship" of fantasy cannot be justified as a critical means of improving the effectiveness of educational digital games. An alternative perspective on the intrinsic integration of learning content is described, incorporating game mechanics, flow and representations
    • …
    corecore