224 research outputs found

    Limits to parallelism in scientific computing

    Get PDF
    The goal of our research is to decrease the execution time of scientific computing applications. We exploit the application\u27s inherent parallelism to achieve this goal. This exploitation is expensive as we analyze sequential applications and port them to parallel computers. Many scientifically computational problems appear to have considerable exploitable parallelism; however, upon implementing a parallel solution on a parallel computer, limits to the parallelism are encountered. Unfortunately, many of these limits are characteristic of a specific parallel computer. This thesis explores these limits.;We study the feasibility of exploiting the inherent parallelism of four NASA scientific computing applications. We use simple models to predict each application\u27s degree of parallelism at several levels of granularity. From this analysis, we conclude that it is infeasible to exploit the inherent parallelism of two of the four applications. The interprocessor communication of one application is too expensive relative to its computation cost. The input and output costs of the other application are too expensive relative to its computation cost. We exploit the parallelism of the remaining two applications and measure their performance on an Intel iPSC/2 parallel computer. We parallelize an Optimal Control Boundary Value Problem. This guidance control problem determines an optimal trajectory of a boat in a river. We parallelize the Carbon Dioxide Slicing technique which is a macrophysical cloud property retrieval algorithm. This technique computes the height at the top of a cloud using cloud imager measurements. We consider the feasibility of exploiting its massive parallelism on a MasPar MP-2 parallel computer. We conclude that many limits to parallelism are surmountable while other limits are inescapable.;From these limits, we elucidate some fundamental issues that must be considered when porting similar problems to yet-to-be designed computers. We conclude that the technological improvements to reduce the isolation of computational units frees a programmer from many of the programmer\u27s current concerns about the granularity of the work. We also conclude that the technological improvements to relax the regimented guidance of the computational units allows a programmer to exploit the inherent heterogeneous parallelism of many applications

    Development of response models for the Earth Radiation Budget Experiment (ERBE) sensors. Part 3: ERBE scanner measurement accuracy analysis due to reduced housekeeping data

    Get PDF
    The accuracy of scanner measurements was evaluated when the sampling frequency of sensor housekeeping (HK) data was reduced from once every scan to once every eight scans. The resulting increase in uncertainty was greatest for sources with rapid or extreme temperature changes. This analysis focused on the mirror attenuator mosaic (MAM) baffle and plate and scanner radiometer baffle due to their relatively high temperature changes during solar calibrations. Since only solar simulator data were available, the solar temperatures were approximated on these components and the radiative and thermal gradients in the MAM baffle due to reflected sunlight. Of the two cases considered for the MAM plate and baffle temperatures, one uses temperatures obtained from the ground calibration. The other attempt uses temperatures computed from the MAM baffle model. This analysis shows that the heat input variations due largely to the solar radiance and irradiance during a scan cycle are small. It also demonstrates that reasonable intervals longer than the current HK data acquisition interval should not significantly affect the estimation of a radiation field in the sensor field-of-view

    Physical Acoustics

    Get PDF
    Contains reports on four research projects.U. S. Navy (Office of Naval Research) under Contract Nonr- 1841(42

    A 10 year follow-up study after Roux-Elmslie-Trillat treatment for cases of patellar instability

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>A retrospective study concerning patients presenting with patella instability, treated using a Roux-Elmslie-Trillat reconstruction operation and followed up for 10 years following surgery, is presented.</p> <p>Methods</p> <p>Pre-operative and follow-up radiographic evaluation included the weight-bearing anteroposterior and merchant views. Evaluation was carried out using the Insall-Salvati index, sulcus and congruence angle. The Roux-Elmslie-Trillat reconstruction operation was performed on 18 patients. The clinical evaluation at follow-up was performed using the Knee-Society-Score (KSS) and Tegner-Score.</p> <p>Results</p> <p>Subjective results of the operation were classed as excellent or good in 16 of the 18 patients ten years after surgery; persistent instability of the patella was recorded in only one of the 18 patients. The majority of patients returned to the same level of sporting activity after surgery as they had participated in before injury.</p> <p>Conclusions</p> <p>The Roux-Elmslie-Trillat procedure could be recommended in cases presenting with an increased q-angle, trochlea dysplasia or failed soft tissue surgery. In the present study the majority of patients report a return to previous sporting activity ten years after surgery.</p

    Plasma Dynamics

    Get PDF
    Contains research objectives and summary of research on eighteen research projects split into seven sections and reports on four research projects.U.S. Atomic Energy Commission (Contract AT(l1-1)-3070)National Science Foundation (Grant GK-37979X1

    A Measurement of Rb using a Double Tagging Method

    Get PDF
    The fraction of Z to bbbar events in hadronic Z decays has been measured by the OPAL experiment using the data collected at LEP between 1992 and 1995. The Z to bbbar decays were tagged using displaced secondary vertices, and high momentum electrons and muons. Systematic uncertainties were reduced by measuring the b-tagging efficiency using a double tagging technique. Efficiency correlations between opposite hemispheres of an event are small, and are well understood through comparisons between real and simulated data samples. A value of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is statistical and the second systematic. The uncertainty on Rc, the fraction of Z to ccbar events in hadronic Z decays, is not included in the errors. The dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the deviation of Rc from the value 0.172 predicted by the Standard Model. The result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European Physical Journal

    Measurement of the B+ and B-0 lifetimes and search for CP(T) violation using reconstructed secondary vertices

    Get PDF
    The lifetimes of the B+ and B-0 mesons, and their ratio, have been measured in the OPAL experiment using 2.4 million hadronic Z(0) decays recorded at LEP. Z(0) --> b (b) over bar decays were tagged using displaced secondary vertices and high momentum electrons and muons. The lifetimes were then measured using well-reconstructed charged and neutral secondary vertices selected in this tagged data sample. The results aretau(B+) = 1.643 +/- 0.037 +/- 0.025 pstau(Bo) = 1.523 +/- 0.057 +/- 0.053 pstau(B+)/tau(Bo) = 1.079 +/- 0.064 +/- 0.041,where in each case the first error is statistical and the second systematic.A larger data sample of 3.1 million hadronic Z(o) decays has been used to search for CP and CPT violating effects by comparison of inclusive b and (b) over bar hadron decays, No evidence fur such effects is seen. The CP violation parameter Re(epsilon(B)) is measured to be Re(epsilon(B)) = 0.001 +/- 0.014 +/- 0.003and the fractional difference between b and (b) over bar hadron lifetimes is measured to(Delta tau/tau)(b) = tau(b hadron) - tau((b) over bar hadron)/tau(average) = -0.001 +/- 0.012 +/- 0.008

    Plasma Dynamics

    Get PDF
    Contains research objectives and summary of research on twenty-one projects split into three sections, with four sub-sections in the second section and reports on twelve research projects.National Science Foundation (Grant ENG75-06242)U.S. Energy Research and Development Administration (Contract E(11-1)-2766)U.S. Energy Research and Development Agency (Contract E(11-1)-3070)U.S. Energy Research and Development Administration (Contract E(11-1)-3070)Research Laboratory of Electronics, M.I.T. Industrial Fellowshi

    Predicting new venture survival and growth: does the fog lift?

    Get PDF
    This paper investigates whether new venture performance becomes easier to predict as the venture ages: does the fog lift? To address this question we primarily draw upon a theoretical framework, initially formulated in a managerial context by Levinthal (Adm Sci Q 36(3):397–420, 1991) that sees new venture sales as a random walk but survival being determined by the stock of available resources (proxied by size). We derive theoretical predictions that are tested with a 10-year cohort of 6579 UK new ventures in the UK. We observe that our ability to predict firm growth deteriorates in the years after entry—in terms of the selection environment, the ‘fog’ seems to thicken. However, our survival predictions improve with time—implying that the ‘fog’ does lift
    corecore