724 research outputs found
Comment: Fisher Lecture: Dimension Reduction in Regression
Comment: Fisher Lecture: Dimension Reduction in Regression [arXiv:0708.3774]Comment: Published at http://dx.doi.org/10.1214/088342307000000041 in the
Statistical Science (http://www.imstat.org/sts/) by the Institute of
Mathematical Statistics (http://www.imstat.org
A Conversation with Seymour Geisser
Seymour Geisser received his bachelor's degree in Mathematics from the City
College of New York in 1950, and his M.A. and Ph.D. degrees in Mathematical
Statistics at the University of North Carolina in 1952 and 1955, respectively.
He then held positions at the National Bureau of Standards and the National
Institute of Mental Health until 1961. From 1961 until 1965, he was Chief of
the Biometry Section at the National Institute of Arthritis and Metabolic
Diseases, and also held the position of Professorial Lecturer at the George
Washington University from 1960 to 1965. From 1965 to 1970, he was the founding
Chair of the Department of Statistics at the State University of New York,
Buffalo, and in 1971, he became the founding Director of the School of
Statistics at the University of Minnesota, remaining in that position until
2001. He held visiting professorships at Iowa State University, 1960;
University of Wisconsin, 1964; University of Tel-Aviv (Israel), 1971;
University of Waterloo (Canada), 1972; Stanford University, 1976, 1977, 1988;
Carnegie Mellon University, 1976; University of the Orange Free State (South
Africa), 1978, 1993; Harvard University, 1981; University of Chicago, 1985;
University of Warwick (England), 1986; University of Modena (Italy), 1996; and
National Chiao Tung University (Taiwan), 1998. He was the Lady Davis Visiting
Professor, Hebrew University of Jerusalem, 1991, 1994, 1999, and the Schor
Scholar, Merck Research Laboratories, 2002-2003. He was a Fellow of the
Institute of Mathematical Statistics and the American Statistical Association.Comment: Published in at http://dx.doi.org/10.1214/088342307000000131 the
Statistical Science (http://www.imstat.org/sts/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Single-Valued Arrays
I spaciously walk alone the path along which I was once led... and taught..
Paper Session II-A - A New Commercial Space Furnace- Developed on the Fast Track
A new space payload was recently developed which provides the capability for processing advanced metals and alloys. This payload features a high temperature sintering furnace which has successfully flown on the first two missions of the commercial SPACEHAB payload carrier (STS Mission 57 and 60), This paper describes the technical and programmatic approaches used to deliver the rack-mounted equipment in less than ten months from program initiation, and at a cost of less than 100,000/pound for comparable astronaut-rated payloads). A key to the efficient and cost-effective approach was the use of the Universal Small Experiment Container or USEC developed by Wyle Laboratories. This commercially-developed product was used to incorporate the furnace, vacuum system, computer/controller, power conditioning, cooling system, pressurized gas purge system, gravity sensor, and other elements into a compact 220-pound package. The project has established a new milestone by demonstrating how more cost-effective payloads can be developed and flown on the Space Shuttle
A Ferguson-Antoniak Approach to the Empiracal Bayes Estimation of a Binomial Parameter
1 online resource (PDF, 27 pages
Medical Malpractice Risk Management Early Warning Systems
The effectiveness of early warning systems that are based on incident of occurrence reports in improving medical malpractice claims processing and outcomes was studied. Results showed that malpractice claims established on the basis of early warning incident reports not only involve the full range of injury severity, but also identify claims warranting substantial indemnity payments
Some Lake Level Control Alternatives for the Great Salt Lake
Fluctuations of the level of the Great Salt Lake cause large changes in both surface area and shoreline. Developments adjacent to the lake have been damaged by both high and low lake levels; and unless measures are implemented to regulate lake level fluctuations or otherwise to protect these developments, damages will continue. Various possible managment alternatives for mitigating potential damages from lake level fluctuations need to be examined and evaluated. In this study, three possible techniques are examined for reducing damages from fluctuating water levels at the lake, namely: 1. Consumptively using an increased proportion of the inflowing fresh waters on irrigated crop lands during periods of high lake inflow. 2. Protecting important properties and facilities around the lake through the construction of a system of dikes. 3. Removing lake water through pumping into the West Dester for evaporation. The above three alternatives are evaluated only for economic feasibility, with physical, legal, and institutional constraints being neglected. The philosophy behind this approach was that if economic feasibility could be demonstrated, other investigations could follow. With reference to the first alternative, the additional irrigation is assumed to occur within the Bear River Basin. The Bear River, which contributes approximately 56 percent of the total inflow to the Great Salt Lake, drains the only tributary basin which contains significant areas of irrigable but not yet irrigated lands. A reconnaissance level economic analysis of each of the above management alternatives is presented. Bapital and annual costs are estimated and compared with estimates of the flood control venefits generated. The overall feasibility, the optimum design, and the optimum time of construction are thus determined for each alternative. From the results of the study, it is concluded that irrigation in the Bear River Basin, except perhaps as part of a multiple purpose project, and the West Desert pumping alternatives are not economically feasible. Particular configurations of the dike alternatives are economically attracive if construction is commenced when lake levels rise to elevations exceeding 4202 feet
Asteroseismology of the Transiting Exoplanet Host HD 17156 with HST FGS
Observations conducted with the Fine Guidance Sensor on Hubble Space
Telescope (HST) providing high cadence and precision time-series photometry
were obtained over 10 consecutive days in December 2008 on the host star of the
transiting exoplanet HD 17156b. During this time 10^12 photons (corrected for
detector deadtime) were collected in which a noise level of 163 parts per
million per 30 second sum resulted, thus providing excellent sensitivity to
detection of the analog of the solar 5-minute p-mode oscillations. For HD 17156
robust detection of p-modes supports determination of the stellar mean density
of 0.5301 +/- 0.0044 g/cm^3 from a detailed fit to the observed frequencies of
modes of degree l = 0, 1, and 2. This is the first star for which direct
determination of the mean stellar density has been possible using both
asteroseismology and detailed analysis of a transiting planet light curve.
Using the density constraint from asteroseismology, and stellar evolution
modeling results in M_star = 1.285 +/- 0.026 solar, R_star = 1.507 +/- 0.012
solar, and a stellar age of 3.2 +/- 0.3 Gyr.Comment: Accepted by ApJ; 16 pages, 18 figure
Adjoint sensitivity analysis in nuclear reactor fuel behavior modeling
A computer program, SCODE, has been developed for calculating sensitivities for EPRI's SPEAR-ALPHA nuclear fuel performance code FCODE-ALPHA. Eleven critical parameters are assessed for the effects of their independent variations on 33 basic variables in the FCODE-ALPHA model. The enormous wealth of sensitivities that result, consisting of 363 quantities per axial node per time step, are calculated following FCODE-ALPHA computations on each time step. SCODE is based on adjoint sensitivity analysis, which is an analytic technique, obviating the need for numerical differentiation via repeated code runs at varied parameter values. Evaluation of sensitivities is reduced to a problem in linear algebra and is handled by standard matrix manipulations. Compared with the customary numerical differentiation approach, SCODE offers advantages of significant runtime reduction, exactitude of results, and on-line computation of sensitivities.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/24288/1/0000554.pd
- …