4,708 research outputs found

    The CPI for rents: a case of understated inflation.

    Get PDF
    Until the end of 1977, the method used in the U.S. consumer price index (CPI) to measure rent inflation tended to omit rent increases when units had a change of tenants or were vacant. Since such units typically had more rapid increases in rents than average units, this response bias biased inflation estimates downward. Beginning in 1978, the Bureau of Labor Statistics (BLS) implemented a series of methodological changes that reduced response bias but substantial bias remained until 1985. We set up a model of response bias, parameterize it, and test it using a BLS microdata set for rents. We conclude that from 1940 to 1985 the CPI inflation rate for rent most likely was understated by 1.4 percentage points annually in U.S. data. We construct an improved rental inflation series for 1940 to 2000; at the starting point in 1940, the revised index is 54 percent as large as the official CPI.Consumer price indexes ; Rent ; Inflation (Finance)

    Measuring American rents: a revisionist history.

    Get PDF
    Until the end of 1977, the method used to measure changes in rent of primary residence in the U.S. consumer price index (CPI) tended to omit price changes when units changed tenants or were temporarily vacant. Since such units typically had more rapid increases in rents than average units, omitting them biased inflation estimates downward. Beginning in 1978, the Bureau of Labor Statistics (BLS) implemented a series of methodological changes that reduced this bias. The authors use data from the American Housing Survey to check the success of the corrections. They compare estimates of the historical series adjusted for the BLS changes in methodology with a new hedonic estimate of changes in rental rates. The authors conclude that from 1940 to 1977 the CPI for rent would have been about 60 percent higher if current BLS practices had been used – between 1.3 and 3.5 percentage points. Even after the corrections have been made, the authors' hedonic estimates suggest that the current CPI methodology may still understate the rental inflation rate by one-half to 1 percentage point.Rent

    The CPI for rents: a case of understated inflation

    Get PDF
    Until the end of 1977, the U.S. consumer price index for rents tended to omit rent increases when units had a change of tenants or were vacant, biasing inflation estimates downward. Beginning in 1978, the Bureau of Labor Statistics (BLS) implemented a series of methodological changes that reduced this nonresponse bias, but substantial bias remained until 1985. The authors set up a model of nonresponse bias, parameterize it, and test it using a BLS microdata set for rents. From 1940 to 1985, the official BLS CPI-W price index for tenant rents rose 3.6 percent annually; the authors argue that it should have risen 5.0 percent annually. Rents in 1940 should be only half as much as their official relative price; this has important consequences for historical measures of rent-house-price ratios and for the growth of real consumption.Consumer price indexes ; Rent ; Inflation (Finance)

    Quantifying the sensitivity of simulated climate change to model configuration

    Get PDF
    Author Posting. © The Author(s), 2009. This is the author's version of the work. It is posted here by permission of Springer for personal use, not for redistribution. The definitive version was published in Climatic Change 3-4 (2009): 275-298, doi:10.1007/s10584-008-9494-x.This study used “factor separation” to quantify the sensitivity of simulated present and future surface temperatures and precipitation to alternative regional climate model physics components. The method enables a quantitative isolation of the effects of using each physical component as well as the combined effect of two or more components. Simulation results are presented from eight versions of the Mesoscale Modeling System Version 5 (MM5), one-way nested within one version of the Goddard Institute for Space Studies Atmosphere-Ocean Global Climate Model (GISS AOGCM). The MM5 simulations were made at 108 km grid spacing over the continental United States for five summers in the 1990s and 2050s. Results show that the choice of cumulus convection parameterization is the most important “factor” in the simulation of contemporary surface summer temperatures and precipitation over both the western and eastern United States. The choice of boundary layer scheme and radiation package also increases the range of model simulation results. Moreover, the alternative configurations give quite different results for surface temperature and precipitation in the 2050s. For example, simulated 2050s surface temperatures by the scheme with the coolest 1990s surface temperatures are comparable to 1990s temperatures produced by other schemes. The study analyzes the spatial distribution of 1990s to 2050s projected changes in the surface temperature for the eight MM5 versions. The predicted surface temperature change at a given grid point, averaged over all eight model configurations, is generally about twice the standard deviation of the eight predicted changes, indicating relative consensus among the different model projections. Factor separation analysis indicates that the choice of cumulus parameterization is the most important modeling factor amongst the three tested contributing to the computed 1990s to 2050s surface temperature change, although enhanced warming over many areas is also attributable to synergistic effects of changing all three model components. Simulated ensemble mean precipitation changes, however, are very small and generally smaller than the inter-model standard deviations. The MM5 versions therefore offer little consensus regarding 1990s to 2050s changes in precipitation rates.This research was supported by Grant R828733 from the U.S. Environmental Protection Agency's Science to Achieve Results (STAR) program, NSF Grant ATM-0652518, NASA Grant NNX07AI93G and the NASA Climate Variability and Climate Change Programs

    Gravitational shear, flexion, and strong lensing in Abell 1689

    Get PDF
    We present a gravitational lensing analysis of the galaxy cluster Abell 1689, incorporating measurements of the weak shear, flexion, and strong lensing induced in background galaxies. This is the first time that a shapelet technique has been used to reconstruct the distribution of mass in this cluster and the first time that a flexion signal has been measured using cluster members as lenses. From weak shear measurements alone, we generate a nonparametric mass reconstruction, which shows significant substructure corresponding to groups of galaxies within the cluster. In addition, our galaxy-galaxy flexion signal demonstrates that the cluster galaxies can be well fit by a singular isothermal sphere model with a characteristic velocity dispersion of σv = (295 ± 40) km s^(-1). We identify a major, distinct dark matter clump, offset by 40 h^(-1) kpc from the central cluster members, which was not apparent from shear measurements alone. This secondary clump is present in a parametric mass reconstruction using flexion data alone, and its existence is suggested in a nonparametric reconstruction of the cluster using a combination of strong and weak lensing. As found in previous studies, the mass profile obtained by combining weak and strong lensing data shows a much steeper profile than that obtained from only weak lensing data

    Interfacing TuLiP with the JPL Statechart Autocoder: Initial progress toward synthesis of flight software from formal specifications

    Get PDF
    This paper describes the implementation of an interface connecting the two tools : the JPL SCA (Statechart Autocoder) and TuLiP (Temporal Logic Planning Toolbox) to enable the automatic synthesis of low level implementation code directly from formal specifications. With system dynamics, bounds on uncertainty and formal specifications as inputs, TuLiP synthesizes Mealy machines that are correct-by-construction. An interface is built that automatically translates these Mealy machines into UML statecharts. The SCA accepts the UML statecharts (as XML files) to synthesize flight-certified implementation code. The functionality of the interface is demonstrated through three example systems of varying complexity a) a simple thermostat b) a simple speed controller for an autonomous vehicle and c) a more complex speed controller for an autonomous vehicle with a map-element. In the thermostat controller, there is a specification regarding the desired temperature range that has to be met despite disturbance from the environment. Similarly, in the speed-controllers there are specifications about safe driving speeds depending on sensor health (sensors fail unpredictably) and the map-location. The significance of these demonstrations is the potential circumventing of some of the manual design of statecharts for flight software/controllers. As a result, we expect that less testing and validation will be necessary. In applications where the products of synthesis are used alongside manually designed components, extensive testing or new certificates of correctness of the composition may still be required

    Sugarcane in Vitro Culture Technology: Applications for Kenya’s Sugar Industry

    Get PDF
    Sugarcane (Saccharum officinarum L.) is one of the most important crops in Kenya and has wide range of economic importance. The sugar industry contributes up to 15% to the Country’s agricultural gross domestic product and an estimated 25% of the population depends on the industry for their livelihood. However, the industry has been facing several challenges including declining yields due to use of poor quality planting materials. There is an increasing pressure to enhance the productivity of sugarcane in order to sustain profitable sugar industries in Kenya, while there are several diseases attacking sugarcane and reducing its quality. Seed multiplication of newly released varieties of sugarcane is one of the major constraints in Kenya as it takes 6-7 years to produce sufficient quantity of improved seed material. In vitro culture offers a practical and fast method for mass propagation of disease-free clonal materials. Successful protocols for shoot tip culture, callus culture, embryo culture, virus free plant production and somatic embryogenesis have already been established. Thus in vitro technology can be used to enhance productivity of sugarcane in Kenya. Keywords: Sugarcane,Somaclonal variation, in vitro culture, Meristems, Micro-propagation, callu

    Sugarcane in vitro culture technology: Opportunities for Kenya’s sugar industry

    Get PDF
    Sugarcane (Saccharum officinarum L.) is one of the most important crops in Kenya and has wide range of economic importance. The sugar industry contributes up to 15% to the country’s agricultural gross domestic product and an estimated 25% of the population depends on the industry for their livelihood. However, the industry has been facing several challenges including declining yields due to use of poor quality planting materials. There is an increasing pressure to enhance the productivity of sugarcane in order to sustain profitable sugar industries in Kenya, while there are several diseases attacking sugarcane and reducing its quality. Seed multiplication of newly released varieties of sugarcane is one of the major constraints in Kenya as it takes 6-7 years to produce sufficient quantity of improved seed material. In vitro culture offers a practical and fast method for mass propagation of disease-free clonal materials. Successful protocols for shoot tip culture, callus culture, embryo culture, virus free plant production and somatic embryogenesis have already been established. Thus, in vitro technology can be used to enhance productivity of sugarcane in Kenya. Despite several advantages of applying micro-propagation technique in sugarcane such as quick multiplication of newly released varieties, rejuvenation of old deteriorated varieties; production of disease free seed; easy transportation of seed material; elimination of viruses; high cane productivity and sugar yield etc., this technique is not gaining popularity up to the desired extent. There are several constraints like the high cost of production and appearance of some variants in micropropagated population among others. The present article describes the status, challenges and opportunities of in vitro technology for the sugar industry in Kenya. Though, some problems have now been resolved to considerable extents which have been described in this review however, some constraints still require intensive research work to be resolved so that a safe and efficient exploitation of this technique can be ensured in sugarcane seed production programmes for enhanced yields and quality.Keywords: Sugarcane, somaclonal variation, in vitro culture, meristems, micro-propagation, callu

    Engineering Resilient Space Systems

    Get PDF
    Several distinct trends will influence space exploration missions in the next decade. Destinations are becoming more remote and mysterious, science questions more sophisticated, and, as mission experience accumulates, the most accessible targets are visited, advancing the knowledge frontier to more difficult, harsh, and inaccessible environments. This leads to new challenges including: hazardous conditions that limit mission lifetime, such as high radiation levels surrounding interesting destinations like Europa or toxic atmospheres of planetary bodies like Venus; unconstrained environments with navigation hazards, such as free-floating active small bodies; multielement missions required to answer more sophisticated questions, such as Mars Sample Return (MSR); and long-range missions, such as Kuiper belt exploration, that must survive equipment failures over the span of decades. These missions will need to be successful without a priori knowledge of the most efficient data collection techniques for optimum science return. Science objectives will have to be revised ‘on the fly’, with new data collection and navigation decisions on short timescales. Yet, even as science objectives are becoming more ambitious, several critical resources remain unchanged. Since physics imposes insurmountable light-time delays, anticipated improvements to the Deep Space Network (DSN) will only marginally improve the bandwidth and communications cadence to remote spacecraft. Fiscal resources are increasingly limited, resulting in fewer flagship missions, smaller spacecraft, and less subsystem redundancy. As missions visit more distant and formidable locations, the job of the operations team becomes more challenging, seemingly inconsistent with the trend of shrinking mission budgets for operations support. How can we continue to explore challenging new locations without increasing risk or system complexity? These challenges are present, to some degree, for the entire Decadal Survey mission portfolio, as documented in Vision and Voyages for Planetary Science in the Decade 2013–2022 (National Research Council, 2011), but are especially acute for the following mission examples, identified in our recently completed KISS Engineering Resilient Space Systems (ERSS) study: 1. A Venus lander, designed to sample the atmosphere and surface of Venus, would have to perform science operations as components and subsystems degrade and fail; 2. A Trojan asteroid tour spacecraft would spend significant time cruising to its ultimate destination (essentially hibernating to save on operations costs), then upon arrival, would have to act as its own surveyor, finding new objects and targets of opportunity as it approaches each asteroid, requiring response on short notice; and 3. A MSR campaign would not only be required to perform fast reconnaissance over long distances on the surface of Mars, interact with an unknown physical surface, and handle degradations and faults, but would also contain multiple components (launch vehicle, cruise stage, entry and landing vehicle, surface rover, ascent vehicle, orbiting cache, and Earth return vehicle) that dramatically increase the need for resilience to failure across the complex system. The concept of resilience and its relevance and application in various domains was a focus during the study, with several definitions of resilience proposed and discussed. While there was substantial variation in the specifics, there was a common conceptual core that emerged—adaptation in the presence of changing circumstances. These changes were couched in various ways—anomalies, disruptions, discoveries—but they all ultimately had to do with changes in underlying assumptions. Invalid assumptions, whether due to unexpected changes in the environment, or an inadequate understanding of interactions within the system, may cause unexpected or unintended system behavior. A system is resilient if it continues to perform the intended functions in the presence of invalid assumptions. Our study focused on areas of resilience that we felt needed additional exploration and integration, namely system and software architectures and capabilities, and autonomy technologies. (While also an important consideration, resilience in hardware is being addressed in multiple other venues, including 2 other KISS studies.) The study consisted of two workshops, separated by a seven-month focused study period. The first workshop (Workshop #1) explored the ‘problem space’ as an organizing theme, and the second workshop (Workshop #2) explored the ‘solution space’. In each workshop, focused discussions and exercises were interspersed with presentations from participants and invited speakers. The study period between the two workshops was organized as part of the synthesis activity during the first workshop. The study participants, after spending the initial days of the first workshop discussing the nature of resilience and its impact on future science missions, decided to split into three focus groups, each with a particular thrust, to explore specific ideas further and develop material needed for the second workshop. The three focus groups and areas of exploration were: 1. Reference missions: address/refine the resilience needs by exploring a set of reference missions 2. Capability survey: collect, document, and assess current efforts to develop capabilities and technology that could be used to address the documented needs, both inside and outside NASA 3. Architecture: analyze the impact of architecture on system resilience, and provide principles and guidance for architecting greater resilience in our future systems The key product of the second workshop was a set of capability roadmaps pertaining to the three reference missions selected for their representative coverage of the types of space missions envisioned for the future. From these three roadmaps, we have extracted several common capability patterns that would be appropriate targets for near-term technical development: one focused on graceful degradation of system functionality, a second focused on data understanding for science and engineering applications, and a third focused on hazard avoidance and environmental uncertainty. Continuing work is extending these roadmaps to identify candidate enablers of the capabilities from the following three categories: architecture solutions, technology solutions, and process solutions. The KISS study allowed a collection of diverse and engaged engineers, researchers, and scientists to think deeply about the theory, approaches, and technical issues involved in developing and applying resilience capabilities. The conclusions summarize the varied and disparate discussions that occurred during the study, and include new insights about the nature of the challenge and potential solutions: 1. There is a clear and definitive need for more resilient space systems. During our study period, the key scientists/engineers we engaged to understand potential future missions confirmed the scientific and risk reduction value of greater resilience in the systems used to perform these missions. 2. Resilience can be quantified in measurable terms—project cost, mission risk, and quality of science return. In order to consider resilience properly in the set of engineering trades performed during the design, integration, and operation of space systems, the benefits and costs of resilience need to be quantified. We believe, based on the work done during the study, that appropriate metrics to measure resilience must relate to risk, cost, and science quality/opportunity. Additional work is required to explicitly tie design decisions to these first-order concerns. 3. There are many existing basic technologies that can be applied to engineering resilient space systems. Through the discussions during the study, we found many varied approaches and research that address the various facets of resilience, some within NASA, and many more beyond. Examples from civil architecture, Department of Defense (DoD) / Defense Advanced Research Projects Agency (DARPA) initiatives, ‘smart’ power grid control, cyber-physical systems, software architecture, and application of formal verification methods for software were identified and discussed. The variety and scope of related efforts is encouraging and presents many opportunities for collaboration and development, and we expect many collaborative proposals and joint research as a result of the study. 4. Use of principled architectural approaches is key to managing complexity and integrating disparate technologies. The main challenge inherent in considering highly resilient space systems is that the increase in capability can result in an increase in complexity with all of the 3 risks and costs associated with more complex systems. What is needed is a better way of conceiving space systems that enables incorporation of capabilities without increasing complexity. We believe principled architecting approaches provide the needed means to convey a unified understanding of the system to primary stakeholders, thereby controlling complexity in the conception and development of resilient systems, and enabling the integration of disparate approaches and technologies. A representative architectural example is included in Appendix F. 5. Developing trusted resilience capabilities will require a diverse yet strategically directed research program. Despite the interest in, and benefits of, deploying resilience space systems, to date, there has been a notable lack of meaningful demonstrated progress in systems capable of working in hazardous uncertain situations. The roadmaps completed during the study, and documented in this report, provide the basis for a real funded plan that considers the required fundamental work and evolution of needed capabilities. Exploring space is a challenging and difficult endeavor. Future space missions will require more resilience in order to perform the desired science in new environments under constraints of development and operations cost, acceptable risk, and communications delays. Development of space systems with resilient capabilities has the potential to expand the limits of possibility, revolutionizing space science by enabling as yet unforeseen missions and breakthrough science observations. Our KISS study provided an essential venue for the consideration of these challenges and goals. Additional work and future steps are needed to realize the potential of resilient systems—this study provided the necessary catalyst to begin this process

    A Study of Thymidylate Synthase Expression as a Biomarker for Resectable Colon Cancer: Alliance (Cancer and Leukemia Group B) 9581 and 89803.

    Get PDF
    PurposeTumor levels of thymidylate synthase (TS), a target of 5-fluorouracil (5-FU)-based chemotherapy for colorectal cancer, have been studied as a predictive or prognostic biomarker with mixed results.Patients and methodsTumor TS levels were prospectively evaluated in two adjuvant therapy trials for patients with resected stage II or III colon cancer. TS expression was determined by standard immunohistochemistry and by automated quantitative analysis. Tumor mismatch repair deficiency (MMR-D) and BRAF c.1799T > A (p.V600E) mutation status were also examined. Relationships between tumor TS, MMR-D, and BRAF mutation status, overall survival (OS), and disease-free survival (DFS) were investigated in the subset of stage III patients.ResultsPatients whose tumors demonstrated high TS expression experienced better treatment outcomes, with DFS hazard ratio (HR) = 0.67, 95% confidence interval (CI) = 0.53, 0.84; and OS HR = 0.68, 95% CI = 0.53, 0.88, for high versus low TS expression, respectively. No significant interaction between TS expression and stage was observed (DFS: interaction HR = 0.94; OS: interaction HR = 0.94). Tumors with high TS expression were more likely to demonstrate MMR-D (22.2% vs. 12.8%; p =  .0003). Patients whose tumors demonstrated both high TS and MMR-D had a 7-year DFS of 77%, compared with 58% for those whose tumors had low TS and were non-MMR-D (log-rank p =  .0006). Tumor TS expression did not predict benefit of a particular therapeutic regimen.ConclusionThis large prospective analysis showed that high tumor TS levels were associated with improved DFS and OS following adjuvant therapy for colon cancer, although tumor TS expression did not predict benefit of 5-FU-based chemotherapy. The Oncologist 2017;22:107-114Implications for Practice: This study finds that measurement of tumor levels of thymidylate synthase is not helpful in assigning specific adjuvant treatment for colorectal cancer. It also highlights the importance of using prospective analyses within treatment clinical trials as the optimal method of determining biomarker utility
    corecore