27,437 research outputs found
Component-smoothed Inflation: Estimating the Persistent Component of Inflation in Real Time
This paper presents a new measure of underlying inflation: component-smoothed inflation. It approaches the problem of determining underlying inflation from a different direction than previous methods. Rather than excluding or trimming out volatile CPI items, it smoothes components of the CPI based on their volatility – CPI expenditure weights are maintained for all items. Items such as rent are smoothed a little, if at all, while volatile series such as fruit, vegetables and automotive fuel are smoothed a lot. This removes much of the temporary volatility in the CPI while retaining most of the persistent signal. Because our underlying inflation measure includes all CPI items at all times, it is robust to sustained relative price changes and is unbiased in the long run. A potential cost of this approach is that, unlike other measures, it places weight on lagged as well as contemporaneous prices for volatile series. An evaluation of the balance between the costs and benefits of this approach remains an open question.CPI; core inflation; underlying inflation; Australia; United States
Social learning and voluntary cooperation among like-minded people
Many people contribute to public goods but stop doing so once they experience free riding. We test the hypothesis that groups whose members know that they are composed only of ‘like-minded’ cooperators are able to maintain a higher cooperation level than the most cooperative, randomly-composed groups. Our experiments confirm this hypothesis. We also predict that groups of ‘like-minded’ free riders do not cooperate. Yet, we find a high level of strategic cooperation that eventually collapses. Our results underscore the importance of group composition and social learning by heterogeneously motivated agents to understand the dynamics of cooperation and free riding.Public goods, social learning, conditional cooperation, free riding, experiments
Dynamical spectral unmixing of multitemporal hyperspectral images
In this paper, we consider the problem of unmixing a time series of
hyperspectral images. We propose a dynamical model based on linear mixing
processes at each time instant. The spectral signatures and fractional
abundances of the pure materials in the scene are seen as latent variables, and
assumed to follow a general dynamical structure. Based on a simplified version
of this model, we derive an efficient spectral unmixing algorithm to estimate
the latent variables by performing alternating minimizations. The performance
of the proposed approach is demonstrated on synthetic and real multitemporal
hyperspectral images.Comment: 13 pages, 10 figure
Stereoscopic Sketchpad: 3D Digital Ink
--Context--
This project looked at the development of a stereoscopic 3D environment in which a user is able to draw freely in all three dimensions. The main focus was on the storage and manipulation of the ‘digital ink’ with which the user draws. For a drawing and sketching package to be effective it must not only have an easy to use user interface, it must be able to handle all input data quickly and efficiently so that the user is able to focus fully on their drawing.
--Background--
When it comes to sketching in three dimensions the majority of applications currently available rely on vector based drawing methods. This is primarily because the applications are designed to take a users two dimensional input and transform this into a three dimensional model. Having the sketch represented as vectors makes it simpler for
the program to act upon its geometry and thus convert it to a model. There are a number of methods to achieve this aim including Gesture Based Modelling, Reconstruction and Blobby Inflation. Other vector based applications focus on the creation of curves allowing the user to draw within or on existing 3D models. They also allow the user to create wire frame type models. These stroke based applications bring the user closer to traditional sketching rather than the more structured modelling methods detailed.
While at present the field is inundated with vector based applications mainly focused upon sketch-based modelling there are significantly less voxel based applications. The majority of these applications focus on the deformation and sculpting of voxmaps, almost the opposite of drawing and sketching, and the creation of three dimensional voxmaps from standard two dimensional pixmaps. How to actually sketch freely within a scene represented by a voxmap has rarely been explored. This comes as a surprise when so many of the standard 2D drawing programs in use today are pixel based.
--Method--
As part of this project a simple three dimensional drawing program was designed and implemented using C and C++. This tool is known as Sketch3D and was created using a Model View Controller (MVC) architecture. Due to the modular nature of Sketch3Ds system architecture it is possible to plug a range of different data structures into the program to represent the ink in a variety of ways. A series of data structures have been implemented and were tested for efficiency. These structures were a simple list, a 3D array, and an octree. They have been tested for: the time it takes to insert or remove points from the structure; how easy it is to manipulate points once they are stored; and also how the number of points stored effects the draw and rendering times.
One of the key issues brought up by this project was devising a means by which a user is able to draw in three dimensions while using only two dimensional input devices. The method settled upon and implemented involves using the mouse or a digital pen to sketch as one would in a standard 2D drawing package but also linking the up and down keyboard keys to the current depth. This allows the user to move in and out of the scene as they draw. A couple of user interface tools were also developed to assist the user. A 3D cursor was implemented and also a toggle, which when on, highlights all of the points intersecting the depth plane on which the cursor currently resides. These tools allow the user to see exactly where they are drawing in relation to previously drawn lines.
--Results--
The tests conducted on the data structures clearly revealed that the octree was the most effective data structure. While not the most efficient in every area, it manages to avoid the major pitfalls of the other structures. The list was extremely quick to render and draw to the screen but suffered severely when it comes to finding and manipulating points already stored. In contrast the three dimensional array was able to erase or manipulate points effectively while the draw time rendered the structure effectively useless, taking huge amounts of time to draw each frame.
The focus of this research was on how a 3D sketching package would go about storing
and accessing the digital ink. This is just a basis for further research in this area and many
issues touched upon in this paper will require a more in depth analysis. The primary area of
this future research would be the creation of an effective user interface and the introduction
of regular sketching package features such as the saving and loading of images
Regression Models for Count Data in R
The classical Poisson, geometric and negative binomial regression models for count data belong to the family of generalized linear models and are available at the core of the statistics toolbox in the R system for statistical computing. After reviewing the conceptual and computational features of these methods, a new implementation of hurdle and zero-inflated regression models in the functions hurdle() and zeroinfl() from the package pscl is introduced. It re-uses design and functionality of the basic R functions just as the underlying conceptual tools extend the classical models. Both hurdle and zero-inflated model, are able to incorporate over-dispersion and excess zeros-two problems that typically occur in count data sets in economics and the social sciences-better than their classical counterparts. Using cross-section data on the demand for medical care, it is illustrated how the classical as well as the zero-augmented models can be fitted, inspected and tested in practice.
Distribution of discontinuous mudstone beds within wave-dominated shallow-marine deposits : Star Point Sandstone and Blackhawk Formation, Eastern Utah
Acknowledgements Funding for this study was provided from the Research Council of Norway through the Petromaks project 193059 and the FORCE Safari Project. The lidar data was collected by Julien Vallet and Samuel Pitiot of Helimap Systems SA. Riegl LMS GmbH is acknowledged for software support. The first author would like to thank Oliver Severin Tynes for assistance in the field. Tore Grane Klausen and Gijs Allard Henstra are thanked for invaluable discussions. The authors would also like to thank Janok Bhattacharya, Cornel Olariu and one anonymous revier for their insightful comments which improved this paper, and Frances Witehurst for his editorial comments.Peer reviewedPostprin
Explaining temporal trends in annualized relapse rates in placebo groups of randomized controlled trials in relapsing multiple sclerosis: systematic review and meta-regression
Background: Recent studies have shown a decrease in annualised relapse rates
(ARRs) in placebo groups of randomised controlled trials (RCTs) in relapsing
multiple sclerosis (RMS).
Methods: We conducted a systematic literature search of RCTs in RMS. Data on
eligibility criteria and baseline characteristics were extracted and tested for
significant trends over time. A meta-regression was conducted to estimate their
contribution to the decrease of trial ARRs over time.
Results: We identified 56 studies. Patient age at baseline (p < 0.001), mean
duration of multiple sclerosis (MS) at baseline (p = 0.048), size of treatment
groups (p = 0.003), Oxford Quality Scale scores (p = 0.021), and the number of
eligibility criteria (p<0.001) increased significantly, whereas pre-trial ARR
(p = 0.001), the time span over which pre-trial ARR was calculated (p < 0.001),
and the duration of placebo-controlled follow-up (p = 0.006) decreased
significantly over time. In meta-regression of trial placebo ARR, the temporal
trend was found to be insignificant, with major factors explaining the
variation: pre-trial ARR, the number of years used to calculate pre-trial ARR
and study duration. Conclusion: The observed decline in trial ARRs may result
from decreasing pre-trial ARRs and a shorter time period over which pre-trial
ARRs were calculated. Increasing patient age and duration of illness may also
contribute.Comment: 20 pages, 4 figures (main article) + 13 pages (web appendix
Specialty chemicals manufacturing SMEs Toolbox to support Environmental and Sustainable Systems (TESS)
- …
