224,213 research outputs found
Neutrino tomography - Learning about the Earth's interior using the propagation of neutrinos
Because the propagation of neutrinos is affected by the presence of Earth
matter, it opens new possibilities to probe the Earth's interior. Different
approaches range from techniques based upon the interaction of high energy
(above TeV) neutrinos with Earth matter, to methods using the MSW effect on the
neutrino oscillations of low energy (MeV to GeV) neutrinos. In principle,
neutrinos from many different sources (sun, atmosphere, supernovae, beams etc.)
can be used. In this talk, we summarize and compare different approaches with
an emphasis on more recent developments. In addition, we point out other
geophysical aspects relevant for neutrino oscillations.Comment: 22 pages, 9 figures. Proceedings of ``Neutrino sciences 2005:
Neutrino geophysics'', December 14-16, 2005, Honolulu, USA. Minor changes,
some references added. Final version to appear in Earth, Moon, and Planet
A sparse implementation of the Frisch-Newton algorithm for 1uantile regression: Working paper series--03-03
Recent experience has shown that interior-point methods using a log barrier approach are far superior to classical simplex methods for computing solutions to large parametric quantile regression problems. In many large empirical applications, the design matrix has a very sparse structure. A typical example is the classical fixed-effect model for panel data where the parametric dimension of the model can be quite large, but the number of non-zero elements is quite small. Adopting recent developments in sparse linear algebra we introduce a modified version of the Frisch-Newton algorithm for quantile regression described in Koenker and Portnoy (1997). The new algorithm substantially reduces the storage (memory) requirements and increases computational speed. The modified algorithm also facilitates the development of nonparametric quantile regression methods. The pseudo design matrices employed in nonparametric quantile regression smoothing are inherently sparse in both the fidelity and roughness penalty components. Exploiting the sparse structure of these problems opens up a whole range of new possibilities for multivariate smoothing on large data sets via ANOVA-type decomposition and partial linear models
Fast Polyhedral Adaptive Conjoint Estimation
We propose and test a new adaptive conjoint analysis method that draws on recent polyhedral “interior-point” developments in mathematical programming. The method is designed to offer accurate estimates after relatively few questions in problems involving many parameters. Each respondent’s ques-tions are adapted based upon prior answers by that respondent. The method requires computer support but can operate in both Internet and off-line environments with no noticeable delay between questions. We use Monte Carlo simulations to compare the performance of the method against a broad array of relevant benchmarks. While no method dominates in all situations, polyhedral algorithms appear to hold significant potential when (a) metric profile comparisons are more accurate than the self-explicated importance measures used in benchmark methods, (b) when respondent wear out is a concern, and (c) when product development and/or marketing teams wish to screen many features quickly. We also test hybrid methods that combine polyhedral algorithms with existing conjoint analysis methods. We close with suggestions on how polyhedral methods can be used to address other marketing problems.Sloan School of Management and the Center for Innovation in Product Development at MI
nprobust: Nonparametric Kernel-Based Estimation and Robust Bias-Corrected Inference
Nonparametric kernel density and local polynomial regression estimators are very popular in statistics, economics, and many other disciplines. They are routinely employed in applied work, either as part of the main empirical analysis or as a preliminary ingredient entering some other estimation or inference procedure. This article describes the main methodological and numerical features of the software package nprobust, which offers an array of estimation and inference procedures for nonparametric kernel-based density and local polynomial regression methods, implemented in both the R and Stata statistical platforms. The package includes not only classical bandwidth selection, estimation, and inference methods (Wand and Jones 1995; Fan and Gijbels 1996), but also other recent developments in the statistics and econometrics literatures such as robust bias-corrected inference and coverage error optimal bandwidth selection (Calonico, Cattaneo, and Farrell 2018, 2019a). Furthermore, this article also proposes a simple way of estimating optimal bandwidths in practice that always delivers the optimal mean square error convergence rate regardless of the specific evaluation point, that is, no matter whether it is implemented at a boundary or interior point. Numerical performance is illustrated using an empirical application and simulated data, where a detailed numerical comparison with other R packages is given
nprobust: Nonparametric Kernel-Based Estimation and Robust Bias-Corrected Inference
Nonparametric kernel density and local polynomial regression estimators are
very popular in Statistics, Economics, and many other disciplines. They are
routinely employed in applied work, either as part of the main empirical
analysis or as a preliminary ingredient entering some other estimation or
inference procedure. This article describes the main methodological and
numerical features of the software package nprobust, which offers an array of
estimation and inference procedures for nonparametric kernel-based density and
local polynomial regression methods, implemented in both the R and Stata
statistical platforms. The package includes not only classical bandwidth
selection, estimation, and inference methods (Wand and Jones, 1995; Fan and
Gijbels, 1996), but also other recent developments in the statistics and
econometrics literatures such as robust bias-corrected inference and coverage
error optimal bandwidth selection (Calonico, Cattaneo and Farrell, 2018, 2019).
Furthermore, this article also proposes a simple way of estimating optimal
bandwidths in practice that always delivers the optimal mean square error
convergence rate regardless of the specific evaluation point, that is, no
matter whether it is implemented at a boundary or interior point. Numerical
performance is illustrated using an empirical application and simulated data,
where a detailed numerical comparison with other R packages is given
Recommended from our members
Developments in linear and integer programming
In this review we describe recent developments in linear and integer (linear) programming. For over 50 years Operational Research practitioners have made use of linear optimisation models to aid decision making and over this period the size of problems that can be solved has increased dramatically, the time required to solve problems has decreased substantially and the flexibility of modelling and solving systems has increased steadily. Large models are no longer confined to large computers, and the flexibility of optimisation systems embedded in other decision support tools has made on-line decision making using linear programming a reality (and using integer programming a possibility). The review focuses on recent developments in algorithms, software and applications and investigates some connections between linear optimisation and other technologies
EIT Reconstruction Algorithms: Pitfalls, Challenges and Recent Developments
We review developments, issues and challenges in Electrical Impedance
Tomography (EIT), for the 4th Workshop on Biomedical Applications of EIT,
Manchester 2003. We focus on the necessity for three dimensional data
collection and reconstruction, efficient solution of the forward problem and
present and future reconstruction algorithms. We also suggest common pitfalls
or ``inverse crimes'' to avoid.Comment: A review paper for the 4th Workshop on Biomedical Applications of
EIT, Manchester, UK, 200
Projection methods in conic optimization
There exist efficient algorithms to project a point onto the intersection of
a convex cone and an affine subspace. Those conic projections are in turn the
work-horse of a range of algorithms in conic optimization, having a variety of
applications in science, finance and engineering. This chapter reviews some of
these algorithms, emphasizing the so-called regularization algorithms for
linear conic optimization, and applications in polynomial optimization. This is
a presentation of the material of several recent research articles; we aim here
at clarifying the ideas, presenting them in a general framework, and pointing
out important techniques
Research Towards High Speed Freeforming
Additive manufacturing (AM) methods are currently utilised for the manufacture of prototypes and low volume, high cost parts. This is because in most cases the high material costs and low volumetric deposition rates of AM parts result in higher per part cost than traditional manufacturing methods. This paper brings together recent research aimed at improving the economics of AM, in particular Extrusion Freeforming (EF).
A new class of machine is described called High Speed Additive Manufacturing (HSAM) in which software, hardware and materials advances are aggregated. HSAM could be cost competitive with injection moulding for medium sized medium quantity parts. A general outline for a HSAM machine and supply chain is provided along with future required research
- …