12,503 research outputs found
Decoupling control technology for medium STOL transports
The advanced control technology is considered that is necessary to cope with the medium STOL transport landing problem and, in particular, the necessity to decouple with active control techniques. It is shown that the need to decouple is independent of the powered lift concept but that the provisioning for decoupling is most greatly dependent on the preassumed piloting technique. The implications of decoupling and active control techniques with respect to pilot technique options, handling quality criteria, flight control mechanization, and the use of piloted simulation as a design tool, are also discussed
Maximum likelihood and pseudo score approaches for parametric time-to-event analysis with informative entry times
We develop a maximum likelihood estimating approach for time-to-event Weibull
regression models with outcome-dependent sampling, where sampling of subjects
is dependent on the residual fraction of the time left to developing the event
of interest. Additionally, we propose a two-stage approach which proceeds by
iteratively estimating, through a pseudo score, the Weibull parameters of
interest (i.e., the regression parameters) conditional on the inverse
probability of sampling weights; and then re-estimating these weights (given
the updated Weibull parameter estimates) through the profiled full likelihood.
With these two new methods, both the estimated sampling mechanism parameters
and the Weibull parameters are consistently estimated under correct
specification of the conditional referral distribution. Standard errors for the
regression parameters are obtained directly from inverting the observed
information matrix in the full likelihood specification and by either
calculating bootstrap or robust standard errors for the hybrid pseudo
score/profiled likelihood approach. Loss of efficiency with the latter approach
is considered. Robustness of the proposed methods to misspecification of the
referral mechanism and the time-to-event distribution is also briefly examined.
Further, we show how to extend our methods to the family of parametric
time-to-event distributions characterized by the generalized gamma
distribution. The motivation for these two approaches came from data on time to
cirrhosis from hepatitis C viral infection in patients referred to the
Edinburgh liver clinic. We analyze these data here.Comment: Published in at http://dx.doi.org/10.1214/14-AOAS725 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Deep shower interpretation of the cosmic ray events observed in excess of the Greisen-Zatsepin-Kuzmin energy
We consider the possibility that the ultra-high-energy cosmic ray flux has a
small component of exotic particles which create showers much deeper in the
atmosphere than ordinary hadronic primaries. It is shown that applying the
conventional AGASA/HiRes/Auger data analysis procedures to such exotic events
results in large systematic biases in the energy spectrum measurement. SubGZK
exotic showers may be mis-reconstructed with much higher energies and mimick
superGZK events. Alternatively, superGZK exotic showers may elude detection by
conventional fluorescence analysis techniques.Comment: 22 pages, 5 figure
Social scale and collective computation: Does information processing limit rate of growth in scale?
Collective computation is the process by which groups store and share information to arrive at decisions for collective behavior. How societies engage in effective collective computation depends partly on their scale. Social arrangements and technologies that work for small- and mid-scale societies are inadequate for dealing effectively with the much larger communication loads that societies face during the growth in scale that is a hallmark of the Holocene. An important bottleneck for growth may be the development of systems for persistent recording of information (writing), and perhaps also the abstraction of money for generalizing exchange mechanisms. Building on Shin et al., we identify a Scale Threshold to be crossed before societies can develop such systems, and an Information Threshold which, once crossed, allows more or less unlimited growth in scale. We introduce several additional articles in this special issue that elaborate or evaluate this Thresholds Model for particular types of societies or times and places in the world.1 Introduction 2 Seshat: The Global History Databank 2.1 Quantitative historical analysis uncovers a single dimension of complexity that structures global variation in human social organization 2.2 Scale and information-processing thresholds in Holocene social evolution 2.3 Evolution of collective computational abilities of (pre)historic societies 3 Empirical Fluctuation, or Stochastic Law? 4 Opening the Discussion on Collective Computation: Historical Survey and Introduction to the Case Studies 4.1 Marcus Hamilton: Collective computation and the emergence of hunter-gatherer small-worlds 4.2 Laura Ellyson: Applying Gregory Johnson’s concepts of scalar stress to scale and Information Thresholds in Holocene social evolution 4.3 Johannes Müller et al.: Tripolye mega-sites: “Collective computational abilities” of prehistoric proto-urban societies? 4.4 Steven Wernke: Explosive expansion, sociotechnical diversity, and fragile sovereignty in the domain of the Inka 4.5 Gary Feinman and David Carballo: Communication, computation, and governance: A multiscalar vantage on the prehispanic Mesoamerican World 4.6 Ian Morris: Scale, information-processing, and complementarities in Old-World Axial-Age societies 5 Conclusion 6 Postscript: the Second Social Media Revolutio
A Population-Based Ultra-Widefield Digital Image Grading Study for Age-Related Macular Degeneration-Like Lesions at the Peripheral Retina.
Our understanding of the relevance of peripheral retinal abnormalities to disease in general and in age-related macular degeneration (AMD) in particular is limited by the lack of detailed peripheral imaging studies. The purpose of this study was to develop image grading protocols suited to ultra-widefield imaging (UWFI) in an aged population
Practical lossless compression with latent variables using bits back coding
Deep latent variable models have seen recent success in many data domains. Lossless compression is an application of these models which, despite having the potential to be highly useful, has yet to be implemented in a practical manner. We present 'Bits Back with ANS' (BB-ANS), a scheme to perform lossless compression with latent variable models at a near optimal rate. We demonstrate this scheme by using it to compress the MNIST dataset with a variational auto-encoder model (VAE), achieving compression rates superior to standard methods with only a simple VAE. Given that the scheme is highly amenable to parallelization, we conclude that with a sufficiently high quality generative model this scheme could be used to achieve substantial improvements in compression rate with acceptable running time. We make our implementation available open source at https://github.com/bits-back/bits-back
Reducing the Computational Cost of Deep Generative Models with Binary Neural Networks
Deep generative models provide a powerful set of tools to understand real-world data. But as these models improve, they increase in size and complexity, so their computational cost in memory and execution time grows. Using binary weights in neural networks is one method which has shown promise in reducing this cost. However, whether binary neural networks can be used in generative models is an open problem. In this work we show, for the first time, that we can successfully train generative models which utilize binary neural networks. This reduces the computational cost of the models massively. We develop a new class of binary weight normalization, and provide insights for architecture designs of these binarized generative models. We demonstrate that two state-of-the-art deep generative models, the ResNet VAE and Flow++ models, can be binarized effectively using these techniques. We train binary models that achieve loss values close to those of the regular models but are 90%-94% smaller in size, and also allow significant speed-ups in execution time
Consistent particle-based algorithm with a non-ideal equation of state
A thermodynamically consistent particle-based model for fluid dynamics with
continuous velocities and a non-ideal equation of state is presented. Excluded
volume interactions are modeled by means of biased stochastic multiparticle
collisions which depend on the local velocities and densities. Momentum and
energy are exactly conserved locally. The equation of state is derived and
compared to independent measurements of the pressure. Results for the kinematic
shear viscosity and self-diffusion constants are presented. A caging and
order/disorder transition is observed at high densities and large collision
frequency.Comment: 7 pages including 4 figure
Velocity Slip and Temperature Jump in Hypersonic Aerothermodynamics
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/76543/1/AIAA-2007-208-226.pd
Microcanonical entropy inflection points: Key to systematic understanding of transitions in finite systems
We introduce a systematic classification method for the analogs of phase
transitions in finite systems. This completely general analysis, which is
applicable to any physical system and extends towards the thermodynamic limit,
is based on the microcanonical entropy and its energetic derivative, the
inverse caloric temperature. Inflection points of this quantity signal
cooperative activity and thus serve as distinct indicators of transitions. We
demonstrate the power of this method through application to the long-standing
problem of liquid-solid transitions in elastic, flexible homopolymers.Comment: 4 pages, 3 figure
- …