1,990 research outputs found
Bayesian comparison of latent variable models: Conditional vs marginal likelihoods
Typical Bayesian methods for models with latent variables (or random effects)
involve directly sampling the latent variables along with the model parameters.
In high-level software code for model definitions (using, e.g., BUGS, JAGS,
Stan), the likelihood is therefore specified as conditional on the latent
variables. This can lead researchers to perform model comparisons via
conditional likelihoods, where the latent variables are considered model
parameters. In other settings, however, typical model comparisons involve
marginal likelihoods where the latent variables are integrated out. This
distinction is often overlooked despite the fact that it can have a large
impact on the comparisons of interest. In this paper, we clarify and illustrate
these issues, focusing on the comparison of conditional and marginal Deviance
Information Criteria (DICs) and Watanabe-Akaike Information Criteria (WAICs) in
psychometric modeling. The conditional/marginal distinction corresponds to
whether the model should be predictive for the clusters that are in the data or
for new clusters (where "clusters" typically correspond to higher-level units
like people or schools). Correspondingly, we show that marginal WAIC
corresponds to leave-one-cluster out (LOcO) cross-validation, whereas
conditional WAIC corresponds to leave-one-unit out (LOuO). These results lead
to recommendations on the general application of the criteria to models with
latent variables.Comment: Manuscript in press at Psychometrika; 31 pages, 8 figure
Tests for Establishing Security Properties
Ensuring strong security properties in some cases requires participants to carry out tests during the execution of a protocol. A classical example is electronic voting: participants are required to verify the presence of their ballots on a bulletin board, and to verify the computation of the election outcome. The notion of certificate transparency is another example, in which participants in the protocol are required to perform tests to verify the integrity of a certificate log.
We present a framework for modelling systems with such `testable properties', using the applied pi calculus. We model the tests that are made by participants in order to obtain the security properties. Underlying our work is an attacker model called ``malicious but cautious'', which lies in between the Dolev-Yao model and the ``honest but curious'' model. The malicious-but-cautious model is appropriate for cloud computing providers that are potentially malicious but are assumed to be cautious about launching attacks that might cause user tests to fail
PDFS: Practical Data Feed Service for Smart Contracts
Smart contracts are a new paradigm that emerged with the rise of the
blockchain technology. They allow untrusting parties to arrange agreements.
These agreements are encoded as a programming language code and deployed on a
blockchain platform, where all participants execute them and maintain their
state. Smart contracts are promising since they are automated and
decentralized, thus limiting the involvement of third trusted parties, and can
contain monetary transfers. Due to these features, many people believe that
smart contracts will revolutionize the way we think of distributed
applications, information sharing, financial services, and infrastructures.
To release the potential of smart contracts, it is necessary to connect the
contracts with the outside world, such that they can understand and use
information from other infrastructures. For instance, smart contracts would
greatly benefit when they have access to web content. However, there are many
challenges associated with realizing such a system, and despite the existence
of many proposals, no solution is secure, provides easily-parsable data,
introduces small overheads, and is easy to deploy.
In this paper we propose PDFS, a practical system for data feeds that
combines the advantages of the previous schemes and introduces new
functionalities. PDFS extends content providers by including new features for
data transparency and consistency validations. This combination provides
multiple benefits like content which is easy to parse and efficient
authenticity verification without breaking natural trust chains. PDFS keeps
content providers auditable, mitigates their malicious activities (like data
modification or censorship), and allows them to create a new business model. We
show how PDFS is integrated with existing web services, report on a PDFS
implementation and present results from conducted case studies and experiments.Comment: Blockchain; Smart Contracts; Data Authentication; Ethereu
Using Sunflower Plots and Classification Trees to Study Typeface Legibility
This article describes the application of sunflower plots and classification trees to the study of onscreen typeface legibility. The two methods are useful for describing high-dimensional data in an intuitive manner, which is crucial for interacting with both the typographers who design the typefaces and the practitioners who must make decisions about which typeface to use for specific applications. Furthermore, classification trees help us make specific recommendations for how much of a character attribute is “enough” to make it legible. We present examples of sunflower plots and classification trees using data from a recent typeface legibility experiment, and we present R code for replicating our analyses. Some familiarity with classification trees and logistic regression will be helpful to the reader
Combustion Processes in Hybrid Rocket Engines
In recent years, there has been a resurgence of interest in the development of hybrid rocket engines for advanced launch vehicle applications. Hybrid propulsion systems use a solid fuel such as hydroxyl-terminated polybutadiene (HTPB) along with a gaseous/liquid oxidizer. The performance of hybrid combustors depends on the convective and radiative heat fluxes to the fuel surface, the rate of pyrolysis in the solid phase, and the turbulent combustion processes in the gaseous phases. These processes in combination specify the regression rates of the fuel surface and thereby the utilization efficiency of the fuel. In this paper, we employ computational fluid dynamics (CFD) techniques in order to gain a quantitative understanding of the physical trends in hybrid rocket combustors. The computational modeling is tailored to ongoing experiments at Penn State that employ a two dimensional slab burner configuration. The coordinated computational/experimental effort enables model validation while providing an understanding of the experimental observations. Computations to date have included the full length geometry with and with the aft nozzle section as well as shorter length domains for extensive parametric characterization. HTPB is sed as the fuel with 1,3 butadiene being taken as the gaseous product of the pyrolysis. Pure gaseous oxygen is taken as the oxidizer. The fuel regression rate is specified using an Arrhenius rate reaction, which the fuel surface temperature is given by an energy balance involving gas-phase convection and radiation as well as thermal conduction in the solid-phase. For the gas-phase combustion, a two step global reaction is used. The standard kappa - epsilon model is used for turbulence closure. Radiation is presently treated using a simple diffusion approximation which is valid for large optical path lengths, representative of radiation from soot particles. Computational results are obtained to determine the trends in the fuel burning or regression rates as a function of the head-end oxidizer mass flux, G=rho(e)U(e), and the chamber pressure. Furthermore, computation of the full slab burner configuration has also been obtained for various stages of the burn. Comparisons with available experimental data from small scale tests conducted by General Dynamics-Thiokol-Rocketdyne suggest reasonable agreement in the predicted regression rates. Future work will include: (1) a model for soot generation in the flame for more quantitative radiative transfer modelling, (2) a parametric study of combustion efficiency, and (3) transient calculations to help determine the possible mechanisms responsible for combustion instability in hybrid rocket motors
Validation of two-equation turbulence models for propulsion flowfields
The objective of the study is to assess the capability of two-equation turbulence models for simulating propulsion-related flowfields. The standard kappa-epsilon model with Chien's low Reynolds number formulation for near-wall effects is used as the baseline turbulence model. Several experimental test cases, representative of rocket combustor internal flowfields, are used to catalog the performance of the baseline model. Specific flowfields considered here include recirculating flow behind a backstep, mixing between coaxial jets and planar shear layers. Since turbulence solutions are notoriously dependent on grid and numerical methodology, the effects of grid refinement and artificial dissipation on numerical accuracy are studied. In the latter instance, computational results obtained with several central-differenced and upwind-based formulations are compared. Based on these results, improved turbulence modes such as enhanced kappa-epsilon models as well as other two-equation formulations (e.g., kappa-omega) are being studied. In addition, validation of swirling and reacting flowfields are also currently underway
- …