6,768 research outputs found
Experimental evaluation into novel, low cost, modular PEMFC stack
Attribution-NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0)The Polymer Electrolyte Membrane Fuel Cell (PEMFC), despite being regarded as an ideal replacement to the internal combustion engine, is still not an economically attractive pri-mover due to a number of key challenges that have yet to be fully resolved; some of which include degradation to cell components resulting in inadequate lifetimes, specialised and costly manufacturing processes and poor gravimetric/volumetric energy densities. This paper presents a novel stack concept which removes the conventional bi polar plate (BPP), a component that is responsible for up to 80% of total stack weight and 90+% of stack volume in some designs. The removal of said component not only improves the volumetric and gravimetric energy density of the PEMFC stack but drastically reduces the cost of the stack by removing all costly manufacturing processes associated with PEMFC component machining while the functionality of the traditional BPP is still retained by the unique stack design. The stack architecture is first presented and then the characterisation of the PEMFC is shown over a wide range of operating scenarios. The experimental studies suggest that the performance of the new design is comparable to that of traditional stacks but at significantly less cost price.Final Published versio
On the commuting probability and supersolvability of finite groups
For a finite group , let denote the probability that a randomly
chosen pair of elements of commute. We prove that if for some
integer and splits over an abelian normal nontrivial subgroup ,
then has a nontrivial conjugacy class inside of size at most . We
also extend two results of Barry, MacHale, and N\'{\i} Sh\'{e} on the commuting
probability in connection with supersolvability of finite groups. In
particular, we prove that if then either is supersolvable, or
isoclinic to , or G/\Center(G) is isoclinic to
Hydrostatic and uniaxial pressure dependence of superconducting transition temperature of KFe2As2 single crystals
We present heat capacity, c-axis thermal expansion and pressure dependent,
low field, temperature dependent magnetization for pressures up to ~ 12 kbar,
data for KFe2As2 single crystals. Tc decreases under pressure with dTc/dP ~
-0.10 K/kbar. The inferred uniaxial, c-axis, pressure derivative is positive,
dTc/dpc ~ 0.11 K/kbar. The data are analyzed in comparison with those for
overdoped Fe-based superconductors. Arguments are presented that
superconductivity in KFe2As2 may be different from the other overdoped,
Fe-based materials in the 122 family
From Entropic Dynamics to Quantum Theory
Non-relativistic quantum theory is derived from information codified into an
appropriate statistical model. The basic assumption is that there is an
irreducible uncertainty in the location of particles: positions constitute a
configuration space and the corresponding probability distributions constitute
a statistical manifold. The dynamics follows from a principle of inference, the
method of Maximum Entropy. The concept of time is introduced as a convenient
way to keep track of change. A welcome feature is that the entropic dynamics
notion of time incorporates a natural distinction between past and future. The
statistical manifold is assumed to be a dynamical entity: its curved and
evolving geometry determines the evolution of the particles which, in their
turn, react back and determine the evolution of the geometry. Imposing that the
dynamics conserve energy leads to the Schroedinger equation and to a natural
explanation of its linearity, its unitarity, and of the role of complex
numbers. The phase of the wave function is explained as a feature of purely
statistical origin. There is a quantum analogue to the gravitational
equivalence principle.Comment: Extended and corrected version of a paper presented at MaxEnt 2009,
the 29th International Workshop on Bayesian Inference and Maximum Entropy
Methods in Science and Engineering (July 5-10, 2009, Oxford, Mississippi,
USA). In version v3 I corrected a mistake and considerably simplified the
argument. The overall conclusions remain unchange
The Role of Donated Labour and Not for Profit at the Public/Private Interface
The aim of this paper is to assess the role of donated labour and not-for-profit (NFP) entities at the public private interface. After discussing what a NFP enterprise is and providing general background, we look at the underlying theory of NFP institutions. The fact that NFP companies are able to precommit themselves not to expropriate donated labour is identified as a primary justification of the NFP model and we emphasise the role that purchasers play in the expropriation problem and suggest that this is a particular concern for institutions at the public private interface. After summarising the empirical literature we provide a brief case study of Glas Cymru and show that it is likely to fall foul of the purchaser problems in that the structure makes it hard to avoid expropriation of donated labour. Although there is limited empirical evidence investigation of what is available suggests that the shift from FP to NFP has had no significant effect on the company. Finally, we address the issue of Foundation Hospitals and suggest that there is more, albeit limited, reason to suggest that the NFP status will prove beneficial for donating labour.not-for-profit, public private interface
Multi-galileons, solitons and Derrick's theorem
The field theory Galilean symmetry, which was introduced in the context of
modified gravity, gives a neat way to construct Lorentz-covariant theories of a
scalar field, such that the equations of motion contain at most second-order
derivatives. Here we extend the analysis to an arbitrary number of scalars, and
examine the restrictions imposed by an internal symmetry, focussing in
particular on SU(N) and SO(N). This therefore extends the possible gradient
terms that may be used to stabilise topological objects such as sigma model
lumps.Comment: 7 pages, 1 figure. Minor change to order of reference
Jaynes' MaxEnt, Steady State Flow Systems and the Maximum Entropy Production Principle
Jaynes' maximum entropy (MaxEnt) principle was recently used to give a
conditional, local derivation of the ``maximum entropy production'' (MEP)
principle, which states that a flow system with fixed flow(s) or gradient(s)
will converge to a steady state of maximum production of thermodynamic entropy
(R.K. Niven, Phys. Rev. E, in press). The analysis provides a steady state
analog of the MaxEnt formulation of equilibrium thermodynamics, applicable to
many complex flow systems at steady state. The present study examines the
classification of physical systems, with emphasis on the choice of constraints
in MaxEnt. The discussion clarifies the distinction between equilibrium, fluid
flow, source/sink, flow/reactive and other systems, leading into an appraisal
of the application of MaxEnt to steady state flow and reactive systems.Comment: 6 pages; paper for MaxEnt0
Computational methods for Bayesian model choice
In this note, we shortly survey some recent approaches on the approximation
of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model
choice. In particular, we reassess importance sampling, harmonic mean sampling,
and nested sampling from a unified perspective.Comment: 12 pages, 4 figures, submitted to the proceedings of MaxEnt 2009,
July 05-10, 2009, to be published by the American Institute of Physic
Entropic Priors and Bayesian Model Selection
We demonstrate that the principle of maximum relative entropy (ME), used
judiciously, can ease the specification of priors in model selection problems.
The resulting effect is that models that make sharp predictions are
disfavoured, weakening the usual Bayesian "Occam's Razor". This is illustrated
with a simple example involving what Jaynes called a "sure thing" hypothesis.
Jaynes' resolution of the situation involved introducing a large number of
alternative "sure thing" hypotheses that were possible before we observed the
data. However, in more complex situations, it may not be possible to explicitly
enumerate large numbers of alternatives. The entropic priors formalism produces
the desired result without modifying the hypothesis space or requiring explicit
enumeration of alternatives; all that is required is a good model for the prior
predictive distribution for the data. This idea is illustrated with a simple
rigged-lottery example, and we outline how this idea may help to resolve a
recent debate amongst cosmologists: is dark energy a cosmological constant, or
has it evolved with time in some way? And how shall we decide, when the data
are in?Comment: Presented at MaxEnt 2009, the 29th International Workshop on Bayesian
Inference and Maximum Entropy Methods in Science and Engineering (July 5-10,
2009, Oxford, Mississippi, USA
- âŠ