9,095 research outputs found
On polymorphic uncertainty modeling in shell buckling
Buckling is typically the governing failure mode of thin-walled shells. In particular, geometric and material imperfections have a major influence on the buckling behavior. Small variations of imperfections have large effects on the load-bearing behavior. However, the design of shells is still characterized by a deterministic way of thinking, in which uncertainties have not yet been sufficiently considered. Even in probabilistic approaches, false assumptions are often generated due to the small amount of experimental data. The focus of this paper is an appropriate uncertainty quantification based on the available data. Therefore, the concept of polymorphic uncertainty modeling is presented on axially loaded shells with different types of imperfections. Finally, an idea for a novel design concept for shells based on a fuzzy-valued safety level is introduced. The paper is intended to initiate a rethinking of the methodology for the numerical design of shells with an appropriate uncertainty quantification
Multi-cultural visualization : how functional programming can enrich visualization (and vice versa)
The past two decades have seen visualization flourish as a research field in its own right, with advances on the computational challenges of faster algorithms, new techniques for datasets too large for in-core processing, and advances in understanding the perceptual and cognitive processes recruited by visualization systems, and through this, how to improve the representation of data. However, progress within visualization has sometimes proceeded in parallel with that in other branches of computer science, and there is a danger that when novel solutions ossify into `accepted practice' the field can easily overlook significant advances elsewhere in the community. In this paper we describe recent advances in the design and implementation of pure functional programming languages that, significantly, contain important insights into questions raised by the recent NIH/NSF report on Visualization Challenges. We argue and demonstrate that modern functional languages combine high-level mathematically-based specifications of visualization techniques, concise implementation of algorithms through fine-grained composition, support for writing correct programs through strong type checking, and a different kind of modularity inherent in the abstractive power of these languages. And to cap it off, we have initial evidence that in some cases functional implementations are faster than their imperative counterparts
Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?
Β© 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. Β© 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio
A Theory of Tolerance
We develop an economic theory of tolerance where lifestyles and traits are invested with symbolic value by people. Value systems are endogenous and taught by parents to their children. In conjunction with actual behavior, value systems determine the esteem enjoyed by individuals. Intolerant individuals attach all symbolic value to a small number of attributes and are irrespectful of people with different ones. Tolerant people have diversified values and respect social alterity. We study the formation of values attached to both endogenous and exogenous attributes, and identify circumstances under which tolerance spontaneously arises. Policy may affect the evolution of tolerance in distinctive ways, and there may be efficiency as well as equity reasons to promote tolerance.value systems, tolerance, modernity
Measuring microsatellite conservation in mammalian evolution with a phylogenetic birth-death model.
Microsatellites make up βΌ3% of the human genome, and there is increasing evidence that some microsatellites can have important functions and can be conserved by selection. To investigate this conservation, we performed a genome-wide analysis of human microsatellites and measured their conservation using a binary character birth--death model on a mammalian phylogeny. Using a maximum likelihood method to estimate birth and death rates for different types of microsatellites, we show that the rates at which microsatellites are gained and lost in mammals depend on their sequence composition, length, and position in the genome. Additionally, we use a mixture model to account for unequal death rates among microsatellites across the human genome. We use this model to assign a probability-based conservation score to each microsatellite. We found that microsatellites near the transcription start sites of genes are often highly conserved, and that distance from a microsatellite to the nearest transcription start site is a good predictor of the microsatellite conservation score. An analysis of gene ontology terms for genes that contain microsatellites near their transcription start site reveals that regulatory genes involved in growth and development are highly enriched with conserved microsatellites
Recommended from our members
Polymorphic AΞ²42 fibrils adopt similar secondary structure but differ in cross-strand side chain stacking interactions within the same Ξ²-sheet.
Formation of polymorphic amyloid fibrils is a common feature in neurodegenerative diseases involving protein aggregation. In Alzheimer's disease, different fibril structures may be associated with different clinical sub-types. Structural basis of fibril polymorphism is thus important for understanding the role of amyloid fibrils in the pathogenesis and progression of these diseases. Here we studied two types of AΞ²42 fibrils prepared under quiescent and agitated conditions. Quiescent AΞ²42 fibrils adopt a long and twisted morphology, while agitated fibrils are short and straight, forming large bundles via lateral association. EPR studies of these two types of AΞ²42 fibrils show that the secondary structure is similar in both fibril polymorphs. At the same time, agitated AΞ²42 fibrils show stronger interactions between spin labels across the full range of the AΞ²42 sequence, suggesting a more tightly packed structure. Our data suggest that cross-strand side chain packing interactions within the same Ξ²-sheet may play a critical role in the formation of polymorphic fibrils
- β¦