379 research outputs found

    Mio-Pliocene Faunal Exchanges and African Biogeography: The Record of Fossil Bovids

    Get PDF
    The development of the Ethiopian biogeographic realm since the late Miocene is here explored with the presentation and review of fossil evidence from eastern Africa. Prostrepsiceros cf. vinayaki and an unknown species of possible caprin affinity are described from the hominid-bearing Asa Koma and Kuseralee Members (∼5.7 and ∼5.2 Ma) of the Middle Awash, Ethiopia. The Middle Awash Prostrepsiceros cf. vinayaki constitutes the first record of this taxon from Africa, previously known from the Siwaliks and Arabia. The possible caprin joins a number of isolated records of caprin or caprin-like taxa recorded, but poorly understood, from the late Neogene of Africa. The identification of these two taxa from the Middle Awash prompts an overdue review of fossil bovids from the sub-Saharan African record that demonstrate Eurasian affinities, including the reduncin Kobus porrecticornis, and species of Tragoportax. The fossil bovid record provides evidence for greater biological continuity between Africa and Eurasia in the late Miocene and earliest Pliocene than is found later in time. In contrast, the early Pliocene (after 5 Ma) saw the loss of any significant proportions of Eurasian-related taxa, and the continental dominance of African-endemic taxa and lineages, a pattern that continues today

    Dispelling urban myths about default uncertainty factors in chemical risk assessment - Sufficient protection against mixture effects?

    Get PDF
    © 2013 Martin et al.; licensee BioMed Central LtdThis article has been made available through the Brunel Open Access Publishing Fund.Assessing the detrimental health effects of chemicals requires the extrapolation of experimental data in animals to human populations. This is achieved by applying a default uncertainty factor of 100 to doses not found to be associated with observable effects in laboratory animals. It is commonly assumed that the toxicokinetic and toxicodynamic sub-components of this default uncertainty factor represent worst-case scenarios and that the multiplication of those components yields conservative estimates of safe levels for humans. It is sometimes claimed that this conservatism also offers adequate protection from mixture effects. By analysing the evolution of uncertainty factors from a historical perspective, we expose that the default factor and its sub-components are intended to represent adequate rather than worst-case scenarios. The intention of using assessment factors for mixture effects was abandoned thirty years ago. It is also often ignored that the conservatism (or otherwise) of uncertainty factors can only be considered in relation to a defined level of protection. A protection equivalent to an effect magnitude of 0.001-0.0001% over background incidence is generally considered acceptable. However, it is impossible to say whether this level of protection is in fact realised with the tolerable doses that are derived by employing uncertainty factors. Accordingly, it is difficult to assess whether uncertainty factors overestimate or underestimate the sensitivity differences in human populations. It is also often not appreciated that the outcome of probabilistic approaches to the multiplication of sub-factors is dependent on the choice of probability distributions. Therefore, the idea that default uncertainty factors are overly conservative worst-case scenarios which can account both for the lack of statistical power in animal experiments and protect against potential mixture effects is ill-founded. We contend that precautionary regulation should provide an incentive to generate better data and recommend adopting a pragmatic, but scientifically better founded approach to mixture risk assessment. © 2013 Martin et al.; licensee BioMed Central Ltd.Oak Foundatio

    Lipids and carotid plaque in the Northern Manhattan Study (NOMAS)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Lipids, particularly low-density (LDL) and high-density (HDL) lipoproteins, are associated with increased risk of stroke and cardiovascular disease, probably due to atherosclerosis. The objective of this cross-sectional analysis was to investigate the relation between blood lipids and carotid plaque.</p> <p>Methods</p> <p>As part of a prospective population-based study to determine the incidence and risk factors of stroke in a multiethnic population, we evaluated 1804 participants with lipid measurements and B-mode ultrasound of carotid arteries (mean age 69 +/- 10 years; 40% men; 51% Hispanic, 26% black, 23% white). The association between lipid parameters and carotid plaque was analyzed by multiple logistic regression.</p> <p>Results</p> <p>Plaque was present in 61% of participants. Mean total cholesterol was 202 +/- 41 mg/dl. After controlling for other lipid parameters, demographics, and risk factors, the only cholesterol subfraction associated with carotid plaque was LDL (OR per standard deviation (SD) = 1.14, 95% CI 1.02-1.27). Neither HDL nor triglycerides independently predicted carotid plaque. Apolipoprotein B (ApoB) was also associated with risk of plaque (OR per SD = 1.29, 95% CI 1.03-1.60). Apolipoprotein A-I (apoA-1) was associated with a decrease in multiple plaques (OR per SD = 0.76, 95% CI 0.60-0.97), while lipoprotein a was associated with an increased risk of multiple plaques (OR per SD = 1.31, 95% CI 1.03-1.66). ApoB:ApoA-I had the strongest relation with carotid plaque (OR per SD = 1.35, 95% CI 1.08-1.69).</p> <p>Conclusions</p> <p>Among the common lipid parameters, LDL has the strongest relation with carotid plaque. Other lipid precursor proteins such as ApoB and ApoA-I may be stronger predictors of subclinical atherosclerosis, however, and better targets for treatment to reduce plaque formation and risk of cerebrovascular disease.</p

    Efficient Evaluation of Low Degree Multivariate Polynomials in Ring-LWE Homomorphic Encryption Schemes

    Get PDF
    Homomorphic encryption schemes allow to perform computations over encrypted data. In schemes based on RLWE assumption the plaintext data is a ring polynomial. In many use cases of homomorphic encryption only the degree-0 coefficient of this polynomial is used to encrypt data. In this context any computation on encrypted data can be performed. It is trickier to perform generic computations when more than one coefficient per ciphertext is used. In this paper we introduce a method to efficiently evaluate low-degree multivariate polynomials over encrypted data. The main idea is to encode several messages in the coefficients of a plaintext space polynomial. Using ring homomorphism operations and multiplications between ciphertexts, we compute multivariate monomials up to a given degree. Afterwards, using ciphertext additions we evaluate the input multivariate polynomial. We perform extensive experimentations of the proposed evaluation method. As example, evaluating an arbitrary multivariate degree-3 polynomial with 100 variables over Boolean space takes under 13 seconds

    Numerical Method for Comparison on Homomorphically Encrypted Numbers

    Get PDF
    We propose a new method to compare numbers which are encrypted by Homomorphic Encryption (HE). Previously, comparison and min/max functions were evaluated using Boolean functions where input numbers are encrypted bit-wisely. However, the bit-wise encryption methods require relatively expensive computation of basic arithmetic operations such as addition and multiplication. In this paper, we introduce iterative algorithms that approximately compute the min/max and comparison operations of several numbers which are encrypted word-wisely. From the concrete error analyses, we show that our min/max and comparison algorithms have Θ(α)\Theta(\alpha) and Θ(αlogα)\Theta(\alpha\log\alpha) computational complexity to obtain approximate values within an error rate 2α2^{-\alpha}, while the previous minimax polynomial approximation method requires the exponential complexity Θ(2α/2)\Theta(2^{\alpha/2}) and Θ(α2α/2)\Theta(\sqrt{\alpha}\cdot 2^{\alpha/2}), respectively. We also show the (sub-)optimality of our min/max and comparison algorithms in terms of asymptotic computational complexity among polynomial evaluations to obtain approximate min/max and comparison results. Our comparison algorithm is extended to several applications such as computing the top-kk elements and counting numbers over the threshold in encrypted state. Our new method enables word-wise HEs to enjoy comparable performance in practice with bit-wise HEs for comparison operations while showing much better performance on polynomial operations. Computing an approximate maximum value of any two \ell-bit integers encrypted by HEAAN, up to error 2102^{\ell-10}, takes only 1.141.14 milliseconds in amortized running time, which is comparable to the result based on bit-wise HEs

    Magnetic resonance imaging after most common form of concussion

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Until now there is a lack of carefully controlled studies with conventional MR imaging performed exclusively in concussion with short lasting loss of consciousness (LOC).</p> <p>Methods</p> <p>A MR investigation was performed within 24 hours and after 3 months in 20 patients who had suffered a concussion with a verified loss of consciousness of maximally 5 minutes. As a control group, 20 age- and gender matched patients with minor orthopaedic injuries had a MR investigation using the same protocol.</p> <p>Results</p> <p>In a concussion population with an average LOC duration of 1. 4 minutes no case with unequivocal intracranial traumatic pathology was detected.</p> <p>Conclusion</p> <p>An ordinary concussion with short lasting LOC does not or only seldom result in a degree of diffuse axonal injury (DAI) that is visualized by conventional MR with field strength of 1.0 Tesla (T). Analysis of earlier MR studies in concussion using field strength of 1.5 T as well as of studies with diffusion tensor MR imaging (MR DTI) reveal methodological shortcomings, in particular use of inadequate control groups. There is, therefore, a need for carefully controlled studies using MR of higher field strength and/or studies with MR DTI exclusively in common concussion with LOC of maximally 5 minutes.</p

    Memory Lower Bounds of Reductions Revisited

    Get PDF
    In Crypto 2017, Auerbach et al. initiated the study on memory-tight reductions and proved two negative results on the memory-tightness of restricted black-box reductions from multi-challenge security to single-challenge security for signatures and an artificial hash function. In this paper, we revisit the results by Auerbach et al. and show that for a large class of reductions treating multi-challenge security, it is impossible to avoid loss of memory-tightness unless we sacrifice the efficiency of their running-time. Specifically, we show three lower bound results. Firstly, we show a memory lower bound of natural black-box reductions from the multi-challenge unforgeability of unique signatures to any computational assumption. Then we show a lower bound of restricted reductions from multi-challenge security to single-challenge security for a wide class of cryptographic primitives with unique keys in the multi-user setting. Finally, we extend the lower bound result shown by Auerbach et al. treating a hash function to one treating any hash function with a large domain

    What Constitutes a Natural Fire Regime? Insight from the Ecology and Distribution of Coniferous Forest Birds in North America

    Get PDF
    Bird species that specialize in the use of burned forest conditions can provide insight into the prehistoric fire regimes associated with the forest types that they have occupied over evolutionary time. The nature of their adaptations reflects the specific post-fire conditions that occurred prior to the unnatural influence of humans after European settlement. Specifically, the post-fire conditions, nest site locations, and social systems of two species (Bachman\u27s sparrow [Aimophila aestivalis] and red-cockaded woodpecker [Picoides borealis]) suggest that, prehistorically, a frequent, low-severity fire regime characterized the southeastern pine system in which they evolved. In contrast, the patterns of distribution and abundance for several other bird species (black-backed woodpecker [Picoides arcticus], buff-breasted flycatcher [Empidonax fulvifrons], Lewis\u27 woodpecker [Melanerpes lewis], northern hawk owl [Surnia ulula], and Kirtland\u27s warbler [Dendroica kirtlandii]) suggest that severe fire has been an important component of the fire regimes with which they evolved. Patterns of habitat use by the latter species indicate that severe fires are important components not only of higher-elevation and high-latitude conifer forest types, which are known to be dominated by such fires, but also of mid-elevation and even low-elevation conifer forest types that are not normally assumed to have had high-severity fire as an integral part of their natural fire regimes. Because plant and animal adaptations can serve as reliable sources of information about what constitutes a natural fire regime, it might be wise to supplement traditional historical methods with careful consideration of information related to plant and animal adaptations when attempting to restore what are thought to be natural fire regimes

    Temporal allocation of foraging effort in female Australian fur seals (Arctocephalus pusillus doriferus)

    Get PDF
    Across an individual\u27s life, foraging decisions will be affected by multiple intrinsic and extrinsic drivers that act at differing timescales. This study aimed to assess how female Australian fur seals allocated foraging effort and the behavioural changes used to achieve this at three temporal scales: within a day, across a foraging trip and across the final six months of the lactation period. Foraging effort peaked during daylight hours (57% of time diving) with lulls in activity just prior to and after daylight. Dive duration reduced across the day (196 s to 168 s) but this was compensated for by an increase in the vertical travel rate (1500&ndash;1600 m&bull;h&minus;1) and a reduction in postdive duration (111&ndash;90 s). This suggests physiological constraints (digestive costs) or prey availability may be limiting mean dive durations as a day progresses. During short trips (&lt;2.9 d), effort remained steady at 55% of time diving, whereas, on long trips (&gt;2.9 d) effort increased up to 2&ndash;3 d and then decreased. Dive duration decreased at the same rate in short and long trips, respectively, before stabilising (long trips) between 4&ndash;5 d. Suggesting that the same processes (digestive costs or prey availability) working at the daily scale may also be present across a trip. Across the lactation period, foraging effort, dive duration and vertical travel rate increased until August, before beginning to decrease. This suggests that as the nutritional demands of the suckling pup and developing foetus increase, female effort increases to accommodate this, providing insight into the potential constraints of maternal investment in this specie
    corecore