25,597 research outputs found

    Prehistoric psychotropic consumption in Andean Chilean mummies

    Get PDF
    Hallucinogenic plants are often regarded as the main source of psychoactive drugs in antiquity to reach deep altered states of consciousness^1,2^. Many researchers believe this was particularly true during the Tiwanaku empire expansion, circa (500-1000 A.D.), along the Atacama Desert of Chile. Highly decorated snuffing tablets and tubes are often found as grave goods during this period^3,4,5,6,7,8^. Until now the type of drugs consumed in this paraphernalia has been unclear. From the modern city of Arica, naturally mummified human bodies with abundant hair provided a unique opportunity to test for hallucinogenic plants consumed in Andean prehistory. Analysis by gas chromatography and mass spectrometry demonstrated the presence of harmine. The Banisteriopsis vine, commonly called Ayahuasca, was the probable source. This is the first confirmed evidence of psychoactive plant consumption in pre-Hispanic Andean populations along the Atacama coastal region. Of the 32 mummy hair samples analyzed 3 males tested positive for harmine. This alkaloid aids in the catalysis and synergic effects of powerful hallucinogenic drugs. The consumption of harmine was likely related to medicinal practices and not exclusively ingested by shamans. Another important aspect of this evidence is that Banisteriopsis is an Amazon plant. It does not grow in the Atacama coastal region. Thus, our findings reveal extensive plant trade networks in antiquity between the coast, desert, highlands, and Amazon basin. The excellent preservation of human organic specimens, the use of gas chromatography and mass spectrometry allowed us to map and demonstrate the consumption of psychoactive compound plants in Andean prehistory. In addition, our findings open the door for future studies to debate the consumption and social role of ancient psychoactive and hallucinogenic plants

    On the inconsistency of the Malmquist-Luenberger index

    Full text link
    Apart from the well-known weaknesses of the standard Malmquist productivity index related to infeasibility and not accounting for slacks, already addressed in the literature, we identify a new and significant drawback of the Malmquist-Luenberger index decomposition that questions its validity as an empirical tool for environmental productivity measurement associated with the production of bad outputs. In particular, we show that the usual interpretation of the technical change component in terms of production frontier shifts can be inconsistent with its numerical value, thereby resulting in an erroneous interpretation of this component that passes on to the index itself. We illustrate this issue with a simple numerical example. Finally, we propose a solution for this inconsistency issue based on incorporating a new postulate for the technology related to the production of bad output

    Quantifying entanglement in multipartite conditional states of open quantum systems by measurements of their photonic environment

    Get PDF
    A key lesson of the decoherence program is that information flowing out from an open system is stored in the quantum state of the surroundings. Simultaneously, quantum measurement theory shows that the evolution of any open system when its environment is measured is nonlinear and leads to pure states conditioned on the measurement record. Here we report the discovery of a fundamental relation between measurement and entanglement which is characteristic of this scenario. It takes the form of a scaling law between the amount of entanglement in the conditional state of the system and the probabilities of the experimental outcomes obtained from measuring the state of the environment. Using the scaling, we construct the distribution of entanglement over the ensemble of experimental outcomes for standard models with one open channel and provide rigorous results on finite-time disentanglement in systems coupled to non-Markovian baths. The scaling allows the direct experimental detection and quantification of entanglement in conditional states of a large class of open systems by quantum tomography of the bath.Comment: 12 pages (including supplementary information), 4 figure

    Accounting for Changing Inequality in Costa Rica, 1980-1999

    Get PDF
    After declining from the mid-1970s to the mid-1980s, earnings inequality in Costa Rica stabilized from 1987 to 1992 and then increased from 1992 to 1999. In this paper we use recently-developed techniques to measure the extent to which these changes in earnings inequality were the result of changes associated with the distributions (or .quantities.) of personal and work place characteristics of workers, and the earnings differences (or .prices.) associated with those characteristics. We present evidence that the most important cause of the fall in inequality prior to 1987 was a decline in returns to education, which in turn was caused by an increase in the supply of more-educated workers. We find that the most important causes of rising inequality in the 1990s were the end of this decline in returns to education and increases in the variance of hours worked among workers. Inequality in hours worked increased because of an increase in the proportion of workers working a non-standard work week (part-time or over-time)inequality, Latin America, labor

    Fine-tuning a context-aware system application by using user-centred design methods

    Get PDF
    Context-Aware Systems in the home environment can provide an effective solution for supporting wellbeing and autonomy for the elderly. The definition and implementation of the system architecture for a particular assisted living healthcare application entail both technological and usability challenges. If issues regarding users’ concerns and desires are taken into account in the early stages of the system development users can benefit substantially more from this technology. In this paper, we describe our initial experiences with different user-centred design methods, as they are applied in the process of fine-tuning a context-aware system architecture to improve quality of life for elderly THR patients (Total Hip Replacement). The insights resulting from this approach result in a clearer functional specification towards a better fit with the user needs regarding information need of the patient as well as the physiotherapist. Important system requirements as timing and content of the feedback are much more fruitful in an earlier phase of the development process. User-centred design methods help to better understand the needed functional features of a context-aware system, thereby saving time and helping developers to improve adoption of the system by the users

    Performance tuning of a smartphone-based overtaking assistant

    Get PDF
    ITS solutions suffer from the slow pace of adoption by manufacturers despite the interest shown by both consumers and industry. Our goal is to develop ITS applications using already available technologies to make them affordable, quick to deploy, and easy to adopt. In this paper we introduce EYES, an overtaking assistance solution that provides drivers with a real-time video feed from the vehicle located just in front. Our application thus provides a better view of the road ahead, and of any vehicles travelling in the opposite direction, being especially useful when the front view of the driver is blocked by large vehicles. We evaluated our application using the MJPEG video encoding format, and have determined the most effective resolution and JPEG quality choice for our case. Experimental results from the tests performed with the application in both indoor and outdoor scenarios, allow us to be optimistic about the effectiveness and applicability of smartphones in providing overtaking assistance based on video streaming in vehicular networks

    Development of a non-linear simulation for generic hypersonic vehicles - ASUHS1

    Get PDF
    A nonlinear simulation is developed to model the longitudinal motion of a vehicle in hypersonic flight. The equations of motion pertinent to this study are presented. Analytic expressions for the aerodynamic forces acting on a hypersonic vehicle which were obtained from Newtonian Impact Theory are further developed. The control surface forces are further examined to incorporate vehicle elastic motion. The purpose is to establish feasible equations of motion which combine rigid body, elastic, and aeropropulsive dynamics for use in nonlinear simulations. The software package SIMULINK is used to implement the simulation. Also discussed are issues needing additional attention and potential problems associated with the implementation (with proposed solutions)

    Families of Linear Efficiency Programs based on Debreu's Loss Function

    Get PDF
    Gerard Debreu introduced a well known radial efficiency measure which he called a “coefficient of resource utilization.†He derived this scalar from a much less well known “dead loss†function that characterizes the monetary value sacrificed to inefficiency, and which is to be minimized subject to a normalization condition. We use Debreu’s loss function, together with a variety of normalization conditions, to generate several popular families of linear efficiency programs. Our methodology also can be employed to generate entirely new families of linear efficiency programs.

    A magnified glance into the dark sector: probing cosmological models with strong lensing in A1689

    Full text link
    In this paper we constrain four alternative models to the late cosmic acceleration in the Universe: Chevallier-Polarski-Linder (CPL), interacting dark energy (IDE), Ricci holographic dark energy (HDE), and modified polytropic Cardassian (MPC). Strong lensing (SL) images of background galaxies produced by the galaxy cluster Abell 16891689 are used to test these models. To perform this analysis we modify the LENSTOOL lens modeling code. The value added by this probe is compared with other complementary probes: Type Ia supernovae (SNIa), baryon acoustic oscillations (BAO), and cosmic microwave background (CMB). We found that the CPL constraints obtained of the SL data are consistent with those estimated using the other probes. The IDE constraints are consistent with the complementary bounds only if large errors in the SL measurements are considered. The Ricci HDE and MPC constraints are weak but they are similar to the BAO, SNIa and CMB estimations. We also compute the figure-of-merit as a tool to quantify the goodness of fit of the data. Our results suggest that the SL method provides statistically significant constraints on the CPL parameters but weak for those of the other models. Finally, we show that the use of the SL measurements in galaxy clusters is a promising and powerful technique to constrain cosmological models. The advantage of this method is that cosmological parameters are estimated by modelling the SL features for each underlying cosmology. These estimations could be further improved by SL constraints coming from other galaxy clusters.Comment: 13 pages, 5 figures, accepted for publication in Ap
    corecore