5,313 research outputs found

    The new (liberal) eugenics

    Get PDF
    Despite the Nazi horrors, in 1953 the new eugenics was founded, when Watson and Crick postulated the double helix of DNA as the basis of chemical heredity. In 1961, scientists have deciphered the genetic code of DNA, laying the groundwork for code manipulation and the potential building of new life forms. After thirty years from the discovery of the DNA structure, the experimenters began to carry out the first clinical studies of human somatic cell therapy. The practice of prenatal genetic tests identifies genes or unwanted genetic markers. Parents can choose to continue pregnancy or give up the fetus. Once the preimplantation genetic diagnosis occurs, potential parents can choose to use in vitro fertilization and then test early embryonic cells to identify embryos with genes they prefer or avoid. Because of concerns about eugenics, genetic counseling is based on a "non-directive" policy to ensure respect for reproductive autonomy. The argument for this counseling service is that we should balance parental autonomy with child's autonomy in the future. Specialists have not yet given a clear answer to the question of whether these practices should be considered eugenic practices, or if they are moral practices. DOI: 10.13140/RG.2.2.28777.9584

    A particle method for the homogeneous Landau equation

    Full text link
    We propose a novel deterministic particle method to numerically approximate the Landau equation for plasmas. Based on a new variational formulation in terms of gradient flows of the Landau equation, we regularize the collision operator to make sense of the particle solutions. These particle solutions solve a large coupled ODE system that retains all the important properties of the Landau operator, namely the conservation of mass, momentum and energy, and the decay of entropy. We illustrate our new method by showing its performance in several test cases including the physically relevant case of the Coulomb interaction. The comparison to the exact solution and the spectral method is strikingly good maintaining 2nd order accuracy. Moreover, an efficient implementation of the method via the treecode is explored. This gives a proof of concept for the practical use of our method when coupled with the classical PIC method for the Vlasov equation.Comment: 27 pages, 14 figures, debloated some figures, improved explanations in sections 2, 3, and

    A comparative study of three international construction firms : knowledge management infrastructures for optimising organisations' learning

    Get PDF
    Sendo a comunicação crescentemente crucial na sociedade contemporânea, e com o aumento do número de plataformas de redes sociais e ferramentas agregadas que vão surgindo, torna-se relevante compreender de que forma estas podem ser aproveitadas e utilizadas enquanto meios de comunicação estratégica para empresas e instituições. A tendência é a de as redes disponibilizarem cada vez mais recursos ao utilizador, fruto da adaptação às necessidades do mesmo, e os baixos custos que possuem são um incentivo à sua exploração. Como tal, através de dois estudos de casos que se tornam aqui relevantes analisar, este trabalho propõe-se compreender de que forma a parceria estratégica entre o YouTube e o Facebook pode funcionar para fortalecer e tornar eficaz a transmissão de mensagens audiovisuais de uma instituição, evidenciando assim a sua presença junto do público-alvo que se encontra cada vez mais presente nestas redes

    A Scalable, FPGA-Based Implementation of the Unscented Kalman Filter

    Get PDF
    Autonomous aerospace systems may well soon become ubiquitous pending an increase in autonomous capability. Greater autonomous capability means there is a need for high-performance state estimation. However, the desire to reduce costs through simplified development processes and compact form factors can limit performance. A hardware-based approach, such as using a field-programmable gate array (FPGA), is common when high performance is required, but hardware approaches tend to have a more complicated development process when compared to traditional software approaches; greater development complexity, in turn, results in higher costs. Leveraging the advantages of both hardware-based and software-based approaches, a hardware/software (HW/SW) codesign of the unscented Kalman filter (UKF), based on an FPGA, is presented. The UKF is split into an application-specific part, implemented in software to simplify the development process, and a non-application-specific part, implemented in hardware as a parameterisable ‘black box’ module (i.e. IP core) to increase performance. Simulation results demonstrating a possible nanosatellite application of the design are presented; implementation (synthesis, timing, power) details are also presented

    MOON: MapReduce On Opportunistic eNvironments

    Get PDF
    Abstract—MapReduce offers a flexible programming model for processing and generating large data sets on dedicated resources, where only a small fraction of such resources are every unavailable at any given time. In contrast, when MapReduce is run on volunteer computing systems, which opportunistically harness idle desktop computers via frameworks like Condor, it results in poor performance due to the volatility of the resources, in particular, the high rate of node unavailability. Specifically, the data and task replication scheme adopted by existing MapReduce implementations is woefully inadequate for resources with high unavailability. To address this, we propose MOON, short for MapReduce On Opportunistic eNvironments. MOON extends Hadoop, an open-source implementation of MapReduce, with adaptive task and data scheduling algorithms in order to offer reliable MapReduce services on a hybrid resource architecture, where volunteer computing systems are supplemented by a small set of dedicated nodes. The adaptive task and data scheduling algorithms in MOON distinguish between (1) different types of MapReduce data and (2) different types of node outages in order to strategically place tasks and data on both volatile and dedicated nodes. Our tests demonstrate that MOON can deliver a 3-fold performance improvement to Hadoop in volatile, volunteer computing environments

    Modules over Monads and their Algebras

    Get PDF
    corecore