26 research outputs found

    Clinical matrices and ethics in Freud

    Get PDF
    O presente estudo hipotetiza que as diferentes matrizes clínicas que Freud encontrava em sua prática, e que lhe possibilitavam acréscimos teóricos, direcionaram seus distintos olhares sobre a cultura, fazendo-o privilegiar alguns elementos éticos em detrimento de outros. Desse modo, indicaremos: (1) como a histeria gerou a questão do conflito entre sexualidade e moral na civilização; (2) como a neurose obsessiva possibilitou a entrada dos temas da agressividade e do ódio como entraves contra os quais a cultura se esforça por lutar, assim como a presença marcante no psiquismo da consciência moral e do sentimento de culpa; (3) por fim, como as ditas afecções narcísicas trouxeram a Freud o papel do egoísmo e da destrutividade como inimigos da cultura. Nesse percurso nos aproximaremos das questões ligadas à problematização ética na "psicologia" freudiana e, a partir daí, do destaque que terá a dimensão moral na concepção freudiana do sujeito.The present study is guided by the hypothesis that the different clinical matrices found by Freud in his practice, which provided him theoretical additions, directed his different perspectives about culture, making him privilege some ethical elements over others. Therefore, we will indicate: (1) how hysteria was responsible for bringing Freud the issue of conflict between sexuality and morality in civilization; (2) the obsessive neurosis enabled the entering of the aggressiveness and hate themes as obstacles against which the culture strives to fight, as well as the outstanding presence in the psyche of the moral conscience and the sentiment of guilt; (3) at last, the so called narcissistic conditions brought to Freud the role of selfishness and destructiveness as enemies of culture. In this way we will approach the issues related to the ethical problematization in Freudian "psychology" and, thereafter, the prominence that the moral dimension will have in Freudian conception of the subject

    The trauma in Freuds work: conceptual ramifications and clinical consequences

    No full text
    Nossa proposta consiste em realizar uma investigação teórica em torno do conceito de trauma na obra freudiana, assim como das problemáticas a que responde. Objetivamos pesquisar se a teoria do trauma teria de fato desaparecido no período situado entre o abandono da teoria da sedução e os construtos de 1920, como defendem alguns autores na literatura psicanalítica. Nossa hipótese é que a noção de trauma teria continuado a sustentar a teoria freudiana do sofrimento psíquico, não só por meio de aparições esporádicas do conceito nesse período, mas também, implicitamente, por meio das problemáticas a ele relacionadas como a da etiologia das neuroses e dos grandes casos clínicos de Freud: Dora, Homem dos Ratos e Homem dos Lobos. Assim, primeiramente rastreamos a noção de trauma e suas ramificações ao longo da obra e, em seguida, fizemos um estudo desses três casos clínicos, em busca de possíveis configurações traumáticas na história de seus protagonistas. Chegamos ao entendimento de que Freud não trata sempre do mesmo tipo de trauma ao longo dos seus escritos: o termo se mantém, mas com acepções diferentes. Haveria, assim, subtipos do trauma implícitos no discurso freudiano trauma por efração, trauma como falta de mediação entre fantasia e realidade e, ainda, pistas para se pensar um trauma como falha narcísica. A partir desses subtipos, abordamos as consequências clínicas da releitura desse conceito, do ponto de vista do diagnóstico diferencial. Em seguida, apontamos como o aspecto eminentemente econômico que permeia a categoria de trauma influiu na visão de Freud sobre os dispositivos utilizados pelo analista na clínica. Acompanhar as ramificações, assim como o alcance clínico da noção de trauma, consiste em uma tarefa que pode nos oferecer ferramentas teóricas para compreensão dos tipos de constelações traumáticas subjacentes aos sofrimentos intoleráveis daqueles que procuram a clínica psicanalíticaOur proposal is to perform a theoretical investigation regarding the concept of trauma in Freuds work, as well as the set of problems it responds to. We aimed to research if the theory of trauma would have in fact disappeared in the period between the abandon of the seduction theory and the constructs of 1920, as defended by some authors in the psychoanalytic literature. Our hypothesis is that the notion of trauma would have continued to sustain the Freudian theory of psychic suffering, not only through sporadic appearances of this concept during this period, but also, implicitly, by means of the set of problems related to it as the etiology of neurosis and of the major clinical cases of Freud: Dora, Rat Man and Wolf Man. Thus, primarily we searched for the notion of trauma and its ramifications throughout the wok and, afterwards, we performed a study of these three clinical cases, looking up for possible traumatic configurations in the history of their protagonists. We came to the understanding that Freud does not always treat the same kind of trauma throughout his writings: the term remains, but with different meanings. There would be, therefore, implicit subtypes of trauma in the Freudian discourse trauma by rupture, trauma as lack of mediation between fantasy and reality and, yet, clues to think of a trauma as a narcissistic fault. From these subtypes, we approached the clinical consequences of the reinterpretation of this concept, from the differential diagnosis viewpoint. Next, we pointed out how the eminently economic aspect that permeates the category of trauma influenced in Freuds view about the devices used by the analyst in the clinic. Following the ramifications, as well as the clinical scope of the notion of trauma, consists in a task that may offer us theoretical tools for the understanding of the types of traumatic constellations underlying the intolerable suffering of those who look for the psychoanalytic clini

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Highly-parallelized simulation of a pixelated LArTPC on a GPU

    No full text
    The rapid development of general-purpose computing on graphics processing units (GPGPU) is allowing the implementation of highly-parallelized Monte Carlo simulation chains for particle physics experiments. This technique is particularly suitable for the simulation of a pixelated charge readout for time projection chambers, given the large number of channels that this technology employs. Here we present the first implementation of a full microphysical simulator of a liquid argon time projection chamber (LArTPC) equipped with light readout and pixelated charge readout, developed for the DUNE Near Detector. The software is implemented with an end-to-end set of GPU-optimized algorithms. The algorithms have been written in Python and translated into CUDA kernels using Numba, a just-in-time compiler for a subset of Python and NumPy instructions. The GPU implementation achieves a speed up of four orders of magnitude compared with the equivalent CPU version. The simulation of the current induced on 10310^3 pixels takes around 1 ms on the GPU, compared with approximately 10 s on the CPU. The results of the simulation are compared against data from a pixel-readout LArTPC prototype

    Impact of cross-section uncertainties on supernova neutrino spectral parameter fitting in the Deep Underground Neutrino Experiment

    No full text
    International audienceA primary goal of the upcoming Deep Underground Neutrino Experiment (DUNE) is to measure the O(10)  MeV neutrinos produced by a Galactic core-collapse supernova if one should occur during the lifetime of the experiment. The liquid-argon-based detectors planned for DUNE are expected to be uniquely sensitive to the νe component of the supernova flux, enabling a wide variety of physics and astrophysics measurements. A key requirement for a correct interpretation of these measurements is a good understanding of the energy-dependent total cross section σ(Eν) for charged-current νe absorption on argon. In the context of a simulated extraction of supernova νe spectral parameters from a toy analysis, we investigate the impact of σ(Eν) modeling uncertainties on DUNE’s supernova neutrino physics sensitivity for the first time. We find that the currently large theoretical uncertainties on σ(Eν) must be substantially reduced before the νe flux parameters can be extracted reliably; in the absence of external constraints, a measurement of the integrated neutrino luminosity with less than 10% bias with DUNE requires σ(Eν) to be known to about 5%. The neutrino spectral shape parameters can be known to better than 10% for a 20% uncertainty on the cross-section scale, although they will be sensitive to uncertainties on the shape of σ(Eν). A direct measurement of low-energy νe-argon scattering would be invaluable for improving the theoretical precision to the needed level

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment

    DUNE Offline Computing Conceptual Design Report

    No full text
    International audienceThis document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore