22,252 research outputs found

    How autistic adults experience bereavement: an interpretative phenomenological study

    Get PDF
    Bereavement is a stressful life event that disrupts a person’s world on relational, practical, and spiritual levels. The aim of this study is to elucidate what it is like for autistic individuals, who characteristically desire predictability and continuity, to experience the death of a loved one. Individual in-depth interviews were conducted with 5 autistic adults and the transcripts analysed using Interpretative Phenomenological Analysis. Four inter-related group experiential themes are presented: ‘Impacts of change, loss and uncertainty’, ‘Marginalisation: the sociocultural context of autistic grief’, ‘Adapting to change and loss: meaning and connection’, and ‘Stories and scripts: making sense of it all’. Consistent with existing grief literature, participants’ grief reactions included affective, physiological, cognitive, and behavioural changes and were individual and varied with each loss. Participants also reported autism related grief reactions such as changes in sensory processing, increased masking, and an increase in autistic inertia, shutdown, and meltdown. The findings provide preliminary data on how the demands of bereavement, including the burden of minority stress, may increase the risk of autistic burnout for autistic survivors. Bereavement instigated a narrative reconstruction of the autistic survivors’ life stories and identities. This process was social, including talking about the deceased, reflecting on their biography and legacy, and creating a sense of continued connection and relationship with the deceased in their ongoing lives. The findings are discussed in relation to extant literature and implications for psychotherapy and Counselling Psychology are raised

    A Hybrid Quantum Encoding Algorithm of Vector Quantization for Image Compression

    Full text link
    Many classical encoding algorithms of Vector Quantization (VQ) of image compression that can obtain global optimal solution have computational complexity O(N). A pure quantum VQ encoding algorithm with probability of success near 100% has been proposed, that performs operations 45sqrt(N) times approximately. In this paper, a hybrid quantum VQ encoding algorithm between classical method and quantum algorithm is presented. The number of its operations is less than sqrt(N) for most images, and it is more efficient than the pure quantum algorithm. Key Words: Vector Quantization, Grover's Algorithm, Image Compression, Quantum AlgorithmComment: Modify on June 21. 10pages, 3 figure

    Three particle quantization condition in a finite volume: 2. general formalism and the analysis of data

    Full text link
    We derive the three-body quantization condition in a finite volume using an effective field theory in the particle-dimer picture. Moreover, we consider the extraction of physical observables from the lattice spectrum using the quantization condition. To illustrate the general framework, we calculate the volume-dependent three-particle spectrum in a simple model both below and above the three-particle threshold. The relation to existing approaches is discussed in detail.Comment: 36 pages, 9 figure

    A machine learning study to identify spinodal clumping in high energy nuclear collisions

    Get PDF
    The coordinate and momentum space configurations of the net baryon number in heavy ion collisions that undergo spinodal decomposition, due to a first-order phase transition, are investigated using state-of-the-art machine-learning methods. Coordinate space clumping, which appears in the spinodal decomposition, leaves strong characteristic imprints on the spatial net density distribution in nearly every event which can be detected by modern machine learning techniques. On the other hand, the corresponding features in the momentum distributions cannot clearly be detected, by the same machine learning methods, in individual events. Only a small subset of events can be systematically differ- entiated if only the momentum space information is available. This is due to the strong similarity of the two event classes, with and without spinodal decomposition. In such sce- narios, conventional event-averaged observables like the baryon number cumulants signal a spinodal non-equilibrium phase transition. Indeed the third-order cumulant, the skewness, does exhibit a peak at the beam energy (Elab = 3–4 A GeV), where the transient hot and dense system created in the heavy ion collision reaches the first-order phase transition

    Spectrum of Relativistic Fermions in a 2d Doped Lattice

    Full text link
    Motivated by some previous work on fermions on random lattices and by suggestions that impurities could trigger parity breaking in 2d crystals, we have analyzed the spectrum of the Dirac equation on a two dimensional square lattice where sites have been removed randomly --- a doped lattice. We have found that the system is well described by a sine-Gordon action. The solitons of this model are the lattice fermions, which pick a quartic interaction due to the doping and become Thirring fermions. They also get an effective mass different from the lagrangian mass. The system seems to exhibit spontaneous symmetry breaking, exactly as it happens for a randomly triangulated lattice. The associated ``Goldstone boson" is the sine-Gordon scalar. We argue, however, that the peculiar behaviour of the chiral condensate is due to finite size effects.Comment: 11 page

    Electrolysis-based diaphragm actuators

    Get PDF
    This work presents a new electrolysis-based microelectromechanical systems (MEMS) diaphragm actuator. Electrolysis is a technique for converting electrical energy to pneumatic energy. Theoretically electrolysis can achieve a strain of 136 000% and is capable of generating a pressure above 200 MPa. Electrolysis actuators require modest electrical power and produce minimal heat. Due to the large volume expansion obtained via electrolysis, small actuators can create a large force. Up to 100 ”m of movement was achieved by a 3 mm diaphragm. The actuator operates at room temperature and has a latching and reversing capability

    Integrated parylene-cabled silicon probes for neural prosthetics

    Get PDF
    Recent advances in the field of neural prosthetics have demonstrated the thought control of a computer cursor. This capability relies primarily on electrode array surgically implanted into the brain as an acquisition source of neural activity. Various technologies have been developed for signal extraction; however most suffer from either fragile electrode shanks and bulky cables or inefficient use of surgical site areas. Here we present a design and initial testing results from high electrode density, silicon based arrays system with an integrated parylene cable. The greatly reduced flexible rigidity of the parylene cable is believed to relief possible mechanical damages due to relative motion between a brain and its skull

    MDCC: Multi-Data Center Consistency

    Get PDF
    Replicating data across multiple data centers not only allows moving the data closer to the user and, thus, reduces latency for applications, but also increases the availability in the event of a data center failure. Therefore, it is not surprising that companies like Google, Yahoo, and Netflix already replicate user data across geographically different regions. However, replication across data centers is expensive. Inter-data center network delays are in the hundreds of milliseconds and vary significantly. Synchronous wide-area replication is therefore considered to be unfeasible with strong consistency and current solutions either settle for asynchronous replication which implies the risk of losing data in the event of failures, restrict consistency to small partitions, or give up consistency entirely. With MDCC (Multi-Data Center Consistency), we describe the first optimistic commit protocol, that does not require a master or partitioning, and is strongly consistent at a cost similar to eventually consistent protocols. MDCC can commit transactions in a single round-trip across data centers in the normal operational case. We further propose a new programming model which empowers the application developer to handle longer and unpredictable latencies caused by inter-data center communication. Our evaluation using the TPC-W benchmark with MDCC deployed across 5 geographically diverse data centers shows that MDCC is able to achieve throughput and latency similar to eventually consistent quorum protocols and that MDCC is able to sustain a data center outage without a significant impact on response times while guaranteeing strong consistency
    • 

    corecore