10,211 research outputs found
A Formal, Resource Consumption-Preserving Translation of Actors to Haskell
We present a formal translation of an actor-based language with cooperative
scheduling to the functional language Haskell. The translation is proven
correct with respect to a formal semantics of the source language and a
high-level operational semantics of the target, i.e. a subset of Haskell. The
main correctness theorem is expressed in terms of a simulation relation between
the operational semantics of actor programs and their translation. This allows
us to then prove that the resource consumption is preserved over this
translation, as we establish an equivalence of the cost of the original and
Haskell-translated execution traces.Comment: Pre-proceedings paper presented at the 26th International Symposium
on Logic-Based Program Synthesis and Transformation (LOPSTR 2016), Edinburgh,
Scotland UK, 6-8 September 2016 (arXiv:1608.02534
Testing Claims of Efficacy and Mechanism of Action for Emotion Focused Couples Therapy: A Dyadic Case Study Using Time-Series Design
The overall purpose of this study was to test claims regarding both the efficacy and mechanism of change for Emotion Focused Couple Therapy (EFT). Although a number of treatment outcome studies have been conducted on EFT, the vast majority of these studies emanate from the research laboratories associated with the two founders of EFT. Additionally, most EFT research has examined treatment outcome rather than mechanisms of change. This study used a time-series single-case experimental design approach to examine both the efficacy and the mechanisms of change in EFT for couple distress. I systematically tracked the symptoms of couple distress across the span of an EFT treatment and explored how symptom severity varied over time within the dyad across several measures. Simulation modeling analysis (SMA) for time-series data was used to evaluate the level change across baseline, treatment, and follow-up phases. Further, crosslag correlational analyses were used to clarify the mechanism of change in EFT. Experimental results from the time-series design provided moderate support for the EFT efficacy claim. Partial support was also found for the underlying EFT mechanism of action claim linking attachment insecurity and marital distress. Two of the EFT mechanism of action claims and an interpersonal mindfulness exploratory hypothesis, however, were unsupported by the experimental data. Implications for future research are discussed
Four decades of mapping and quantifying neuroreceptors at work in vivo by positron emission tomography
Decryption of brain images is the basis for the necessary translation of the findings from imaging to information required to meet the demands of clinical intervention. Tools of brain imaging, therefore, must satisfy the conditions dictated by the needs for interpretation in terms of diagnosis and prognosis. In addition, the applications must serve as fundamental research tools that enable the understanding of new therapeutic drugs, including compounds as diverse as antipsychotics, antidepressants, anxiolytics, and drugs serving the relief of symptoms from neurochemical disorders as unrelated as multiple sclerosis, stroke, and dementia. Here we review and explain the kinetics of methods that enable researchers to describe the brain\u27s work and functions. We focus on methods invented by neurokineticists and expanded upon by practitioners during decades of experimental work and on the methods that are particularly useful to predict possible future approaches to the treatment of neurochemical disorders. We provide an overall description of the basic elements of kinetics and the underlying quantification methods, as well as the mathematics of modeling the recorded brain dynamics embedded in the images we obtai
Dynamical edge modes and entanglement in Maxwell theory
Previous work on black hole partition functions and entanglement entropy suggests the existence of âedgeâ degrees of freedom living on the (stretched) horizon. We identify a local and âshrinkableâ boundary condition on the stretched horizon that gives rise to such degrees of freedom. They can be interpreted as the Goldstone bosons of gauge transformations supported on the boundary, with the electric field component normal to the boundary as their symplectic conjugate. Applying the covariant phase space formalism for manifolds with boundary, we show that both the symplectic form and Hamiltonian exhibit a bulk-edge split. We then show that the thermal edge partition function is that of a codimension-two ghost compact scalar living on the horizon. In the context of a de Sitter static patch, this agrees with the edge partition functions found by Anninos et al. in arbitrary dimensions. It also yields a 4D entanglement entropy consistent with the conformal anomaly. Generalizing to Proca theory, we find that the prescription of Donnelly and Wall reproduces existing results for its edge partition function, while its classical phase space does not exhibit a bulk-edge split
Development of a standardized multiplex Filovirus and SARS-CoV2 antibody immunoassay
With the goal of producing multivalent recombinant subunit filovirus and SARS-CoV-2 vaccines, we develop formulations using surface glycoproteins of Ebola, Marburg and Sudan viruses or the Spike protein of the SARS-CoV-2 virus. In determining the potency of our formulations in generating an immune response in mice and non-human primates (NHP), serum antibody titers are used. Instead of using conventional antigen-binding ELISA assays for each antigen, we conduct testing by a custom multiplex immunoassay. This method uses regionally different magnetic beads coupled to purified recombinant antigens which are incubated with serum dilutions to simultaneously determine the antibody titers to the different immunizing antigens. After application of a secondary, fluorescently labeled antibody, values are normally shown as median fluorescent intensity or MFI.
By converting the MFI to an actual concentration, samples from different studies can more easily be compared. For this, standard curves using purified antigen-specific immunoglobulin G (IgG) to the three filovirus GPâs or SARS-CoV2 spike protein are established with each assay. Standards were prepared passing high-titered mouse or NHP sera over a protein G column to isolate IgG, then purified further using affinity-chromatography columns with individual filovirus GPâs or SARS-CoV-2 spike protein to select for antigen-specificity. The standards are quantified and curves are generated which will be run with each set of serum samples.
Please click Download on the upper right corner to see the full abstract
When Machines Are Watching: How Warrantless Use of GPS Surveillance Technology Terminates The Fourth Amendment Right Against Unreasonable Search
The use of GPS surveillance technology for prolonged automated surveillance of American citizens is proliferating, and a direct split between the Ninth and D.C. Circuits on whether warrants are required under the Fourth Amendment for such use of GPS technology is bringing the issue to a head in the Supreme Court. A Petition for Certiorari is pending in the Ninth Circuit case which held that warrants are not required, and a second Petition is likely from the Government in the D.C. Circuit case holding that warrants are required. In this paper, we argue first, that where a technology enables invasion of interests at the heart of the Fourth Amendmentâs concern -- protection of citizens from arbitrary government intrusions into their private lives -- the Courtâs precedents require warrants to prevent abuse, and second, that the type and scope of information collected by prolonged automated GPS surveillance enables governments to monitor a personâs political associations, their medical conditions and their amorous interests, in a way that invades their privacy and chills expression of other fundamental rights.
Our argument differs significantly from previous scholarship by tracing a continuous emphasis in Fourth Amendment jurisprudence on review of the potential for abuse of surveillance methods. Moreover, we are the first to argue that in protecting against abuse the Court has drawn a firm line between technology that simply enhances the natural senses of law enforcement officials, and technology that creates novel, non-biological âsenses.â
In Part I of this paper, we trace the origins of the Fourth Amendmentâs protections against law enforcement abuse, present evidence that GPS surveillance technology is in fact being abused, and discuss the impact unfettered abuse of the technology will have on the individual rights of citizens. In Part II, we explain the Courtâs historic approach to new surveillance technologies, noting that the Court has carefully examined new technologies to prevent any end-runs around legal doctrine from eroding personal privacy, and showing that the Court has always required warrants where technology goes beyond enhancement of senses to the creation of new non-biological âsenses.â In Part III, we explain why the Supreme Courtâs ruling on the use of beeper technology to enhance visual surveillance in United States v. Knotts, 460 U.S. 276 (1983), does not apply to the use of GPS technology as a replacement for visual surveillance. Finally, in Part IV, we explain how prolonged automated GPS surveillance invades a reasonable expectation of privacy and chills the exercise of core constitutional rights
Study of Vertical Ga\u3csub\u3e2\u3c/sub\u3eO\u3csub\u3e3\u3c/sub\u3e FinFET Short Circuit Ruggedness using Robust TCAD Simulation
In this paper, the short circuit ruggedness of Gallium Oxide (Ga2O3) vertical FinFET is studied using Technology Computer-Aided-Design (TCAD) simulations. Ga2O3 is an emerging ultra-wide bandgap material and Ga2O3 vertical FinFET can achieve the normally-off operation for high voltage applications. Ga2O3 has a relatively low thermal conductivity and, thus, it is critical to explore the design space of Ga2O3 vertical FinFETs to achieve an acceptable short-circuit capability for power applications. In this study, appropriate TCAD models and parameters calibrated to experimental data are used. For the first time, the breakdown voltage simulation accuracy of Ga2O3 vertical FinFETs is studied systematically. It is found that a background carrier generation rate between 105 cmâ3sâ1 and 1012 cmâ3sâ1 is required in simulation to obtain correct results. The calibrated and robust setup is then used to study the short circuit withstand time (SCWT) of an 800 V-rated Ga2O3 vertical FinFET with different inter-fin architectures. It is found that, due to the high thermal resistance in Ga2O3, to achieve an SCWT \u3e1 ÎŒs, low gate overdrive is needed which increases Ron,sp by 66% and that Ga2O3 might melt before the occurrence of thermal runaway. These results provide important guidance for developing rugged Ga2O3 power transistors
History Dependence in a Chemical Reaction Network Enables Dynamic Switching
This work describes an enzymatic autocatalytic network capable of dynamic switching under out-of-equilibrium conditions. The network, wherein a molecular fuel (trypsinogen) and an inhibitor (soybean trypsin inhibitor) compete for a catalyst (trypsin), is kept from reaching equilibria using a continuous flow stirred tank reactor. A so-called âlinear inhibition sweepâ is developed (i.e., a molecular analogue of linear sweep voltammetry) to intentionally perturb the competition between autocatalysis and inhibition, and used to demonstrate that a simple molecular system, comprising only three components, is already capable of a variety of essential neuromorphic behaviors (hysteresis, synchronization, resonance, and adaptation). This research provides the first steps in the development of a strategy that uses the principles in systems chemistry to transform chemical reaction networks into platforms capable of neural network computing.</p
Applying Theories of Particle Packing and Rheology to Concrete for Sustainable Development
Concrete is one of the most important construction materials.
However, it is not so compatible with the demands of sustainable development because manufacturing of cement generates a large amount of carbon dioxide and therefore cement consumption produces a huge carbon footprint. Currently, the cement consumption is generally lowered by adding supplementary cementitious materials to replace part of the cement. Nonetheless, in order to maintain performance, there is a limit to such cement replacement by supplementary cementitious materials. To further reduce the cement consumption,
the total cementitious materials content has to be reduced. This requires the packing density of the aggregate particles to be maximized so that the amount of voids in the bulk volume of aggregate to be filled with cement paste could be minimized and the surface area of the aggregate particles to be minimized so that the amount of cement paste needed to form paste films coating the surfaces of aggregate particle for rheological performance could be minimized. Such optimization
is not straightforward and modern concrete science based on particuology is needed. Herein, a number of new theories regarding particle packing and rheology of concrete, which are transforming conventional concrete technology into modern concrete science, are presented. These theories would help to develop a more scientific and systematic concrete mix design method for the production of high-performance concrete with minimum cement consumption
- âŠ