295 research outputs found

    The Investor's Toolkit: Generating Multiple Returns Through a Unified Investment Strategy

    Get PDF
    This paper offers an introduction to the concept of managing financial assets using a strategy that maximises not only economic performance, but also social and environmental returns. It also explores the current state of this evolving investment approach, as well as the logic driving its inevitable expansion. This work is intended to broaden understanding of the diverse investment vehicles presently available, as well as to analyse how these trends might play out for different types of investors. The future of financial asset management as a field is also considered

    Reproducing Kernel Hilbert Space Pruning for Sparse Hyperspectral Abundance Prediction

    Full text link
    Hyperspectral measurements from long range sensors can give a detailed picture of the items, materials, and chemicals in a scene but analysis can be difficult, slow, and expensive due to high spatial and spectral resolutions of state-of-the-art sensors. As such, sparsity is important to enable the future of spectral compression and analytics. It has been observed that environmental and atmospheric effects, including scattering, can produce nonlinear effects posing challenges for existing source separation and compression methods. We present a novel transformation into Hilbert spaces for pruning and constructing sparse representations via non-negative least squares minimization. Then we introduce max likelihood compression vectors to decrease information loss. Our approach is benchmarked against standard pruning and least squares as well as deep learning methods. Our methods are evaluated in terms of overall spectral reconstruction error and compression rate using real and synthetic data. We find that pruning least squares methods converge quickly unlike matching pursuit methods. We find that Hilbert space pruning can reduce error by as much as 40% of the error of standard pruning and also outperform neural network autoencoders

    Computing equilibrium states of cholesteric liquid crystals in elliptical channels with deflation algorithms

    Full text link
    We study the problem of a cholesteric liquid crystal confined to an elliptical channel. The system is geometrically frustrated because the cholesteric prefers to adopt a uniform rate of twist deformation, but the elliptical domain precludes this. The frustration is resolved by deformation of the layers or introduction of defects, leading to a particularly rich family of equilibrium configurations. To identify the solution set, we adapt and apply a new family of algorithms, known as deflation methods, that iteratively modify the free energy extremisation problem by removing previously known solutions. A second algorithm, deflated continuation, is used to track solution branches as a function of the aspect ratio of the ellipse and preferred pitch of the cholesteric.Comment: 9 pages, 6 figure

    Experimental dynamic characterization of meta-configured structures for aerospace applications

    Get PDF
    Drawing inspiration from acoustic metamaterials, meta-configuration as a means to improve coactive mechanisms between sub-structures to enhance overall structural performance is investigated with a primary focus on aerospace applications. Two prototypical meta-configured structures for dynamic load manipulation were designed, constructed and experimentally characterized. The first of these is a directional wave guide (DWG) that utilizes tailored spatio-spectral band gap regions to selectively attenuate or propagate frequency components within tunable bandwidths along designated paths. Finite element simulations were performed to evaluate designs for various resonator sub-structures and their patterning within a plate-type wave guide. Relative band structures for the resonators were tuned to span a frequency range from 16 to 20 kHz. The DWG is fabricated by chemically etching a patterned array of cantilever beam-type resonators on a 50-micron thick brass sheet. A custom test-rig consisting of an adjustable mount with low-stiffness boundary, piezo-actuator and a mechanically-staged laser vibrometer was used to conduct the experiments. Overall, reasonable correlation is indicated for the designed versus measured extents for the band gap frequency ranges along each path, although further detuning of global modes and minimization of resonator and boundary variability would help improve correlation. The second prototypical structure investigated is a passive-adaptive tuned vibration absorber (TVA). In contrast to conventional TVAs which are tuned to absorb vibrations within a preset bandwidth, the passive-adaptive TVA would be able to self-tune based solely on the input excitation to deliver appreciable absorption spanning larger bandwidths. An approach based on a riding mass retained under spring force on a cantilever beam resonator is explored for a low-frequency (~40-80 Hz) passive-adaptive TVA. Test articles with various configurations for the riding mass attachment were constructed and dynamically characterized using non-contact transduction techniques. An increase of about 55% over the mass-equivalent, conventional TVA's absorption bandwidth is obtained for the passive-adaptive TVA without altering its lower bound. These results indicate that meta-configured structures may provide improved solutions in applications such as isolation of payloads from structure-borne vibrations having a meandering dominant frequency content

    Hydrogen Utilization in the Electricity Sector: Opportunities, Issues, and Challenges

    Get PDF
    This is a modified version of an opinion piece that was featured in Power Engineering.When the sun is shining and the wind is blowing, solar and wind energy are the lowest cost sources of electric power in the country. This energy can be used to directly power electrical devices, such as lighting for buildings or charging electric vehicles. It can also be stored in batteries for short term storage or can be used to make hydrogen, which can be stored or put in a pipeline for later use, including users that are a long distance away. On the other hand, natural gas fi red gas turbines are both the lowest cost non-intermittent power source, and the largest source of electric power in the US, at around 40%. Can they continue to evolve and be repurposed to utilize stored hydrogen for electric power

    Quantum Process Tomography of the Quantum Fourier Transform

    Full text link
    The results of quantum process tomography on a three-qubit nuclear magnetic resonance quantum information processor are presented, and shown to be consistent with a detailed model of the system-plus-apparatus used for the experiments. The quantum operation studied was the quantum Fourier transform, which is important in several quantum algorithms and poses a rigorous test for the precision of our recently-developed strongly modulating control fields. The results were analyzed in an attempt to decompose the implementation errors into coherent (overall systematic), incoherent (microscopically deterministic), and decoherent (microscopically random) components. This analysis yielded a superoperator consisting of a unitary part that was strongly correlated with the theoretically expected unitary superoperator of the quantum Fourier transform, an overall attenuation consistent with decoherence, and a residual portion that was not completely positive - although complete positivity is required for any quantum operation. By comparison with the results of computer simulations, the lack of complete positivity was shown to be largely a consequence of the incoherent errors during the quantum process tomography procedure. These simulations further showed that coherent, incoherent, and decoherent errors can often be identified by their distinctive effects on the spectrum of the overall superoperator. The gate fidelity of the experimentally determined superoperator was 0.64, while the correlation coefficient between experimentally determined superoperator and the simulated superoperator was 0.79; most of the discrepancies with the simulations could be explained by the cummulative effect of small errors in the single qubit gates.Comment: 26 pages, 17 figures, four tables; in press, Journal of Chemical Physic

    Robust Control of Quantum Information

    Full text link
    Errors in the control of quantum systems may be classified as unitary, decoherent and incoherent. Unitary errors are systematic, and result in a density matrix that differs from the desired one by a unitary operation. Decoherent errors correspond to general completely positive superoperators, and can only be corrected using methods such as quantum error correction. Incoherent errors can also be described, on average, by completely positive superoperators, but can nevertheless be corrected by the application of a locally unitary operation that ``refocuses'' them. They are due to reproducible spatial or temporal variations in the system's Hamiltonian, so that information on the variations is encoded in the system's spatiotemporal state and can be used to correct them. In this paper liquid-state nuclear magnetic resonance (NMR) is used to demonstrate that such refocusing effects can be built directly into the control fields, where the incoherence arises from spatial inhomogeneities in the quantizing static magnetic field as well as the radio-frequency control fields themselves. Using perturbation theory, it is further shown that the eigenvalue spectrum of the completely positive superoperator exhibits a characteristic spread that contains information on the Hamiltonians' underlying distribution.Comment: 14 pages, 6 figure

    Ultraviolet-reflective film applied to windows reduces the likelihood of collisions for two species of songbird

    Get PDF
    Perhaps a billion birds die annually from colliding with residential and commercial windows. Therefore, there is a societal need to develop technologies that reduce window collisions by birds. Many current window films that are applied to the external surface of windows have human-visible patterns that are not esthetically preferable. BirdShades have developed a short wavelength (ultraviolet) reflective film that appears as a slight tint to the human eye but should be highly visible to many bird species that see in this spectral range. We performed flight tunnel tests of whether the BirdShades external window film reduced the likelihood that two species of song bird (zebra finch, Taeniopygia guttata and brown-headed cowbird, Molothrus ater) collide with windows during daylight. We paid particular attention to simulate the lighting conditions that birds will experience while flying during the day. Our results indicate a 75–90% reduction in the likelihood of collision with BirdShades-treated compared with control windows, in forced choice trials. In more ecologically relevant comparison between trials where all windows were either treated or control windows, the estimated reduction in probability of collision was 30–50%. Further, both bird species slow their flight by approximately 25% when approaching windows treated with the BirdShades film, thereby reducing the force of collisions if they were to happen. Therefore, we conclude that the BirdShades external window film will be effective in reducing the risk of and damage caused to populations and property by birds’ collision with windows. As this ultraviolet-reflective film has no human-visible patterning to it, the product might be an esthetically more acceptable low cost solution to reducing bird-window collisions. Further, we call for testing of other mitigation technologies in lighting and ecological conditions that are more similar to what birds experience in real human-built environments and make suggestions for testing standards to assess collision-reducing technologies

    In What Ways Are Deep Neural Networks Invariant and How Should We Measure This?

    Full text link
    It is often said that a deep learning model is "invariant" to some specific type of transformation. However, what is meant by this statement strongly depends on the context in which it is made. In this paper we explore the nature of invariance and equivariance of deep learning models with the goal of better understanding the ways in which they actually capture these concepts on a formal level. We introduce a family of invariance and equivariance metrics that allows us to quantify these properties in a way that disentangles them from other metrics such as loss or accuracy. We use our metrics to better understand the two most popular methods used to build invariance into networks: data augmentation and equivariant layers. We draw a range of conclusions about invariance and equivariance in deep learning models, ranging from whether initializing a model with pretrained weights has an effect on a trained model's invariance, to the extent to which invariance learned via training can generalize to out-of-distribution data.Comment: To appear at NeurIPS 202
    • …
    corecore