12,193 research outputs found
Recursive Thick Modeling and the Choice of Monetary Policy in Mexico
By following the spirit in Favero and Milani (2005), we use recursive thick modeling to take into account model uncertainty for the choice of optimal monetary policy. We consider an open economy model and generate multiple models for only the aggregate demand and aggregate supply. Models are constructed by matching the rankings of aggregate demand and aggregate supply and adding other specifications for the rest of the variables. The main results show that recursive thick modeling with equal and different weights approximates the recent historical behavior of nominal interest rates in Mexico better than recursive thin modelingmodel uncertainty, optimal control, out-of-bag, thin modeling and thick modeling
On the analytical determination of relaxation modulus of viscoelastic materials by Prony's interpolation method
A computer implementation to Prony's curve fitting by exponential functions is presented. The method, although more than one hundred years old, has not been utilized to its fullest capabilities due to the restriction that the time range must be given in equal increments in order to obtain the best curve fit for a given set of data. The procedure used in this paper utilizes the 3-dimensional capabilities of the Interactive Graphics Design System (I.G.D.S.) in order to obtain the equal time increments. The resultant information is then input into a computer program that solves directly for the exponential constants yielding the best curve fit. Once the exponential constants are known, a simple least squares solution can be applied to obtain the final form of the equation
Super-Resolution for Overhead Imagery Using DenseNets and Adversarial Learning
Recent advances in Generative Adversarial Learning allow for new modalities
of image super-resolution by learning low to high resolution mappings. In this
paper we present our work using Generative Adversarial Networks (GANs) with
applications to overhead and satellite imagery. We have experimented with
several state-of-the-art architectures. We propose a GAN-based architecture
using densely connected convolutional neural networks (DenseNets) to be able to
super-resolve overhead imagery with a factor of up to 8x. We have also
investigated resolution limits of these networks. We report results on several
publicly available datasets, including SpaceNet data and IARPA Multi-View
Stereo Challenge, and compare performance with other state-of-the-art
architectures.Comment: 9 pages, 9 figures, WACV 2018 submissio
Modern Approaches to Topological Quantum Error Correction
The construction of a large-scale fault-tolerant quantum computer is an outstanding scientific and technological goal. It holds the promise to allow us to solve a variety of complex problems such as factoring large numbers, quick database search, and the quantum simulation of many-body quantum systems in fields as diverse as condensed matter, quantum chemistry, and even high-energy physics. Sophisticated theoretical protocols for reliable quantum information processing under imperfect conditions have been de-veloped, when errors affect and corrupt the fragile quantum states during storage and computations. Arguably, the most realistic and promising ap-proach towards practical fault-tolerant quantum computation are topologi-cal quantum error-correcting codes, where quantum information is stored in interacting, topologically ordered 2D or 3D many-body quantum systems. This approach offers the highest known error thresholds, which are already today within reach of the experimental accuracy in state-of-the-art setups. A combination of theoretical and experimental research is needed to store, protect and process fragile quantum information in logical qubits effectively so that they can outperform their constituting physical qubits. Whereas small-scale quantum error correction codes have been implemented, one of the main theoretical challenges remains to develop new and improve existing efficient strategies (so-called decoders) to derive (near-)optimal error cor-rection operations in the presence of experimentally accessible measurement information and realistic noise sources. One main focus of this project is the development and numerical implementation of scalable, efficient decoders to operate topological color codes. Additionally, we study the feasibility of im-plementing quantum error-correcting codes fault-tolerantly in near-term ion traps. To this end, we use realistic modeling of the different noise sources, computer simulations, and most modern quantum information approaches to quantum circuitry and noise suppression techniques
Porcion 72, Jose Maria Balli
Jose Maria Balli, Hidalgo County, Alamo, North Alamo Heights Subdivision.https://scholarworks.utrgv.edu/chapseducationalresources/1058/thumbnail.jp
In Singulo Biophysics:accessing the Dynamics of Intracellular Processes at the Molecular and Cellular Levels
The understanding of the mechanisms behind the dynamics of biological processes is crucial to build a more comprehensive picture of the principles governing living systems. At the microscopic scale, local fluctuations and perturbations shape the dynamic behavior of these natural processes, demanding experimental access to the action of their individual players, e.g., biomolecules, supramolecular complexes, organelles. This thesis presents a variety of biophysical strategies based on advanced microscopy techniques to study biomolecular processes that are highly dynamic in essence. In particular, a combination of optical trapping, fluorescence microscopy, and atomic force microscopy was applied to describe diverse intracellular processes at the level of individual molecules and individual cells. The biological processes covered in this thesis are: the role of histone chaperones regulating histone-histone and histone-DNA interaction in the context of nucleosome remodeling; the thermodynamic and structural characterization of DNA encapsidation and nucleation intermediates in virus assembly; and the development of a method to investigate the kinetic responses of chemically activated macrophages with single-cell resolution. Overall, we report the detailed molecular and cellular behavior of the targeted biological systems with special emphasis in the underlying dynamics and energetics that lead to their specific biological functions
Extraction of wood compounds by use of subcritical fluids
A study of the extraction of oak wood compounds with subcritical water-ethanol mixtures as extractant, with an ethanol content between 0-60%, is reported. Identification and characterisation of the extracted compounds have been made by spectrophotometry and gas chromatography with either flame ionisation or mass detectors. Extraction was performed statically manner by use of a single cycle or repeated cycles. All variables affecting the extraction process were studied and optimised. Extraction time and temperature are 60 min and 200ºC, respectively. Comparison of the extract thus obtained with commercial extracts showed the former to be rich in compounds characteristic of the commercial extracts. The method enables manipulation of the extract composition by changing the temperature and water/ethanol ratio used. It is faster than the traditional procedures for obtaining wood extracts
- …