352 research outputs found

    Parameter Learning and the Carry Trade

    Full text link
    Honors (Bachelor's)EconomicsMathematicsUniversity of Michiganhttp://deepblue.lib.umich.edu/bitstream/2027.42/120626/1/yichuanw.pd

    Synthesis and Application of [2.2]Paracyclophane Derivatives in Supramolecular Systems

    Get PDF

    Leveraging Big Data Analytics to Improve Quality of Care In Health Care: A fsQCA Approach

    Get PDF
    Academics across disciplines such as information systems, computer science and healthcare informatics highlight that big data analytics (BDA) have the potential to provide tremendous benefits for healthcare industries. Nevertheless, healthcare organizations continue to struggle to make progress on their BDA initiatives. Drawing on the configuration theory, this paper proposes a conceptual framework to explore the impact of BDA on improving quality of care in health care. Specifically, we investigate how BDA capabilities interact with complementary organizational resources and organizational capabilities in multiple configurations to achieve higher quality of care. Fuzzy-set qualitative comparative analysis (fsQCA), which is a relatively new approach, was employed to identify five different configurations that lead to higher quality of care. These findings offer evidence to suggest that a range of solutions leading to better healthcare performance can indeed be identified through the effective use of BDA and other organizational elements

    ADVANCED REFLECTION SEISMIC STUDIES OF PHASE I WEYBURN CO2 SEQUESTRATION MONITORING

    Get PDF
    Three-dimensional, time-lapse (TL) reflection seismic datasets and well logs collected for Phase I CO2 sequestration project in Weyburn oilfield (southern Saskatchewan, Canada) are utilized for developing new approaches in three research areas: 1) estimation of seismic source waveforms, 2) evaluation of TL acoustic impedance (AI) variations for monitoring CO2 propagation, and 3) rigorous modeling of seismic waves propagating through finely layered rock. All three study areas are interconnected and important for accurate analysis of seismic data and TL monitoring of this and other oil reservoirs undergoing fluid injection. The first approach focuses on estimating the source waveforms from reflection seismic data, which is critical for evaluating accurate well-to-seismic ties as well as in other applications. A simple and effective method is proposed, based on iterative identification of the strongest and sparse reflections in seismic records, which allows estimation of source waveforms through an optimization approach, without well-log control and statistical hypotheses. The method allows correcting for coherent noise which seems to occur in stacked Weyburn data, consisting in (de)amplification and time shifts of the low-frequency components of the records. The method is tested on real and self-similar synthetic well-log models and applied to the Weyburn seismic data. For the second topic, a post-stack waveform-calibration processing procedure is developed in order to achieve accurate consistency of TL datasets. Time shifts between the monitor and baseline records are also measured during this procedure, and an improved method for calculating the TL reflectivity differences is proposed. Further, instead of subtraction of the baseline and monitor AIs, TL AI variations are evaluated directly from the reflectivity differences and baseline AI. AI inversion is performed by an accurate and stable method using the stacked reflection and well-log data, and also seismic velocities measured during data processing. The inverted time shifts and TL AI variations correlate with CO2 distributions within the reservoir and allow estimating parameters of the reservoir. In the third research area, a completely new approach to seismic wave modeling is proposed. Rigorous first-principle continuum mechanics is used instead of the conventional viscoelastic approximation. This modeling considers the existence of internal variables, body-force internal friction, and boundary conditions for internal variables. These factors are disregarded in the viscoelastic model, but they should cause dominant effects on seismic-wave attenuation and velocity dispersion in layered media. Numerical modeling of seismic wave propagation is performed in a model of the Weyburn Field. The resulting wavefield and seismic attenuation parameters are found to strongly depend on the internal boundary conditions between layers. Several types of quality (Q) factors are measured in the modeled synthetic waveforms

    Challenges for Less Developed Countries: Agricultural Policies in the EU and the US

    Get PDF
    Agricultural policies adopted by developed countries are considered distortional and detrimental to less developed countries (LDCs). This paper discusses the adverse impacts on less developed countries of the agricultural support regimes of the European Union (EU) and the United States (US). Despite the fact that the budget for agriculture in these constituencies has the same order of magnitude, we find that the EU relies much more heavily on agricultural support than does the US. Specifically, the EU provides agricultural producers with an amount of support that is about two-and-a-half times that of the US, and for most commodities a larger share of farmers’ income stems from support measures as well. While the composition of producer support differs between the EU and US, the per-dollar negative impact of the policies on farmers in LDCs is about equal. Finally, we analyse the medium-term impact of the 2003 reform of common agricultural policy in the EU. We estimate the reform will lead to a reduction of EU producer support of 20 percent by 2013 and will reduce the per-dollar negative impact on LDCs of the policy as well.agricultural support, European Union, US, developing countries, least-developed countries

    Carbon Sequestration in Forest Ecosystems as a Strategy for Mitigating Climate Change

    Get PDF
    Under Kyoto, forestry activities that sequester carbon can be used to create CO2 offset credits that could obviate the need for lifestyle-changing reductions in fossil fuel use. Credits are earned by storing carbon in forest ecosystems and wood products, although CO2 emissions are also mitigated by delaying deforestation, which accounts for one-quarter of anthropogenic CO2 emissions. Non-permanent carbon offsets from forest activities are difficult to compare with each other and with mitigation strategies because they differ in how long they prevent CO2 from entering the atmosphere. In this paper, we investigate issues of carbon sequestration in detail, but in particular we expand in comprehensive fashion on earlier work comparing carbon mitigation activities according to how long they can lower atmospheric CO2 levels. The duration problem is modeled theoretically. Meta-regression analysis with 1047 observations from 68 studies is then used to determine whether the duration problem leads to inconclusive results between carbon-uptake costs and carbon sequestration. In addition, from the regression analysis, it is possible to estimate potential costs of carbon uptake via forestry activities for various scenarios. It turns out that forestry activities are competitive with emissions reduction in tropical regions and, perhaps, in boreal regions, but certainly not in Europe.climate change, carbon offset credits from forestry activities, meta-regression analysis, Resource /Energy Economics and Policy, Q54, R15, Q23, Q27,

    Costs of Creating Carbon Offset Credits via Forestry Activities: A Meta-Regression Analysis

    Get PDF
    The main focus of efforts to mitigate climate change is on the avoidance of fossil fuel emissions. However, the Kyoto rules permit the use of forestry activities that create carbon offset credits. These could obviate the need for lifestyle-changing reductions in fossil fuel use. It is necessary for policy purposes, therefore, to determine the cost effectiveness of creating forest sink carbon credits. In this study, meta-regression analyses with 1047 observations from 68 studies are used to determine factors that affect carbon sequestration costs. Results indicate that soil carbon is not very important, but that forest plantations and use of biomass for energy make forestry activities more attractive. It also turns out that forestry activities are competitive with emissions reduction in tropical regions and, perhaps, boreal regions, but certainly not in Europe. Finally, the regression estimates are used to project the potential costs of carbon uptake for various forest management scenarios.climate mitigation, forest carbon offset credits, meta-regression analysis

    The Effect of Social Factors on User-Generated Content Productivity: Evidence from Flickr.com

    Get PDF
    User-generated content (UGC) is a valuable information resource on social media. However, we currently know little about why people are constantly producing digital content. In an effort to gather more contents, many UGC service providers are trying to help users improve their social experience. Through the lens of social network theories, our study illustrates that social influence and social interaction between users will motivate users to contribute contents on UGC platforms. We hence design an empirical study to discover the social factors influencing users’ productivity. By collecting data from Flickr.com, we expect to support that individuals’ perception of “social comparison” and “social identity” will positively influence users’ behavior of content generation. We also assume that social interaction, in terms of tie strength, will increase users’ productivity of content

    STUDY OF STOCHASTIC POLYAK STEP-SIZES FOR STOCHASTIC OPTIMIZATION

    Get PDF
    Stochastic gradient descent (SGD) is commonly used in solving finite sum optimization problems. The main parameter one needs to choose for this algorithm is the step-size. Adaptive step sizes are particularly appealing as generally they do not rely on the parameter of functions such as smoothness constant or strong convexity constant which is required to be known in advance in guaranteeing the convergence of SGD with constant step size. Loizou et al. [1] and Horváth et al. [2] proposed novel adaptive step-sizes, SPS and StoPS, which can be seen as the stochastic variants of the classical Polyak step-size (PS). In this thesis, we provide a new viewpoint and analyze SPS and StoPS via operator theory: we no longer analyze the step-size individually. Instead, we combine the adaptive step-size and unbiased estimator of the stochastic gradient. We prove the properties of SPS and StoPS operators when assuming different properties of the function we aim to minimize and give the convergence guarantees for several types of objective with the help of operator properties. Besides, we analyze the stochastic reformulation of SPS operator and show how it is different from minibatching SPS and the connection with distributed optimization
    corecore