401 research outputs found

    Big Data in Finance: Highlights from the Big Data in Finance Conference Hosted at the University of Michigan October 27-28, 2016

    Get PDF
    How can financial data be made more accessible and more secure, as well as more useful to regulators, market participants, and the public? As new data sets are created, opportunities emerge. Vast quantities of financial data may help identify emerging risks, enable market participants and regulators to see and better understand financial networks and interconnections, enhance financial stability, bolster consumer protection, and increase access to the underserved. Data can also increase transparency in the financial system for market participants, regulators and the public. These data sets, however, can raise significant questions about security and privacy; ensuring data quality; protecting against discrimination or privacy intrusions; managing, synthesizing, presenting, and analyzing data in usable form; and sharing data among regulators, researchers, and the public. Moreover, any conflicts among regulators and financial firms over such data could create opportunities for regulatory arbitrage and gaps in understanding risk in the financial system. The Big Data in Finance Conference, co-sponsored by the federal Office of Financial Research and the University of Michigan Center on Finance, Law, and Policy, and held at the University of Michigan Law School on October 27-28, 2016, covered a number of important and timely topics in the worlds of Big Data and finance. This paper highlights several key issues and conference takeaways as originally presented by the contributors and panelists who took part

    Ecosystem carbon 7 dioxide fluxes after disturbance in forests of North America

    Get PDF
    Disturbances are important for renewal of North American forests. Here we summarize more than 180 site years of eddy covariance measurements of carbon dioxide flux made at forest chronosequences in North America. The disturbances included stand-replacing fire (Alaska, Arizona, Manitoba, and Saskatchewan) and harvest (British Columbia, Florida, New Brunswick, Oregon, Quebec, Saskatchewan, and Wisconsin) events, insect infestations (gypsy moth, forest tent caterpillar, and mountain pine beetle), Hurricane Wilma, and silvicultural thinning (Arizona, California, and New Brunswick). Net ecosystem production (NEP) showed a carbon loss from all ecosystems following a stand-replacing disturbance, becoming a carbon sink by 20 years for all ecosystems and by 10 years for most. Maximum carbon losses following disturbance (g C m−2y−1) ranged from 1270 in Florida to 200 in boreal ecosystems. Similarly, for forests less than 100 years old, maximum uptake (g C m−2y−1) was 1180 in Florida mangroves and 210 in boreal ecosystems. More temperate forests had intermediate fluxes. Boreal ecosystems were relatively time invariant after 20 years, whereas western ecosystems tended to increase in carbon gain over time. This was driven mostly by gross photosynthetic production (GPP) because total ecosystem respiration (ER) and heterotrophic respiration were relatively invariant with age. GPP/ER was as low as 0.2 immediately following stand-replacing disturbance reaching a constant value of 1.2 after 20 years. NEP following insect defoliations and silvicultural thinning showed lesser changes than stand-replacing events, with decreases in the year of disturbance followed by rapid recovery. NEP decreased in a mangrove ecosystem following Hurricane Wilma because of a decrease in GPP and an increase in ER

    A Survey of Laboratory and Statistical Issues Related to Farmworker Exposure Studies

    Get PDF
    Developing internally valid, and perhaps generalizable, farmworker exposure studies is a complex process that involves many statistical and laboratory considerations. Statistics are an integral component of each study beginning with the design stage and continuing to the final data analysis and interpretation. Similarly, data quality plays a significant role in the overall value of the study. Data quality can be derived from several experimental parameters including statistical design of the study and quality of environmental and biological analytical measurements. We discuss statistical and analytic issues that should be addressed in every farmworker study. These issues include study design and sample size determination, analytical methods and quality control and assurance, treatment of missing data or data below the method’s limits of detection, and post-hoc analyses of data from multiple studies

    Understanding the treatment benefit of hyperimmune anti-influenza intravenous immunoglobulin (Flu-IVIG) for severe human influenza

    Get PDF
    Background: Antibody-based therapies for respiratory viruses are of increasing importance. The INSIGHT 006 trial administered anti-influenza hyperimmune intravenous immunoglobulin (Flu-IVIG) to patients hospitalized with influenza. Flu-IVIG treatment improved outcomes in patients with influenza B but showed no benefit for influenza A. Methods: To probe potential mechanisms of Flu-IVIG utility, sera collected from patients hospitalized with influenza A or B viruses (IAV or IBV) were analyzed for antibody isotype/subclass and Fcγ receptor (FcγR) binding by ELISA, bead-based multiplex, and NK cell activation assays. Results: Influenza-specific FcγR-binding antibodies were elevated in Flu-IVIG–infused IBV- and IAV-infected patients. In IBV-infected participants (n = 62), increased IgG3 and FcγR binding were associated with more favorable outcomes. Flu-IVIG therapy also improved the odds of a more favorable outcome in patients with low levels of anti-IBV Fc-functional antibody. Higher FcγR-binding antibody was associated with less favorable outcomes in IAV-infected patients (n = 50), and Flu-IVIG worsened the odds of a favorable outcome in participants with low levels of anti-IAV Fc-functional antibody. Conclusion: These detailed serological analyses provide insights into antibody features and mechanisms required for a successful humoral response against influenza, suggesting that IBV-specific, but not IAV-specific, antibodies with Fc-mediated functions may assist in improving influenza outcome. This work will inform development of improved influenza immunotherapies

    The role of the chemokine receptor CXCR4 in infection with feline immunodeficiency virus

    Get PDF
    Infection with feline immunodeficiency virus (FIV) leads to the development of a disease state similar to AIDS in man. Recent studies have identified the chemokine receptor CXCR4 as the major receptor for cell culture-adapted strains of FIV, suggesting that FIV and human immunodeficiency virus (HIV) share a common mechanism of infection involving an interaction between the virus and a member of the seven transmembrane domain superfamily of molecules. This article reviews the evidence for the involvement of chemokine receptors in FIV infection and contrasts these findings with similar studies on the primate lentiviruses HIV and SIV (simian immunodeficiency virus)

    Emergent Dark Matter, Baryon, and Lepton Numbers

    Get PDF
    We present a new mechanism for transferring a pre-existing lepton or baryon asymmetry to a dark matter asymmetry that relies on mass mixing which is dynamically induced in the early universe. Such mixing can succeed with only generic scales and operators and can give rise to distinctive relationships between the asymmetries in the two sectors. The mixing eliminates the need for the type of additional higher-dimensional operators that are inherent to many current asymmetric dark matter models. We consider several implementations of this idea. In one model, mass mixing is temporarily induced during a two-stage electroweak phase transition in a two Higgs doublet model. In the other class of models, mass mixing is induced by large field vacuum expectation values at high temperatures - either moduli fields or even more generic kinetic terms. Mass mixing models of this type can readily accommodate asymmetric dark matter masses ranging from 1 GeV to 100 TeV and expand the scope of possible relationships between the dark and visible sectors in such models.Comment: 36 pages, 5 figure

    A Solution to the Strong CP Problem with Gauge-Mediated Supersymmetry Breaking

    Get PDF
    We demonstrate that a certain class of low scale supersymmetric ``Nelson-Barr'' type models can solve the strong and supersymmetric CP problems while at the same time generating sufficient weak CP violation in the K0Kˉ0K^{0}-\bar{K}^{0} system. In order to prevent one-loop corrections to θˉ\bar{\theta} which violate bounds coming from the neutron electric dipole moment (EDM), one needs a scheme for the soft supersymmetry breaking parameters which can naturally give sufficient squark degeneracies and proportionality of trilinear soft supersymmetry-breaking parameters to Yukawa couplings. We show that a gauge-mediated supersymmetry breaking sector can provide the needed degeneracy and proportionality, though that proves to be a problem for generic Nelson-Barr models. The workable model we consider here has the Nelson-Barr mass texture enforced by a gauge symmetry; one also expects a new U(1) gauge superfield with mass in the TeV range. The resulting model is predictive. We predict a measureable neutron EDM and the existence of extra vector-like quark superfields which can be discovered at the LHC. Because the 3×33\times 3 Cabbibo-Kobayashi-Maskawa matrix is approximately real, the model also predicts a flat unitarity triangle and the absence of substantial CP violation in the BB system at future BB factories. We discuss the general issues pertaining to the construction of such a workable model and how they lead to the successful strategy. A detailed renormalization group study is then used to establish the feasibility of the model considered.Comment: Proof-read version to appear in Phys. Rev.

    Late quaternary biotic homogenization of North American mammalian faunas

    Get PDF
    Biotic homogenization-increasing similarity of species composition among ecological communities-has been linked to anthropogenic processes operating over the last century. Fossil evidence, however, suggests that humans have had impacts on ecosystems for millennia. We quantify biotic homogenization of North American mammalian assemblages during the late Pleistocene through Holocene (similar to 30,000 ybp to recent), a timespan encompassing increased evidence of humans on the landscape (similar to 20,000-14,000 ybp). From similar to 10,000 ybp to recent, assemblages became significantly more homogenous (>100% increase in Jaccard similarity), a pattern that cannot be explained by changes in fossil record sampling. Homogenization was most pronounced among mammals larger than 1 kg and occurred in two phases. The first followed the megafaunal extinction at similar to 10,000 ybp. The second, more rapid phase began during human population growth and early agricultural intensification (similar to 2,000-1,000 ybp). We show that North American ecosystems were homogenizing for millennia, extending human impacts back similar to 10,000 years.Peer reviewe
    corecore