342 research outputs found

    Inferring neural circuit structure from datasets of heterogeneous tuning curves.

    Get PDF
    Tuning curves characterizing the response selectivities of biological neurons can exhibit large degrees of irregularity and diversity across neurons. Theoretical network models that feature heterogeneous cell populations or partially random connectivity also give rise to diverse tuning curves. Empirical tuning curve distributions can thus be utilized to make model-based inferences about the statistics of single-cell parameters and network connectivity. However, a general framework for such an inference or fitting procedure is lacking. We address this problem by proposing to view mechanistic network models as implicit generative models whose parameters can be optimized to fit the distribution of experimentally measured tuning curves. A major obstacle for fitting such models is that their likelihood function is not explicitly available or is highly intractable. Recent advances in machine learning provide ways for fitting implicit generative models without the need to evaluate the likelihood and its gradient. Generative Adversarial Networks (GANs) provide one such framework which has been successful in traditional machine learning tasks. We apply this approach in two separate experiments, showing how GANs can be used to fit commonly used mechanistic circuit models in theoretical neuroscience to datasets of tuning curves. This fitting procedure avoids the computationally expensive step of inferring latent variables, such as the biophysical parameters of, or synaptic connections between, particular recorded cells. Instead, it directly learns generalizable model parameters characterizing the network's statistical structure such as the statistics of strength and spatial range of connections between different cell types. Another strength of this approach is that it fits the joint high-dimensional distribution of tuning curves, instead of matching a few summary statistics picked a priori by the user, resulting in a more accurate inference of circuit properties. More generally, this framework opens the door to direct model-based inference of circuit structure from data beyond single-cell tuning curves, such as simultaneous population recordings

    A test of speculative arbitrage : is the cross-section of volatility invariant?

    Get PDF
    We derive testable implications of Kyle and Obizhaevaā€™s (2016) notion of ā€œbet invarianceā€ for the cross-section of trade-time volatilities. We jointly develop theoretical foundations of ā€œno speculative arbitrageā€ whose implications incorporate those of bet invariance. Our proposed test circumvents the unobservable nature of ā€œbets.ā€ Utilizing a large sample of U.S. stocks post decimilization, we show that using realized volatilities rather than expected volatilities introduces noise that substantially biases the tests. This leads us to use estimates of normalized volatilities based on running 24 month windows. We ļ¬nd strong support for no speculative arbitrage at a moment in time, but not across time

    The night and day of Amihudā€™s (2002) liquidity measure

    Get PDF
    Amihudā€™s (2002) stock (il)liquidity measure averages the daily ratio of absolute closeto-close return to dollar volume, including overnight returns, while trading volumes come from regular trading hours. Our modiļ¬ed measure addresses this mis-match by using open-to-close returns. It better explains cross-sections of returns, doubling estimated liquidity premia over 1964ā€“2017. Using non-synchronous trading near close as an instrument reveals that overnight returns are primarily information-driven and orthogonal to price impacts of trading. Thus, including them in liquidity proxies magniļ¬es measurement error, understating liquidity premia. Our modiļ¬cation especially matters when applications in ļ¬nance and accounting render use of intraday data infeasible/undesirable

    Exploring Parameter Constraints on Quintessential Dark Energy: the Inverse Power Law Model

    Full text link
    We report on the results of a Markov Chain Monte Carlo (MCMC) analysis of an inverse power law (IPL) quintessence model using the Dark Energy Task Force (DETF) simulated data sets as a representation of future dark energy experiments. We generate simulated data sets for a Lambda-CDM background cosmology as well as a case where the dark energy is provided by a specific IPL fiducial model and present our results in the form of likelihood contours generated by these two background cosmologies. We find that the relative constraining power of the various DETF data sets on the IPL model parameters is broadly equivalent to the DETF results for the w_{0}-w_{a} parameterization of dark energy. Finally, we gauge the power of DETF "Stage 4" data by demonstrating a specific IPL model which, if realized in the universe, would allow Stage 4 data to exclude a cosmological constant at better than the 3-sigma level.Comment: 15 pages, including 13 figure

    Constraints on small-scale cosmological perturbations from gamma-ray searches for dark matter

    Full text link
    Events like inflation or phase transitions can produce large density perturbations on very small scales in the early Universe. Probes of small scales are therefore useful for e.g. discriminating between inflationary models. Until recently, the only such constraint came from non-observation of primordial black holes (PBHs), associated with the largest perturbations. Moderate-amplitude perturbations can collapse shortly after matter-radiation equality to form ultracompact minihalos (UCMHs) of dark matter, in far greater abundance than PBHs. If dark matter self-annihilates, UCMHs become excellent targets for indirect detection. Here we discuss the gamma-ray fluxes expected from UCMHs, the prospects of observing them with gamma-ray telescopes, and limits upon the primordial power spectrum derived from their non-observation by the Fermi Large Area Space Telescope.Comment: 4 pages, 3 figures. To appear in J Phys Conf Series (Proceedings of TAUP 2011, Munich

    A Qualitative Study of Anticipated Decision Making around Type 2 Diabetes Genetic Testing: the Role of Scientifically Concordant and Discordant Expectations

    Full text link
    Type 2 diabetes mellitus (T2DM) genetic testing is undergoing clinical trials to measure the efficacy of genetic counseling for behaviorā€based risk reduction. The expectations patients bring to the testing process may play an important role in individual outcomes. We conducted a qualitative exploration of anticipated decisionā€making and expectations around T2DM genetic testing. Semiā€structured interviews were completed with Mexican Americans (n = 34), nonā€Hispanic Black Americans (n = 39), and nonā€Hispanic White Americans (n = 39) at risk for T2DM. Transcripts were analyzed for themes. Most participants would accept T2DM genetic testing in order to motivate riskā€reducing behaviors or apprise family members of their risk. Participants who would decline testing wished to avoid emotional distress or believed the test would not reveal new risk information. Nonā€Hispanic Whites and those with college education declined genetic testing more often than other groups. Those without college education were more likely to have testing expectations that were discordant with current science, such as conflating genetic testing with common ā€˜blood tests.ā€™ Understanding expectations and decisionā€making factors around T2DM genetic testing will better prepare healthcare professionals to counsel their patients. This may lead to a higher efficacy of T2DM genetic testing and counseling.Peer Reviewedhttps://deepblue.lib.umich.edu/bitstream/2027.42/147076/1/jgc40469.pd

    Continuous monitoring of an intentionally-manufactured crack using an automated welding and in-process inspection system

    Get PDF
    Automated weld deposition coupled with the real-time robotic Non-Destructive Evaluation (NDE) is used in this paper. For performance verification of the in-process inspection system, an intentionally embedded defect, a tungsten rod, is introduced into the multi-pass weld. A partially-filled groove (staircase) sample is also manufactured and ultrasonically tested to calibrate the real-time inspection implemented on all seven layers of the weld which are deposited progressively. The tungsten rod is successfully detected in the real-time NDE of the deposited position. The same robotic inspection system was then used to continuously monitor an intentionally-manufactured crack for 20 h. The crack was initiated 22 min after the weld ended and it grew quickly within the next 1.5 h. The crack growth stopped approximately after 2 h and no considerable change in the reflection signal was detected for the next 18 h of monitoring

    SUSY Breaking and Moduli Stabilization from Fluxes in Gauged 6D Supergravity

    Get PDF
    We construct the 4D N=1 supergravity which describes the low-energy limit of 6D supergravity compactified on a sphere with a monopole background a la Salam and Sezgin. This provides a simple setting sharing the main properties of realistic string compactifications such as flat 4D spacetime, chiral fermions and N=1 supersymmetry as well as Fayet-Iliopoulos terms induced by the Green-Schwarz mechanism. The matter content of the resulting theory is a supersymmetric SO(3)xU(1) gauge model with two chiral multiplets, S and T. The expectation value of T is fixed by the classical potential, and S describes a flat direction to all orders in perturbation theory. We consider possible perturbative corrections to the Kahler potential in inverse powers of ReSRe S and ReTRe T, and find that under certain circumstances, and when taken together with low-energy gaugino condensation, these can lift the degeneracy of the flat direction for ReSRe S. The resulting vacuum breaks supersymmetry at moderately low energies in comparison with the compactification scale, with positive cosmological constant. It is argued that the 6D model might itself be obtained from string compactifications, giving rise to realistic string compactifications on non Ricci flat manifolds. Possible phenomenological and cosmological applications are briefly discussed.Comment: 32 pages, 2 figures. Uses JHEP3.cls. References fixed and updated, some minor typos fixed. Corrected minor error concerning Kaluza-Klein scales. Results remain unchange

    Statistical coverage for supersymmetric parameter estimation: a case study with direct detection of dark matter

    Full text link
    Models of weak-scale supersymmetry offer viable dark matter (DM) candidates. Their parameter spaces are however rather large and complex, such that pinning down the actual parameter values from experimental data can depend strongly on the employed statistical framework and scanning algorithm. In frequentist parameter estimation, a central requirement for properly constructed confidence intervals is that they cover true parameter values, preferably at exactly the stated confidence level when experiments are repeated infinitely many times. Since most widely-used scanning techniques are optimised for Bayesian statistics, one needs to assess their abilities in providing correct confidence intervals in terms of the statistical coverage. Here we investigate this for the Constrained Minimal Supersymmetric Standard Model (CMSSM) when only constrained by data from direct searches for dark matter. We construct confidence intervals from one-dimensional profile likelihoods and study the coverage by generating several pseudo-experiments for a few benchmark sets of pseudo-true parameters. We use nested sampling to scan the parameter space and evaluate the coverage for the benchmarks when either flat or logarithmic priors are imposed on gaugino and scalar mass parameters. The sampling algorithm has been used in the configuration usually adopted for exploration of the Bayesian posterior. We observe both under- and over-coverage, which in some cases vary quite dramatically when benchmarks or priors are modified. We show how most of the variation can be explained as the impact of explicit priors as well as sampling effects, where the latter are indirectly imposed by physicality conditions. For comparison, we also evaluate the coverage for Bayesian credible intervals, and observe significant under-coverage in those cases.Comment: 30 pages, 5 figures; v2 includes major updates in response to referee's comments; extra scans and tables added, discussion expanded, typos corrected; matches published versio
    • ā€¦
    corecore