4,758 research outputs found

    Analysis of Passive Charge Balancing for Safe Current-Mode Neural Stimulation

    Get PDF
    Charge balancing has been often considered as one of the most critical requirement for neural stimulation circuits. Over the years several solutions have been proposed to precisely balance the charge transferred to the tissue during anodic and cathodic phases. Elaborate dynamic current sources/sinks with improved matching, and feedback loops have been proposed with a penalty on circuit complexity, area or power consumption. Here we review the dominant assumptions in safe stimulation protocols, and derive mathematical models to determine the effectiveness of passive charge balancing in a typical application scenario

    Fundamental Frequencies in the Schwarzschild Spacetime

    Full text link
    We consider the Keplerian, radial and vertical fundamental frequencies in the Schwarzschild spacetime to study the so-called kilohertz quasi-periodic oscillations from low-mass X-ray binary systems. We show that, within the Relativistic Precession Model, the interpretation of observed kilohertz quasi-periodic oscillations in terms of the fundamental frequencies of test particles in the Schwarzschild spacetime, allows one to infer the total mass MM of the central object, the internal RinR_{in} and external RexR_{ex} radii of accretion disks, and innermost stable circular orbits rISCOr_{ISCO} for test particles in a low-mass X-ray binary system. By constructing the relation between the upper and lower frequencies and exploiting the quasi-periodic oscillation data of the Z and Atoll sources we perform the non-linear model fit analysis and estimate the mass of the central object. Knowing the value of the mass we calculate the internal RinR_{in} and external RexR_{ex} radii of accretion disks and show that they are larger than rISCOr_{ISCO}, what was expected.Comment: 7 pages, 6 figures, 1 tabl

    Resonant Electro-Optic Frequency Comb

    Full text link
    High speed optical telecommunication is enabled by wavelength division multiplexing, whereby hundreds of individually stabilized lasers encode the information within a single mode optical fiber. In the seek for larger bandwidth the optical power sent into the fiber is limited by optical non-linearities within the fiber and energy consumption of the light sources starts to become a significant cost factor. Optical frequency combs have been suggested to remedy this problem by generating multiple laser lines within a monolithic device, their current stability and coherence lets them operate only in small parameter ranges. Here we show that a broadband frequency comb realized through the electro-optic effect within a high quality whispering gallery mode resonator can operate at low microwave and optical powers. Contrary to the usual third order Kerr non-linear optical frequency combs we rely on the second order non-linear effect which is much more efficient. Our result uses a fixed microwave signal which is mixed with an optical pump signal to generate a coherent frequency comb with a precisely determined carrier separation. The resonant enhancement enables us to operate with microwave powers three order magnitude smaller than in commercially available devices. We can expect the implementation into the next generation long distance telecommunication which relies on coherent emission and detection schemes to allow for operation with higher optical powers and at reduced cost

    When is the Haar measure a Pietsch measure for nonlinear mappings?

    Full text link
    We show that, as in the linear case, the normalized Haar measure on a compact topological group GG is a Pietsch measure for nonlinear summing mappings on closed translation invariant subspaces of C(G)C(G). This answers a question posed to the authors by J. Diestel. We also show that our result applies to several well-studied classes of nonlinear summing mappings. In the final section some problems are proposed

    Activation-induced deoxycytidine deaminase (AID) co-transcriptional scanning at single-molecule resolution

    Get PDF
    Activation-induced deoxycytidine deaminase (AID) generates antibody diversity in B cells by initiating somatic hypermutation (SHM) and class-switch recombination (CSR) during transcription of immunoglobulin variable (IgV) and switch region (IgS) DNA. Using single-molecule FRET, we show that AID binds to transcribed dsDNA and translocates unidirectionally in concert with RNA polymerase (RNAP) on moving transcription bubbles, while increasing the fraction of stalled bubbles. AID scans randomly when constrained in an 8 nt model bubble. When unconstrained on single-stranded (ss) DNA, AID moves in random bidirectional short slides/hops over the entire molecule while remaining bound for ~5 min. Our analysis distinguishes dynamic scanning from static ssDNA creasing. That AID alone can track along with RNAP during transcription and scan within stalled transcription bubbles suggests a mechanism by which AID can initiate SHM and CSR when properly regulated, yet when unregulated can access non-Ig genes and cause cancer

    A formal analysis of why heuristic functions work

    Get PDF
    AbstractMany optimization problems in computer science have been proven to be NP-hard, and it is unlikely that polynomial-time algorithms that solve these problems exist unless P=NP. Alternatively, they are solved using heuristics algorithms, which provide a sub-optimal solution that, hopefully, is arbitrarily close to the optimal. Such problems are found in a wide range of applications, including artificial intelligence, game theory, graph partitioning, database query optimization, etc. Consider a heuristic algorithm, A. Suppose that A could invoke one of two possible heuristic functions. The question of determining which heuristic function is superior, has typically demanded a yes/no answer—one which is often substantiated by empirical evidence. In this paper, by using Pattern Classification Techniques (PCT), we propose a formal, rigorous theoretical model that provides a stochastic answer to this problem. We prove that given a heuristic algorithm, A, that could utilize either of two heuristic functions H1 or H2 used to find the solution to a particular problem, if the accuracy of evaluating the cost of the optimal solution by using H1 is greater than the accuracy of evaluating the cost using H2, then H1 has a higher probability than H2 of leading to the optimal solution. This unproven conjecture has been the basis for designing numerous algorithms such as the A* algorithm, and its variants. Apart from formally proving the result, we also address the corresponding database query optimization problem that has been open for at least two decades. To validate our proofs, we report empirical results on database query optimization techniques involving a few well-known histogram estimation methods
    • …
    corecore