102 research outputs found

    A Novel Low Noise High Gain CMOS Instrumentation Amplifier for Biomedical Applications

    Get PDF
    This work describes a novel technique of designing a low noise high gain CMOS instrumentation amplifier for biomedical applications. A three op-amp instrumentation amplifier have been designed by employing two simple op-amps at the two inputs and a folded cascode op-amp at the output. Both input and output stage op-amps are 2-stage. Most of the earlier designed op-amp in literature uses same type of op-amp at the input and output stages of instrumentation amplifier. By using folded cascode two stage op-amp at the output, we have achieved significant improvement in gain and CMRR. Transistors sizing plays a vital role in achieving high gain and CMRR. To achieve a desirable gain, common mode rejection ratio and other performance metrics, selection of most appropriate op-amp circuit topologies & optimum transistor sizing was the main criteria for design of instrumentation amplifier for biomedical applications. The instrumentation amplifier is simulated using Cadence Spectre tool and layout is designed in Cadence Layout editor at 0.18µm CMOS technology. Each of the input op-amp provides a gain and CMRR of 45dB and 72dB respectively. The output stage folded cascode amplifier provides a gain of 82dB and a CMRR of 92dB. The design achieves an overall gain and CMRR of 67dB and 92db respectively. The designed instrumentation amplifier consumes only 263µW of power suitable for biomedical signal processing applications.DOI:http://dx.doi.org/10.11591/ijece.v3i4.317

    Stiffness after Primary Total Knee Arthroplasty

    Get PDF
    Total knee arthroplasty remains the definitive treatment for end-stage osteoarthritis of the knee. Despite being a very successful intervention in terms of relieving pain and returning a patient’s function, it is not without complications. Post-operative stiffness after total knee arthroplasty is one of those complications that can be puzzling for physicians and debilitating for patients. While the etiology of stiffness is multifactorial, the treatment options are essentially limited to manipulation under anesthesia, removal of adhesions and revision total knee arthroplasty. With patient outcomes directly related to relief of pain and post-operative range of motion, it is paramount that surgeons do all that is necessary to minimize risk of post-operative stiffness

    Securing CNN Model and Biometric Template using Blockchain

    Full text link
    Blockchain has emerged as a leading technology that ensures security in a distributed framework. Recently, it has been shown that blockchain can be used to convert traditional blocks of any deep learning models into secure systems. In this research, we model a trained biometric recognition system in an architecture which leverages the blockchain technology to provide fault tolerant access in a distributed environment. The advantage of the proposed approach is that tampering in one particular component alerts the whole system and helps in easy identification of `any' possible alteration. Experimentally, with different biometric modalities, we have shown that the proposed approach provides security to both deep learning model and the biometric template.Comment: Published in IEEE BTAS 201

    Transformers Learn Shortcuts to Automata

    Full text link
    Algorithmic reasoning requires capabilities which are most naturally understood through recurrent models of computation, like the Turing machine. However, Transformer models, while lacking recurrence, are able to perform such reasoning using far fewer layers than the number of reasoning steps. This raises the question: what solutions are learned by these shallow and non-recurrent models? We find that a low-depth Transformer can represent the computations of any finite-state automaton (thus, any bounded-memory algorithm), by hierarchically reparameterizing its recurrent dynamics. Our theoretical results characterize shortcut solutions, whereby a Transformer with o(T)o(T) layers can exactly replicate the computation of an automaton on an input sequence of length TT. We find that polynomial-sized O(logT)O(\log T)-depth solutions always exist; furthermore, O(1)O(1)-depth simulators are surprisingly common, and can be understood using tools from Krohn-Rhodes theory and circuit complexity. Empirically, we perform synthetic experiments by training Transformers to simulate a wide variety of automata, and show that shortcut solutions can be learned via standard training. We further investigate the brittleness of these solutions and propose potential mitigations

    Exposing Attention Glitches with Flip-Flop Language Modeling

    Full text link
    Why do large language models sometimes output factual inaccuracies and exhibit erroneous reasoning? The brittleness of these models, particularly when executing long chains of reasoning, currently seems to be an inevitable price to pay for their advanced capabilities of coherently synthesizing knowledge, pragmatics, and abstract thought. Towards making sense of this fundamentally unsolved problem, this work identifies and analyzes the phenomenon of attention glitches, in which the Transformer architecture's inductive biases intermittently fail to capture robust reasoning. To isolate the issue, we introduce flip-flop language modeling (FFLM), a parametric family of synthetic benchmarks designed to probe the extrapolative behavior of neural language models. This simple generative task requires a model to copy binary symbols over long-range dependencies, ignoring the tokens in between. We find that Transformer FFLMs suffer from a long tail of sporadic reasoning errors, some of which we can eliminate using various regularization techniques. Our preliminary mechanistic analyses show why the remaining errors may be very difficult to diagnose and resolve. We hypothesize that attention glitches account for (some of) the closed-domain hallucinations in natural LLMs.Comment: v2: NeurIPS 2023 camera-ready + data releas
    corecore