155 research outputs found
Small business consulting : making it work
https://egrove.olemiss.edu/aicpa_guides/1223/thumbnail.jp
Role of signal averaging of the surface QRS complex in selecting patients with nonsustained ventricular tachycardia and high grade ventricular arrhythmias for programmed ventricular stimulation
Signal averaging of the surface QRS complex was performed before programmed ventricular stimulation in 53 individuals with high grade ventricular arrhythmias or nonsustained ventricular tachycardia, or both. An abnormal signal-averaged electrocardiogram (ECG) was recorded in 22 patients and was associated with inducible ventricular tachycardia in 12 (55%) of the 22. In contrast, a normal signal-averaged ECG was associated with inducible tachycardia in only 1 (3%) of 31 individuals (p < 0.005). The group with inducible tachycardia had a longer duration of the signal-averaged QRS complex (124 ± 19 versus 96 ± 26 ms) and of low amplitude signals (44 ± 13 versus 29 ± 11 ms) (p < 0.005). In addition, the root mean square voltage of the terminal 40 ms was lower in this group (20 ± 14 versus 48 ± 34 µV, p < 0.005).Twenty-seven of the 53 subjects had a prior myocardial infarction; 17 (63%) of the 27 had an abnormal signalaveraged ECG, and ventricular tachycardia was inducible in 10 (59%) of the 17. A normal signal-averaged ECG was recorded in 10 of the 27 patients and only 1 (10%) of these 10 had inducible tachycardia. An abnormal signal-averaged ECG had a 91% sensitivity and a 56% specificity with respect to subsequent induction of tachycardia.During long-term follow-up, 2 (15%) of the 13 patients with inducible ventricular tachycardia who were treated with eleetrophysiologically guided antiarrhythmics therapy died suddenly; the remaining 11 patients (85%) are alive 15 ± 10 months after electrophysiologic testing. Both of these patients who died had an abnormal signal-averaged ECG. In contrast, only 2 (5%) of the 40 patients with no inducible tachycardia, both with a normal signal-averaged ECG, have had an arrhythmic event; the other 38 patients have remained free of sustained ventricular arrhythmia for a follow-up period of 17 ± 9 months.In conclusion: 1) Signal averaging of the surface QRS complex is useful in identifying patients with nonsustained ventricular tachycardia or high grade ventricular arrhythmias, or both, who will have inducible ventricular tachycardia on programmed ventricular stimulation. 2) Inducibility of arrhythmia is unlikely in individuals who have a normal signal-averaged ECG despite the presence of complex ventricular arrhythmia. 3) The occurrence of spontaneous sustained ventricular tachyarrhythmias is low in patients with a prior myocardial infarction and without inducible ventricular tachycardia who have nonsustained ventricular tachycardia or complex ventricular arrhythmias and a normal signal-averaged ECG. 4) Signalaveraged electrocardiography may be useful in detecting low risk groups of patients with complex ventricular arrhythmias who do not require electrophysiologic testing or antiarrhythmic therapy
Will Brexit raise the cost of living?
This paper considers two aspects of this question. First, Brexit has already induced a devaluation of sterling of around 14 per cent since June 2016, which has started to work through to consumer prices: between June 2016 and July 2017 consumer prices increased by around 2.5 per cent. Second, while it is not government policy, nor the desire of the UK public, that the outcome of negotiations is a ‘MFN Brexit’, this remains a distinct possibility. Thus we ask how the imposition of tariffs on imports from the EU will work through into consumer prices. Making very conservative assumptions, we conclude that ‘MFN Brexit’ will increase the average cost of living by around 1 per cent and increase it for 8 per cent of households by 2 per cent or more. We present results for different groups of households according to their employment and structural characteristics and show that the impact will generally be largest on unemployed, single parent and pensioner households
The role of silent ischemia, the arrhythmic substrate and the short-long sequence in the genesis of sudden cardiac death
AbstractTo study the role of silent ischemia and the arrhythmic substrate in the genesis of sudden cardiac death, 67 patients were studied (mean age 62 ± 12 years). Of these, 14 patients (Group 1) had an in-hospital episode of ventricular tachycardia or fibrillation while wearing a 24 h Holter ambulatory electrocardiographic (ECG) monitor, 33 (Group II) had a documented episode of sustained ventricular tachycardia or fibrillation, or both, and 20 (Group III) had angina pectoris but no ventricular tachycardia or fibrillation. Eight Group I survivors underwent programmed electrical stimulation or ECG signal averaging, or both. All Group II patients underwent 24 h Holter monitoring and ECG signal averaging to detect late potentials before programmed electrical stimulation. Group III patients underwent both 24 h Holter recording and coronary angiography. The 24 h ECG tapes were analyzed for ST segment changes, prematurity index and characteristics of ventricular premature depolarizations. Any ST depression ≥1mm for >30 s was considered to be a reflection of silent ischemia, and the induction of ventricular tachycardia or fibrillation by programmed electrical stimulation or the presence of late potentials, or both, was considered to be a reflection of the arrhythmia substrate.Silent ischemia preceded ventricular tachycardia in only 2 (14%) of the 14 Group I patients. The prematurity index was <1 in only 18% of ventricular tachycardia episodes. However, 14 (64%) of 22 episodes of ventricular tachycardia in 9 (64%) of the 14 patients were initiated by a ventricular premature depolarization preceded by a short-long sequence (sinus beat-ventricular premature depolarization-sinus beat) with a ratio of 0.5 ± 0.1. Six (75%) of eight in-hospital survivors of ventricular tachycardia or fibrillation (Group 1) had an arrhythmic substrate. A significantly (p < 0.0001) higher percent of the 33 Group II patients had an arrhythmic substrate (93%) than had silent ischemic episodes (45%). Silent ischemia resulted in ventricular tachycardia in only 1(7%) of 15 Group II patients. There was no significant difference between the incidence of silent ischemia (45% versus 35%) and the extent of coronary artery disease between Groups II and III.It is concluded that: 1) Silent ischemia was not a major determinant of ventricular tachycardia. 2) Although silent ischemia was common in survivors of ventricular tachycardia or fibrillation, its incidence was not significantly different from that in patients with angina pectoris and no sustained ventricular arrhythmias. 3) A high percent of patients (75% to 93%) with ventricular tachycardia and fibrillation have an arrhythmic substrate. 4) In the absence of acute myocardial infarction, sudden cardiac death is frequently triggered by a ventricular premature depolarization, with a preceding short-long cycle that likely produces dispersion of refractoriness in the arrhythmic substrate
Duration learning for analysis of nanopore ionic current blockades
<p>Abstract</p> <p>Background</p> <p>Ionic current blockade signal processing, for use in nanopore detection, offers a promising new way to analyze single molecule properties, with potential implications for DNA sequencing. The alpha-Hemolysin transmembrane channel interacts with a translocating molecule in a nontrivial way, frequently evidenced by a complex ionic flow blockade pattern. Typically, recorded current blockade signals have several levels of blockade, with various durations, all obeying a fixed statistical profile for a given molecule. Hidden Markov Model (HMM) based duration learning experiments on artificial two-level Gaussian blockade signals helped us to identify proper modeling framework. We then apply our framework to the real multi-level DNA hairpin blockade signal.</p> <p>Results</p> <p>The identified upper level blockade state is observed with durations that are geometrically distributed (consistent with an a physical decay process for remaining in any given state). We show that mixture of convolution chains of geometrically distributed states is better for presenting multimodal long-tailed duration phenomena. Based on learned HMM profiles we are able to classify 9 base-pair DNA hairpins with accuracy up to 99.5% on signals from same-day experiments.</p> <p>Conclusion</p> <p>We have demonstrated several implementations for <it>de novo </it>estimation of duration distribution probability density function with HMM framework and applied our model topology to the real data. The proposed design could be handy in molecular analysis based on nanopore current blockade signal.</p
The NTD Nanoscope: potential applications and implementations
<p>Abstract</p> <p>Background</p> <p>Nanopore transduction detection (NTD) offers prospects for a number of highly sensitive and discriminative applications, including: (i) single nucleotide polymorphism (SNP) detection; (ii) targeted DNA re-sequencing; (iii) protein isoform assaying; and (iv) biosensing via antibody or aptamer coupled molecules. Nanopore event transduction involves single-molecule biophysics, engineered information flows, and nanopore cheminformatics. The NTD Nanoscope has seen limited use in the scientific community, however, due to lack of information about potential applications, and lack of availability for the device itself. Meta Logos Inc. is developing both pre-packaged device platforms and component-level (unassembled) kit platforms (the latter described here). In both cases a lipid bi-layer workstation is first established, then augmentations and operational protocols are provided to have a nanopore transduction detector. In this paper we provide an overview of the NTD Nanoscope applications and implementations. The NTD Nanoscope Kit, in particular, is a component-level reproduction of the standard NTD device used in previous research papers.</p> <p>Results</p> <p>The NTD Nanoscope method is shown to functionalize a single nanopore with a channel current modulator that is designed to transduce events, such as binding to a specific target. To expedite set-up in new lab settings, the calibration and troubleshooting for the NTD Nanoscope kit components and signal processing software, the NTD Nanoscope Kit, is designed to include a set of test buffers and control molecules based on experiments described in previous NTD papers (the model systems briefly described in what follows). The description of the Server-interfacing for advanced signal processing support is also briefly mentioned.</p> <p>Conclusions</p> <p>SNP assaying, SNP discovery, DNA sequencing and RNA-seq methods are typically limited by the accuracy of the error rate of the enzymes involved, such as methods involving the polymerase chain reaction (PCR) enzyme. The NTD Nanoscope offers a means to obtain higher accuracy as it is a single-molecule method that does not inherently involve use of enzymes, using a functionalized nanopore instead.</p
Hidden Markov Model Variants and their Application
Markov statistical methods may make it possible to develop an unsupervised learning process that can automatically identify genomic structure in prokaryotes in a comprehensive way. This approach is based on mutual information, probabilistic measures, hidden Markov models, and other purely statistical inputs. This approach also provides a uniquely common ground for comparative prokaryotic genomics. The approach is an on-going effort by its nature, as a multi-pass learning process, where each round is more informed than the last, and thereby allows a shift to the more powerful methods available for supervised learning at each iteration. It is envisaged that this "bootstrap" learning process will also be useful as a knowledge discovery tool. For such an ab initio prokaryotic gene-finder to work, however, it needs a mechanism to identify critical motif structure, such as those around the start of coding or start of transcription (and then, hopefully more). For eukaryotes, even with better start-of-coding identification, parsing of eukaryotic coding regions by the HMM is still limited by the HMM's single gene assumption, as evidenced by the poor performance in alternatively spliced regions. To address these complications an approach is described to expand the states in a eukaryotic gene-predictor HMM, to operate with two layers of DNA parsing. This extension from the single layer gene prediction parse is indicated after preliminary analysis of the C. elegans alt-splice statistics. State profiles have made use of a novel hash-interpolating MM (hIMM) method. A new implementation for an HMM-with-Duration is also described, with far-reaching application to gene-structure identification and analysis of channel current blockade data
Coronary angiographic morphology in myocardial infarction: A link between the pathogenesis of unstable angina and myocardial infarction
It has previously been shown that analysis of coronary morphology can separate unstable from stable angina. An eccentric stenosis with a narrow neck or irregular borders, or both, is very common in patients who present with acute unstable angina, whereas it is rare in patients with stable angina. To extend these observations to myocardial infarction, the coronary morphology of 41 patients with acute or recent infarction and nontotally occluded infarct vessels was studied. For all patients, 27 (66%) of 41 infarct vessels contained this eccentric narrowing, whereas only 2 (11%) of 18 noninfarct vessels with narrowing of 50 to less than 100% had this lesion (p < 0.001). In addition, a separate group of patients with acute myocardial infarction who underwent intracoronary streptokinase infusion were also analyzed in similar fashion. Fourteen (61%) of 23 infarct vessels contained this lesion after streptokinase infusion compared with 1 (9%) of 11 noninfarct vessels with narrowing of 50 to less than 100% (p < 0.01).Therefore, an eccentric coronary stenosis with a narrow neck or irregular borders, or both, is the most common morphologic feature on angiography in both acute and recent infarction as well as unstable angina. This lesion probably represents either a disrupted atherosclerotic plaque or a partially occlusive or lysed thrombus, or both. The predominance of this morphology in both unstable angina and acute infarction suggests a possible link between these two conditions. Unstable angina and myocardial infarction may form a continuous spectrum with the clinical outcome dependent on the subsequent change in coronary supply relative to myocardial demand
DNA Molecule Classification Using Feature Primitives
BACKGROUND: We present a novel strategy for classification of DNA molecules using measurements from an alpha-Hemolysin channel detector. The proposed approach provides excellent classification performance for five different DNA hairpins that differ in only one base-pair. For multi-class DNA classification problems, practitioners usually adopt approaches that use decision trees consisting of binary classifiers. Finding the best tree topology requires exploring all possible tree topologies and is computationally prohibitive. We propose a computational framework based on feature primitives that eliminates the need of a decision tree of binary classifiers. In the first phase, we generate a pool of weak features from nanopore blockade current measurements by using HMM analysis, principal component analysis and various wavelet filters. In the next phase, feature selection is performed using AdaBoost. AdaBoost provides an ensemble of weak learners of various types learned from feature primitives. RESULTS AND CONCLUSION: We show that our technique, despite its inherent simplicity, provides a performance comparable to recent multi-class DNA molecule classification results. Unlike the approach presented by Winters-Hilt et al., where weaker data is dropped to obtain better classification, the proposed approach provides comparable classification accuracy without any need for rejection of weak data. A weakness of this approach, on the other hand, is the very "hands-on" tuning and feature selection that is required to obtain good generalization. Simply put, this method obtains a more informed set of features and provides better results for that reason. The strength of this approach appears to be in its ability to identify strong features, an area where further results are actively being sought
- …