3,092 research outputs found

    Hybrid fuzzy- proportionl integral derivative controller (F-PID-C) for control of speed brushless direct curren motor (BLDCM)

    Get PDF
    Hybrid Fuzzy proportional-integral-derivative (PID) controllers (F-PID-C) is designed and analyzed for controlling speed of brushless DC (BLDC) motor. A simulation investigation of the controller for controlling the speed of BLDC motors is performed to beat the presence of nonlinearities and uncertainties in the system. The fuzzy logic controller (FLC) is designed according to fuzzy rules so that the systems are fundamentally robust. There are 49 fuzzy rules for each parameter of FUZZY-PID controller. Fuzzy Logic is used to tune each parameter of the proportional, integral and derivative ( kp,ki,kd) gains, respectively of the PID controller. The FLC has two inputs i.e., i) the motor speed error between the reference and actual speed and ii) the change in speed of error (rate of change error). The three outputs of the FLC are the proportional gain, kp, integral gain ki and derivative gain kd, gains to be used as the parameters of PID controller in order to control the speed of the BLDC motor. Various types of membership functions have been used in this project i.e., gaussian, trapezoidal and triangular are assessed in the fuzzy control and these membership functions are used in FUZZY PID for comparative analysis. The membership functions and the rules have been defined using fuzzy system editor given in MATLAB. Two distinct situations are simulated, which are start response, step response with load and without load. The FUZZY-PID controller has been tuned by trial and error and performance parameters are rise time, settling time and overshoot. The findings show that the trapezoidal membership function give good results of short rise time, fast settling time and minimum overshoot compared to others for speed control of the BLDC motor

    Modified Newton's Law of Gravitation Due to Minimal Length in Quantum Gravity

    Full text link
    A recent theory about the origin of the gravity suggests that the gravity is originally an entropic force. In this work, we discuss the effects of generalized uncertainty principle (GUP) which is proposed by some approaches to quantum gravity such as string theory, black hole physics and doubly special relativity theories (DSR), on the area law of the entropy. This leads to a Area\sqrt{Area}-type correction to the area law of entropy which imply that the number of bits NN is modified. Therefore, we obtain a modified Newton's law of gravitation. Surprisingly, this modification agrees with different sign with the prediction of Randall-Sundrum II model which contains one uncompactified extra dimension. Furthermore, such modification may have observable consequences at length scales much larger than the Planck scale.Comment: 12 pages, no figures, references adde

    Nitrates for the Management of Acute Heart Failure Syndromes, A Systematic Review

    Get PDF
    © The Author(s) 2016Intravenous nitrates are widely used in the management of acute heart failure syndrome (AHFS) yet with lack of robust evidence to support their use. We therefore sought to analyze all randomized studies that evaluated the effects of nitrates on clinical outcomes in patients with AHFS. In total, 15 relevant trials comparing nitrates and alternative interventions in 1824 patients were identified. All but 3 were conducted before 1998. No trials demonstrated a beneficial effect on mortality, apart from 1 trial reporting a reduction in mortality, which was related to the time of treatment. Retrospective review suggests that there is a lack of data to draw any firm conclusions concerning the use of nitrates in patients with AHFS. More studies are needed to evaluate the safety and efficacy of these agents in the modern era of guideline-directed use of heart failure therapy.Peer reviewedFinal Accepted Versio

    Adjunctive therapies to reduce thrombotic events in patients with a history of myocardial infarction : role of vorapaxar

    Get PDF
    © 2015 Farag et al. This work is published by Dove Medical Press Limited, and licensed under Creative Commons Attribution – Non Commercial (unported, v3.0) LicenseAcute myocardial infarction (AMI) is generally attributed to coronary atherothrombotic disease. Platelet activation is essential for thrombus formation and is thus an important target for pharmacological intervention to prevent and treat AMI. Despite contemporary treatment with dual antiplatelet therapy, including acetylsalicylic acid and adenosine diphosphate receptor antagonists, patients with prior AMI remain at increased risk of future thrombotic events. This has stimulated the search for more potent antithrombotic agents. Among these is the oral protease-activated receptor-1 antagonist vorapaxar, which represents a new oral antiplatelet agent to reduce thrombotic risk in patients with atherothrombotic disease. The TRACER and the TRA 2°P-TIMI 50 trials concluded that vorapaxar in addition to standard therapy reduced ischemic adverse cardiac events. A remarkable benefit was observed in patients with stable atherosclerotic disease, particularly those with a previous history of AMI. Although favorable effects were seen in reduction of adverse cardiac events, this was associated with excess major and intracranial bleeding, particularly in patients at high risk of bleeding and those with a history of stroke or transient ischemic attack. Currently, the lack of a reliable individualized risk stratification tool to assess patients for thrombotic and bleeding tendencies in order to identify those who might gain most net clinical benefit has led to limited use of vorapaxar in clinical practice. Vorapaxar may find a niche as an adjunct to standard care in patients at high risk of thrombotic events and who are at low risk of bleeding.Peer reviewe

    Bivalirudin versus unfractionated heparin: a meta-analysis of patients receiving percutaneous coronary intervention for acute coronary syndromes

    Get PDF
    OBJECTIVE: Acute coronary syndrome (ACS) encompasses ST segment elevation myocardial infarction (STEMI), with generally high thrombus burden and non-ST segment elevation ACS (NSTE-ACS), with lower thrombus burden. In the setting of percutaneous coronary intervention (PCI) for ACS, bivalirudin appears superior to unfractionated heparin (UFH), driven by reduced major bleeding. Recent trials suggest that the benefit of bivalirudin may be reduced with use of transradial access and evolution in antiplatelet therapy. Moreover, a differential role of bivalirudin in ACS cohorts is unknown. METHODS: A meta-analysis of randomised trials comparing bivalirudin and UFH in patients with ACS receiving PCI, with separate analyses in STEMI and NSTE-ACS groups. Overall estimates of treatment effect were calculated with random-effects model. RESULTS: In 5 trials of STEMI (10 358 patients), bivalirudin increased the risk of acute stent thrombosis (ST) (OR 3.62; CI 1.95 to 6.74; p<0.0001) compared with UFH. Bivalirudin reduced the risk of major bleeding only when compared with UFH plus planned glycoprotein IIb/IIIa inhibitors (GPI) (OR 0.49; CI 0.36 to 0.67; p<0.00001). In 14 NSTE-ACS trials (25 238 patients), there was no difference between bivalirudin and UFH in death, myocardial infarction or ST. However, bivalirudin reduced the risk of major bleeding compared with UFH plus planned GPI (OR 0.52; CI 0.43 to 0.62; p<0.00001), or UFH plus provisional GPI (OR 0.68; CI 0.46 to 1.01; p=0.05). The reduction in major bleeding with bivalirudin was not related to vascular access site. CONCLUSIONS: Bivalirudin increases the risk of acute ST in STEMI, but may confer an advantage over UFH in NSTE-ACS while undergoing PCI, reducing major bleeding without an increase in ST

    Spontaneous Coronary Artery Dissection : The Phantom Menace

    Get PDF
    Articles © The authorsWe present a case of a 66-year-old lady with chest pain, without dynamic 12-lead electrocardiographic (ECG) changes and normal serial troponin. Coronary angiography revealed a linear filing defect in the first obtuse marginal branch of the circumflex artery indicating coronary artery dissection, with superadded thrombus. She was managed medically with dual antiplatelet therapy and has responded well. Spontaneous coronary artery dissection (SCAD) is a rare cause of cardiac chest pain, which can be missed without coronary angiography. Unlike most other lesions in patients with unstable symptoms, where coronary intervention with stenting is recommended, patients with SCAD generally fare better with conservative measures than with intervention, unless there is hemodynamic instability.Peer reviewe

    A comparison of advanced time series models for environmental dependent stock recruitment of the western rock lobster

    Get PDF
    Time series models have been applied in many areas including economics, stuck recruitment and the environment. Most environmental time series involve highly correlated dependent variables, which makes it difficult to apply conventional regression analysis, Traditionally, regression analysis has been applied to the environmental dependent stock and recruitment relationships for crustacean species in Western Australian fisheries. Alternative models, such as transfer function models and state space models have the potential to provide unproved forecasts for these types of data sets. This dissertation will explore the application of regression models, transfer function models, and state space models to modelling the puerulus stage of the western rock lobster (Panulirus Cynus) in the fisheries of Western Australia. The transfer function models are consulted to examining the influences of the environment on crustacean species and can be used where correlated variables are involved. These models aim at producing short-term forecasts that may help in the management of the fisheries. In comparison with regression models, TFM models gave better forecast values with state space models given the forecast values in the first two years. Overall, it was shown that environmental effects, westerly winds and the Leeuwin Current, have a significant effect on the puerulus settlement for Dongara and Alkimos. It was also shown that westerly winds and spawning stock have a significant effect on the puerulus settlement at the Abrolhos Islands

    Lung nodule modeling and detection for computerized image analysis of low dose CT imaging of the chest.

    Get PDF
    From a computerized image analysis prospective, early diagnosis of lung cancer involves detection of doubtful nodules and classification into different pathologies. The detection stage involves a detection approach, usually by template matching, and an authentication step to reduce false positives, usually conducted by a classifier of one form or another; statistical, fuzzy logic, support vector machines approaches have been tried. The classification stage matches, according to a particular approach, the characteristics (e.g., shape, texture and spatial distribution) of the detected nodules to common characteristics (again, shape, texture and spatial distribution) of nodules with known pathologies (confirmed by biopsies). This thesis focuses on the first step; i.e., nodule detection. Specifically, the thesis addresses three issues: a) understanding the CT data of typical low dose CT (LDCT) scanning of the chest, and devising an image processing approach to reduce the inherent artifacts in the scans; b) devising an image segmentation approach to isolate the lung tissues from the rest of the chest and thoracic regions in the CT scans; and c) devising a nodule modeling methodology to enhance the detection rate and lend benefits for the ultimate step in computerized image analysis of LDCT of the lungs, namely associating a pathology to the detected nodule. The methodology for reducing the noise artifacts is based on noise analysis and examination of typical LDCT scans that may be gathered on a repetitive fashion; since, a reduction in the resolution is inevitable to avoid excessive radiation. Two optimal filtering methods are tested on samples of the ELCAP screening data; the Weiner and the Anisotropic Diffusion Filters. Preference is given to the Anisotropic Diffusion Filter, which can be implemented on 7x7 blocks/windows of the CT data. The methodology for lung segmentation is based on the inherent characteristics of the LDCT scans, shown as distinct bi-modal gray scale histogram. A linear model is used to describe the histogram (the joint probability density function of the lungs and non-lungs tissues) by a linear combination of weighted kernels. The Gaussian kernels were chosen, and the classic Expectation-Maximization (EM) algorithm was employed to estimate the marginal probability densities of the lungs and non-lungs tissues, and select an optimal segmentation threshold. The segmentation is further enhanced using standard shape analysis based on mathematical morphology, which improves the continuity of the outer and inner borders of the lung tissues. This approach (a preliminary version of it appeared in [14]) is found to be adequate for lung segmentation as compared to more sophisticated approaches developed at the CVIP Lab (e.g., [15][16]) and elsewhere. The methodology developed for nodule modeling is based on understanding the physical characteristics of the nodules in LDCT scans, as identified by human experts. An empirical model is introduced for the probability density of the image intensity (or Hounsfield units) versus the radial distance measured from the centroid – center of mass - of typical nodules. This probability density showed that the nodule spatial support is within a circle/square of size 10 pixels; i.e., limited to 5 mm in length; which is within the range that the radiologist specify to be of concern. This probability density is used to fill in the intensity (or Hounsfield units) of parametric nodule models. For these models (e.g., circles or semi-circles), given a certain radius, we calculate the intensity (or Hounsfield units) using an exponential expression for the radial distance with parameters specified from the histogram of an ensemble of typical nodules. This work is similar in spirit to the earlier work of Farag et al., 2004 and 2005 [18][19], except that the empirical density of the radial distance and the histogram of typical nodules provide a data-driven guide for estimating the intensity (or Hounsfield units) of the nodule models. We examined the sensitivity and specificity of parametric nodules in a template-matching framework for nodule detection. We show that false positives are inevitable problems with typical machine learning methods of automatic lung nodule detection, which invites further efforts and perhaps fresh thinking into automatic nodule detection. A new approach for nodule modeling is introduced in Chapter 5 of this thesis, which brings high promise in both the detection, and the classification of nodules. Using the ELCAP study, we created an ensemble of four types of nodules and generated a nodule model for each type based on optimal data reduction methods. The resulting nodule model, for each type, has lead to drastic improvements in the sensitivity and specificity of nodule detection. This approach may be used as well for classification. In conclusion, the methodologies in this thesis are based on understanding the LDCT scans and what is to be expected in terms of image quality. Noise reduction and image segmentation are standard. The thesis illustrates that proper nodule models are possible and indeed a computerized approach for image analysis to detect and classify lung nodules is feasible. Extensions to the results in this thesis are immediate and the CVIP Lab has devised plans to pursue subsequent steps using clinical data

    Face recognition in the wild.

    Get PDF
    Research in face recognition deals with problems related to Age, Pose, Illumination and Expression (A-PIE), and seeks approaches that are invariant to these factors. Video images add a temporal aspect to the image acquisition process. Another degree of complexity, above and beyond A-PIE recognition, occurs when multiple pieces of information are known about people, which may be distorted, partially occluded, or disguised, and when the imaging conditions are totally unorthodox! A-PIE recognition in these circumstances becomes really “wild” and therefore, Face Recognition in the Wild has emerged as a field of research in the past few years. Its main purpose is to challenge constrained approaches of automatic face recognition, emulating some of the virtues of the Human Visual System (HVS) which is very tolerant to age, occlusion and distortions in the imaging process. HVS also integrates information about individuals and adds contexts together to recognize people within an activity or behavior. Machine vision has a very long road to emulate HVS, but face recognition in the wild, using the computer, is a road to perform face recognition in that path. In this thesis, Face Recognition in the Wild is defined as unconstrained face recognition under A-PIE+; the (+) connotes any alterations to the design scenario of the face recognition system. This thesis evaluates the Biometric Optical Surveillance System (BOSS) developed at the CVIP Lab, using low resolution imaging sensors. Specifically, the thesis tests the BOSS using cell phone cameras, and examines the potential of facial biometrics on smart portable devices like iPhone, iPads, and Tablets. For quantitative evaluation, the thesis focused on a specific testing scenario of BOSS software using iPhone 4 cell phones and a laptop. Testing was carried out indoor, at the CVIP Lab, using 21 subjects at distances of 5, 10 and 15 feet, with three poses, two expressions and two illumination levels. The three steps (detection, representation and matching) of the BOSS system were tested in this imaging scenario. False positives in facial detection increased with distances and with pose angles above ± 15°. The overall identification rate (face detection at confidence levels above 80%) also degraded with distances, pose, and expressions. The indoor lighting added challenges also, by inducing shadows which affected the image quality and the overall performance of the system. While this limited number of subjects and somewhat constrained imaging environment does not fully support a “wild” imaging scenario, it did provide a deep insight on the issues with automatic face recognition. The recognition rate curves demonstrate the limits of low-resolution cameras for face recognition at a distance (FRAD), yet it also provides a plausible defense for possible A-PIE face recognition on portable devices
    • …
    corecore