1,149 research outputs found

    The utility of Aspirin in Dukes C and High Risk Dukes B Colorectal cancer--the ASCOLT study: study protocol for a randomized controlled trial

    Get PDF
    published_or_final_versio

    Role of intestinal microflora (Lactobacillus Acidophilus) in phagocytic function of leukocytes in type 2 diabetic patients

    Get PDF
    The prevalence of obesity, insulin resistance and type 2 diabetes has steadily increased in the last decades. In addition to the genetic and environmental factors, gut microbiota may play an important role in the modulation of intermediary phenotypes leading to metabolic disease. Infection is an important cause of morbidity and mortality in diabetic patients. Chronic hyperglycemia impairs host defense mechanism such as cell mediated immunity, polymorphonuclear leukocyte function, and antibody formation. So we aimed to study the association between intestinal microflora (Lactobacillus acidophilus) count and phagocytic activity of polymorphonuclear leukocytes in humans with type 2 diabetes.The study included 20 type 2 diabetic patients with good glycemic control and 20 type 2 diabetic patients with poor glycemic control. In addition, 20 normal healthy subjects were included as normal controls. The fecal composition of L. acidophilus was detected using de Man Rogosa Sharpagar followed by further confirmation using the polymerase chain reaction technique. Phagocytic function of polymorphonuclear leukocytes was assessed using the phagocytosis index %. Fecal L. acidophilus count was significantly increased among uncontrolled diabetic patients, while thephagocytosis index%was significantly reduced among the same patients. In uncontrolled diabetics, a significant positive correlation was observed between fecal L. acidophilus count and HbA1c and a significant negative correlation between phagocytic activity and L. acidophilus count. In  conclusion, type 2 diabetes is associated with compositional changes in fecal L. acidophilus especially in the uncontrolled diabetes. The levels of glucose tolerance or severity of diabetes should be considered while linking the level of intestinal microbiota with a phagocytosis index of leukocyte

    Euclidean space data projection classifier with cartesian genetic programming (CGP)

    Get PDF
    Most evolutionary based classifiers are built based on generated rules sets that categorize the data into respective classes. This research work is a preliminary work which proposes an evolutionary-based classifier using a simplified Cartesian Genetic Programming (CGP) evolutionary algorithm. Instead on using evolutionary generated rule sets, the CGP generates i) a reference coordinate ii) projection functions to project data into a new 3 Dimensional Euclidean space. Subsequently, a distance boundary function of the new projected data to the reference coordinates is applied to classify the data into their respective classes. The evolutionary algorithm is based on a simplified CGP Algorithm using a 1+4 evolutionary strategy. The data projection functions were evolved using CGP for 1000 generations before stopping to extract the best functions. The Classifier was tested using three PROBEN 1 benchmarking datasets which are the PIMA Indians diabetes dataset, Heart Disease dataset and Wisconsin Breast Cancer (WBC) Dataset based on 10 fold cross validation dataset partitioning. Testing results showed that data projection function generated competitive results classification rates: Cancer dataset (97.71%), PIMA Indians dataset (77.92%) and heart disease (85.86%)

    Euclidean Space Data Projection Classifier with Cartesian Genetic Programming (CGP)

    Get PDF
    Most evolutionary based classifiers are built based on generated rules sets that categorize the data into respective classes. This research work is a preliminary work which proposes an evolutionary-based classifier using a simplified Cartesian Genetic Programming (CGP) evolutionary algorithm. Instead on using evolutionary generated rule sets, the CGP generates i) a reference coordinate ii) projection functions to project data into a new 3 Dimensional Euclidean space. Subsequently, a distance boundary function of the new projected data to the reference coordinates is applied to classify the data into their respective classes. The evolutionary algorithm is based on a simplified CGP Algorithm using a 1+4 evolutionary strategy. The data projection functions were evolved using CGP for 1000 generations before stopping to extract the best functions. The Classifier was tested using three PROBEN 1 benchmarking datasets which are the PIMA Indians diabetes dataset, Heart Disease dataset and Wisconsin Breast Cancer (WBC) Dataset based on 10 fold cross validation dataset partitioning. Testing results showed that data projection function generated competitive results classification rates: Cancer dataset (97.71%), PIMA Indians dataset (77.92%) and heart disease (85.86%)

    Optimisation of neural network with simultaneous feature selection and network prunning using evolutionary algorithm

    Get PDF
    Most advances on the Evolutionary Algorithm optimisation of Neural Network are on recurrent neural network using the NEAT optimisation method. For feed forward network, most of the optimisation are merely on the Weights and the bias selection which is generally known as conventional Neuroevolution. In this research work, a simultaneous feature reduction, network pruning and weight/biases selection is presented using fitness function design which penalizes selection of large feature sets. The fitness function also considers feature and the neuron reduction in the hidden layer. The results were demonstrated using two sets of data sets which are the cancer datasets and Thyroid datasets. Results showed backpropagation gradient descent error weights/biased optimisations performed slightly better at classification of the two datasets with lower misclassification rate and error. However, features and hidden neurons were reduced with the simultaneous feature /neurons switching using Genetic Algorithm. The number of features were reduced from 21 to 4 (Thyroid dataset) and 9 to 3 (cancer dataset) with only 1 hidden neuron in the processing layer for both network structures for the respective datasets. This research work will present the chromosome representation and the fitness function design

    Optimisation of Neural Network with Simultaneous Feature Selection and Network Prunning using Evolutionary Algorithm

    Get PDF
    Most advances on the Evolutionary Algorithm optimisation of Neural Network are on recurrent neural network using the NEAT optimisation method. For feed forward network, most of the optimisation are merely on the Weights and the bias selection which is generally known as conventional Neuroevolution. In this research work, a simultaneous feature reduction, network pruning and weight/biases selection is presented using fitness function design which penalizes selection of large feature sets. The fitness function also considers feature and the neuron reduction  in the hidden layer. The results were demonstrated using two sets of data sets which are the cancer datasets and Thyroid datasets. Results showed backpropagation gradient descent error weights/biased optimisations performed slightly better at classification of the two datasets with lower misclassification rate and error. However, features and hidden neurons were reduced with the simultaneous feature /neurons switching using Genetic Algorithm. The number of features were reduced from 21 to 4 (Thyroid dataset) and 9 to 3 (cancer dataset) with only 1 hidden neuron in the processing layer for both network structures for the respective datasets.  This research work will present the chromosome representation and the fitness function design

    What is the real impact of acute kidney injury?

    Get PDF
    Background: Acute kidney injury (AKI) is a common clinical problem. Studies have documented the incidence of AKI in a variety of populations but to date we do not believe the real incidence of AKI has been accurately documented in a district general hospital setting. The aim here was to describe the detected incidence of AKI in a typical general hospital setting in an unselected population, and describe associated short and long-term outcomes. Methods: A retrospective observational database study from secondary care in East Kent (adult catchment population of 582,300). All adult patients (18 years or over) admitted between 1st February 2009 and 31st July 2009, were included. Patients receiving chronic renal replacement therapy (RRT), maternity and day case admissions were excluded. AKI was defined by the acute kidney injury network (AKIN) criteria. A time dependent risk analysis with logistic regression and Cox regression was used for the analysis of in-hospital mortality and survival. Results: The incidence of AKI in the 6 month period was 15,325 pmp/yr (adults) (69% AKIN1, 18% AKIN2 and 13% AKIN3). In-hospital mortality, length of stay and ITU utilisation all increased with severity of AKI. Patients with AKI had an increase in care on discharge and an increase in hospital readmission within 30 days. Conclusions: This data comes closer to the real incidence and outcomes of AKI managed in-hospital than any study published in the literature to date. Fifteen percent of all admissions sustained an episode of AKI with increased subsequent short and long term morbidity and mortality, even in those with AKIN1. This confers an increased burden and cost to the healthcare economy, which can now be quantified. These results will furnish a baseline for quality improvement projects aimed at early identification, improved management, and where possible prevention, of AKI

    Overview on the phenomenon of two-qubit entanglement revivals in classical environments

    Full text link
    The occurrence of revivals of quantum entanglement between separated open quantum systems has been shown not only for dissipative non-Markovian quantum environments but also for classical environments in absence of back-action. While the phenomenon is well understood in the first case, the possibility to retrieve entanglement when the composite quantum system is subject to local classical noise has generated a debate regarding its interpretation. This dynamical property of open quantum systems assumes an important role in quantum information theory from both fundamental and practical perspectives. Hybrid quantum-classical systems are in fact promising candidates to investigate the interplay among quantum and classical features and to look for possible control strategies of a quantum system by means of a classical device. Here we present an overview on this topic, reporting the most recent theoretical and experimental results about the revivals of entanglement between two qubits locally interacting with classical environments. We also review and discuss the interpretations provided so far to explain this phenomenon, suggesting that they can be cast under a unified viewpoint.Comment: 16 pages, 9 figures. Chapter written for the upcoming book "Lectures on general quantum correlations and their applications

    Cosmic Flows on 100 Mpc/h Scales: Standardized Minimum Variance Bulk Flow, Shear and Octupole Moments

    Get PDF
    The low order moments, such as the bulk flow and shear, of the large scale peculiar velocity field are sensitive probes of the matter density fluctuations on very large scales. In practice, however, peculiar velocity surveys are usually sparse and noisy, which can lead to the aliasing of small scale power into what is meant to be a probe of the largest scales. Previously, we developed an optimal ``minimum variance'' (MV) weighting scheme, designed to overcome this problem by minimizing the difference between the measured bulk flow (BF) and that which would be measured by an ideal survey. Here we extend this MV analysis to include the shear and octupole moments, which are designed to have almost no correlations between them so that they are virtually orthogonal. We apply this MV analysis to a compilation of all major peculiar velocity surveys, consisting of 4536 measurements. Our estimate of the BF on scales of ~ 100 Mpc/h has a magnitude of |v|= 416 +/- 78 km/s towards Galactic l = 282 degree +/- 11 degree and b = 6 degree +/- 6 degree. This result is in disagreement with LCDM with WMAP5 cosmological parameters at a high confidence level, but is in good agreement with our previous MV result without an orthogonality constraint, showing that the shear and octupole moments did not contaminate the previous BF measurement. The shear and octupole moments are consistent with WMAP5 power spectrum, although the measurement noise is larger for these moments than for the BF. The relatively low shear moments suggest that the sources responsible for the BF are at large distances.Comment: 13 Pages, 7 figures, 4 tables. Some changes to reflect the published versio
    • 

    corecore