4,735 research outputs found

    A unified wavelet-based modelling framework for non-linear system identification: the WANARX model structure

    Get PDF
    A new unified modelling framework based on the superposition of additive submodels, functional components, and wavelet decompositions is proposed for non-linear system identification. A non-linear model, which is often represented using a multivariate non-linear function, is initially decomposed into a number of functional components via the wellknown analysis of variance (ANOVA) expression, which can be viewed as a special form of the NARX (non-linear autoregressive with exogenous inputs) model for representing dynamic input–output systems. By expanding each functional component using wavelet decompositions including the regular lattice frame decomposition, wavelet series and multiresolution wavelet decompositions, the multivariate non-linear model can then be converted into a linear-in-theparameters problem, which can be solved using least-squares type methods. An efficient model structure determination approach based upon a forward orthogonal least squares (OLS) algorithm, which involves a stepwise orthogonalization of the regressors and a forward selection of the relevant model terms based on the error reduction ratio (ERR), is employed to solve the linear-in-the-parameters problem in the present study. The new modelling structure is referred to as a wavelet-based ANOVA decomposition of the NARX model or simply WANARX model, and can be applied to represent high-order and high dimensional non-linear systems

    The wavelet-NARMAX representation : a hybrid model structure combining polynomial models with multiresolution wavelet decompositions

    Get PDF
    A new hybrid model structure combing polynomial models with multiresolution wavelet decompositions is introduced for nonlinear system identification. Polynomial models play an important role in approximation theory, and have been extensively used in linear and nonlinear system identification. Wavelet decompositions, in which the basis functions have the property of localization in both time and frequency, outperform many other approximation schemes and offer a flexible solution for approximating arbitrary functions. Although wavelet representations can approximate even severe nonlinearities in a given signal very well, the advantage of these representations can be lost when wavelets are used to capture linear or low-order nonlinear behaviour in a signal. In order to sufficiently utilise the global property of polynomials and the local property of wavelet representations simultaneously, in this study polynomial models and wavelet decompositions are combined together in a parallel structure to represent nonlinear input-output systems. As a special form of the NARMAX model, this hybrid model structure will be referred to as the WAvelet-NARMAX model, or simply WANARMAX. Generally, such a WANARMAX representation for an input-output system might involve a large number of basis functions and therefore a great number of model terms. Experience reveals that only a small number of these model terms are significant to the system output. A new fast orthogonal least squares algorithm, called the matching pursuit orthogonal least squares (MPOLS) algorithm, is also introduced in this study to determine which terms should be included in the final model

    Energy expenditure during common sitting and standing tasks: examining the 1.5 MET definition of sedentary behaviour

    Get PDF
    Background: Sedentary behavior is defined as any waking behavior characterized by an energy expenditure of 1.5 METS or less while in a sitting or reclining posture. This study examines this definition by assessing the energy cost (METs) of common sitting, standing and walking tasks. Methods: Fifty one adults spent 10 min during each activity in a variety of sitting tasks (watching TV, Playing on the Wii, Playing on the PlayStation Portable (PSP) and typing) and non-sedentary tasks (standing still, walking at 0.2, 0.4, 0.6, 0.8, 1.0, 1.2, 1.4, and 1.6 mph). Activities were completed on the same day in a random order following an assessment of resting metabolic rate (RMR). A portable gas analyzer was used to measure oxygen uptake, and data were converted to units of energy expenditure (METs). Results: Average of standardized MET values for screen-based sitting tasks were: 1.33 (SD: 0.24) METS (TV), 1.41 (SD: 0.28) (PSP), and 1.45 (SD: 0.32) (Typing). The more active, yet still seated, games on the Wii yielded an average of 2.06 (SD: 0.5) METS. Standing still yielded an average of 1.59 (SD: 0.37) METs. Walking MET values increased incrementally with speed from 2.17 to 2.99 (SD: 0.5 - 0.69) METs. Conclusions: The suggested 1.5 MET threshold for sedentary behaviors seems reasonable however some sitting based activities may be classified as non-sedentary. The effect of this on the definition of sedentary behavior and associations with metabolic health needs further investigation

    Improved model identification for non-linear systems using a random subsampling and multifold modelling (RSMM) approach

    Get PDF
    In non-linear system identification, the available observed data are conventionally partitioned into two parts: the training data that are used for model identification and the test data that are used for model performance testing. This sort of 'hold-out' or 'split-sample' data partitioning method is convenient and the associated model identification procedure is in general easy to implement. The resultant model obtained from such a once-partitioned single training dataset, however, may occasionally lack robustness and generalisation to represent future unseen data, because the performance of the identified model may be highly dependent on how the data partition is made. To overcome the drawback of the hold-out data partitioning method, this study presents a new random subsampling and multifold modelling (RSMM) approach to produce less biased or preferably unbiased models. The basic idea and the associated procedure are as follows. First, generate K training datasets (and also K validation datasets), using a K-fold random subsampling method. Secondly, detect significant model terms and identify a common model structure that fits all the K datasets using a new proposed common model selection approach, called the multiple orthogonal search algorithm. Finally, estimate and refine the model parameters for the identified common-structured model using a multifold parameter estimation method. The proposed method can produce robust models with better generalisation performance

    High risk prescribing in older adults: Prevalence, clinical and economic implications and potential for intervention at the population level

    Get PDF
    Background: High risk prescribing can compromise independent wellbeing and quality of life in older adults. The aims of this project are to determine the prevalence, risk factors, clinical consequences, and costs of high risk prescribing, and to assess the impact of interventions on high risk prescribing in older people. Methods. The proposed project will utilise data from the 45 and Up Study, a large scale cohort of 267,153 men and women aged 45 and over recruited during 2006-2009 from the state of New South Wales, Australia linked to a range of administrative health datasets. High risk prescribing will be assessed using three indicators: polypharmacy (use of five or more medicines); Beers Criteria (an explicit measure of potentially inappropriate medication use); and Drug Burden Index (a pharmacologic dose-dependent measure of cumulative exposure to anticholinergic and sedative medicines). Individual risk factors from the 45 and Up Study questionnaire, and health system characteristics from health datasets that are associated with the likelihood of high risk prescribing will be identified. The main outcome measures will include hospitalisation (first admission to hospital, total days in hospital, cause-specific hospitalisation); admission to institutionalised care; all-cause mortality, and, where possible, cause-specific mortality. Economic costs to the health care system and implications of high risk prescribing will be also investigated. In addition, changes in high risk prescribing will be evaluated in relation to certain routine medicines-related interventions. The statistical analysis will be conducted using standard pharmaco-epidemiological methods including descriptive analysis, univariate and multivariate regression analysis, controlling for relevant confounding factors, using a number of different approaches. Discussion. The availability of large-scale data is useful to identify opportunities for improving prescribing, and health in older adults. The size of the 45 and Up Study, along with linkage to health databases provides an important opportunity to investigate the relationship between high risk prescribing and adverse outcomes in a real-world population of older adults. © 2013 Gnjidic et al.; licensee BioMed Central Ltd

    Identification of furfural resistant strains of Saccharomyces cerevisiae and Saccharomyces paradoxus from a collection of environmental and industrial isolates

    Get PDF
    Background Fermentation of bioethanol using lignocellulosic biomass as a raw material provides a sustainable alternative to current biofuel production methods by utilising waste food streams as raw material. Before lignocellulose can be fermented it requires physical, chemical and enzymatic treatment in order to release monosaccharides, a process that causes the chemical transformation of glucose and xylose into the cyclic aldehydes furfural and hydroxyfurfural. These furan compounds are potent inhibitors of Saccharomyces fermentation, and consequently furfural tolerant strains of Saccharomyces are required for lignocellulosic fermentation. Results This study investigated yeast tolerance to furfural and hydroxyfurfural using a collection of 71 environmental and industrial isolates of the baker’s yeast Saccharomyces cerevisiae and its closest relative Saccharomyces paradoxus. The Saccharomyces strains were initially screened for growth on media containing 100 mM glucose and 1.5 mg ml-1 furfural. Five strains were identified that showed a significant tolerance to growth in the presence of furfural and these were then screened for growth and ethanol production in the presence of increasing amounts (0.1-4 mg ml-1) of furfural. Conclusions Of the five furfural tolerant strains S. cerevisiae NCYC 3451 displayed the greatest furfural resistance, and was able to grow in the presence of up to 3.0 mg ml-1 furfural. Furthermore, ethanol production in this strain did not appear to be inhibited by furfural, with the highest ethanol yield observed at 3.0 mg ml-1 furfural. Although furfural resistance was not found to be a trait specific to any one particular lineage or population, three of the strains were isolated from environments where they might be continually exposed to low levels of furfural through the on-going natural degradation of lignocelluloses, and would therefore develop elevated levels of resistance to these furan compounds. Thus these strains represent good candidates for future studies of genetic variation relevant to understanding and manipulating furfural resistance and in the development of tolerant ethanologenic yeast strains for use in bioethanol production from lignocellulose processing

    Legitimacy of medicines funding in the era of accelerated access

    Get PDF
    Objectives: In recent years, numerous frameworks have been developed to enhance the legitimacy of health technology assessment processes. Despite efforts to implement these “legitimacy frameworks”, medicines funding decisions can still be perceived as lacking in legitimacy. We therefore sought to examine stakeholder views on factors that they think should be considered when making decisions about the funding of high-cost breast cancer therapies, focusing on those that are not included in current frameworks and processes. Methods: We analyzed published discourse on the funding of high-cost breast-cancer therapies. Relevant materials were identified by searching the databases Google, Google Scholar and Factiva in August 2014 and July 2016 and these were analyzed thematically. Results: We analyzed 50 published materials and found that stakeholders, for the most part, want to be able to access medicines more quickly and at the same time as other patients and for decision-makers to be more flexible with regards to evidence requirements and to use a wider range of criteria when evaluating therapies. Many also advocated for existing process to be accelerated or bypassed in order to improve access to therapies. Conclusions: Our results illustrate that a stakeholder-derived conceptualization of legitimacy emphasizes principles of accelerated access, and is not fully accounted for by existing frameworks and processes aimed at promoting legitimacy. However, further research examining the ethical, political and clinical implications of the stakeholder claims raised here is needed before firm policy recommendations can be made. Keywords pharmaceutical funding decisions; resource allocation; stakeholder engagement; breast cancer; accelerated acces

    Very Cold Gas and Dark Matter

    Get PDF
    We have recently proposed a new candidate for baryonic dark matter: very cold molecular gas, in near-isothermal equilibrium with the cosmic background radiation at 2.73 K. The cold gas, of quasi-primordial abundances, is condensed in a fractal structure, resembling the hierarchical structure of the detected interstellar medium. We present some perspectives of detecting this very cold gas, either directly or indirectly. The H2_2 molecule has an "ultrafine" structure, due to the interaction between the rotation-induced magnetic moment and the nuclear spins. But the lines fall in the km domain, and are very weak. The best opportunity might be the UV absorption of H2_2 in front of quasars. The unexpected cold dust component, revealed by the COBE/FIRAS submillimetric results, could also be due to this very cold H2_2 gas, through collision-induced radiation, or solid H2_2 grains or snowflakes. The γ\gamma-ray distribution, much more radially extended than the supernovae at the origin of cosmic rays acceleration, also points towards and extended gas distribution.Comment: 16 pages, Latex pages, crckapb macro, 3 postscript figures, uuencoded compressed tar file. To be published in the proceeedings of the "Dust-Morphology" conference, Johannesburg, 22-26 January, 1996, D. Block (ed.), (Kluwer Dordrecht

    The CACCC-binding protein KLF3/BKLF represses a subset of KLF1/EKLF target genes and is required for proper erythroid maturation in vivo

    Get PDF
    The CACCC-box binding protein erythroid Kruppel-like factor (EKLF/KLF1) is a master regulator that directs the expression of many important erythroid genes. We have previously shown that EKLF drives transcription of the gene for a second KLF, basic Kruppel-like factor, or KLF3. We have now tested the in vivo role of KLF3 in erythroid cells by examining Klf3 knockout mice. KLF3-deficient adults exhibit a mild compensated anemia, including enlarged spleens, increased red pulp, and a higher percentage of erythroid progenitors, together with elevated reticulocytes and abnormal erythrocytes in the peripheral blood. Impaired erythroid maturation is also observed in the fetal liver. We have found that KLF3 levels rise as erythroid cells mature to become TER119(+). Consistent with this, microarray analysis of both TER119(-) and TER119(+) erythroid populations revealed that KLF3 is most critical at the later stages of erythroid maturation and is indeed primarily a transcriptional repressor. Notably, many of the genes repressed by KLF3 are also known to be activated by EKLF. However, the majority of these are not currently recognized as erythroid-cell-specific genes. These results reveal the molecular and physiological function of KLF3, defining it as a feedback repressor that counters the activity of EKLF at selected target genes to achieve normal erythropoiesis

    The politics of the teaching of reading

    Get PDF
    Historically, political debates have broken out over how to teach reading in primary schools and infant classrooms. These debates and “reading wars” have often resulted from public concerns and media reportage of a fall in reading standards. They also reflect the importance placed on learning to read by parents, teachers, employers, and politicians. Public and media-driven controversies over the teaching of reading have resulted in intense public and professional debates over which specific methods and materials to use with beginning readers and with children who have reading difficulties. Recently, such debates have led to a renewed emphasis on reading proficiency and “standardized” approaches to teaching reading and engaging with literacy. The universal acceptance of the importance of learning to read has also led to vested interests in specific methods, reading programmes, and early literacy assessments amongst professional, business, commercial, and parental lobbying groups. This article traces these debates and the resulting growing support for a quantitative reductionist approach to early-reading programmes
    corecore