647 research outputs found
Recommended from our members
Tools for reformulating logical forms into zero-one mixed integer programs (MIPS)
A systematic procedure for transforming a set of logical statements or logical conditions imposed on a model into an Integer Linear Programming (ILP) formulation or a Mixed Integer Programming (MIP) formulation is presented. A reformulation procedure which uses the extended reverse polish representation of a compound logical form is then described. A prototype user interface by which logical forms can be reformulated and the corresponding MIP constructed and analysed within an existing Mathematical Programming modelling system is illustrated. Finally, the steps to formulate a discrete optimisation model in this way are demonstrated by means of an example
Recommended from our members
Introducing new constructs for data modelling and column generation in LP modelling languages
Through popular implementation of structured query language (SQL) and query-by-example(QBE) relational databases have become the de-facto industry standard for data modelling.We consider the indices, sets, and the declarative form of Linear Programming (LP) modelling languages and introduce new constructs which provide direct link to the database systems. The models constructed in this way are data driven and display a dynamicstructure. We then show how this approach can be naturally extended to include column generation features stated in procedural forms within an otherwise declarative modelling paradigm
Sets and indices in linear programming modelling and their integration with relational data models
LP models are usually constructed using index sets and data tables which are closely related to the attributes and relations of relational database (RDB) systems. We extend the syntax of MPL, an existing LP modelling language, in order to connect it to a given RDB system. This approach reuses existing modelling and database software, provides a rich modelling environment and achieves model and data independence. This integrated software enables Mathematical Programming to be widely used as a decision support tool by unlocking the data residing in corporate databases
Recommended from our members
Point excess solute: A new metric for quantifying solute segregation in atom probe tomography datasets including application to naturally aged solute clusters in Al-Mg-Si-(Cu) alloys
Data availability: The raw and processed data required to reproduce these findings cannot be shared at this time due to legal reasons.Supplementary data are available online at: https://www.sciencedirect.com/science/article/pii/S1044580323007611#s0060 .Copyright © 2023 The Authors. Accurate, repeatable and quantitative analysis of nanoscale solute clustering in atom probe tomography (APT) datasets is a complex challenge which is made more difficult by the positional uncertainty and lack of absolute resolution inherent to the technique. In this work a new method, the point excess solute, is introduced for quantifying solute segregation in datasets with limited spatial resolution. This new method is based on measuring the matrix concentration using a dataset sampling method. We show the new method can accurately reproduce the values expected from synthetic datasets a priori and when the dataset spatial resolution and or phase contrast is too low for accurate quantification this is observable. The method is then applied to naturally aged solute clusters in the Al-Mg-Si-Cu system. Datasets were collected with a range of natural ageing times from 8 min to 76 weeks. The formation of the solute clusters is shown to be unaffected by the Cu content of the alloy.The authors would like to thank Constellium for providing the materials and financially supporting this research alongside the Engineering and Physical Science Research Council (EPSRC) through studentship 1922133. The authors are grateful to EPSRC for funding of the LEAP 5000XR for the UK National Atom Probe Facility through grant EP/M022803/1
Discrete and fuzzy dynamical genetic programming in the XCSF learning classifier system
A number of representation schemes have been presented for use within
learning classifier systems, ranging from binary encodings to neural networks.
This paper presents results from an investigation into using discrete and fuzzy
dynamical system representations within the XCSF learning classifier system. In
particular, asynchronous random Boolean networks are used to represent the
traditional condition-action production system rules in the discrete case and
asynchronous fuzzy logic networks in the continuous-valued case. It is shown
possible to use self-adaptive, open-ended evolution to design an ensemble of
such dynamical systems within XCSF to solve a number of well-known test
problems
Metagenomic ene-reductases for the bioreduction of sterically challenging enones
Ene-reductases (ERs) of the Old Yellow Enzyme family catalyse asymmetric reduction of activated alkenes providing chiral products. They have become an important method in the synthetic chemists' toolbox offering a sustainable alternative to metal-catalysed asymmetric reduction. Development of new biocatalytic alkene reduction routes, however needs easy access to novel biocatalysts. A sequence-based functional metagenomic approach was used to identify novel ERs from a drain metagenome. From the ten putative ER enzymes initially identified, eight exhibited activities towards widely accepted mono-cyclic substrates with several of the ERs giving high reaction yields and stereoselectivities. Two highly performing enzymes that displayed excellent co-solvent tolerance were used for the stereoselective reduction of sterically challenging bicyclic enones where the reactions proceeded in high yields, which is unprecedented to date with wild-type ERs. On a preparative enzymatic scale, reductions of HajosâParish, WielandâMiescher derivatives and a tricyclic ketone proceeded with good to excellent yields
Recommended from our members
Ozone depletion events observed in the high latitude surface layer during the TOPSE aircraft program
During the Tropospheric Ozone Production about the Spring Equinox (TOPSE) aircraft program, ozone depletion events (ODEs) in the high latitude surface layer were investigated using lidar and in situ instruments. Flight legs of 100 km or longer distance were flown 32 times at 30 m altitude over a variety of regions north of 58° between early February and late May 2000. ODEs were found on each flight over the Arctic Ocean but their occurrence was rare at more southern latitudes. However, large area events with depletion to over 2 km altitude in one case were found as far south as Baffin Bay and Hudson Bay and as late as 22 May. There is good evidence that these more southern events did not form in situ but were the result of export of ozoneâdepleted air from the surface layer of the Arctic Ocean. Surprisingly, relatively intact transport of ODEs occurred over distances of 900â2000 km and in some cases over rough terrain. Accumulation of constituents in the frozen surface over the dark winter period cannot be a strong prerequisite of ozone depletion since latitudes south of the Arctic Ocean would also experience a long dark period. Some process unique to the Arctic Ocean surface or its coastal regions remains unidentified for the release of ozoneâdepleting halogens. There was no correspondence between coarse surface features such as solid ice/snow, open leads, or polynyas with the occurrence of or intensity of ozone depletion over the Arctic or subarctic regions. Depletion events also occurred in the absence of longârange transport of relatively fresh âpollutionâ within the high latitude surface layer, at least in spring 2000. Direct measurements of halogen radicals were not made. However, the flights do provide detailed information on the vertical structure of the surface layer and, during the constant 30 m altitude legs, measurements of a variety of constituents including hydroxyl and peroxy radicals. A summary of the behavior of these constituents is made. The measurements were consistent with a source of formaldehyde from the snow/ice surface. Median NOx in the surface layer was 15 pptv or less, suggesting that surface emissions were substantially converted to reservoir constituents by 30 m altitude and that ozone production rates were small (0.15â1.5 ppbv/d) at this altitude. Peroxyacetylnitrate (PAN) was by far the major constituent of NOy in the surface layer independent of the ozone mixing ratio
Neuroimaging of tissue microstructure as a marker of neurodegeneration in the AT(N) framework: defining abnormal neurodegeneration and improving prediction of clinical status
Background: Alzheimerâs disease involves accumulating amyloid (A) and tau (T) pathology, and progressive neurodegeneration (N), leading to the development of the AD clinical syndrome. While several markers of N have been proposed, efforts to define normal vs. abnormal neurodegeneration based on neuroimaging have been limited. Sensitive markers that may account for or predict cognitive dysfunction for individuals in early disease stages are critical. Methods: Participants (n = 296) defined on A and T status and spanning the AD-clinical continuum underwent multi-shell diffusion-weighted magnetic resonance imaging to generate Neurite Orientation Dispersion and Density Imaging (NODDI) metrics, which were tested as markers of N. To better define N, we developed age- and sex-adjusted robust z-score values to quantify normal and AD-associated (abnormal) neurodegeneration in both cortical gray matter and subcortical white matter regions of interest. We used general logistic regression with receiver operating characteristic (ROC) and area under the curve (AUC) analysis to test whether NODDI metrics improved diagnostic accuracy compared to models that only relied on cerebrospinal fluid (CSF) A and T status (alone and in combination). Results: Using internal robust norms, we found that NODDI metrics correlate with worsening cognitive status and that NODDI captures early, AD neurodegenerative pathology in the gray matter of cognitively unimpaired, but A/T biomarker-positive, individuals. NODDI metrics utilized together with A and T status improved diagnostic prediction accuracy of AD clinical status, compared with models using CSF A and T status alone. Conclusion: Using a robust norms approach, we show that abnormal AD-related neurodegeneration can be detected among cognitively unimpaired individuals. Metrics derived from diffusion-weighted imaging are potential sensitive markers of N and could be considered for trial enrichment and as outcomes in clinical trials. However, given the small sample sizes, the exploratory nature of the work must be acknowledged
Process evaluation of appreciative inquiry to translate pain management evidence into pediatric nursing practice
Background
Appreciative inquiry (AI) is an innovative knowledge translation (KT) intervention that is compatible with the Promoting Action on Research in Health Services (PARiHS) framework. This study explored the innovative use of AI as a theoretically based KT intervention applied to a clinical issue in an inpatient pediatric care setting. The implementation of AI was explored in terms of its acceptability, fidelity, and feasibility as a KT intervention in pain management.
Methods
A mixed-methods case study design was used. The case was a surgical unit in a pediatric academic-affiliated hospital. The sample consisted of nurses in leadership positions and staff nurses interested in the study. Data on the AI intervention implementation were collected by digitally recording the AI sessions, maintaining logs, and conducting individual semistructured interviews. Data were analysed using qualitative and quantitative content analyses and descriptive statistics. Findings were triangulated in the discussion.
Results
Three nurse leaders and nine staff members participated in the study. Participants were generally satisfied with the intervention, which consisted of four 3-hour, interactive AI sessions delivered over two weeks to promote change based on positive examples of pain management in the unit and staff implementation of an action plan. The AI sessions were delivered with high fidelity and 11 of 12 participants attended all four sessions, where they developed an action plan to enhance evidence-based pain assessment documentation. Participants labeled AI a 'refreshing approach to change' because it was positive, democratic, and built on existing practices. Several barriers affected their implementation of the action plan, including a context of change overload, logistics, busyness, and a lack of organised follow-up.
Conclusions
Results of this case study supported the acceptability, fidelity, and feasibility of AI as a KT intervention in pain management. The AI intervention requires minor refinements (e.g., incorporating continued follow-up meetings) to enhance its clinical utility and sustainability. The implementation process and effectiveness of the modified AI intervention require evaluation in a larger multisite study
- âŠ