504 research outputs found

    Uniform sampling of steady states in metabolic networks: heterogeneous scales and rounding

    Get PDF
    The uniform sampling of convex polytopes is an interesting computational problem with many applications in inference from linear constraints, but the performances of sampling algorithms can be affected by ill-conditioning. This is the case of inferring the feasible steady states in models of metabolic networks, since they can show heterogeneous time scales . In this work we focus on rounding procedures based on building an ellipsoid that closely matches the sampling space, that can be used to define an efficient hit-and-run (HR) Markov Chain Monte Carlo. In this way the uniformity of the sampling of the convex space of interest is rigorously guaranteed, at odds with non markovian methods. We analyze and compare three rounding methods in order to sample the feasible steady states of metabolic networks of three models of growing size up to genomic scale. The first is based on principal component analysis (PCA), the second on linear programming (LP) and finally we employ the lovasz ellipsoid method (LEM). Our results show that a rounding procedure is mandatory for the application of the HR in these inference problem and suggest that a combination of LEM or LP with a subsequent PCA perform the best. We finally compare the distributions of the HR with that of two heuristics based on the Artificially Centered hit-and-run (ACHR), gpSampler and optGpSampler. They show a good agreement with the results of the HR for the small network, while on genome scale models present inconsistencies.Comment: Replacement with major revision

    Evidence-based robust design of deflection actions for near Earth objects

    Get PDF
    This paper presents a novel approach to the robust design of deflection actions for Near Earth Objects (NEO). In particular, the case of deflection by means of Solar-pumped Laser ablation is studied here in detail. The basic idea behind Laser ablation is that of inducing a sublimation of the NEO surface, which produces a low thrust thereby slowly deviating the asteroid from its initial Earth threatening trajectory. This work investigates the integrated design of the Space-based Laser system and the deflection action generated by laser ablation under uncertainty. The integrated design is formulated as a multi-objective optimisation problem in which the deviation is maximised and the total system mass is minimised. Both the model for the estimation of the thrust produced by surface laser ablation and the spacecraft system model are assumed to be affected by epistemic uncertainties (partial or complete lack of knowledge). Evidence Theory is used to quantify these uncertainties and introduce them in the optimisation process. The propagation of the trajectory of the NEO under the laser-ablation action is performed with a novel approach based on an approximated analytical solution of Gauss’ Variational Equations. An example of design of the deflection of asteroid Apophis with a swarm of spacecraft is presented

    Use of orthogonal arrays, quasi-Monte Carlo sampling and kriging response models for reservoir simulation with many varying factors

    Get PDF
    Asset development teams may adjust simulation model parameters using experimental design to reveal which factors have the greatest impact on the reservoir performance. Response surfaces and experimental design make sensitivity analysis less expensive and more accurate, helping to optimize recovery under geological and economical uncertainties. In this thesis, experimental designs including orthogonal arrays, factorial designs, Latin hypercubes and Hammersley sequences are compared and analyzed. These methods are demonstrated for a gas well with water coning problem to illustrate the efficiency of orthogonal arrays. Eleven geologic factors are varied while optimizing three engineering factors (total of fourteen factors). The objective is to optimize completion length, tubing head pressure, and tubing diameter for a partially penetrating well with uncertain reservoir properties. A nearly orthogonal array was specified with three levels for eight factors and four levels for the remaining six geologic and engineering factors. This design requires only 36 simulations compared to (26,873,856) runs for a full factorial design. Hyperkriging surfaces are an alternative model form for large numbers. Hyperkriging uses the maximum likelihood variogram model parameters to minimize prediction errors. Kriging is compared to conventional polynomial response models. The robustness of the response surfaces generated by kriging and polynomial regression are compared using jackknifing and bootstrapping. Sensitivity analysis and uncertainty analysis can be performed inexpensively and efficiently using response surfaces. The proposed design approach requires fewer simulations and provides accurate response models, efficient optimization, and flexible sensitivity and uncertainty assessment

    Bounding Embeddings of VC Classes into Maximum Classes

    Full text link
    One of the earliest conjectures in computational learning theory-the Sample Compression conjecture-asserts that concept classes (equivalently set systems) admit compression schemes of size linear in their VC dimension. To-date this statement is known to be true for maximum classes---those that possess maximum cardinality for their VC dimension. The most promising approach to positively resolving the conjecture is by embedding general VC classes into maximum classes without super-linear increase to their VC dimensions, as such embeddings would extend the known compression schemes to all VC classes. We show that maximum classes can be characterised by a local-connectivity property of the graph obtained by viewing the class as a cubical complex. This geometric characterisation of maximum VC classes is applied to prove a negative embedding result which demonstrates VC-d classes that cannot be embedded in any maximum class of VC dimension lower than 2d. On the other hand, we show that every VC-d class C embeds in a VC-(d+D) maximum class where D is the deficiency of C, i.e., the difference between the cardinalities of a maximum VC-d class and of C. For VC-2 classes in binary n-cubes for 4 <= n <= 6, we give best possible results on embedding into maximum classes. For some special classes of Boolean functions, relationships with maximum classes are investigated. Finally we give a general recursive procedure for embedding VC-d classes into VC-(d+k) maximum classes for smallest k.Comment: 22 pages, 2 figure

    A hyperspectral imaging system for mapping haemoglobin and cytochrome-c-oxidase concentration changes in the exposed cerebral cortex

    Get PDF
    We present a novel hyperspectral imaging (HSI) system using visible and near-infrared (NIR) light on the exposed cerebral cortex of animals, to monitor and quantify in vivo changes in the oxygenation of haemoglobin and in cellular metabolism via measurement of the redox states of cytochrome-c-oxidase (CCO). The system, named hNIR, is based on spectral scanning illumination at 11 bands (600, 630, 665, 784, 800, 818, 835, 851, 868, 881 and 894 nm), using a supercontinuum laser coupled with a rotating Pellin-Broca prism. Image reconstruction is performed with the aid of a Monte Carlo framework for photon pathlength estimation and post-processing correction of partial volume effects. The system is validated on liquid optical phantoms mimicking brain tissue haemodynamics and metabolism, and finally applied in vivo on the exposed cortex of mice undergoing alternating oxygenation challenges. The results of the study demonstrate the capacity of hNIR to map and quantify the haemodynamic and metabolic states of the exposed cortex at microvascular levels. This represents (to the best of our knowledge) the first example of simultaneous mapping and quantification of cerebral haemoglobin and CCO in vivo using visible and NIR HSI, which can potentially become a powerful tool for better understanding brain physiology

    ADVANTEX: Research of innovative tools to support the logistics of the use of excavation materials produced by the Lyon-Turin railway line for the best sustainability and circular economy of the process

    Get PDF
    The Mont-Cenis Base Tunnel is the key work of the new Lyon-Turin railway line. The project envisages a total volume of 37.2 million tons of excavated material over a period of 10 years: a considerable part of the excavated material will be used for the tunnel lining (con-crete or railway embankments) and for the embankments of the open-air sectors, while the remaining part will be transported by rail, conveyor belts, and heavy vehicles to the temporary and permanent storage sites. To maximize the circular economy and the efficiency of the materials logistic, TELT is working with the Politecnico di Torino (Department of Environment, Land, and Infrastructure Engineering, Department of Structural, Geotechnical and Building Engineering, and Department of Applied Science and Technology) and the Interdepartmental Laboratory SISCON - Safety of Infrastructures and Constructions, to study innovative solutions for the char-acterization and reuse of the excavated materials. Given that the materials are substantially undif-ferentiated during the excavation and that the geological classification requires long and complex additional verification activities, which can negatively affect the process, a significant sample of materials excavated at the survey tunnel of La Maddalena (place where the base tunnel will be excavated) were analyzed. The objective of this first phase is the search for new technologies and new processes for the early characterization of the excavated material in order to determine its intended use, designing green concretes (defining its sustainability and mechanical characteristics for structural use, through synthetic parameters, including durability analysis) and backfilling, seeking innovative tools for optimal logistics, in order to “industrialize” the identification process and optimal technologies for automatic process control and traceability, in order to give strength and speed to all activities. The subject of this work is the results of the early characterization experimentation process with the application of artificial intelligence and possible innovative circu-lar solutions

    Visible Near-Infrared Hyperspectral Imaging for the Identification and Discrimination of Brown Blotch Disease on Mushroom (Agaricus bisporus) Caps

    Get PDF
    Brown blotch, caused by pathogenic Pseudomonas tolaasii (P. tolaasii), is the most problematic bacterial disease in Agaricus bisporus mushrooms. Although it does not cause any health problems, it reduces the consumer appeal of mushrooms in the market place, generating important economical losses worldwide. Hyperspectral imaging (HSI) is a non-destructive technique that combines imaging and spectroscopy to obtain information from a sample. The objective of this study was to investigate the use of HSI for brown blotch identification and discrimination from mechanical damage on mushrooms. Hyperspectral images of mushrooms subjected to i) no treatment, ii) mechanical damage or iii) microbiological spoilage were taken during storage and spectra representing each of the classes were selected. Partial least squares- discriminant analysis (PLS-DA) was carried out in two steps: i) discrimination between undamaged and damaged mushrooms and ii) discrimination between damage sources (i.e. mechanical or microbiological). The models were applied at a pixel level and a decision tree was used to classify mushrooms into one of the aforementioned classes. A correct classification of \u3e95% was achieved. Results from this study could be used for the development of a sensor to detect and classify mushroom damage of mechanical and microbial origin, which would facilitate the industry to make rapid and automated decisions to discard produce of poor marketability
    • …
    corecore