6 research outputs found

    Improved comprehensibility and reliability of explanations via restricted halfspace discretization

    Get PDF
    Abstract. A number of two-class classification methods first discretize each attribute of two given training sets and then construct a propositional DNF formula that evaluates to True for one of the two discretized training sets and to False for the other one. The formula is not just a classification tool but constitutes a useful explanation for the differences between the two underlying populations if it can be comprehended by humans and is reliable. This paper shows that comprehensibility as well as reliability of the formulas can sometimes be improved using a discretization scheme where linear combinations of a small number of attributes are discretized

    Q(sqrt(-3))-Integral Points on a Mordell Curve

    Get PDF
    We use an extension of quadratic Chabauty to number fields,recently developed by the author with Balakrishnan, Besser and M ̈uller,combined with a sieving technique, to determine the integral points overQ(√−3) on the Mordell curve y2 = x3 − 4

    Numerical Methods for Parameter Estimation in Dynamical Systems with Noise with Applications in Systems Biology

    Get PDF
    This thesis comprises the modelling of and parameter estimation in dynamical systems, with a focus on applications in systems biology. In an interdisciplinary research project on the systems biology of cancer, we develop a predictive mathematical model of an intracellular crosstalk in cytokine signalling. Expected and unexpected predictions are confirmed in experiments and lead to new biological insights. For model calibration with measurement data, we apply well established methods for parameter estimation in ordinary differential equation models. Extending these to stochastic differential equations, we develop, analyse, and implement a new method for parameter estimation in dynamical processes with noise, and demonstrate its performance in several selected examples from systems biology and mathematical finance. Many processes, especially in biology, obey deterministic ground rules (e.g. metabolic processes or signal transduction pathways), but may be heavily influenced by fluctuations and stochasticity inherent to the system that change its behaviour both qualitatively and quantitatively. Therefore, frequently, a deterministic description is not constructive. A large class of such systems can be adequately described by nonlinear multi-dimensional stochastic differential equations (SDEs). Classical estimation techniques for SDEs, relying on (approximations of) transition densities, are all too often not applicable to these problems, due, inter alia, to their high computational costs and prerequisites on the measurements. The proposed new method is based on the method of multiple shooting, using piecewise deterministic solutions of ordinary differential equations (ODEs) to approximate the SDE realization that corresponds to the studied process from which measurements have been taken. The generally discontinuous concatenation of ODE trajectories mimics the consequences of stochastic effects, and, further, allows to formulate the parameter estimation problem as a deterministic nonlinear optimization problem that can be solved with efficient derivative-based solution methods. In this thesis, a generalized GAUSS-NEWTON method is deployed. Main results and contributions of this thesis are summarized in the following: • We propose a new method for parameter estimation in nonlinear multi-dimensional SDEs, based on a piecewise approximation by solutions of ODEs. Discontinuities (jumps) occurring at the interval borders are used for regularization. Unknown parameters and initial states are estimated by a generalized weighted least squares method from data that can originate from direct complete or partial state measurements or from indirectly observed quantities. Measurement data may be afflicted with errors and arbitrarily sampled. Non-linear parameter and point constraints may be formulated as equality and inequality constraints. The resulting nonlinear constrained optimization problems are highly structured and efficiently solved using a generalized GAUSS-NEWTON method. • We give a proof that the discontinuities at the interval borders asymptotically tend to zero if the number of equidistantly distributed shooting nodes goes to infinity. • We show in a numerical analysis that the resulting equation systems are sparse, that the number of nonzero elements depends only linearly on the number of shooting nodes, and give sharp upper bounds. Moreover, we prove that the sparsity is maintained if an appropriate (stable) decomposition is a applied. • It is demonstrated in comparative simulation studies that the estimates are robust w.r.t. the exact choice of jump regularization weights. Moreover, the effects of jump regularization on estimates and approximated trajectories are investigated and described. • A lifting approach with per-interval parameter sets, coupled by additional equality constraints, is developed and its numerical properties are analysed. Moreover, we propose a homotopy method for the treatment of hard problems. • We demonstrate the performance of the new estimation technique in examples from systems biology, each shedding some light on different aspects. Especially, we show that the method can also be used for hidden state estimation and for trajectory reconstruction in time spans without observations. Further, we derive a criterion for local grid refinement. • We show for an ORNSTEIN-UHLENBECK process driven by a LEVY jump process, that, in addition to mean reversion level and mean reversion rate, also the diffusion constant may be estimated by analysing the jump residuals. • The software package :sfit is an efficient implementation of the proposed method, offering easy symbolic problem formulation to the user, from which the stochastic parameter estimation problem can be automatically built and solved. Parts of this work emerged in the interdisciplinary research project SBCancer of the Helmholtz Alliance on Systems Biology. In close collaboration with expert biologists, we developed an mathematical model for a crosstalk of two cytokines in human skin cells that interfere in a signalling pathway frequently found aberrantly activated in cancer. After an extensive analysis of the deployed measurement data processing, the model proposed by the author of this thesis has been calibrated from experimental data. Its counter-intuitive predictions have been verified in wet lab experiments and lead to new biological insights. Main novelties and contributions in this thesis are: • Development of a mathematical model of the crosstalk of two cytokines in human keratinocytes (HaCaT cell line). The predicted and hitherto unknown nonlinear moderating effects of GM-CSF on the IL-6-induced JAK-STAT signalling pathway has been verified in vitro. • An extensive mathematical analysis of the frequently utilized quantitative WESTERN blotting measurement procedure shows that established data normalization methods, relying on housekeeping proteins or manually added calibrator proteins, are prone to signal deteriorating statistical artefacts. Moreover, we show that the frequently declared assumption of normally distributed measurement errors cannot be maintained if these normalization techniques are applied. • We propose as a remedy a normalization technique based on the calculation of amplification factors, and develop criteria for (approximate) normally distributed errors. These criteria can be easily checked using solely the raw measurement data. Moreover, we demonstrate the advantages of the proposed amplification factors method in a large comparative simulation study

    Fuelling the zero-emissions road freight of the future: routing of mobile fuellers

    Get PDF
    The future of zero-emissions road freight is closely tied to the sufficient availability of new and clean fuel options such as electricity and Hydrogen. In goods distribution using Electric Commercial Vehicles (ECVs) and Hydrogen Fuel Cell Vehicles (HFCVs) a major challenge in the transition period would pertain to their limited autonomy and scarce and unevenly distributed refuelling stations. One viable solution to facilitate and speed up the adoption of ECVs/HFCVs by logistics, however, is to get the fuel to the point where it is needed (instead of diverting the route of delivery vehicles to refuelling stations) using "Mobile Fuellers (MFs)". These are mobile battery swapping/recharging vans or mobile Hydrogen fuellers that can travel to a running ECV/HFCV to provide the fuel they require to complete their delivery routes at a rendezvous time and space. In this presentation, new vehicle routing models will be presented for a third party company that provides MF services. In the proposed problem variant, the MF provider company receives routing plans of multiple customer companies and has to design routes for a fleet of capacitated MFs that have to synchronise their routes with the running vehicles to deliver the required amount of fuel on-the-fly. This presentation will discuss and compare several mathematical models based on different business models and collaborative logistics scenarios

    Evaluation and modelling of perceived audio quality in popular music, towards intelligent music production

    Get PDF
    This thesis addresses three fundamental questions: What is mixing? What makes a high-quality mix? How can high-quality mixes be automatically generated? While these may seem essential to the very foundations of intelligent music production, this thesis argues that they have not been sufficiently addressed in previous studies. An important contribution is the questioning of previously-held definitions of a 'mix'. Experiments were conducted in which participants used traditional mixing interfaces to create mixes using gain, panning and equalisation. The data was analysed in a novel 'mix-space', 'panning-space' and 'tone-space' in order to determine if there is a consensus in how these tools are used. Methods were developed to create mixes by populating the mix-space according to parametric models. These mixes were characterised by signal features, the distributions of which suggest tolerance bounds for automated mixing systems. This was complemented by a study of real-world music mixes, containing hundreds of mixes each for ten songs, collected from on-line communities. Mixes were shown to vary along four dimensions: loudness/dynamics, brightness, bass and stereo width. The variations between individual mix engineers were also studied, indicating a small effect of the mix engineer on mix preference ratings (eta2 = 0.021). Perceptual audio evaluation revealed that listeners appreciate 'quality' in a variety of ways, depending on the circumstances. In commercially-released music, 'quality' was related to the loudness/dynamic dimension. In mixes, 'quality' is highly correlated with 'preference'. To create mixes which maximised perceived quality, a novel semi-automatic mixing system was developed using evolutionary computation, wherein a population of mixes, generated in the mix-space, is guided by the subjective evaluations of the listener. This system was evaluated by a panel of users, who used it to create their ideal mixes, rather than the technically-correct mixes which previous systems strived for. It is hoped that this thesis encourages the community to pursue subjectively motivated methods when designing systems for music-mixing
    corecore