67 research outputs found
Hit-and-run algorithms for the indentification of nonredundant linear inequalities
Two probabilistic hit-and-run algorithms are presented to detect nonredundant constraints in a full dimensional system of linear inequalities. The algorithms proceed by generating a random sequence of interior points whose limiting distribution is uniform, and by searching for a nonredundant constraint in the direction of a random vector from each point in the sequence. In the hypersphere directions algorithm tile direction vector is drawn from a uniform distribution on a hypersphere. In tile computalionalb superior coordinate directions algorithm a search is carried out along one of the coordinate vectors. The algorithms are terminated through the use of a Bayesian stopping rule. Computational experience with the algorithms and the stopping rule will be reported
Monte Carlo algorithms for the detection of necessary linear matrix inequality constraints
We reduce the size of large semidefinite programming problems by identifying necessary linear matrix inequalities (LMI's) using Monte Carlo techniques. We describe three algorithms for detecting necessary LMI constraints that extend algorithms used in linear programming to semidefinite programming. We demonstrate that they are beneficial and could serve as tools for a semidefinite programming preprocessor. A necessary LMI is one whose removal changes the feasible region defined by all the LMI constraints. The general problem of checking whether or not a particular LMI is necessary is NP-complete. However, the methods we describe are polynomial in each iteration, and the number of iterations can be limited by stopping rules. This provides a practical method for reducing the size of some large Semidefinite Programming problems before one attempts to solve them. We demonstrate the applicability of this approach to solving instances of the Lowner ellipsoid problem. We also consider the problem of classification of all the constraints of a semidefinite programming problem as redundant or necessary
Stochastic billiards for sampling from the boundary of a convex set
Stochastic billiards can be used for approximate sampling from the boundary
of a bounded convex set through the Markov Chain Monte Carlo (MCMC) paradigm.
This paper studies how many steps of the underlying Markov chain are required
to get samples (approximately) from the uniform distribution on the boundary of
the set, for sets with an upper bound on the curvature of the boundary. Our
main theorem implies a polynomial-time algorithm for sampling from the boundary
of such sets
A novel methodology to estimate metabolic flux distributions in constraint-based models
Quite generally, constraint-based metabolic flux analysis describes the space of viable flux configurations for a metabolic network as a high-dimensional polytope defined by the linear constraints that enforce the balancing of production and consumption fluxes for each chemical species in the system. In some cases, the complexity of the solution space can be reduced by performing an additional optimization, while in other cases, knowing the range of variability of fluxes over the polytope provides a sufficient characterization of the allowed configurations. There are cases, however, in which the thorough information encoded in the individual distributions of viable fluxes over the polytope is required. Obtaining such distributions is known to be a highly challenging computational task when the dimensionality of the polytope is sufficiently large, and the problem of developing cost-effective ad hoc algorithms has recently seen a major surge of interest. Here, we propose a method that allows us to perform the required computation heuristically in a time scaling linearly with the number of reactions in the network, overcoming some limitations of similar techniques employed in recent years. As a case study, we apply it to the analysis of the human red blood cell metabolic network, whose solution space can be sampled by different exact techniques, like Hit-and-Run Monte Carlo (scaling roughly like the third power of the system size). Remarkably accurate estimates for the true distributions of viable reaction fluxes are obtained, suggesting that, although further improvements are desirable, our method enhances our ability to analyze the space of allowed configurations for large biochemical reaction networks. © 2013 by the authors; licensee MDPI, Basel, Switzerland
On Murty\u27s gravitational interior point method for quadratic programming
This thesis presents a modification of the gravitational interior point method for quadratic programming [7]. Murty presented the algorithm as a generalization of his gravitational method for linear programming [8]. Murty claims that this method is matrix inverse free unlike other interior point methods, however convergence of his algorithm is not guaranteed. This thesis introduces modifications in the centering step of the algorithm and, using a MatlabR2009a implementation, demonstrates the centering step
Experiments in reduction techniques for linear and integer programming
This study consisted of evaluating the relative performance to a
selection of the most promising size-reduction techniques. Experiments
and comparisons were made among these techniques on a series of tested
problems to determine their relative efficiency, efficiency versus time
etc. Three main new methods were developed by modifying and extending
the previous ones. These methods were also tested and their results are
compared with the earlier methods
A Relative View on Tracking Error
When delegating an investment decisions to a professional manager, investors often anchor their mandate to a specific benchmark. The manager’s exposure to risk is controlled by means of a tracking error volatility constraint. It depends on market conditions whether this constraint is easily met or violated. Moreover, the performance of the portfolio depends on market conditions. In this paper we argue that these mandated portfolios should not only be evaluated relative to their benchmarks in order to appraise their performance. They should also be evaluated relative to the opportunity set of all portfolios that can be formed under the same mandate – the portfolio opportunity set. The distribution of performance values over the portfolio opportunity set depends on contemporary market dynamics. To correct for this, we suggest a normalized version of the information ratio that is invariant to these market conditions
- …