52,746 research outputs found
Can Computer Algebra be Liberated from its Algebraic Yoke ?
So far, the scope of computer algebra has been needlessly restricted to exact
algebraic methods. Its possible extension to approximate analytical methods is
discussed. The entangled roles of functional analysis and symbolic programming,
especially the functional and transformational paradigms, are put forward. In
the future, algebraic algorithms could constitute the core of extended symbolic
manipulation systems including primitives for symbolic approximations.Comment: 8 pages, 2-column presentation, 2 figure
An introduction to the adjoint approach to design
Optimal design methods involving the solution of an adjoint system of equations are an active area of research in computational fluid dynamics, particularly for aeronautical applications. This paper presents an introduction to the subject, emphasising the simplicity of the ideas when viewed in the context of linear algebra. Detailed discussions also include the extension to p.d.e.'s, the construction of the adjoint p.d.e. and its boundary conditions, and the physical significance of the adjoint solution. The paper concludes with examples of the use of adjoint methods for optimising the design of business jets.\ud
\ud
This research was supported by funding from Rolls-Royce plc, BAe Systems plc and EPSRC grants GR/K91149 and GR/L95700
Computational complexity of ÎĽ calculation
The structured singular value ÎĽ measures the robustness of uncertain systems. Numerous researchers over the last decade have worked on developing efficient methods for computing ÎĽ. This paper considers the complexity of calculating ÎĽ with general mixed real/complex uncertainty in the framework of combinatorial complexity theory. In particular, it is proved that the ÎĽ recognition problem with either pure real or mixed real/complex uncertainty is NP-hard. This strongly suggests that it is futile to pursue exact methods for calculating ÎĽ of general systems with pure real or mixed uncertainty for other than small problems
Smoothed Complexity Theory
Smoothed analysis is a new way of analyzing algorithms introduced by Spielman
and Teng (J. ACM, 2004). Classical methods like worst-case or average-case
analysis have accompanying complexity classes, like P and AvgP, respectively.
While worst-case or average-case analysis give us a means to talk about the
running time of a particular algorithm, complexity classes allows us to talk
about the inherent difficulty of problems.
Smoothed analysis is a hybrid of worst-case and average-case analysis and
compensates some of their drawbacks. Despite its success for the analysis of
single algorithms and problems, there is no embedding of smoothed analysis into
computational complexity theory, which is necessary to classify problems
according to their intrinsic difficulty.
We propose a framework for smoothed complexity theory, define the relevant
classes, and prove some first hardness results (of bounded halting and tiling)
and tractability results (binary optimization problems, graph coloring,
satisfiability). Furthermore, we discuss extensions and shortcomings of our
model and relate it to semi-random models.Comment: to be presented at MFCS 201
- …