1,691 research outputs found
Separation Plane Priority in Computer Image Generation
Flight simulators are devices in which air crews can be trained without the use of actual aircraft. Potentially dangerous maneuvers, such as air-to-air refueling, and destructive exercises, such as evasive action from weaponry or aerial dogfights, can be practiced repeatedly with no risk to pilot or crew. Flight simulators are cost effective since the fuel costs associated with training pilots in actual aircraft can be excessive. Flight simulators offer an alternate training method with reduced cost. The task of a visual flight simulator is to present the trainee with scenes representative of those that would be seen if the actual mission being trained for were flown. Scenes produced by a Computer Image Generation device must be of sufficient content, fidelity, resolution, brightness and field of view to allow the trainees to improve their skills. If one of these factors falls below the threshold of acceptability, the training value of the device is diminished, if not lost altogether. One of the most challenging problems in Computer Image Generation is the removal of hidden parts from images of solid objects. In real life, the opaque material of these objects obstructs the light rays from hidden parts and prevents us from seeing them. In the computer generation of an image no such automatic elimination takes place. Instead, all parts of every object, including parts that should be hidden are displayed. In order to remove these parts and create a more realistic image, a hidden-line or hidden-surface algorithm must be applied to the set of objects. When more than a single object is in the scene another problem arises; which of the objects block the view of the others. This is an occultation problem. This paper presents a âseparation planeâ priority algorithm used in Computer Image Generation to solve this occultation problem. The algorithm uses a binary search technique to generate a âlistable setâ; a set of planes that yield proper object priority for any viewpoint in the data base
Geometric Reasoning with polymake
The mathematical software system polymake provides a wide range of functions
for convex polytopes, simplicial complexes, and other objects. A large part of
this paper is dedicated to a tutorial which exemplifies the usage. Later
sections include a survey of research results obtained with the help of
polymake so far and a short description of the technical background
Doctor of Philosophy
dissertationMany algorithms have been developed for synthesizing shaded images of three dimensional objects modeled by computer. In spite of widely differing approaches the current state of the art algorithms are surprisingly similar with respect to the richness of the scenes they can process. One attribute these algorithms have in common is the use of a conventional passive data base to represent the objects being modeled. This paper postulates and explores the use of an alternative modeling technique which uses procedures to represent the objects being modeled. The properties and structure of such "procedure models" are investigated and an algorithm based on them is presented
Computer representation of graphical information with applications
PhD ThesisThe research work contained in this thesls lies
mainly in the field of computer graphics.
The initial chapters are concerned with methods of
representing three dimensional solids in two dimensions.
Chapter 2 describes a method by which points in three
dimensions can be projected onto a two dimensional plane of
This is an essential requirement in the projection.
This is an essential requirement in the representation of three dimensional solids.
Chapter 3 describes a method by which convex polyhedra can be represented by computer.
Both the hidden polyhedra and visible face of the polyhedron can be represented by computer.
Having tackled this problem, the
more difficult problem of representing the non convex
polyhedron has been attempted and the results of this work
are presented in Chapter 4.
Line drawings of the various polyhedra, produced
on a graph plotter, are given as examples at the end of
Chapters 2, 3 and 4.
The problem of how to connect a given line
drawing such that the distance travelled by the pen of
some computer display is kept to a minimum is discussed in
Chapter 5 and various definitions of the concepts involved
are given.
Theory associated with this 'Pen-Up Problem'
has been developed and is explained in detail in the early
part of Chapter 6. A method of obtaining an optimal
solution to the problem is presented in the latter part of this chapter in addition to various enumerative schemes
which have been developed to obtain good feasible solutions to the pen up problems under various conditions
Extensive C.P.U. timing experiments have been
carried out in Chapter 7 on the various enumerative schemes
in Chapter 6 and it has introduced been possible to reach
conclusions on the applicability of the various methods.
Several topics of interest which have arisen
during the main research work are presented as appendices.
The programs which have been coded during the period of
research are also inc1udeu as appendices
A SOLUTION FOR THE INTERSECTION OF TWO CONVEX POLYHEDRA
A computer program is presented for determining the polygon of intersection of two
convex polyhedra. The algorithm is based upon the trivial construction and uses O(N2)
operations. where N is the sum of the numbers of edges of the two polyhedra. The program is
written in Basic for the personal computer Commodore 64
Algorithms and Hardness for Robust Subspace Recovery
We consider a fundamental problem in unsupervised learning called
\emph{subspace recovery}: given a collection of points in ,
if many but not necessarily all of these points are contained in a
-dimensional subspace can we find it? The points contained in are
called {\em inliers} and the remaining points are {\em outliers}. This problem
has received considerable attention in computer science and in statistics. Yet
efficient algorithms from computer science are not robust to {\em adversarial}
outliers, and the estimators from robust statistics are hard to compute in high
dimensions.
Are there algorithms for subspace recovery that are both robust to outliers
and efficient? We give an algorithm that finds when it contains more than a
fraction of the points. Hence, for say this estimator
is both easy to compute and well-behaved when there are a constant fraction of
outliers. We prove that it is Small Set Expansion hard to find when the
fraction of errors is any larger, thus giving evidence that our estimator is an
{\em optimal} compromise between efficiency and robustness.
As it turns out, this basic problem has a surprising number of connections to
other areas including small set expansion, matroid theory and functional
analysis that we make use of here.Comment: Appeared in Proceedings of COLT 201
Planar graphs : a historical perspective.
The field of graph theory has been indubitably influenced by the study of planar graphs. This thesis, consisting of five chapters, is a historical account of the origins and development of concepts pertaining to planar graphs and their applications. The first chapter serves as an introduction to the history of graph theory, including early studies of graph theory tools such as paths, circuits, and trees. The second chapter pertains to the relationship between polyhedra and planar graphs, specifically the result of Euler concerning the number of vertices, edges, and faces of a polyhedron. Counterexamples and generalizations of Euler\u27s formula are also discussed. Chapter III describes the background in recreational mathematics of the graphs of K5 and K3,3 and their importance to the first characterization of planar graphs by Kuratowski. Further characterizations of planar graphs by Whitney, Wagner, and MacLane are also addressed. The focus of Chapter IV is the history and eventual proof of the four-color theorem, although it also includes a discussion of generalizations involving coloring maps on surfaces of higher genus. The final chapter gives a number of measurements of a graph\u27s closeness to planarity, including the concepts of crossing number, thickness, splitting number, and coarseness. The chapter conclused with a discussion of two other coloring problems - Heawood\u27s empire problem and Ringel\u27s earth-moon problem
Fun with Fonts: Algorithmic Typography
Over the past decade, we have designed six typefaces based on mathematical
theorems and open problems, specifically computational geometry. These
typefaces expose the general public in a unique way to intriguing results and
hard problems in hinged dissections, geometric tours, origami design,
computer-aided glass design, physical simulation, and protein folding. In
particular, most of these typefaces include puzzle fonts, where reading the
intended message requires solving a series of puzzles which illustrate the
challenge of the underlying algorithmic problem.Comment: 14 pages, 12 figures. Revised paper with new glass cane font.
Original version in Proceedings of the 7th International Conference on Fun
with Algorithm
Modulo scheduling for a fully-distributed clustered VLIW architecture
Clustering is an approach that many microprocessors are adopting in recent times in order to mitigate the increasing penalties of wire delays. We propose a novel clustered VLIW architecture which has all its resources partitioned among clusters, including the cache memory. A modulo scheduling scheme for this architecture is also proposed. This algorithm takes into account both register and memory inter-cluster communications so that the final schedule results in a cluster assignment that favors cluster locality in cache references and register accesses. It has been evaluated for both 2- and 4-cluster configurations and for differing numbers and latencies of inter-cluster buses. The proposed algorithm produces schedules with very low communication requirements and outperforms previous cluster-oriented schedulers.Peer ReviewedPostprint (published version
- âŠ