400 research outputs found
Recursive structure in computer systems
PhD ThesisStructure plays a important part in the design of large systems.
Unstructured programs are difficult to design or test
and good structure has been recognized as essential to all but
the smallest programs. Similarly, concurrently executing computers
must co-operate in a structured way if an uncontrolled
growth in complexity is to be avoided. The thesis presented
here is that recursive structure can be used to organize and
simplify large programs and highly parallel computers.
In programming, naming concerns the way names are used to
identify objects. Various naming schemes are examined including
'block structured' and 'pathname' naming. A new scheme is
presented as a synthesis of these two combining most of their
advantages. Recursively structured naming is shown to be an
advantage when programs are to be de-composed or combined to
an arbitrary degree. Also, a contribution to the UNIX
United/Newcastle Connection distributed operating system
design is described. This shows how recursive naming was used
in a practical system.
Computation concerns the progress of execution in a computer.
A distinction is made between control driven computation where
the programmer has explicit control over sequencing and data
driven or demand driven computation where sequencing is implicit.
It is shown that recursively structured computation has
attractive locality properties.
The definition of a recursive structure may itself be cyclic
(self-referencing). A new resource management ('garbage collection')
algorithm is presented which can manage cyclic
structures without costs proportional to the system size. The
scheme is an extension of 'reference counting'.
Finally the need for structure in program and computer design
and the advantages of recursive structure are discussed.The Science and Engineering Research Council of Great Britain
A Relaxation Scheme for Mesh Locality in Computer Vision.
Parallel processing has been considered as the key to build computer systems of the future and has become a mainstream subject in Computer Science. Computer Vision applications are computationally intensive that require parallel approaches to exploit the intrinsic parallelism. This research addresses this problem for low-level and intermediate-level vision problems. The contributions of this dissertation are a unified scheme based on probabilistic relaxation labeling that captures localities of image data and the ability of using this scheme to develop efficient parallel algorithms for Computer Vision problems. We begin with investigating the problem of skeletonization. The technique of pattern match that exhausts all the possible interaction patterns between a pixel and its neighboring pixels captures the locality of this problem, and leads to an efficient One-pass Parallel Asymmetric Thinning Algorithm (OPATA\sb8). The use of 8-distance in this algorithm, or chessboard distance, not only improves the quality of the resulting skeletons, but also improves the efficiency of the computation. This new algorithm plays an important role in a hierarchical route planning system to extract high level typological information of cross-country mobility maps which greatly speeds up the route searching over large areas. We generalize the neighborhood interaction description method to include more complicated applications such as edge detection and image restoration. The proposed probabilistic relaxation labeling scheme exploit parallelism by discovering local interactions in neighboring areas and by describing them effectively. The proposed scheme consists of a transformation function and a dictionary construction method. The non-linear transformation function is derived from Markov Random Field theory. It efficiently combines evidences from neighborhood interactions. The dictionary construction method provides an efficient way to encode these localities. A case study applies the scheme to the problem of edge detection. The relaxation step of this edge-detection algorithm greatly reduces noise effects, gets better edge localization such as line ends and corners, and plays a crucial rule in refining edge outputs. The experiments on both synthetic and natural images show that our algorithm converges quickly, and is robust in noisy environment
Stellar Intensity Interferometry: Prospects for sub-milliarcsecond optical imaging
Using kilometric arrays of air Cherenkov telescopes, intensity interferometry
may increase the spatial resolution in optical astronomy by an order of
magnitude, enabling images of rapidly rotating stars with structures in their
circumstellar disks and winds, or mapping out patterns of nonradial pulsations
across stellar surfaces. Intensity interferometry (pioneered by Hanbury Brown
and Twiss) connects telescopes only electronically, and is practically
insensitive to atmospheric turbulence and optical imperfections, permitting
observations over long baselines and through large airmasses, also at short
optical wavelengths. The required large telescopes with very fast detectors are
becoming available as arrays of air Cherenkov telescopes, distributed over a
few square km. Digital signal handling enables very many baselines to be
synthesized, while stars are tracked with electronic time delays, thus
synthesizing an optical interferometer in software. Simulated observations
indicate limiting magnitudes around m(v)=8, reaching resolutions ~30
microarcsec in the violet. The signal-to-noise ratio favors high-temperature
sources and emission-line structures, and is independent of the optical
passband, be it a single spectral line or the broad spectral continuum.
Intensity interferometry provides the modulus (but not phase) of any spatial
frequency component of the source image; for this reason image reconstruction
requires phase retrieval techniques, feasible if sufficient coverage of the
interferometric (u,v)-plane is available. Experiments are in progress; test
telescopes have been erected, and trials in connecting large Cherenkov
telescopes have been carried out. This paper reviews this interferometric
method in view of the new possibilities offered by arrays of air Cherenkov
telescopes, and outlines observational programs that should become realistic
already in the rather near future.Comment: New Astronomy Reviews, in press; 101 pages, 11 figures, 185
reference
- âŠ