4,420 research outputs found
The First Hour of Extra-galactic Data of the Sloan Digital Sky Survey Spectroscopic Commissioning: The Coma Cluster
On 26 May 1999, one of the Sloan Digital Sky Survey (SDSS) fiber-fed
spectrographs saw astronomical first light. This was followed by the first
spectroscopic commissioning run during the dark period of June 1999. We present
here the first hour of extra-galactic spectroscopy taken during these early
commissioning stages: an observation of the Coma cluster of galaxies. Our data
samples the Southern part of this cluster, out to a radius of 1.5degrees and
thus fully covers the NGC 4839 group. We outline in this paper the main
characteristics of the SDSS spectroscopic systems and provide redshifts and
spectral classifications for 196 Coma galaxies, of which 45 redshifts are new.
For the 151 galaxies in common with the literature, we find excellent agreement
between our redshift determinations and the published values. As part of our
analysis, we have investigated four different spectral classification
algorithms: spectral line strengths, a principal component decomposition, a
wavelet analysis and the fitting of spectral synthesis models to the data. We
find that a significant fraction (25%) of our observed Coma galaxies show signs
of recent star-formation activity and that the velocity dispersion of these
active galaxies (emission-line and post-starburst galaxies) is 30% larger than
the absorption-line galaxies. We also find no active galaxies within the
central (projected) 200 h-1 Kpc of the cluster. The spatial distribution of our
Coma active galaxies is consistent with that found at higher redshift for the
CNOC1 cluster survey. Beyond the core region, the fraction of bright active
galaxies appears to rise slowly out to the virial radius and are randomly
distributed within the cluster with no apparent correlation with the potential
merger of the NGC 4839 group. [ABRIDGED]Comment: Accepted in AJ, 65 pages, 20 figures, 5 table
Model-Based Development of Distributed Embedded Systems by the Example of the Scicos/SynDEx Framework
The embedded systems engineering industry faces increasing demands for more
functionality, rapidly evolving components, and shrinking schedules. Abilities
to quickly adapt to changes, develop products with safe design, minimize
project costs, and deliver timely are needed. Model-based development (MBD)
follows a separation of concerns by abstracting systems with an appropriate
intensity. MBD promises higher comprehension by modeling on several
abstraction-levels, formal verification, and automated code generation. This
thesis demonstrates MBD with the Scicos/SynDEx framework on a distributed
embedded system. Scicos is a modeling and simulation environment for hybrid
systems. SynDEx is a rapid prototyping integrated development environment for
distributed systems. Performed examples implement well-known control algorithms
on a target system containing several networked microcontrollers, sensors, and
actuators. The addressed research question tackles the feasibility of MBD for
medium-sized embedded systems. In the case of single-processor applications
experiments show that the comforts of tool-provided simulation, verification,
and code-generation have to be weighed against an additional memory consumption
in dynamic and static memory compared to a hand-written approach. Establishing
a near-seamless modeling-framework with Scicos/SynDEx is expensive. An
increased development effort indicates a high price for developing single
applications, but might pay off for product families. A further drawback was
that the distributed code generated with SynDEx could not be adapted to
microcontrollers without a significant alteration of the scheduling tables. The
Scicos/SynDEx framework forms a valuable tool set that, however, still needs
many improvements. Therefore, its usage is only recommended for experimental
purposes.Comment: 146 pages, Master's Thesi
Hubble Space Telescope: Optical telescope assembly handbook. Version 1.0
The Hubble Space Telescope is described along with how its design affects the images produced at the Science Instruments. An overview is presented of the hardware. Details are presented of the focal plane, throughput of the telescope, and the point spread function (image of an unresolved point source). Some detailed simulations are available of this, which might be useful to observers in planning their observations and in reducing their data
Exploring the value of supporting multiple DSM protocols in Hardware DSM Controllers
Journal ArticleThe performance of a hardware distributed shared memory (DSM) system is largely dependent on its architect's ability to reduce the number of remote memory misses that occur. Previous attempts to solve this problem have included measures such as supporting both the CC-NUMA and S-COMA architectures is the same machine and providing a programmable DSM controller that can emulate any DSM mechanism. In this paper we first present the design of a DSM controller that supports multiple DSM protocols in custom hardware, and allows the programmer or compiler to specify on a per-variable basis what protocol to use to keep that variable coherent. This simulated performance of this DSM controller compares favorably with that of conventional single-protocol custom hardware designs, often outperforming the conventional systems by a factor of two. To achieve these promising results, that multi-protocol DSM controller needed to support only two DSM architectures (CC-NUMA and S-COMA) and three coherency protocols (both release and sequentially consistent write invalidate and release consistent write update). This work demonstrates the value of supporting a degree of flexibility in one's DSM controller design and suggests what operations such a flexible DSM controller should support
Recommended from our members
Severity of Visual Field Loss at First Presentation to Glaucoma Clinics in England and Tanzania
Purpose: To compare severity of visual field (VF) loss at first presentation in glaucoma clinics in England and Tanzania.
Methods: Large archives of VF records from automated perimetry were used to retrospectively examine vision loss at first presentation in glaucoma clinics in Tanzania (N = 1,502) and England (N = 9,264). Mean deviation (MD) of the worse eye at the first hospital visit was used as an estimate of detectable VF loss severity.
Results: In Tanzania, 44.7% {CI95%: 42.2, 47.2} of patients presented with severe VF loss (< −20 dB), versus 4.6% {4.1, 5.0} in England. If we consider late presentation to also include cases of advanced loss (-12.01 dB to -20 dB), then the proportion of patients presenting late was 58.1% {55.6, 60.6} and 14.0% {13.3, 14.7}, respectively. The proportion of late presentations was greater in Tanzania at all ages, but the difference was particularly pronounced among working-age adults, with 50.3% {46.9, 53.7} of 18–65-year-olds presenting with advanced or severe VF loss, versus 10.2% {9.3, 11.3} in England. In both countries, men were more likely to present late than women.
Conclusions: Late presentation of glaucoma is a problem in England, and an even greater challenge in Tanzania. Possible solutions are discussed, including increased community eye-care, and a more proactive approach to case finding through the use of disruptive new technologies, such as low-cost, portable diagnostic aids
Spin-scanning Cameras for Planetary Exploration: Imager Analysis and Simulation
In this thesis, a novel approach to spaceborne imaging is investigated, building upon the scan imaging technique in which camera motion is used to construct an image. This thesis investigates its use with wide-angle (≥90° field of view) optics mounted on spin stabilised probes for large-coverage imaging of planetary environments, and focusses on two instruments. Firstly, a descent camera concept for a planetary penetrator. The imaging geometry of the instrument is analysed. Image resolution is highest at the penetrator’s nadir and lowest at the horizon, whilst any point on the surface is imaged with highest possible resolution when the camera’s altitude is equal to that point’s radius from nadir. Image simulation is used to demonstrate the camera’s images and investigate analysis techniques. A study of stereophotogrammetric measurement of surface topography using pairs of descent images is conducted. Measurement accuracies and optimum stereo geometries are presented. Secondly, the thesis investigates the EnVisS (Entire Visible Sky) instrument, under development for the Comet Interceptor mission. The camera’s imaging geometry, coverage and exposure times are calculated, and used to model the expected signal and noise in EnVisS observations. It is found that the camera’s images will suffer from low signal, and four methods for mitigating this – binning, coaddition, time-delay integration and repeat sampling – are investigated and described. Use of these methods will be essential if images of sufficient signal are to be acquired, particularly for conducting polarimetry, the performance of which is modelled using Monte Carlo simulation. Methods of simulating planetary cameras’ images are developed to facilitate the study of both cameras. These methods enable the accurate simulation of planetary surfaces and cometary atmospheres, are based on Python libraries commonly used in planetary science, and are intended to be readily modified and expanded for facilitating the study of a variety of planetary cameras
Can biological quantum networks solve NP-hard problems?
There is a widespread view that the human brain is so complex that it cannot
be efficiently simulated by universal Turing machines. During the last decades
the question has therefore been raised whether we need to consider quantum
effects to explain the imagined cognitive power of a conscious mind.
This paper presents a personal view of several fields of philosophy and
computational neurobiology in an attempt to suggest a realistic picture of how
the brain might work as a basis for perception, consciousness and cognition.
The purpose is to be able to identify and evaluate instances where quantum
effects might play a significant role in cognitive processes.
Not surprisingly, the conclusion is that quantum-enhanced cognition and
intelligence are very unlikely to be found in biological brains. Quantum
effects may certainly influence the functionality of various components and
signalling pathways at the molecular level in the brain network, like ion
ports, synapses, sensors, and enzymes. This might evidently influence the
functionality of some nodes and perhaps even the overall intelligence of the
brain network, but hardly give it any dramatically enhanced functionality. So,
the conclusion is that biological quantum networks can only approximately solve
small instances of NP-hard problems.
On the other hand, artificial intelligence and machine learning implemented
in complex dynamical systems based on genuine quantum networks can certainly be
expected to show enhanced performance and quantum advantage compared with
classical networks. Nevertheless, even quantum networks can only be expected to
efficiently solve NP-hard problems approximately. In the end it is a question
of precision - Nature is approximate.Comment: 38 page
- …