2,506 research outputs found

    A Bi-Directional Approach for Developing Data Warehouses in Public Sectors

    Get PDF
    Data warehouse is proclaimed as the latest decision support technology. As data warehouses require a significant amount of organizational resources to develop, more research have been devoted to identifying the critical success factors and the formulas for assured investment return from data warehouses. This study proposes a bi-directional development approach for data warehouses in public sectors. The primary rationale for the proposed approach is the fundamentally different organizational goals of public sector organizations from private sector organizations. Whereas the ultimate goal of private sector organizations is profit making, public sector organizations have a set of conflicting goals including different social and political objectives. The star schema as a dimensional data model for data warehouse is not totally suitable for data warehouses that demand the analyses of both quantitative and qualitative measures. Using the data warehouse in the College of Business Administration at the California State University, Sacramento as a case study, we illustrate how the QQ (Quantitative and Qualitative) data schema accommodates the need of capturing both quantitative and qualitative information. In addition, we show the bidirectional top-down/bottom-up initiative, the formal/informal information collection, and the enterprise data warehouse/subject data mart architecture for the data warehouse

    The Effects of Information Load on Decision Making In a Decision Support Environment

    Get PDF
    The conflicting results of previous studies examining DSS effectiveness suggest that other factors may be affecting a user’s ability to process information. Several research studies in the marketing, accounting and psychology disciplines have examined the effects information load has on decision quality involving manual decision making tasks. Their results strongly indicate that decision-makers working under increased loads of information beyond an optimal point perform poorly or render poorer decisions. This study examines the relationship between information load and decision quality in a DSS (computer-aided problem solving) environment. The results suggest that in spite of information technology’ s support, information load can affect a user’s decisions

    A Comparative Analysis of Manual and Computer-Aided Ranking Tasks for Curriculum Development

    Get PDF
    Inconsistencies in judgement during a manual ranking task can prevent the clear identification of underlying (ranking) policy. AHP (analytical hierarchy process) provides an alternative to overcoming this problem. This study examines these methods in the context of IS curriculum development for their ability to accurately capture the policies of twenty-eight judges. A cluster analysis based on the rankings identifies their underlying policies, and thereby suggests the core courses for the curriculum. The results demonstrate the AHP’s ability to capture more consistent ranking policies, and thereby produce clusters of higher predictive quality

    Integrating the IT/IS Professional Community with IT/IS Academic Programs

    Get PDF
    Developing a successful IT/IS curriculum requires departments to understand the needs of their constituents, organizations that hire their graduates. As many recent studies have revealed, the success of an IT/IS graduate rests on the possession of both non-technical and technical skills. Furthermore, a greater understanding of how IT can be applied to solving organizational problems is sought. This study presents the findings of a recent national survey that asked respondents to rank the importance of certain skills and academic/profession community involvement. The results suggest IT/IS curricula should emphasize developing professional skills, such as work ethic, problem solving, and oral and team communication skills in students. By the same token, ways should be sought to integrate professional experiences into curricula for developing these skills

    Polarization Aberration in Astronomical Telescopes

    Get PDF
    The point spread function (PSF) for astronomical telescopes and instruments depends not only on geometric aberrations and scalar wave diffraction, but also on the apodization and wavefront errors introduced by coatings on reflecting and transmitting surfaces within the optical system. The functional form of these aberrations, called polarization aberrations, result from the angles of incidence and the variations of the coatings as a function of angle. These coatings induce small modifications to the PSF, which consists of four separate components, two nearly Airy-disk PSF components, and two faint components, we call ghost PSF components, with a spatial extent about twice the size of the diffraction limited image. As the specifications of optical systems constantly improve, these small effects become increasingly important. It is shown how the magnitude of these ghost PSF components, at ~10^(-5) in the example telescope, can interfere with exoplanet detection with coronagraphs

    Polarization Aberrations in Astronomical Telescopes: The Point Spread Function

    Get PDF
    Detailed knowledge of the image of the point spread function (PSF) is necessary to optimize astronomical coronagraph masks and to understand potential sources of errors in astrometric measurements. The PSF for astronomical telescopes and instruments depends not only on geometric aberrations and scalar wave diffraction but also on those wavefront errors introduced by the physical optics and the polarization properties of reflecting and transmitting surfaces within the optical system. These vector wave aberrations, called polarization aberrations, result from two sources: (1) the mirror coatings necessary to make the highly reflecting mirror surfaces, and (2) the optical prescription with its inevitable non-normal incidence of rays on reflecting surfaces. The purpose of this article is to characterize the importance of polarization aberrations, to describe the analytical tools to calculate the PSF image, and to provide the background to understand how astronomical image data may be affected. To show the order of magnitude of the effects of polarization aberrations on astronomical images, a generic astronomical telescope configuration is analyzed here by modeling a fast Cassegrain telescope followed by a single 90° deviation fold mirror. All mirrors in this example use bare aluminum reflective coatings and the illumination wavelength is 800 nm. Our findings for this example telescope are: (1) The image plane irradiance distribution is the linear superposition of four PSF images: one for each of the two orthogonal polarizations and one for each of two cross-coupled polarization terms. (2) The PSF image is brighter by 9% for one polarization component compared to its orthogonal state. (3) The PSF images for two orthogonal linearly polarization components are shifted with respect to each other, causing the PSF image for unpolarized point sources to become slightly elongated (elliptical) with a centroid separation of about 0.6 mas. This is important for both astrometry and coronagraph applications. (4) Part of the aberration is a polarization-dependent astigmatism, with a magnitude of 22 milliwaves, which enlarges the PSF image. (5) The orthogonally polarized components of unpolarized sources contain different wavefront aberrations, which differ by approximately 32 milliwaves. This implies that a wavefront correction system cannot optimally correct the aberrations for all polarizations simultaneously. (6) The polarization aberrations couple small parts of each polarization component of the light (∌10^(-4)) into the orthogonal polarization where these components cause highly distorted secondary, or “ghost” PSF images. (7) The radius of the spatial extent of the 90% encircled energy of these two ghost PSF image is twice as large as the radius of the Airy diffraction pattern. Coronagraphs for terrestrial exoplanet science are expected to image objects 10^(-10), or 6 orders of magnitude less than the intensity of the instrument-induced “ghost” PSF image, which will interfere with exoplanet measurements. A polarization aberration expansion which approximates the Jones pupil of the example telescope in six polarization terms is presented in the appendix. Individual terms can be associated with particular polarization defects. The dependence of these terms on angles of incidence, numerical aperture, and the Taylor series representation of the Fresnel equations lead to algebraic relations between these parameters and the scaling of the polarization aberrations. These “design rules” applicable to the example telescope are collected in § 5. Currently, exoplanet coronagraph masks are designed and optimized for scalar diffraction in optical systems. Radiation from the “ghost” PSF image leaks around currently designed image plane masks. Here, we show a vector-wave or polarization optimization is recommended. These effects follow from a natural description of the optical system in terms of the Jones matrices associated with each ray path of interest. The importance of these effects varies by orders of magnitude between different optical systems, depending on the optical design and coatings selected. Some of these effects can be calibrated while others are more problematic. Polarization aberration mitigation methods and technologies to minimize these effects are discussed. These effects have important implications for high-contrast imaging, coronagraphy, and astrometry with their stringent PSF image symmetry and scattered light requirements

    Investigating Variation in the Prevalence of Weathering in Faunal Assemblages in the UK: A Multivariate Statistical Approach

    Get PDF
    This article presents an exploratory multivariate statistical approach to gaining a more comprehensive understanding of variation in subaerial bone weathering in a British context. Weathering is among the most common taphonomic modifications and provides a crucial line of evidence for reconstructing the taphonomic trajectories of faunal assemblages and archaeological deposits. It provides clear evidence for prolonged subaerial exposure either before deposition in a context or because of later disturbance. In combination with other taphonomic indices such as gnawing, trampling, abrasion and fracture patterns, weathering can be used to reconstruct depositional histories and to investigate the structured treatment of different body parts or taxa in deposition. However, a broad range of factors affect the prevalence and severity of weathering, and therefore patterns can rarely be interpreted at face value. Many variables such as predepositional microenvironment cannot be traced archaeologically. Other contributory factors pertaining to the structural properties of elements and taxa can be discerned and must be taken into account in interpreting weathering signatures. However, disagreement exists regarding which variables are most important in mediating weathering. In addition for zooarchaeologists to interpret modification patterns, it is necessary for elements and taxa that are most likely to be affected by weathering to be defined. This is the case as deposits that are dominated by those classes of remains are likely to exhibit greater modification than those that are not, even if depositional histories were similar. Through a combination of classification tree and ordinal regression analysis, this article identifies which archaeologically recoverable variables explain the greatest variance in weathering and which anatomical elements and taxa are most likely to be affected in archaeological deposits in the UK

    Direct maximum parsimony phylogeny reconstruction from genotype data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Maximum parsimony phylogenetic tree reconstruction from genetic variation data is a fundamental problem in computational genetics with many practical applications in population genetics, whole genome analysis, and the search for genetic predictors of disease. Efficient methods are available for reconstruction of maximum parsimony trees from haplotype data, but such data are difficult to determine directly for autosomal DNA. Data more commonly is available in the form of genotypes, which consist of conflated combinations of pairs of haplotypes from homologous chromosomes. Currently, there are no general algorithms for the direct reconstruction of maximum parsimony phylogenies from genotype data. Hence phylogenetic applications for autosomal data must therefore rely on other methods for first computationally inferring haplotypes from genotypes.</p> <p>Results</p> <p>In this work, we develop the first practical method for computing maximum parsimony phylogenies directly from genotype data. We show that the standard practice of first inferring haplotypes from genotypes and then reconstructing a phylogeny on the haplotypes often substantially overestimates phylogeny size. As an immediate application, our method can be used to determine the minimum number of mutations required to explain a given set of observed genotypes.</p> <p>Conclusion</p> <p>Phylogeny reconstruction directly from unphased data is computationally feasible for moderate-sized problem instances and can lead to substantially more accurate tree size inferences than the standard practice of treating phasing and phylogeny construction as two separate analysis stages. The difference between the approaches is particularly important for downstream applications that require a lower-bound on the number of mutations that the genetic region has undergone.</p
    • 

    corecore