77 research outputs found

    An Analysis of the State of Public Archaeology in Canadian Public School Curricula

    Get PDF
    Public support of archaeology is required to have effective heritage legislation and the prevention of site vandalism and looting. One of the best ways for the public to understand the importance of archaeology and heritage conservation is through school-aged education. This paper examines the nature and extent to which archaeology is covered in Canadian public school curricula. To determine the extent of archaeological material in school curricula, Social Studies curricula from each province and territory are examined and critically evaluated. This analysis indicates that archaeology is not often taught in curricula, and when it is taught, lacks a Canadian focus. For further evaluation, these findings are compared to the guidelines developed by the Canadian Archaeological Association (CAA), to determine if its expectations for students\u27 achievement in archaeology are appropriate and are being met. This research emphasizes the gap between CAA guidelines and Canadian curricula and pinpoints what is lacking in school-aged archaeological education in Canada. &nbsp

    Towards an Evaluation-Based Framework of Collaborative Archaeology

    Get PDF
    Collaborative archaeology is a growing field within the discipline, albeit one that is rarely analyzed. Although collaborative approaches are varied and diverse, we argue that they can all share a single methodological framework. Moreover, we suggest that collaborative archaeology projects can be evaluated to determine the variety among projects and to identify the elements of engaged research. We provide two case studies emphasizing project evaluation: (1) inter-project evaluation of community-engagement in British Columbia archaeology and (2) intra-project evaluation of co-management archaeology projects in Western Australia. The two case studies highlight that project evaluation is possible and that a single framework can be applied to many different types of projects. Collaborative archaeology requires analysis and evaluation to determine what facilitates engagement to further the discipline and to create better connections between archaeologists and community members. The discussed case studies illustrate two shared methods for accomplishing this. The paper argues that collaborative approaches are necessary for advancing archaeological practice

    Photometric Redshift Probability Distributions for Galaxies in the SDSS DR8

    Full text link
    We present redshift probability distributions for galaxies in the SDSS DR8 imaging data. We used the nearest-neighbor weighting algorithm presented in Lima et al. 2008 and Cunha et al. 2009 to derive the ensemble redshift distribution N(z), and individual redshift probability distributions P(z) for galaxies with r < 21.8. As part of this technique, we calculated weights for a set of training galaxies with known redshifts such that their density distribution in five dimensional color-magnitude space was proportional to that of the photometry-only sample, producing a nearly fair sample in that space. We then estimated the ensemble N(z) of the photometric sample by constructing a weighted histogram of the training set redshifts. We derived P(z) s for individual objects using the same technique, but limiting to training set objects from the local color-magnitude space around each photometric object. Using the P(z) for each galaxy, rather than an ensemble N(z), can reduce the statistical error in measurements that depend on the redshifts of individual galaxies. The spectroscopic training sample is substantially larger than that used for the DR7 release, and the newly added PRIMUS catalog is now the most important training set used in this analysis by a wide margin. We expect the primary source of error in the N(z) reconstruction is sample variance: the training sets are drawn from relatively small volumes of space. Using simulations we estimated the uncertainty in N(z) at a given redshift is 10-15%. The uncertainty on calculations incorporating N(z) or P(z) depends on how they are used; we discuss the case of weak lensing measurements. The P(z) catalog is publicly available from the SDSS website.Comment: 29 pages, 9 figures, single colum

    Digital Bridges Across Disciplinary, Practical and Pedagogical Divides: An Online Professional Master’s Program in Heritage Resource Management

    Get PDF
    Growth and diversification in heritage resource management (HRM) archaeology since the 1960s have created new demands for training the next generations of HRM leaders and for addressing persistent and counterproductive divisions between academic and applied archaeologies. The Simon Fraser University Department of Archaeology (SFU) has responded to these demands with an all-new, cohort-based, thesis-focused graduate program created by and for HRM professionals. The program’s target audience is HRM practitioners who hold Bachelor’s credentials, have initiated promising careers in HRM, and desire advanced, research-focused degrees to enable their professional capacity and upward mobility. The SFU program is structured and focused to provide intensive, predominantly online training in the four essential dimensions of HRM: law and policy, ethics and practice, business management, and research design and methods. The program has been successful through initial cohort cycles and in attracting HRM industry interest in collaboration. Industry-academic partnerships in cognate disciplines have proved effective in comparable circumstances but remain underdeveloped as bases for planning and delivering state-of-the-art training in applied archaeology and the broader field of HRM. Critical next steps in program development entail the identification of attributes of HRM futures desired by all or most HRM stakeholders and the collaborative pursuit of those desired futures

    A Simple Likelihood Method for Quasar Target Selection

    Full text link
    We present a new method for quasar target selection using photometric fluxes and a Bayesian probabilistic approach. For our purposes we target quasars using Sloan Digital Sky Survey (SDSS) photometry to a magnitude limit of g=22. The efficiency and completeness of this technique is measured using the Baryon Oscillation Spectroscopic Survey (BOSS) data, taken in 2010. This technique was used for the uniformly selected (CORE) sample of targets in BOSS year one spectroscopy to be realized in the 9th SDSS data release. When targeting at a density of 40 objects per sq-deg (the BOSS quasar targeting density) the efficiency of this technique in recovering z>2.2 quasars is 40%. The completeness compared to all quasars identified in BOSS data is 65%. This paper also describes possible extensions and improvements for this techniqueComment: Updated to accepted version for publication in the Astrophysical Journal. 10 pages, 10 figures, 3 table

    Precision Measurements of the Cluster Red Sequence using an Error Corrected Gaussian Mixture Model

    Full text link
    The red sequence is an important feature of galaxy clusters and plays a crucial role in optical cluster detection. Measurement of the slope and scatter of the red sequence are affected both by selection of red sequence galaxies and measurement errors. In this paper, we describe a new error corrected Gaussian Mixture Model for red sequence galaxy identification. Using this technique, we can remove the effects of measurement error and extract unbiased information about the intrinsic properties of the red sequence. We use this method to select red sequence galaxies in each of the 13,823 clusters in the maxBCG catalog, and measure the red sequence ridgeline location and scatter of each. These measurements provide precise constraints on the variation of the average red galaxy populations in the observed frame with redshift. We find that the scatter of the red sequence ridgeline increases mildly with redshift, and that the slope decreases with redshift. We also observe that the slope does not strongly depend on cluster richness. Using similar methods, we show that this behavior is mirrored in a spectroscopic sample of field galaxies, further emphasizing that ridgeline properties are independent of environment.Comment: 33 pages, 14 Figures; A typo in Eq.A11 is fixed. The C++/Python codes for ECGMM can be downloaded from: https://sites.google.com/site/jiangangecgmm

    Think Outside the Color Box: Probabilistic Target Selection and the SDSS-XDQSO Quasar Targeting Catalog

    Full text link
    We present the SDSS-XDQSO quasar targeting catalog for efficient flux-based quasar target selection down to the faint limit of the Sloan Digital Sky Survey (SDSS) catalog, even at medium redshifts (2.5 <~ z <~ 3) where the stellar contamination is significant. We build models of the distributions of stars and quasars in flux space down to the flux limit by applying the extreme-deconvolution method to estimate the underlying density. We convolve this density with the flux uncertainties when evaluating the probability that an object is a quasar. This approach results in a targeting algorithm that is more principled, more efficient, and faster than other similar methods. We apply the algorithm to derive low-redshift (z < 2.2), medium-redshift (2.2 <= z 3.5) quasar probabilities for all 160,904,060 point sources with dereddened i-band magnitude between 17.75 and 22.45 mag in the 14,555 deg^2 of imaging from SDSS Data Release 8. The catalog can be used to define a uniformly selected and efficient low- or medium-redshift quasar survey, such as that needed for the SDSS-III's Baryon Oscillation Spectroscopic Survey project. We show that the XDQSO technique performs as well as the current best photometric quasar-selection technique at low redshift, and outperforms all other flux-based methods for selecting the medium-redshift quasars of our primary interest. We make code to reproduce the XDQSO quasar target selection publicly available

    Photometric redshifts and quasar probabilities from a single, data-driven generative model

    Full text link
    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift one can obtain quasar flux-densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques---which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data---and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar--star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84 and 97 percent of the objects with GALEX UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available

    The SDSS-III Baryon Oscillation Spectroscopic Survey: Quasar Target Selection for Data Release Nine

    Full text link
    The SDSS-III Baryon Oscillation Spectroscopic Survey (BOSS), a five-year spectroscopic survey of 10,000 deg^2, achieved first light in late 2009. One of the key goals of BOSS is to measure the signature of baryon acoustic oscillations in the distribution of Ly-alpha absorption from the spectra of a sample of ~150,000 z>2.2 quasars. Along with measuring the angular diameter distance at z\approx2.5, BOSS will provide the first direct measurement of the expansion rate of the Universe at z > 2. One of the biggest challenges in achieving this goal is an efficient target selection algorithm for quasars over 2.2 < z < 3.5, where their colors overlap those of stars. During the first year of the BOSS survey, quasar target selection methods were developed and tested to meet the requirement of delivering at least 15 quasars deg^-2 in this redshift range, out of 40 targets deg^-2. To achieve these surface densities, the magnitude limit of the quasar targets was set at g <= 22.0 or r<=21.85. While detection of the BAO signature in the Ly-alpha absorption in quasar spectra does not require a uniform target selection, many other astrophysical studies do. We therefore defined a uniformly-selected subsample of 20 targets deg^-2, for which the selection efficiency is just over 50%. This "CORE" subsample will be fixed for Years Two through Five of the survey. In this paper we describe the evolution and implementation of the BOSS quasar target selection algorithms during the first two years of BOSS operations. We analyze the spectra obtained during the first year. 11,263 new z>2.2 quasars were spectroscopically confirmed by BOSS. Our current algorithms select an average of 15 z > 2.2 quasars deg^-2 from 40 targets deg^-2 using single-epoch SDSS imaging. Multi-epoch optical data and data at other wavelengths can further improve the efficiency and completeness of BOSS quasar target selection. [Abridged]Comment: 33 pages, 26 figures, 12 tables and a whole bunch of quasars. Submitted to Ap

    GREAT3 results I: systematic errors in shear estimation and the impact of real galaxy morphology

    Get PDF
    We present first results from the third GRavitational lEnsing Accuracy Testing (GREAT3) challenge, the third in a sequence of challenges for testing methods of inferring weak gravitational lensing shear distortions from simulated galaxy images. GREAT3 was divided into experiments to test three specific questions, and included simulated space- and ground-based data with constant or cosmologically-varying shear fields. The simplest (control) experiment included parametric galaxies with a realistic distribution of signal-to-noise, size, and ellipticity, and a complex point spread function (PSF). The other experiments tested the additional impact of realistic galaxy morphology, multiple exposure imaging, and the uncertainty about a spatially-varying PSF; the last two questions will be explored in Paper II. The 24 participating teams competed to estimate lensing shears to within systematic error tolerances for upcoming Stage-IV dark energy surveys, making 1525 submissions overall. GREAT3 saw considerable variety and innovation in the types of methods applied. Several teams now meet or exceed the targets in many of the tests conducted (to within the statistical errors). We conclude that the presence of realistic galaxy morphology in simulations changes shear calibration biases by 1\sim 1 per cent for a wide range of methods. Other effects such as truncation biases due to finite galaxy postage stamps, and the impact of galaxy type as measured by the S\'{e}rsic index, are quantified for the first time. Our results generalize previous studies regarding sensitivities to galaxy size and signal-to-noise, and to PSF properties such as seeing and defocus. Almost all methods' results support the simple model in which additive shear biases depend linearly on PSF ellipticity.Comment: 32 pages + 15 pages of technical appendices; 28 figures; submitted to MNRAS; latest version has minor updates in presentation of 4 figures, no changes in content or conclusion
    corecore