7,969 research outputs found

    A citation analysis of the ACE2005 - 2007 proceedings, with reference to the June 2007 CORE conference and journal rankings

    Full text link
    This paper compares the CORE rankings of computing education conferences and journals to the frequency of citation of those journals and conferences in the ACE2005, 2006 and 2007 proceedings. The assumption underlying this study is that citation rates are a measure of esteem, and so there should be a positive relationship between citation rates and rankings. The CORE conference rankings appear to broadly reflect the ACE citations, but there are some inconsistencies between citation rates and the journal rankings. The paper also identifies the most commonly cited books in these ACE proceedings. Finally, in the spirit of "Quis custodiet ipsos custodes?" the paper discusses some ways in which the CORE rankings process itself might in future be made more transparent and open to scholarly discourse. © 2008, Australian Computer Society, Inc

    Variation in students' conceptions of object-oriented information system development

    Full text link

    Routine Crime in Exceptional Times: The Impact of the 2002 Winter Olympics on Citizen Demand for Police Services

    Get PDF
    Despite their rich theoretical and practical importance, criminologists have paid scant attention to the patterns of crime and the responses to crime during exceptional events. Throughout the world large-scale political, social, economic, cultural, and sporting events have become commonplace. Natural disasters such as blackouts, hurricanes, tornadoes, and tsunamis present similar opportunities. Such events often tax the capacities of jurisdictions to provide safety and security in response to the exceptional event, as well as to meet the “routine” public safety needs. This article examines “routine” crime as measured by calls for police service, official crime reports, and police arrests in Salt Lake City before, during, and after the 2002 Olympic Games. The analyses suggest that while a rather benign demographic among attendees and the presence of large numbers of social control agents might have been expected to decrease calls for police service for minor crime, it actually increased in Salt Lake during this period. The implications of these findings are considered for theories of routine activities, as well as systems capacity

    A citation analysis of the ICER 2005-07 proceedings

    Full text link
    This paper identifies the most commonly cited conferences, journals and books of the 43 papers within the first three ICER proceedings. A large array of conferences, journals, and books were cited. However, only a small set of journals and conferences were cited frequently, and the majority were only cited within a single paper, which is consistent with a power law distribution, as predicted by Zipf's Law. The most commonly cited books are concerned with education in general (29%) or psychology (20%), while 17% of books are concerned with computer science education and 12% with computing content. The citation results for ICER are contrasted with earlier published citation analyses of SIGCSE 2007 and ACE2005-07. © 2009, Australian Computer Society, Inc

    A citation analysis of the ACSC 2006 - 2008 proceedings, with reference to the CORE conference and journal rankings

    Full text link
    This paper compares the CORE rankings of computing conferences and journals to the frequency of citation of those journals and conferences in the Australasian Computer Science Conference (ACSC) 2006, 2007 and 2008 proceedings. The assumption underlying this study is that there should be a positive relationship between citation rates and the CORE rankings. Our analysis shows that the CORE rankings broadly reflect the ACSC citations, but with some anomalies. While these anomalies might be minor in the larger scheme of things, anomalies need to be addressed, as the careers of individual academics may depend upon it. Rankings are probably here to stay, and this paper ends with some suggestions on how the rankings process should now evolve, so that it becomes more transparent. Copyright © 2009, Australian Computer Society, Inc

    Determination of the Joint Confidence Region of Optimal Operating Conditions in Robust Design by Bootstrap Technique

    Full text link
    Robust design has been widely recognized as a leading method in reducing variability and improving quality. Most of the engineering statistics literature mainly focuses on finding "point estimates" of the optimum operating conditions for robust design. Various procedures for calculating point estimates of the optimum operating conditions are considered. Although this point estimation procedure is important for continuous quality improvement, the immediate question is "how accurate are these optimum operating conditions?" The answer for this is to consider interval estimation for a single variable or joint confidence regions for multiple variables. In this paper, with the help of the bootstrap technique, we develop procedures for obtaining joint "confidence regions" for the optimum operating conditions. Two different procedures using Bonferroni and multivariate normal approximation are introduced. The proposed methods are illustrated and substantiated using a numerical example.Comment: Two tables, Three figure

    Application of Bayesian model averaging to measurements of the primordial power spectrum

    Get PDF
    Cosmological parameter uncertainties are often stated assuming a particular model, neglecting the model uncertainty, even when Bayesian model selection is unable to identify a conclusive best model. Bayesian model averaging is a method for assessing parameter uncertainties in situations where there is also uncertainty in the underlying model. We apply model averaging to the estimation of the parameters associated with the primordial power spectra of curvature and tensor perturbations. We use CosmoNest and MultiNest to compute the model Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR, BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find that the model-averaged 95% credible interval for the spectral index using all of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale 0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper limit, depending on prior assumptions.Comment: 7 pages with 7 figures include

    Kepler-539: a young extrasolar system with two giant planets on wide orbits and in gravitational interaction

    Get PDF
    We confirm the planetary nature of Kepler-539b (aka Kepler object of interest K00372.01), a giant transiting exoplanet orbiting a solar-analogue G2 V star. The mass of Kepler-539b was accurately derived thanks to a series of precise radial velocity measurements obtained with the CAFE spectrograph mounted on the CAHA 2.2m telescope. A simultaneous fit of the radial-velocity data and Kepler photometry revealed that Kepler-539b is a dense Jupiter-like planet with a mass of Mp = 0.97 Mjup and a radius of Rp = 0.747 Rjup, making a complete circular revolution around its parent star in 125.6 days. The semi-major axis of the orbit is roughly 0.5 au, implying that the planet is at roughly 0.45 au from the habitable zone. By analysing the mid-transit times of the 12 transit events of Kepler-539b recorded by the Kepler spacecraft, we found a clear modulated transit time variation (TTV), which is attributable to the presence of a planet c in a wider orbit. The few timings available do not allow us to precisely estimate the properties of Kepler-539c and our analysis suggests that it has a mass between 1.2 and 3.6 Mjup, revolving on a very eccentric orbit (0.4<e<0.6) with a period larger than 1000 days. The high eccentricity of planet c is the probable cause of the TTV modulation of planet b. The analysis of the CAFE spectra revealed a relatively high photospheric lithium content, A(Li)=2.48 dex, which, together with both a gyrochronological and isochronal analysis, suggests that the parent star is relatively young.Comment: 11 pages, 14 figures, accepted for publication in Astronomy & Astrophysic

    A Bayesian spatio-temporal model of panel design data: airborne particle number concentration in Brisbane, Australia

    Get PDF
    This paper outlines a methodology for semi-parametric spatio-temporal modelling of data which is dense in time but sparse in space, obtained from a split panel design, the most feasible approach to covering space and time with limited equipment. The data are hourly averaged particle number concentration (PNC) and were collected, as part of the Ultrafine Particles from Transport Emissions and Child Health (UPTECH) project. Two weeks of continuous measurements were taken at each of a number of government primary schools in the Brisbane Metropolitan Area. The monitoring equipment was taken to each school sequentially. The school data are augmented by data from long term monitoring stations at three locations in Brisbane, Australia. Fitting the model helps describe the spatial and temporal variability at a subset of the UPTECH schools and the long-term monitoring sites. The temporal variation is modelled hierarchically with penalised random walk terms, one common to all sites and a term accounting for the remaining temporal trend at each site. Parameter estimates and their uncertainty are computed in a computationally efficient approximate Bayesian inference environment, R-INLA. The temporal part of the model explains daily and weekly cycles in PNC at the schools, which can be used to estimate the exposure of school children to ultrafine particles (UFPs) emitted by vehicles. At each school and long-term monitoring site, peaks in PNC can be attributed to the morning and afternoon rush hour traffic and new particle formation events. The spatial component of the model describes the school to school variation in mean PNC at each school and within each school ground. It is shown how the spatial model can be expanded to identify spatial patterns at the city scale with the inclusion of more spatial locations.Comment: Draft of this paper presented at ISBA 2012 as poster, part of UPTECH projec

    Minimum Requirements for Detecting a Stochastic Gravitational Wave Background Using Pulsars

    Full text link
    We assess the detectability of a nanohertz gravitational wave (GW) background with respect to additive red and white noise in the timing of millisecond pulsars. We develop detection criteria based on the cross-correlation function summed over pulsar pairs in a pulsar timing array. The distribution of correlation amplitudes is found to be non-Gaussian and highly skewed, which significantly influences detection and false-alarm probabilities. When only white noise and GWs contribute, our detection results are consistent with those found by others. Red noise, however, drastically alters the results. We discuss methods to meet the challenge of GW detection ("climbing mount significance") by distinguishing between GW-dominated and red or white-noise limited regimes. We characterize detection regimes by evaluating the number of millisecond pulsars that must be monitored in a high-cadence, 5-year timing program for a GW background spectrum hc(f)=Af2/3h_c(f) = A f^{-2/3} with A=1015A = 10^{-15} yr2/3^{-2/3}. Unless a sample of 20 super-stable millisecond pulsars can be found --- those with timing residuals from red-noise contributions σr20\sigma_r \lesssim 20 ns --- a much larger timing program on 50100\gtrsim 50 - 100 MSPs will be needed. For other values of AA, the constraint is σr20ns(A/1015yr2/3)\sigma_r \lesssim 20 {\rm ns} (A/10^{-15} {\rm yr}^{-2/3}). Identification of suitable MSPs itself requires an aggressive survey campaign followed by characterization of the level of spin noise in the timing residuals of each object. The search and timing programs will likely require substantial fractions of time on new array telescopes in the southern hemisphere as well as on existing ones.Comment: Submitted to the Astrophysical Journa
    corecore