1,508 research outputs found

    Emergent requirements for supporting introductory programming

    Get PDF
    The problems associated with learning and teaching first year University Computer Science (CS1) programming classes are summarized showing that various support tools and techniques have been developed and evaluated. From this review of applicable support the paper derives ten requirements that a support tool should have in order to improve CS1 student success rate with respect to learning and understanding

    New Zealand's Approach to Parallel Imported Goods and the Copyright Amendment Act 1998

    Get PDF
    The prohibition against parallel importing has attracted growing international attention, especially given the global trend towards freeing up international trade. This article examines New Zealand's new statutory provisions relating to parallel importing following the Copyright Amendment Act 1998. This Act legalises parallel importing in New Zealand in given circumstances. This article critically assesses the economic arguments for parallel importing to analyse whether these support the rationale behind the amendment. It then questions whether the potential defects in the preamendment law were sufficiently severe to warrant the fundamental policy reversal effected by the Act, and discusses alternative avenues of reform and whether these are viable in New Zealand. Underlying the discussion is a consideration of whether the same law should apply to all types of goods, and whether New Zealand's approach should focus more rigorously on the varying characteristics of individual markets

    Looking Deeper at the Reform of New Zealand's Roading System: Driven by Economic Rationalism or an Example of Public Choice Theory in Action

    Get PDF
    Successive New Zealand governments have investigated reforming New Zealand's road network since 1994. The draft Roads Bill 1998 creates a system of road provision operated according to commercial principles. Roads remain publicly owned, but operational management passes from Transit and 74 territorial authorities to newly incorporated "public road companies". The reforms are examined from "a law and economics" perspective, which finds that while efficiency gains may be generated, the proposed institutional design may prevent these gains from being realised to the proposed extent. The reform proposal is then analysed using public choice theory. This analysis finds that while some elements of the proposal are consistent with legislator and bureaucrat self interest, it contains too many politically sensitive variables to displace the Government's stated "efficiency gains" premise as the overriding motivation for reform. &nbsp

    General Design Bayesian Generalized Linear Mixed Models

    Get PDF
    Linear mixed models are able to handle an extraordinary range of complications in regression-type analyses. Their most common use is to account for within-subject correlation in longitudinal data analysis. They are also the standard vehicle for smoothing spatial count data. However, when treated in full generality, mixed models can also handle spline-type smoothing and closely approximate kriging. This allows for nonparametric regression models (e.g., additive models and varying coefficient models) to be handled within the mixed model framework. The key is to allow the random effects design matrix to have general structure; hence our label general design. For continuous response data, particularly when Gaussianity of the response is reasonably assumed, computation is now quite mature and supported by the R, SAS and S-PLUS packages. Such is not the case for binary and count responses, where generalized linear mixed models (GLMMs) are required, but are hindered by the presence of intractable multivariate integrals. Software known to us supports special cases of the GLMM (e.g., PROC NLMIXED in SAS or glmmML in R) or relies on the sometimes crude Laplace-type approximation of integrals (e.g., the SAS macro glimmix or glmmPQL in R). This paper describes the fitting of general design generalized linear mixed models. A Bayesian approach is taken and Markov chain Monte Carlo (MCMC) is used for estimation and inference. In this generalized setting, MCMC requires sampling from nonstandard distributions. In this article, we demonstrate that the MCMC package WinBUGS facilitates sound fitting of general design Bayesian generalized linear mixed models in practice.Comment: Published at http://dx.doi.org/10.1214/088342306000000015 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Ordinal Probit Functional Regression Models with Application to Computer-Use Behavior in Rhesus Monkeys

    Full text link
    Research in functional regression has made great strides in expanding to non-Gaussian functional outcomes, however the exploration of ordinal functional outcomes remains limited. Motivated by a study of computer-use behavior in rhesus macaques (\emph{Macaca mulatta}), we introduce the Ordinal Probit Functional Regression Model or OPFRM to perform ordinal function-on-scalar regression. The OPFRM is flexibly formulated to allow for the choice of different basis functions including penalized B-splines, wavelets, and O'Sullivan splines. We demonstrate the operating characteristics of the model in simulation using a variety of underlying covariance patterns showing the model performs reasonably well in estimation under multiple basis functions. We also present and compare two approaches for conducting posterior inference showing that joint credible intervals tend to out perform point-wise credible. Finally, in application, we determine demographic factors associated with the monkeys' computer use over the course of a year and provide a brief analysis of the findings

    Measurement error caused by spatial misalignment in environmental epidemiology

    Get PDF
    Copyright @ 2009 Gryparis et al - Published by Oxford University Press.In many environmental epidemiology studies, the locations and/or times of exposure measurements and health assessments do not match. In such settings, health effects analyses often use the predictions from an exposure model as a covariate in a regression model. Such exposure predictions contain some measurement error as the predicted values do not equal the true exposures. We provide a framework for spatial measurement error modeling, showing that smoothing induces a Berkson-type measurement error with nondiagonal error structure. From this viewpoint, we review the existing approaches to estimation in a linear regression health model, including direct use of the spatial predictions and exposure simulation, and explore some modified approaches, including Bayesian models and out-of-sample regression calibration, motivated by measurement error principles. We then extend this work to the generalized linear model framework for health outcomes. Based on analytical considerations and simulation results, we compare the performance of all these approaches under several spatial models for exposure. Our comparisons underscore several important points. First, exposure simulation can perform very poorly under certain realistic scenarios. Second, the relative performance of the different methods depends on the nature of the underlying exposure surface. Third, traditional measurement error concepts can help to explain the relative practical performance of the different methods. We apply the methods to data on the association between levels of particulate matter and birth weight in the greater Boston area.This research was supported by NIEHS grants ES012044 (AG, BAC), ES009825 (JS, BAC), ES007142 (CJP), and ES000002 (CJP), and EPA grant R-832416 (JS, BAC)

    Targeted Derepression of the Human Immunodeficiency Virus Type 1 Long Terminal Repeat by Pyrrole-Imidazole Polyamides

    Get PDF
    The host factor LSF represses the human immunodeficiency virus type 1 long terminal repeat (LTR) by mediating recruitment of histone deacetylase. We show that pyrrole-imidazole polyamides targeted to the LTR can specifically block LSF binding both in vitro and within cells via direct access to chromatin, resulting in increased LTR expression

    Five hole probe errors caused by fluctuating incidence

    Get PDF
    Steady multi-hole pressure probes are used extensively in turbomachinery research. While various sources of error are known, this paper demonstrates that fluctuations in probe incidence can be particularly damaging for accuracy. A simple, quasi-steady model of five-hole-probe response explains why angle fluctuations can cause large errors in the indicated total and static pressure. The model explains why measurements in a shedding wake over-estimated loss by 40%. Simulated traverses behind rotors show similar behavior: fluctuating incidence causes efficiency to be under-estimated by over 1% in some cases. The model can correct five-hole-probe errors using an estimate of unsteady flow angles. This approach reduces errors by an order of magnitude and can be used to post-correct existing test data
    corecore