9,717 research outputs found

    Experiences with the Greenstone digital library software for international development

    Get PDF
    Greenstone is a versatile open source multilingual digital library environment, emerging from research on text compression within the New Zealand Digital Library Research Project in the Department of Computer Science at the University of Waikato. In 1997 we began to work with Human Info NGO to help them produce fully-searchable CD-ROM collections of humanitarian information. The software has since evolved to support a variety of application contexts. Rather than being simply a delivery mechanism, we have emphasised the empowerment of users to create and distribute their own digital collections

    Effect of prestressed fibers upon the response of composite materials

    Get PDF
    Prestressing materials in order to improve structural characteristics is a common engineering practice. Probably the most evident case is the use of prestressed concrete. This class of material is utilized in situations where the structure is loaded in tension. The prestress is obtained by using steel wires which are loaded in tension prior to the curing of the concrete. When the load is released, the brittle concrete is compressed, allowing for the superposition of externally applied tension. An analog to prestressed concrete has been developed by the author for use with advanced composite materials. However, the goals of this new method of composite fabrication are different than with concrete. The difference in thermal expansion coefficients of the matrix and fibers as well as a large change in temperature following cure result in three dimensional residual stresses. Applying an external load to the fibers during the cure cycle is seen as a means of both mitigating these stresses as well as prescribing a greater degree of fiber linearity within the composite. The effect of applying stresses to the fibers prior to consolidation is determined through both mathematical and experimental techniques. A boundary value problem is posed utilizing an elasticity method based on a concentric cylinder model. This model allows for the prediction of the stress/strain state at any point away from the ends of the laminate. The results obtained from the boundary value problem are used with classical laminated plate theory in order to determine the ply stresses as a function o f fiber prestress lev e ls. The experimental procedure includes both the fabrication and mechanical testing o f prestressed laminates as w ell as comparison to data obtained from conventionally processed composite

    A novel function for the Caenorhabditis elegans torsin OOC-5 in nucleoporin localization and nuclear import.

    Get PDF
    Torsin proteins are AAA+ ATPases that localize to the endoplasmic reticular/nuclear envelope (ER/NE) lumen. A mutation that markedly impairs torsinA function causes the CNS disorder DYT1 dystonia. Abnormalities of NE membranes have been linked to torsinA loss of function and the pathogenesis of DYT1 dystonia, leading us to investigate the role of the Caenorhabditis elegans torsinA homologue OOC-5 at the NE. We report a novel role for torsin in nuclear pore biology. In ooc-5-mutant germ cell nuclei, nucleoporins (Nups) were mislocalized in large plaques beginning at meiotic entry and persisted throughout meiosis. Moreover, the KASH protein ZYG-12 was mislocalized in ooc-5 gonads. Nups were mislocalized in adult intestinal nuclei and in embryos from mutant mothers. EM analysis revealed vesicle-like structures in the perinuclear space of intestinal and germ cell nuclei, similar to defects reported in torsin-mutant flies and mice. Consistent with a functional disruption of Nups, ooc-5-mutant embryos displayed impaired nuclear import kinetics, although the nuclear pore-size exclusion barrier was maintained. Our data are the first to demonstrate a requirement for a torsin for normal Nup localization and function and suggest that these functions are likely conserved

    Combining RP and SP data: Biases in using the nested logit ‘trick’ – contrasts with flexible mixed logit incorporating panel and scale effects

    Get PDF
    It has become popular practice that joint estimation of choice models that use stated preference (SP) and revealed preference (RP) data requires a way of adjusting for scale to ensure that parameter estimates across data sets are not confounded by differences in scale. The nested logit ‘trick’ presented in Hensher and Bradley (1993) continues to be widely used, especially by practitioners, to accommodate scale differences. This modelling strategy has always assumed that the observations are independent, a condition of all GEV models, which is not strictly valid within a stated preference experiment with repeated choice sets and between each SP observation and the single RP data point. This paper promotes the replacement of the NL ‘trick’ method with an error components model that can accommodate correlated observations as well as reveal the relevant scale parameter for subsets of alternatives. Such a model can also incorporate “state” or reference dependence between data types and preference heterogeneity on observed attributes. An example illustrates the difference in empirical evidence

    The implications of Willingness to Pay of Respondents Ignoring Specific Attributes

    Get PDF
    Individuals processing the information in a stated choice experiment are typically assumed to evaluate each and every attribute offered within and between alternatives, and to choose their most preferred alternative. However, it has always been thought that some attributes are ignored in this process for many reasons, including a coping strategy to handle one’s perception of the complexity of the choice task. Nonetheless, analysts typically proceed to estimate discrete choice models as if all attributes have influenced the outcome to some degree. The cognitive processes used to evaluate trade-offs are complex with boundaries often placed on the task to assist the respondent. These boundaries can include prioritising attributes and ignoring specific attributes. In this paper we investigate the implications of bounding the information processing task by attribute elimination through ignoring one or more attributes. Using a sample of car commuters in Sydney we estimate mixed logit models that assume all attributes are candidate contributors, and models that assume certain attributes are ignored, the latter based on supplementary information provided by respondents. We compare the value of travel time savings under the alternative attribute processing regimes. Assuming that all attributes are not ignored and duly processed, leads to estimates of parameters which produce significantly different willingness to pay (WTP) to that obtained when the exclusion rule is invoked

    Using Classical Inference Methods to reveal individual-specific parameter estimates to avoid the potential complexities of WTP derived from population moments

    Get PDF
    nference estimation methods for logit models with Bayesian methods and suggested that the latter are more appealing on grounds of relative simplicity in estimation and in producing individual observation parameter estimates instead of population distributions. It is argued that one particularly appealing feature of the Bayesian approach is the ability to derive individual-specific willingness to pay measures that are claimed to be less problematic than the classical approaches in terms of extreme values and signs. This paper takes a close look at this claim by deriving both population derived WTP measures and individual-specific values based on the classical ‘mixed logit’ model. We show that the population approach may undervalue the willingness to pay substantially; however individual parameters derived using conditional distributions can be obtained from classical inference methods, offering the same posterior information associated with the Bayesian view. The technique is no more difficult to apply than the Bayesian approach – indeed the individual specific estimates are a by-product of the parameter estimation process. Our results suggest that while extreme values and unexpected signs cannot be ruled out (nor can they in the Bayesian framework), the overall superiority of the Bayesian method is overstated

    Recovering costs through price and service differentiation: Accounting for exogenous information on attribute processing strategies in airline choice

    Get PDF
    The entry of low cost airlines has thrown out a challenge to all airlines to find ways of attracting passengers, through a mix of fare discounting, greater frequency, improved flight times and no-frill’s levels of on-board service. All of these competitive strategies have an impact on cost recovery. As airlines seek business in an increasingly heterogeneous passenger market, a greater understanding of what matters to potential passengers in choosing an airline grows in importance. Which attributes really do matter to specific classes of passengers? Traditional studies of passenger airline choice assume that all attributes matter, but some to a lesser extent. What happens to the empirical evidence on willingness to pay when specific attributes are totally ignored by particular passengers? In this paper, we examine the impact of individual-specific attribute processing strategies (APS) on the inclusion/exclusion of attributes on the parameter estimates and behavioural outputs of models of airline service and fare level choice. Current modelling practice assumes that whilst respondents may exhibit preference heterogeneity, they employ a homogenous APS with regards to how they process the presence/absence of attributes of stated choice (SC) experiments. We demonstrate how information collected exogenous of the SC experiment on whether respondents either ignored or considered each attribute of the SC task may be used in the estimation process, and how such information may be used to provide outputs that are APS segment specific. Accounting for the inclusion/exclusion of attributes has important implications on the willingness to pay for varying levels of service

    Soluble tau species, not neurofibrillary aggregates, disrupt neural system integration in a tau transgenic model

    Get PDF
    Neurofibrillary tangles are a feature of Alzheimer disease and other tauopathies, and while they are generally believed to be markers of neuronal pathology, there is little evidence evaluating whether tangles directly impact neuronal function. To investigate the response of cells in hippocampal circuits to complex behavioral stimuli, we used an environmental enrichment paradigm to induce expression of an immediate-early gene, Arc, in the rTg4510 mouse model of tauopathy. These mice reversibly overexpress P301L tau and exhibit substantial neurofibrillary tangle deposition, neuronal loss, and memory deficits. Employing fluorescent in situ hybridization to detect Arc mRNA, we found that rTg4510 mice have impaired hippocampal Arc expression both without stimulation and in response to environmental enrichment; this likely reflects the combination of functional impairments of existing neurons and loss of neurons. However, tangle-bearing cells were at least as likely as non-tangle-bearing neurons to exhibit Arc expression in response to enrichment. Transgene suppression with doxycycline for 6 weeks resulted in increased percentages of Arc-positive cells in rTg4510 brains compared to untreated transgenics, restoring enrichment-induced Arc mRNA levels to that of wild-type controls despite the continued presence of neurofibrillary pathology. We interpret these data to indicate that soluble tau contributes to impairment of hippocampal function, while tangles do not preclude neurons from responding in a functional circuit

    ONTOGENY OF B LYMPHOCYTES : III. H-2 LINKAGE OF A GENE CONTROLLING THE RATE OF APPEARANCE OF COMPLEMENT RECEPTOR LYMPHOCYTES

    Get PDF
    The frequency of lymphocytes bearing complement receptors in the spleens of 2-wk old mice appears to be controlled by two independent genes. The presence of a "high" allele at either locus leads to intermediate or high frequency of CRL at 2 wk of age. One of the genes controlling complement receptor lymphocyte (CRL) frequency (CRL-1) is linked to the H-2 complex. Thus, in progeny of (AKR x DBA/2)F1 x DBA/2, all mice with a low frequency of CRL at 2 wk of age are homozygous for the H-2 type of the low CRL parent (DBA/2). Furthermore, in the B10 series of congenic mice, CRL frequency at 2 wk of age is similar to the frequency in the donor of the H-2 region. Thus, C57BL/10, B10.BR, and B10-D2 mice are all of the low CRL type while B10.A mice are intermediate in CRL frequency at 2 wk. C57BR and DBA/2, the donors of the H-2 complex of the B10.BR and B10.D2, respectively, are of low CRL type while the A/WySn, the donor of the H-2 complex in the B10.A, is an intermediate CRL strain. Similarly in the A/WySn series of congenic mice, A.CA, A.SW, and A.BY are all low CRL strains while the A/WySn is intermediate. Studies of CRL frequency in mice with recombinant H-2 chromosomes (B10.A(2R), (4R), and (5R); B6/TL+; and A/TL-) indicate that CRL-1 is to the right of the Ss-Slp genes and to the left of Tla

    Multiscale Modeling of Astrophysical Jets

    Get PDF
    We are developing the capability for a multi-scale code to model the energy deposition rate and momentum transfer rate of an astrophysical jet which generates strong plasma turbulence in its interaction with the ambient medium through which it propagates. We start with a highly parallelized version of the VH-1 Hydrodynamics Code (Coella and Wood 1984, and Saxton et al., 2005). We are also considering the PLUTO code (Mignone et al. 2007) to model the jet in the magnetohydrodynamic (MHD) and relativistic, magnetohydrodynamic (RMHD) regimes. Particle-in-Cell approaches are also being used to benchmark a wave-population models of the two-stream instability and associated plasma processes in order to determine energy deposition and momentum transfer rates for these modes of jet-ambient medium interactions. We show some elements of the modeling of these jets in this paper, including energy loss and heating via plasma processes, and large scale hydrodynamic and relativistic hydrodynamic simulations. A preliminary simulation of a jet from the galactic center region is used to lend credence to the jet as the source of the so-called the Fermi Bubble (see, e.g., Su, M. & Finkbeiner, D. P., 2012)*It is with great sorrow that we acknowledge the loss of our colleague and friend of more than thirty years, Dr. John Ural Guillory, to his battle with cancer
    corecore