154 research outputs found

    Student Internship in Information Systems: Creating Opportunities and Solutions

    Get PDF
    A common problem in academics is the effective use of technology to support faculty use of information technology both in and out of the classroom. Administrators have tried four strategies to support technology: rely on a computer user services department, hire a technology specialist in a staff position, require one or more faculty to be responsible for the technology, or use students in a combined learning/service role. This paper describes our use of Information System students as interns. We discuss how our internship program has both solved technology problems for faculty and staff and created opportunities for students. We describe the implementation and management of the student internship program. We conclude the paper with a description of a typical project, the installation of a network throughout the college

    New Psychological Paradigm for Conditionals and General de Finetti Tables

    Get PDF
    International audienceThe new Bayesian paradigm in the psychology of reasoning aims to integrate the study of human reasoning, decision making, and rationality. It is supported by two findings. One, most people judge the probability of the indicative conditional, P(if A then B), to be the conditional probability, P(B|A), as implied by the Ramsey test. Two, they judge if A then B to be void when A is false. Their three-valued response table used to be called 'defective', but should be termed the de Finetti table. We show how to study general de Finetti truth tables for negations, conjunctions, disjunctions, and conditionals

    What to Teach in an Information Systems Curriculum: Depends on Who You Ask!

    Get PDF
    Alumni surveys provide useful ways to evaluate the effectiveness of current programs and to provide direction for program modifications. A recent survey of alumni of a Pacific Northwest university undergraduate IS program reveals that they view communication skills and data communication, network and client server skills highly. Group Projects with Real Clients was the highest rated topic for the alumni respondents. Further analysis of the data shows that responses are highly tied to the current IS position and focus of the respondent. As schools evaluate their programs via alumni surveys, they should be aware of this influence

    New normative standards of conditional reasoning and the dual-source model

    Get PDF
    There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task

    The TREC2001 video track: information retrieval on digital video information

    Get PDF
    The development of techniques to support content-based access to archives of digital video information has recently started to receive much attention from the research community. During 2001, the annual TREC activity, which has been benchmarking the performance of information retrieval techniques on a range of media for 10 years, included a ”track“ or activity which allowed investigation into approaches to support searching through a video library. This paper is not intended to provide a comprehensive picture of the different approaches taken by the TREC2001 video track participants but instead we give an overview of the TREC video search task and a thumbnail sketch of the approaches taken by different groups. The reason for writing this paper is to highlight the message from the TREC video track that there are now a variety of approaches available for searching and browsing through digital video archives, that these approaches do work, are scalable to larger archives and can yield useful retrieval performance for users. This has important implications in making digital libraries of video information attainable

    TRECVID 2014 -- An Overview of the Goals, Tasks, Data, Evaluation Mechanisms and Metrics

    No full text
    International audienceThe TREC Video Retrieval Evaluation (TRECVID) 2014 was a TREC-style video analysis and retrieval evaluation, the goal of which remains to promote progress in content-based exploitation of digital video via open, metrics-based evaluation. Over the last dozen years this effort has yielded a better under- standing of how systems can effectively accomplish such processing and how one can reliably benchmark their performance. TRECVID is funded by the NIST with support from other US government agencies. Many organizations and individuals worldwide contribute significant time and effort

    The role of sea-level change and marine anoxia in the Frasnian-Famennian (Late Devonian) mass extinction

    Get PDF
    Johnson et al. (Johnson, J.G., Klapper, G., Sandberg, C.A., 1985. Devonian eustatic fluctuations in Euramerica. Geological Society of America Bulletin 96, 567–587) proposed one of the first explicit links between marine anoxia, transgression and mass extinction for the Frasnian–Famennian (F–F, Late Devonian) mass extinction. This cause-and-effect nexus has been accepted by many but others prefer sea-level fall and cooling as an extinction mechanism. New facies analysis of sections in the USA and Europe (France, Germany, Poland), and comparison with sections known from the literature in Canada, Australia and China reveal several high-frequency relative sea-level changes in the late Frasnian to earliest Famennian extinction interval. A clear signal of major transgression is seen within the Early rhenana Zone (e.g. drowning of the carbonate platform in the western United States). This is the base of transgressive–regressive Cycle IId of the Johnson et al. (Johnson, J.G., Klapper, G., Sandberg, C.A., 1985. Devonian eustatic fluctuations in Euramerica. Geological Society of America Bulletin 96, 567–587) eustatic curve. This was curtailed by regression and sequence boundary generation within the early linguiformis Zone, recorded by hardground and karstification surfaces in sections from Canada to Australia. This major eustatic fall probably terminated platform carbonate deposition over wide areas, especially in western North America. The subsequent transgression in the later linguiformis Zone, recorded by the widespread development of organic-rich shale facies, is also significant because it is associated with the expansion of anoxic deposition, known as the Upper Kellwasser Event. Johnson et al.'s (Johnson, J.G., Klapper, G., Sandberg, C.A., 1985. Devonian eustatic fluctuations in Euramerica. Geological Society of America Bulletin 96, 567–587) original transgression-anoxia–extinction link is thus supported, although some extinction losses of platform carbonate biota during the preceeding regression cannot be ruled out. Conodont faunas suffered major losses during the Upper Kellwasser Event, with deep-water taxa notably affected. This renders unreliable any eustatic analyses utilising changes in conodont biofacies. Claims for a latest Frasnian regression are not supported, and probably reflect poor biostratigraphic dating of the early linguiformis Zone sequence boundary

    ScotGrid: Providing an Effective Distributed Tier-2 in the LHC Era

    Get PDF
    ScotGrid is a distributed Tier-2 centre in the UK with sites in Durham, Edinburgh and Glasgow. ScotGrid has undergone a huge expansion in hardware in anticipation of the LHC and now provides more than 4MSI2K and 500TB to the LHC VOs. Scaling up to this level of provision has brought many challenges to the Tier-2 and we show in this paper how we have adopted new methods of organising the centres, from fabric management and monitoring to remote management of sites to management and operational procedures, to meet these challenges. We describe how we have coped with different operational models at the sites, where Glagsow and Durham sites are managed "in house" but resources at Edinburgh are managed as a central university resource. This required the adoption of a different fabric management model at Edinburgh and a special engagement with the cluster managers. Challenges arose from the different job models of local and grid submission that required special attention to resolve. We show how ScotGrid has successfully provided an infrastructure for ATLAS and LHCb Monte Carlo production. Special attention has been paid to ensuring that user analysis functions efficiently, which has required optimisation of local storage and networking to cope with the demands of user analysis. Finally, although these Tier-2 resources are pledged to the whole VO, we have established close links with our local physics user communities as being the best way to ensure that the Tier-2 functions effectively as a part of the LHC grid computing framework..Comment: Preprint for 17th International Conference on Computing in High Energy and Nuclear Physics, 7 pages, 1 figur
    • 

    corecore