760 research outputs found

    Bias Correction in the Dynamic Panel Data Model with a Nonscalar Disturbance Covariance Matrix

    Get PDF
    By using asymptotic expansion techniques approximation formulae are developed for the bias of ordinary and generalized Least Squares Dummy Variable (LSDV) estimators in dynamic panel data models. Earlier results on bias approximation in first-order stable dynamic panel data models are extended to higher-order dynamic models with general disturbance covariance structure. The focus is on estimation of both short- and long-run coefficients. The results show that proper modelling of the disturbance covariance structure is indispensable. The bias approximations are used to construct bias corrected estimators which are then applied to quarterly data from 14 European Union countries. Money demand functions for M1, M2 and M3 are estimated for the EU area as a whole for the period 1991:I-1995:IV. The empirical results show that in general plausible long-run effects are obtained by the bias corrected estimators. Moreover, bias correction can be substantial underlining the importance of more refined estimation techniques. Also the efficiency gains by exploiting the heteroscedasticity and cross-correlation patterns between countries are considerable.

    Complementary feeding with fortified spread and incidence of severe stunting in 6- to 18-month-old rural Malawians.

    Get PDF
    OBJECTIVE: To compare growth and incidence of malnutrition in infants receiving long-term dietary supplementation with ready-to-use fortified spread (FS) or micronutrient-fortified maize-soy flour (likuni phala [LP]). DESIGN: Randomized, controlled, single-blind trial. SETTING: Rural Malawi. PARTICIPANTS: A total of 182 six-month-old infants. INTERVENTION: Participants were randomized to receive 1 year of daily supplementation with 71 g of LP (282 kcal), 50 g of FS (FS50) (256 kcal), or 25 g of FS (FS25) (130 [corrected] kcal). OUTCOME MEASURES: Weight and length gains and the incidences of severe stunting, underweight, and wasting. RESULTS: Mean weight and length gains in the LP, FS50, and FS25 groups were 2.37, 2.47, and 2.37 kg (P = .66) and 12.7, 13.5, and 13.2 cm (P = .23), respectively. In the same groups, the cumulative 12-month incidence of severe stunting was 13.3%, 0.0%, and 3.5% (P = .01), of severe underweight was 15.0%, 22.5%, and 16.9% (P = .71), and of severe wasting was 1.8%, 1.9%, and 1.8% (P > .99). Compared with LP-supplemented infants, those given FS50 gained a mean of 100 g more weight and 0.8 cm more length. There was a significant interaction between baseline length and intervention (P = .04); in children with below-median length at enrollment, those given FS50 gained a mean of 1.9 cm more than individuals receiving LP. CONCLUSION: One-year-long complementary feeding with FS does not have a significantly larger effect than LP on mean weight gain in all infants, but it is likely to boost linear growth in the most disadvantaged individuals and, hence, decrease the incidence of severe stunting

    Global Diffusion of the Internet XV: Web 2.0 Technologies, Principles, and Applications: A Conceptual Framework from Technology Push and Demand Pull Perspective

    Get PDF
    Web 2.0, the current Internet evolution, can be described by several key features of an expanded Web that is more interactive; allows easy social interactions through participation and collaboration from a variety of human sectors; responds more immediately to users\u27 queries and needs; is easier to search; and provides a faster, smoother, realistic and engaging user search capability, often with automatic updates to users. The purpose of this study is three-fold. First, the primary goal is to propose a conceptual Web 2.0 framework that provides better understanding of the Web 2.0 concept by classifying current key components in a holistic manner. Second, using several selective key components from the conceptual framework, this study conducts case analyses of Web 2.0 applications to discuss how they have adopted the selective key features (i.e., participation, collaboration, rich user experience, social networking, semantics, and interactivity responsiveness) of the conceptual Web 2.0 framework. Finally, the study provides insightful discussion of some challenges and opportunities provided by Web 2.0 to education, business, and social life

    From Expectation to Actual Perception after Experience: A Longitudinal Study of the Perceptions of Student Response Systems

    Get PDF
    Interactive student response systems (SRSs) are becoming popular as many instructors at the tertiary level educationinstitutions adopt the systems to transform traditional passive lectures into interactive classes. Despite the popularity andnumerous benefits of SRSs, there is conflicting evidence regarding the current levels of perceptions and actual performancesbefore and after use the systems. We believe the inconsistent result stems from the differences between the level ofexpectation and the level of actual perceptions after they use the systems. Students’ beliefs and attitudes are key perceptionstoward the information technology (i.e., SRSs) usage. However, these perceptions may change over time as they gain directexperience with the technology. In this study, therefore, we test students’ expectations and perceptions of the technology andprovide a comparative result from a longitudinal perspective. More specifically, in this study, we examine students’perceptional differences in terms of well-known technology acceptance constructs such as perceived usefulness, ease of use,enjoyment, and intention to use the systems before and after they actually use the systems.The results confirm that there are differences between the levels of expectations and actually perceptions of cognitive beliefsacross three-stage of experience. However, there are no differences of students’ perceptions between initial usage andsecond-time usage. In other words, the very first direct experience of students will become a salient determinant of theirperception of cognitive beliefs and behavioral intention, but the second experience does not provide the same level of impactas much the first one does. We believe that the result of the study will provide solid understanding of the gap between thelevels of expectations and actual perceptions of a technology before and after usage, which will explain the inconsistentresults related to the SRSs. Limitations and future directions are discussed

    Provider-initiated symptom screening for tuberculosis in Zimbabwe: diagnostic value and the effect of HIV status.

    Get PDF
    OBJECTIVE: To assess the diagnostic value of provider-initiated symptom screening for tuberculosis (TB) and how HIV status affects it. METHODS: We performed a secondary analysis of randomly selected participants in a community-based TB-HIV prevalence survey in Harare, Zimbabwe. All completed a five-symptom questionnaire and underwent sputum TB culture and HIV testing. We calculated the sensitivity, specificity, and positive and negative predictive values of various symptoms and used regression analysis to investigate the relationship between symptoms and TB disease. FINDINGS: We found one or more symptoms of TB in 21.2% of 1858 HIV-positive (HIV+) and 9.9% of 7121 HIV-negative (HIV-) participants (P or = 2 weeks' duration, any symptom and a positive sputum culture had sensitivities of 48%, 81% and 65%, respectively; in HIV- participants, the sensitivities were 45%, 71% and 74%, respectively. Symptoms had a similar sensitivity and specificity in HIV+ and HIV- participants, but in HIV+ participants they had a higher positive and a lower negative predictive value. CONCLUSION: Even smear-positive TB may be missed by provider-initiated symptom screening, especially in HIV+ individuals. Symptom screening is useful for ruling out TB, but better TB diagnostics are urgently needed for resource-poor settings

    Taking advantage of the UNFCCC Kyoto Policy Process: What can we learn about learning?

    Get PDF
    Learning is difficult to anticipate when it happen instantaneously, e.g. in the context of innovations [2]. However, even if learning is anticipated to happen continuously, it is difficult to grasp, e.g. when it occurs outside well-defined lab conditions, because adequate monitoring had not been put in place. Our study is retrospective. It focuses on the emissions of greenhouse gases (GHGs)that had been reported by countries (Parties) under the Kyoto Protocol (KP) to the United Nations Framework on Climate Change (UNFCCC). Discussions range widely on (i) whether the KP is considered a failure [6] or a success [5] ; and (ii) whether international climate policy should transit from a centralized model of governance to a 'hybrid' decentralized approach that combines country-level mitigation pledges with common principles for accounting and monitoring [1] . Emissions of GHGs - in the following we refer to CO2 emissions from burning fossil fuels at country level, particularly in the case of Austria - provide a perfect means to study learning in a globally relevant context. We are not aware of a similar data treasure of global relevance. Our mode of grasping learning is novel, i.e. it may have been referred to in general but, to the best of our knowledge, had not been quantifed so far. (That is, we consider the KP a success story potentially and advocate for the hybrid decentralized approach.) Learning requires 'measuring' differences or deviations. Here we follow Marland et al. [3] who discuss this issue in the context of emissions accounting: 'Many of the countries and organizations that make estimates of CO2 emissions provide annual updates in which they add another year of data to the time series and revise the estimates for earlier years. Revisions may reflect revised or more complete energy data and ... more complete and detailed understanding of the emissions processes and emissions coefficients. In short, we expect revisions to reflect learning and a convergence toward more complete and accurate estimates.' The United Nations Framework Convention on Climate Change (UNFCCC)requires exactly this to be done. Each year UNFCCC signatory countries are obliged to provide an annual inventory of emissions (and removals) of specified GHGs from five sectors (energy; industrial processes and product use; agriculture; land use, land use change and forestry; and waste) and revisit the emissions (and removals) for all previous years, back to the country specified base years (or periods). These data are made available by means of a database [4]. The time series of revised emission estimates reflect learning, but they are 'contaminated' by (i) structural change (e.g., when a coal-power plant is substituted by a gas-power plant); (ii) changes in consumption; and, rare but possible, (iii)methodological changes in surveying emission related activities. De-trending time series of revised emission estimates allows this contamination to be isolated by country, for which we provide three approaches: (I) parametric approach employing polynomial trend; (II) non-parametric approach employing smoothing splines; and (III) approach in which the most recent estimate is used as trend. That is, after de-trending for each year we are left with a set of revisions that reflect 'pure'(uncontaminated) learning which, is expected to be independent of the year under consideration (i.e., identical from year to year). However, we are confronted with two non-negligible problems (P): (P.1) the problem of small numbers - the remaining differences in emissions are small (before and after de-trending); and (P.2) the problem of non-monotonic learning - our knowledge of emission-generating activities and emission factors may not become more accurate from revision to revision

    Assessing Information Systems and Computer Information Systems Programs from a Balanced Scorecard Perspective

    Get PDF
    Assessment of educational programs is one of the important means used in academia for accountability, accreditation, and improvement of program quality. The assessment practices, guidelines, and requirements are very broad and vary widely among academic programs and from one institution to the other. In this paper, from the theoretical lenses of a strategic planning and management methodology, the Balanced Scorecard, we try to integrate various perspectives into a performance assessment framework for an educational assessment of computing and information systems. Particularly, based on the actual accreditation experience, we propose two assessment models: a conceptual model and a process model. This modeling approach addresses the critical conceptual elements required for educational assessment and provides practical guidelines to follow for a complete, smooth and successful assessment process. In addition, we present a set of robust tools and techniques, incorporated into the process steps, team work, and task-driven management process. We were successful in our accreditation efforts, and improved the quality of our computing and information systems programs by using these presented assessment methods. We share our views and thoughts in the form of lessons learned and suggested best practices so as to streamline program assessment and simplify its procedures and steps

    Preparatory Signal Detection for Annex I Countries under the Kyoto Protocol - A Lesson for the Post-Kyoto Policy Process

    Get PDF
    In our study we address the detection of uncertain GHG emission changes (also termed emission signals) under the Kyoto Protocol. The question to be probed is "how well do we need to know net emissions if we want to detect a specified emission signal after a given time?" No restrictions exist as to what concerns the net emitter. However, for data availability reasons and because of the excellent possibilityof inter-country comparisons, the Protocols Annex I countries are used as net emitters. Another restriction concerns the exclusion of emissions/removals due to land-use change and forestry (LUCF) as the reporting of their uncertainties is only soon becoming standard practice. Our study centers on the preparatory detection of emission signals, which should have been applied prior to/in negotiating the Kyoto Protocol. Rigorous preparatory signal detection has not yet been carried out, neither prior to the negotiations of the Kyoto Protocol nor afterwards. The starting point for preparatory signal detection is that the Annex I countries under the Kyoto Protocol comply with with their emission limitation or reduction commitments. Uncertainties are already monitored. However, monitored emissions and uncertainties are still dealt with in isolation. A connection between emission and uncertainty estimates for the purpose of an advanced country evaluation has not yet been established. We apply four preparatory signal detection techniques. These are the Critical Relative Uncertainty (CRU) concept, the Verification Time (VT) concept, the Undershooting (Und) concept, and the Undershooting and Verification Time (Und&VT) concepts combined. All of the techniques identify an emission signal and consider the total uncertainty that underlies the countries emissions, either in the commitment year/period or in both the base year and the commitment year/period. The techniques follow a hierarchical order in terms of complexity permitting to explore their robustness. The most complex technique, the Und&VT concept, considers in addition to uncertainty (1) the dynamics of the signal itself permitting to ask for the verification time, the time when the signal is outstripping total uncertainty; (2) the risk (probability) that the countries true emissions in the commitment year/period are above (below) their true emission limitation or reduction commitments; (3) the undershooting that is needed to reduce this risk to a prescribed level; and (4) a corrected undershooting/risk that accounts for detectability, i.e., that fulfills a given commitment period or, equivalently, its maximal allowable verification time. Our preparatory signal detection exercise exemplifies that the negotiations for the Kyoto Protocol were imprudent because they did not consider the consequences of uncertainty, i.e., (1) the risk that the countries true emissions in the commitment year/period are above their true emission limitation or reduction commitments; and (2) detectable targets. Expecting that Annex I countries exhibit relative uncertainties in the range of 5-10 % and above rather than below, excluding emissions/removals due to LUCF, both the CRU concept and VT concept show that it is virtually impossible for most of the Annex I countries to meet the condition that their overall relative uncertainties are smaller than their CRUs or, equivalently, that their VTs are smaller than their maximal allowable verification times. Moreover, the Und and the Und&VT concepts show that the countries committed emission limitation or reduction targets - or their Kyoto-compatible but detectable targets, respectively - require considerable undershooting if one wants to keep the risk low that the countries true emissios in the commitment year/period are above the true equivalents of these targets. The amount by which a country undershoots its Kyoto target or its Kyoto-compatible but detectable target can be traded. Towards installing a successful trading regime, countries may want to also price the risk associated with this amount We anticipate that the evaluation of the countries emission signals in terms of risk and detectability will become reality. The Intergovernmental Panel on Climate Change (IPCC) also suggests assessing total uncertainties. However, a connection between monitored emission and uncertainty estimates for the purpose of an advanced country evaluation, which considers the aforementioned risk as well as detectable targets, has not yet been established. The IPCC has to take up this challenge

    Preparatory Signal Detection for the EU Member States under EU Burden Sharing - Advanced Monitoring Including Uncertainty (1990-2002)

    Get PDF
    This study follows up the authors' collaborative IIASA Interim Report IR-04-024 (Jonas et al., 2004) which addresses the preparatory detection of uncertain greenhouse gas (GHG) emission changes (also termed emission signals) under the Kyoto Protocol. The question was "how well do we need to know net emissions if we want to detect a specified emission signal after a given time?" The authors use the Protocol's Annex I countries as net emitters and excluded the emission/removals due to land-use change and forestry (LUCF). They motivated the application of preparatory signal detection in the context of the Kyoto Protocol as a necessary measure that should have been taken prior to/in negotiating the Protocol. The authors argued that uncertainties are already monitored and are increasingly made available but that monitored emissions and uncertainties are still dealt with in isolation. A connection between emissions and uncertainty estimates for the purpose of an advanced country evaluation has not yet been established. The authors develop four preparatory signal detection techniques and applied these to the Annex I countries under the Kyoto Protocol. The frame of reference for preparatory signal detection is that Annex I countries comply with their committed emission targets in 2008-2012. In our study we apply one of these techniques, the combined undershooting and verification time (Und&VT) concept to advance the monitoring of the GHG emissions reported by the Member States of the European Union (EU). In contrast to the earlier study, we focus on the Member States' committed emission targets under the EU burden sharing in compliance with the Kyoto Protocol. We apply the Und&VT concept in the standard mode, i.e., with reference to the Member States committed emission targets in 2008-2012, and in a new mode, i.e., with reference to linear path emission targets between the base year and the commitment year (here for 2001). To advance the reporting of the EU we take uncertainty and its consequences into consideration, i.e., (i) the risk that a Member State's true emissions in the commitment year/period are above its true emission limitation or reduction commitment; and (ii) the detectability of its target. Undershooting the committed EU target or EU-compatible, but detectable, target can decrease this risk. We contrast the Member States' linear path undershooting targets for the year 2001 with their actual emission situation in that year, for which we use the distance-to-target indicator (DTI) introduced by the European Environment Agency. In 2001 only four countries exhibit a negative DTI and thus appear as potential sellers: Germany, Luxembourg, Sweden and the United Kingdom. However, expecting that the EU Member States exhibit relative uncertainties in the range of 5-10% and above rather than below, excluding emissions/removals due to LUCF, the member states require considerable undershooting of their EU-compatible, but detectable, targets if one wants to keep the associated risk low. These conditions can only be met by the three Member States Germany, Luxembourg and the United Kingdom - or Luxembourg, Germany and the United Kingdom if ranked in terms of creditability. Within the 5-10% relative uncertainty class, Sweden can only act as potential high-risk seller. In contrast, with relative uncertainty increasing from 5 to 10%, the emission signal of the EU as a whole switches from "detectable" to "non-detectable", indicating that the negotiations for the Kyoto Protocol were imprudent because they did not take uncertainty and its consequences into account. We anticipate that the evaluation of emission signals in terms of risk and detectability will become standard practice and that these two qualifiers will be accounted for in pricing GHG emission permits
    corecore