2,928 research outputs found

    Approaching Evaluation in Youth Community Informatics

    Get PDF
    In the Youth Community Informatics project, young people from disadvantaged communities use audio and video recording and editing tools, GPS/GIS, presentation software, graphics, and other digital technologies as the means for addressing community needs. They build community asset maps, document community history, develop exhibits in collaboration with libraries and museums, present cultural heritage, organize political action, operate community radio, create and maintain community technology centers, and express themselves through multiple media. These activities typically involve multiple partners and develop in unpredictable ways in response to community life. In order to understand what they mean in the lives of the youth and the community we need richer evaluation approaches.published or submitted for publicationis peer reviewe

    Trends in Child Poverty by Race/Ethnicity: New Evidence Using an Anchored Historical Supplemental Poverty Measure

    Get PDF
    Official poverty statistics have been criticized, however, for being based on an outdated measure of poverty (Blank, 2008; Citro and Michael, 1995). First put into use in the 1960s, the official poverty measure’s (OPM) concept of needs has been updated for inflation but still reflects the living standards, family budgets, and family structures of that time. Moreover, when tallying family resources, the OPM misses key government programs such as Food Stamps and tax credits that have expanded since the 1960s. For these reasons, the Census Bureau and the Bureau of Labor Statistics (BLS) implemented an improved “supplemental poverty measure” (SPM) in 2011 (Short 2011) for calendar years 2009 and 2010. This SPM is now released annually alongside the OPM (see Short 2015 for the latest data as of this writing), but the Census Bureau has no plans to produce the measure historically. However, historical data on levels and trends in poverty are essential for better understanding the progress the country has made since Lyndon B. Johnson’s famous declaration of the War on Poverty in the 1960s. Understanding what has been successful in the past is important for assessing what might be successful in the future for amelioration of economic disadvantage. What’s more, success and its sources may vary by race/ethnicity. We use a historical version of the SPM to provide the first estimates of racial/ethnic differences in child poverty for the period 1970 to the present using this improved measure. We begin our analysis in 1970 because that is the first year we can reliably distinguish non-Hispanic whites, African-Americans, and Latinos (unfortunately, data limitations prevent us from examining other groups over the long term). We detail below our data and methods, then describe our main results, and conclude briefly. Data and Methods We use data from multiple years of the Current Population Survey’s Annual Social and Economic Supplement (also known as the March CPS) combined with data from the Consumer Expenditure Survey (CEX) to produce SPM estimates for the period 1970 to 2014. We use a methodology similar to that used by the Census Bureau and the Bureau of Labor Statistics in producing their SPM estimates, but with adjustments for differences in available data over time. Our methodology differs from the SPM in only one respect. Instead of using a poverty threshold that is re-calculated over time, we use today’s threshold and carry it back historically by adjusting it for inflation using the CPI-U-RS. Because this alternative measure is anchored with today’s SPM threshold, we refer to it as an anchored supplemental poverty measure, or anchored SPM for short. An advantage of an anchored SPM is that poverty trends resulting from such a measure can be explained only by changes in income and net transfer payments (cash or in kind). Trends in poverty based on a relative measure (e.g. SPM poverty), on the other hand, could be due to over time changes in thresholds. Thus, an anchored SPM arguably provides a cleaner measure of how changes in income and net transfer payments have affected poverty historically (Wimer et al., 2013).[1] [1] All analyses in this paper are also available using quasi-relative poverty thresholds; results are available upon request

    The uncertainty of changepoints in time series

    Get PDF
    Analysis concerning time series exhibiting changepoints have predominantly focused on detection and estimation. However, changepoint estimates such as their number and location are subject to uncertainty which is often not captured explicitly, or requires sampling long latent vectors in existing methods. This thesis proposes efficient, flexible methodologies in quantifying the uncertainty of changepoints. The core proposed methodology of this thesis models time series and changepoints under a Hidden Markov Model framework. This methodology combines existing work on exact changepoint distributions conditional on model parameters with Sequential Monte Carlo samplers to account for parameter uncertainty. The combination of the two provides posterior distributions of changepoint characteristics in light of parameter uncertainty. This thesis also presents a methodology in approximating the posterior of the number of underlying states in a Hidden Markov Model. Consequently, model selection for Hidden Markov Models is possible. This methodology employs the use of Sequential Monte Carlo samplers, such that no additional computational costs are incurred from the existing use of these samplers. The final part of this thesis considers time series in the wavelet domain, as opposed to the time domain. The motivation for this transformation is the occurrence of autocovariance changepoints in time series. Time domain modelling approaches are somewhat limited for such types of changes, with approximations often taking place. The wavelet domain relaxes these modelling limitations, such that autocovariance changepoints can be considered more readily. The proposed methodology develops a joint density for multiple processes in the wavelet domain which can then be embedded within a Hidden Markov Model framework. Quantifying the uncertainty of autocovariance changepoints is thus possible. These methodologies will be motivated by datasets from Econometrics, Neuroimaging and Oceanography

    The uncertainty of storm season changes:quantifying the uncertainty of autocovariance changepoints

    Get PDF
    In oceanography, there is interest in determining storm season changes for logistical reasons such as equipment maintenance scheduling. In particular, there is interest in capturing the uncertainty associated with these changes in terms of the number and location of them. Such changes are associated with autocovariance changes. This paper proposes a framework to quantify the uncertainty of autocovariance changepoints in time series motivated by this oceanographic application. More specifically, the framework considers time series under the Locally Stationary Wavelet framework, deriving a joint density for scale processes in the raw wavelet periodogram. By embedding this density within a Hidden Markov Model framework, we consider changepoint characteristics under this multiscale setting. Such a methodology allows us to model changepoints and their uncertainty for a wide range of models, including piecewise second-order stationary processes, for example piecewise Moving Average processes

    Physico-electrochemical Characterization of Pluripotent Stem Cells during Self-Renewal or Differentiation by a Multi-modal Monitoring System.

    Get PDF
    Monitoring pluripotent stem cell behaviors (self-renewal and differentiation to specific lineages/phenotypes) is critical for a fundamental understanding of stem cell biology and their translational applications. In this study, a multi-modal stem cell monitoring system was developed to quantitatively characterize physico-electrochemical changes of the cells in real time, in relation to cellular activities during self-renewal or lineage-specific differentiation, in a non-destructive, label-free manner. The system was validated by measuring physical (mass) and electrochemical (impedance) changes in human induced pluripotent stem cells undergoing self-renewal, or subjected to mesendodermal or ectodermal differentiation, and correlating them to morphological (size, shape) and biochemical changes (gene/protein expression). An equivalent circuit model was used to further dissect the electrochemical (resistive and capacitive) contributions of distinctive cellular features. Overall, the combination of the physico-electrochemical measurements and electrical circuit modeling collectively offers a means to longitudinally quantify the states of stem cell self-renewal and differentiation

    Resource-Optimized Fermionic Local-Hamiltonian Simulation on Quantum Computer for Quantum Chemistry

    Get PDF
    The ability to simulate a fermionic system on a quantum computer is expected to revolutionize chemical engineering, materials design, nuclear physics, to name a few. Thus, optimizing the simulation circuits is of significance in harnessing the power of quantum computers. Here, we address this problem in two aspects. In the fault-tolerant regime, we optimize the \rzgate and \tgate gate counts along with the ancilla qubit counts required, assuming the use of a product-formula algorithm for implementation. We obtain a savings ratio of two in the gate counts and a savings ratio of eleven in the number of ancilla qubits required over the state of the art. In the pre-fault tolerant regime, we optimize the two-qubit gate counts, assuming the use of the variational quantum eigensolver (VQE) approach. Specific to the latter, we present a framework that enables bootstrapping the VQE progression towards the convergence of the ground-state energy of the fermionic system. This framework, based on perturbation theory, is capable of improving the energy estimate at each cycle of the VQE progression, by about a factor of three closer to the known ground-state energy compared to the standard VQE approach in the test-bed, classically-accessible system of the water molecule. The improved energy estimate in turn results in a commensurate level of savings of quantum resources, such as the number of qubits and quantum gates, required to be within a pre-specified tolerance from the known ground-state energy. We also explore a suite of generalized transformations of fermion to qubit operators and show that resource-requirement savings of up to more than 20%20\%, in small instances, is possible

    Small Ubiquitin-Like Modifier Modulates Abscisic Acid Signaling in Arabidopsis

    Full text link

    Collaboration among government agencies : a study in the management of roadside skips

    Get PDF
    published_or_final_versionPolitics and Public AdministrationMasterMaster of Public Administratio
    • …
    corecore