5,427 research outputs found
A Codebook Generation Algorithm for Document Image Compression
Pattern-matching-based document-compression systems (e.g. for faxing) rely on
finding a small set of patterns that can be used to represent all of the ink in
the document. Finding an optimal set of patterns is NP-hard; previous
compression schemes have resorted to heuristics. This paper describes an
extension of the cross-entropy approach, used previously for measuring pattern
similarity, to this problem. This approach reduces the problem to a k-medians
problem, for which the paper gives a new algorithm with a provably good
performance guarantee. In comparison to previous heuristics (First Fit, with
and without generalized Lloyd's/k-means postprocessing steps), the new
algorithm generates a better codebook, resulting in an overall improvement in
compression performance of almost 17%
Examining the Read-to-Write Strategy and its Effects on Second Grader’s Writing of Sequential Text
Writing is so important. It is important in school and in our careers; writing is found to be helpful physiologically and psychologically. Experts wonder, with writing so important, why is writing not being adequately taught in the schools. The answer may be that writing is complex and teaching it is even more complex. The Read-to-Write Strategy is a writing model based on the study of exemplary models of text and children are explicitly taught how to write the way an author writes through a process of the teacher modeling how to write this way; the teacher sharing the writing task with children, and having children collaborate with a partner during the writing task, so that eventually children can independently write text to match the child’s audience and purpose. In this exploratory study, second grade children were explicitly taught a writing strategy that followed the model proposed by Read-to-Write Strategy. This study of writing compared samples of children’s writing before and after they received instruction in the Read-to-Write Strategy. Children made good improvements in their writing and the tests run on the children’s writing samples infer that learning was significant
Passive broadband full Stokes polarimeter using a Fresnel cone
Light's polarisation contains information about its source and interactions,
from distant stars to biological samples. Polarimeters can recover this
information, but reliance on birefringent or rotating optical elements limits
their wavelength range and stability. Here we present a static, single-shot
polarimeter based on a Fresnel cone - the direct spatial analogue to the
popular rotating quarter-wave plate approach. We measure the average angular
accuracy to be 2.9 (3.6) degrees for elliptical(linear) polarisation states
across the visible spectrum, with the degree of polarisation determined to
within 0.12(0.08). Our broadband full Stokes polarimeter is robust,
cost-effective, and could find applications in hyper-spectral polarimetry and
scanning microscopy.Comment: 6 Pages, 4 Figure
Forecasting dose-time profiles of solar particle events using a dosimetry-based Bayesian forecasting methodology
A dosimetery-based Bayesian methodology for forecasting astronaut radiation doses in deep space due to radiologically significant solar particle event proton fluences is developed. Three non-linear sigmoidal growth curves (Gompertz, Weibull, logistic) are used with hierarchical, non-linear, regression models to forecast solar particle event dose-time profiles from doses obtained early in the development of the event. Since there are no detailed measurements of dose versus time for actual events, surrogate dose data are provided by calculational methods. Proton fluence data are used as input to the deterministic, coupled neutron-proton space radiation computer code, BRYNTRN, for transporting protons and their reaction products (protons, neutrons, 2H, 3H, ³He, and He) through aluminum shielding material and water. Calculated doses and dose rates for ten historical solar particle events are used as the input data by grouping similar historical solar particle events, using asymptotic dose and maximum dose rate as the grouping criteria. These historical data are then used to lend strength to predictions of dose and dose rate-time profiles for new solar particle events. Bayesian inference techniques are used to make parameter estimates and predictive forecasts. Due to the difficulty in performing the numerical integrations necessary to calculate posterior parameter distributions and posterior predictive distributions, Markov Chain Monte Carlo (MCMC) methods are used to sample from the posterior distributions.
Hierarchical, non-linear regression models provide useful predictions of asymptotic dose and dose-time profiles for the November 8, 2000 and August 12, 1989 solar particle events. Predicted dose rate-time profiles are adequate for the November 8, 2000 solar particle event. Predicitions of dose rate-time profiles for the August 12, 1989 solar particle event suffer due to a more complex dose rate- time profile. Model assessment indicates adequate fits of the data. Model comparison results clearly indicate preference for the Weibull model for both events.
Forecasts provide a valuable tool to space operations planners when making recommendations concerning operations in which radiological exposure might jeopardize personal safety or mission completion. This work demonstrates that Bayesian inference methods can be used to make forecasts of dose and dose rate-time profiles early in the evolution of solar particle events. Bayesian inference methods provide a coherent methodology for quantifying uncertainty. Hierarchical models provide a natural framework for the prediction of new solar particle event dose and dose rate-time profiles
Advances in semantic representation for multiscale biosimulation: a case study in merging models
As a case-study of biosimulation model integration, we describe our experiences applying the SemSim methodology to integrate independently-developed, multiscale models of cardiac circulation. In particular, we have integrated the CircAdapt model (written by T. Arts for MATLAB) of an adapting vascular segment with a cardiovascular system model (written by M. Neal for JSim). We report on three results from the model integration experience. First, models should be explicit about simulations that occur on different time scales. Second, data structures and naming conventions used to represent model variables may not translate across simulation languages. Finally, identifying the dependencies among model variables is a non-trivial task. We claim that these challenges will appear whenever researchers attempt to integrate models from others, especially when those models are written in a procedural style (using MATLAB, Fortran, etc.) rather than a declarative format (as supported by languages like SBML, CellML or JSim’s MML)
Subglacial clast/bed contact forces
A laboratory device was built to measure the forces that ice exerts on a 0.05 m diameter rigid plastic sphere in two different configurations: in contact with a flat bed or isolated from the bed. Measurements indicated that bed-normal contact forces were 1.8 times larger than drag forces due to creeping flow past a slippery sphere isolated from the bed. Measurements of forces as a function of the bed-normal ice velocity, estimations of the ice viscosity parameter and observations of markers in the ice indicate ice is Newtonian with a viscosity of ∼1.3 × 1011 Pa s. Newtonian behavior is expected due to small and transient stresses. A model of regelation indicates that it had a negligible (\u3c5%) influence on forces. Water pressure in the cavity beneath the sphere in contact with the bed had a likewise negligible influence on contact forces. When no cavity is present, drag forces can be correctly estimated using Stokes\u27s law (Newtonian viscosity) for a slippery sphere. The same law with a bed-enhancement factor of 1.8 is appropriate for estimating bed-normal contact forces. These results reinforce previous laboratory measurements and theories but provide no support for explanations of high debris/bed friction or rates of abrasion that depend on high contact forces
- …