4,263 research outputs found
Is my configuration any good: checking usability in an interactive sensor-based activity monitor
We investigate formal analysis of two aspects of usability in a deployed interactive, configurable and context-aware system: an event-driven, sensor-based homecare activity monitor system. The system was not designed from formal requirements or specification: we model the system as it is in the context of an agile development process. Our aim was to determine if formal modelling and analysis can contribute to improving usability, and if so, which style of modelling is most suitable. The purpose of the analysis is to inform configurers about how to interact with the system, so the system is more usable for participants, and to guide future developments. We consider redundancies in configuration rules defined by carers and participants and the interaction modality of the output messages.Two approaches to modelling are considered: a deep embedding in which devices, sensors and rules are represented explicitly by data structures in the modelling language and non-determinism is employed to model all possible device and sensor states, and a shallow embedding in which the rules and device and sensor states are represented directly in propositional logic. The former requires a conventional machine and a model-checker for analysis, whereas the latter is implemented using a SAT solver directly on the activity monitor hardware. We draw conclusions about the role of formal models and reasoning in deployed systems and the need for clear semantics and ontologies for interaction modalities
A vector quantization approach to universal noiseless coding and quantization
A two-stage code is a block code in which each block of data is coded in two stages: the first stage codes the identity of a block code among a collection of codes, and the second stage codes the data using the identified code. The collection of codes may be noiseless codes, fixed-rate quantizers, or variable-rate quantizers. We take a vector quantization approach to two-stage coding, in which the first stage code can be regarded as a vector quantizer that “quantizes” the input data of length n to one of a fixed collection of block codes. We apply the generalized Lloyd algorithm to the first-stage quantizer, using induced measures of rate and distortion, to design locally optimal two-stage codes. On a source of medical images, two-stage variable-rate vector quantizers designed in this way outperform standard (one-stage) fixed-rate vector quantizers by over 9 dB. The tail of the operational distortion-rate function of the first-stage quantizer determines the optimal rate of convergence of the redundancy of a universal sequence of two-stage codes. We show that there exist two-stage universal noiseless codes, fixed-rate quantizers, and variable-rate quantizers whose per-letter rate and distortion redundancies converge to zero as (k/2)n -1 log n, when the universe of sources has finite dimension k. This extends the achievability part of Rissanen's theorem from universal noiseless codes to universal quantizers. Further, we show that the redundancies converge as O(n-1) when the universe of sources is countable, and as O(n-1+ϵ) when the universe of sources is infinite-dimensional, under appropriate conditions
Weighted universal image compression
We describe a general coding strategy leading to a family of universal image compression systems designed to give good performance in applications where the statistics of the source to be compressed are not available at design time or vary over time or space. The basic approach considered uses a two-stage structure in which the single source code of traditional image compression systems is replaced with a family of codes designed to cover a large class of possible sources. To illustrate this approach, we consider the optimal design and use of two-stage codes containing collections of vector quantizers (weighted universal vector quantization), bit allocations for JPEG-style coding (weighted universal bit allocation), and transform codes (weighted universal transform coding). Further, we demonstrate the benefits to be gained from the inclusion of perceptual distortion measures and optimal parsing. The strategy yields two-stage codes that significantly outperform their single-stage predecessors. On a sequence of medical images, weighted universal vector quantization outperforms entropy coded vector quantization by over 9 dB. On the same data sequence, weighted universal bit allocation outperforms a JPEG-style code by over 2.5 dB. On a collection of mixed test and image data, weighted universal transform coding outperforms a single, data-optimized transform code (which gives performance almost identical to that of JPEG) by over 6 dB
User Interface Management Systems: A Survey and a Proposed Design
The growth of interactive computing has resulted in increasingly more complex styles of interaction between user and computer. To facilitate the creation of highly interactive systems, the concept of the User Interface Management System (UIMS) has been developed. Following the definition of the term 'UIMS' and a consideration of the putative advantages of the UIMS approach, a number of User Interface Management Systems are examined. This examination focuses in turn on the run-time execution system, the specification notation and the design environment, with a view to establishing the features which an "ideal" UIMS should possess. On the basis of this examination, a proposal for the design of a new UIMS is presented, and progress reported towards the implementation of a prototype based on this design
‘A publick benefite to the nation': the charitable and religious origins of the SSPCK, 1690-1715
The stated purpose of the Society in Scotland for the Propagation of Christian Knowledge was the establishment of charity schools which were complementary to statutory parochial schools in the Highland parishes of Scotland. The parochial schools were demonstrably unsuited for these parishes due to terrain, weather, infrastructure, the nature of settlement, and their vulnerability to the Catholic mission. Historians and commentators have tended to see the society through a cultural and linguistic lens, imputing to it the weak condition in which Gaelic finds itself today. A ban on teaching Gaelic literacy, which was not lifted until the 1760s, has been considered part of an overall strategy to eliminate Gaelic in the hopes of greater civilization in the Highlands. This perspective overlooks a broader significance of the society, which, as a corporation, extended charity beyond the landed classes and nobility, to the rising professions and also common labourers and tenants, through its use of the parishes to collect donations. It was also a sustained effort at establishing a joint-stock company in the wake of the Bank of Scotland and the Company of Scotland, and instituted transparent business practices to foster a reputation for financial probity. The moral aspect of its mission required good and pious behaviour from its teachers, for them to serve as an example for the schools’ communities and to persuade, rather than coerce, children to attend. The society was also very much of its time, with a role in a completion of the Reformation which was a common theme in contemporary religious and social circles. This completion was structural, with the Church of Scotland trying to secure its presbyterian establishment throughout the country, but also moral, with the Societies for Reformation of Manners in England and Scotland, and the Society for Promotion of Christian Knowledge in England, building the legacy of the Reformation and the providential revolution through an encouragement of moral behaviour. These were private groups, however, and while the SPCK developed a channel for charitable activity for the rising professional and middle classes, the SSPCK worked to produce a national corporate effort to support reformation and education in the Highlands
A mean-removed variation of weighted universal vector quantization for image coding
Weighted universal vector quantization uses traditional codeword design techniques to design locally optimal multi-codebook systems. Application of this technique to a sequence of medical images produces a 10.3 dB improvement over standard full search vector quantization followed by entropy coding at the cost of increased complexity. In this proposed variation each codebook in the system is given a mean or 'prediction' value which is subtracted from all supervectors that map to the given codebook. The chosen codebook's codewords are then used to encode the resulting residuals. Application of the mean-removed system to the medical data set achieves up to 0.5 dB improvement at no rate expense
Dexterity, Deliberateness, And Disposition: An Investigation of Instructional Strength for Early Literacy
This mixed-methods study examines the relationship between an emergent conceptualization of teachers’ instructional strength and their students’ progress in early literacy. The conceptualization includes three components: a teacher’s deliberateness, their instructional dexterity, and a set of teacher dispositions that catalyse and maximize these attributes. Based on the results of qualitative inquiry, the authors developed a measure of individual teachers’ instructional strength according to this conceptualization. Regression analysis reveals that all three components are significant predictors of students’ growth in early literacy. The study includes 318 teachers and 1,181 students from 227 schools across the country
A Progressive Universal Noiseless Coder
The authors combine pruned tree-structured vector quantization (pruned TSVQ) with Itoh's (1987) universal noiseless coder. By combining pruned TSVQ with universal noiseless coding, they benefit from the “successive approximation” capabilities of TSVQ, thereby allowing progressive transmission of images, while retaining the ability to noiselessly encode images of unknown statistics in a provably asymptotically optimal fashion. Noiseless compression results are comparable to Ziv-Lempel and arithmetic coding for both images and finely quantized Gaussian sources
Trade and Volatility at the Core and Periphery of the Global Economy
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/95090/1/j.1468-2478.2012.00748.x.pd
The i3 Validation of SunBay Digital Mathematics
Decades of research have emphasized the need for engaging, accessible ways to help students learn fundamental mathematical concepts. Research also shows that teacher beliefs about teaching and learning have an outsized influence on the quality and effectiveness of curricular interventions. This article reports results of an independent evaluation of the i3 implementation of SunBay Digital Mathematics, a middle-school math intervention, and examines the program’s impacts on both student progress and teachers’ beliefs about math instruction. Prior studies have demonstrated the efficacy of SunBay Math for students of varied levels of prior achievement. This independent evaluation included a randomized controlled trial in 60 Florida middle schools during the 2015-16 school year and a mixed-methods implementation study. No impact on student achievement was observed overall; however, the evaluation did reveal positive impacts on teachers’ classroom practices and beliefs about the use of technology in math instruction. Inadequate implementation of instructional units and lack of impact on teachers for targeted beliefs about math instruction likely contributed to lack of overall program effects
- …