1,881 research outputs found
Developments in WIS Development
This paper presents findings from a longitudinal field study of web-design undertaken in a webdevelopment company. The main contribution is a comparison of some early predictions on the implications of the increased usage of www-technologies for more complex information systems and core findings from two field studies of a specific web-development work setting. The two studies are snapshots from a longitudinal study. Focus in the studies has been on organization of the development work, essential characteristics of the development work, and core characteristics of the products developed. Many of the core problems to be handled in web-based information systems (WIS) development are quite analogous to challenges known from traditional information systems development, although the pace and the number of involved competencies increase, there is a lack of standards in many areas, and there is an increase in communication problems between the different competence groups
Recommended from our members
Fitting the Mixed Rasch Model to a Reading Comprehension Test: Identifying Reader Types
Standard unidimensional Rasch models assume that persons with the same ability parameters are comparable. That is, the same interpretation applies to persons with identical ability estimates as regards the underlying mental processes triggered by the test. However, research in cognitive psychology shows that persons at the same trait level may employ different strategies to arrive at the solutions. This is a major threat to the construct validity of a test since the construct representation of the test changes for different classes of respondents. In this study a reading comprehension test composed of 20 multiple-choice items is analysed with mixed Rasch model. Findings show that a two-class model fits the data best. After investigating class specific item profiles the implications of the study for test validation along with the contribution of the research to our understanding of reading processes are discussed. Accessed 7,812 times on https://pareonline.net from March 05, 2013 to December 31, 2019. For downloads from January 1, 2020 forward, please click on the PlumX Metrics link to the right
The Merger Incipiency Doctrine and the Importance of Redundant Competitors
The enforcers and the courts have not implemented the merger incipiency doctrine in the vigorous manner Congress intended. We believe one important reason for this failure is that, until now, the logic underlying this doctrine has never been explained. The purpose of this article is to demonstrate that markets’ need for “protective redundancy” explains the incipiency policy. We are writing this article in the hope that this will cause the enforcers and courts to implement significantly more stringent merger enforcement.To vastly oversimplify, the current enforcement approach assumes that if N significant competitors are necessary for competition, N-1 competitors could well be anticompetitive, but blocking an N 1 merger would not confer any gains. Because many enforcers and judges erroneously assume that mergers among major competitors usually result in significant gains to efficiency and innovation, they believe that blocking mergers to the N 1 level would impose significant costs on the economy. Why should enforcement preserve apparent “redundancy”? First, the relationship between concentration and competition, and between concentration and innovation, is uncertain. Underestimating the minimum necessary number of firms needed for competition and for innovation is likely to result in harm to consumer welfare. Second, one or more of the N firms frequently can wither or implode as a result of normal competition, or from an unexpected shock to the market, often surprisingly quickly. This leaves only N-1 or N-2 remaining significant competitors. Finally, when enforcers challenge a merger that would have resulted in N competitors, they often allow the merger subject to complex remedies. But if the remedy fails, as they often do, the market will have too few competitors by the enforcers’ own estimate. Taken together these scenarios often leave markets with too few firms. The attenuation of the incipiency doctrine has allowed many mergers that have resulted in higher prices and lower levels of innovation. This has been shown by recent empirical work evaluating the consequences of major mergers. Moreover, other empirical work shows that significant mergers rarely produce significant efficiency gains and often result in losses to innovation.A revitalized incipiency doctrine would retain the “protective redundancy” that would preserve competition, while sacrificing little or nothing in terms of efficiency or innovation. The enforcers and the courts should implement this policy much more aggressively
Evaluation of Item Response Theory Models for Nonignorable Omissions
When competence tests are administered, subjects frequently omit items. These
missing responses pose a threat to correctly estimating the proficiency level.
Newer model-based approaches aim to take nonignorable missing data processes
into account by incorporating a latent missing propensity into the measurement
model. Two assumptions are typically made when using these models: (1) The
missing propensity is unidimensional and (2) the missing propensity and the
ability are bivariate normally distributed. These assumptions may, however, be
violated in real data sets and could, thus, pose a threat to the validity of
this approach. The present study focuses on modeling competencies in various
domains, using data from a school sample (N = 15,396) and an adult sample (N =
7,256) from the National Educational Panel Study. Our interest was to
investigate whether violations of unidimensionality and the normal
distribution assumption severely affect the performance of the model-based
approach in terms of differences in ability estimates. We propose a model with
a competence dimension, a unidimensional missing propensity and a
distributional assumption more flexible than a multivariate normal. Using this
model for ability estimation results in different ability estimates compared
with a model ignoring missing responses. Implications for ability estimation
in large-scale assessments are discussed
Identifying Disengaged Responding in Multiple-Choice Items : Extending a Latent Class Item Response Model With Novel Process Data Indicators
Disengaged responding poses a severe threat to the validity of educational large-scale assessments, because item responses from unmotivated test-takers do not reflect their actual ability. Existing identification approaches rely primarily on item response times, which bears the risk of misclassifying fast engaged or slow disengaged responses. Process data with its rich pool of additional information on the test-taking process could thus be used to improve existing identification approaches. In this study, three process data variables—text reread, item revisit, and answer change—were introduced as potential indicators of response engagement for multiple-choice items in a reading comprehension test. An extended latent class item response model for disengaged responding was developed by including the three new indicators as additional predictors of response engagement. In a sample of 1,932 German university students, the extended model indicated a better model fit than the baseline model, which included item response time as only indicator of response engagement. In the extended model, both item response time and text reread were significant predictors of response engagement. However, graphical analyses revealed no systematic differences in the item and person parameter estimation or item response classification between the models. These results suggest only a marginal improvement of the identification of disengaged responding by the new indicators. Implications of these results for future research on disengaged responding with process data are discussed
- …