309,407 research outputs found
WEAK MEASUREMENT THEORY AND MODIFIED COGNITIVE COMPLEXITY MEASURE
Measurement is one of the problems in the area of software engineering. Since traditional measurement
theory has a major problem in defining empirical observations on software entities in terms of their
measured quantities, Morasca has tried to solve this problem by proposing Weak Measurement theory. In
this paper, we tried to evaluate the applicability of weak measurement theory by applying it on a newly
proposed Modified Cognitive Complexity Measure (MCCM). We also investigated the applicability of
Weak Extensive Structure for deciding on the type of scale for MCCM. It is observed that the MCCM is on
weak ratio scale
Software development: A paradigm for the future
A new paradigm for software development that treats software development as an experimental activity is presented. It provides built-in mechanisms for learning how to develop software better and reusing previous experience in the forms of knowledge, processes, and products. It uses models and measures to aid in the tasks of characterization, evaluation and motivation. An organization scheme is proposed for separating the project-specific focus from the organization's learning and reuse focuses of software development. The implications of this approach for corporations, research and education are discussed and some research activities currently underway at the University of Maryland that support this approach are presented
Implicitization of curves and (hyper)surfaces using predicted support
We reduce implicitization of rational planar parametric curves and (hyper)surfaces to linear algebra, by interpolating the coefficients of the implicit equation.
For predicting the implicit support, we focus on methods that exploit input and output structure in the sense of sparse (or toric) elimination theory, namely by computing the Newton polytope of the implicit polynomial, via sparse resultant theory.
Our algorithm works even in the presence of base points but, in this case, the implicit equation shall be obtained as a factor of the produced polynomial.
We implement our methods on Maple, and some on Matlab as well, and study their numerical stability and efficiency on several classes of curves and surfaces.
We apply our approach to approximate implicitization,
and quantify the accuracy of the approximate output,
which turns out to be satisfactory on all tested examples; we also relate our measures to Hausdorff distance.
In building a square or rectangular matrix, an important issue is (over)sampling the given curve or surface: we conclude that unitary complexes offer the best tradeoff between speed and accuracy when numerical methods are employed, namely SVD, whereas for exact kernel computation random integers is the method of choice.
We compare our prototype to existing software and find that it is rather competitive
Measuring the Pro-Activity of Software Agents
Despite having well-defined characteristics, software agents do not have a developed set of measures defining their quality. Attempts at evaluating software agent quality have focused on some agent aspects, like the development process, whereas others focusing on the agent as a software product have basically adopted measures associated with other software paradigms, like procedural and object-oriented concepts. Here we propose a set of measures for evaluating software agent pro-activity, the software agent's goal-driven behavioral ability to take the initiative and satisfy its goal
Technical alignment
This essay discusses the importance of the areas of
infrastructure and testing to help digital preservation services
demonstrate reliability, transparency, and accountability. It
encourages practitioners to build a strong culture in which
transparency and collaborations between technical frameworks
are valued highly. It also argues for devising and applying
agreed-upon metrics that will enable the systematic analysis of
preservation infrastructure. The essay begins by defining
technical infrastructure and testing in the digital preservation
context, provides case studies that exemplify both progress and
challenges for technical alignment in both areas, and concludes
with suggestions for achieving greater degrees of technical
alignment going forward
Uncertainty management in real estate development: studying the potential of the SCRUM design methodology
Real estate development is all about assessing and controlling risks and uncertainties. Risk management implies making decisions based on quantified risks to execute riskresponse measures. Uncertainties, on the other hand, cannot be quantified and are therefore unpredictable. In literature, much attention is paid to risk management. The management of uncertainties is underexposed. Uncertainties appear in the programming and designing phases of projects. The main goal of our research is to develop guidelines for real estate developers to manage uncertainties in those phases
Software Measurement Activities in Small and Medium Enterprises: an Empirical Assessment
An empirical study for evaluating the proper implementation of measurement/metric programs in software companies in one area of Turkey is presented. The research questions are discussed and validated with the help of senior software
managers (more than 15 yearsâ experience) and then used for interviewing a variety of medium and small scale software companies in Ankara. Observations show that there is a
common reluctance/lack of interest in utilizing measurements/metrics despite the fact that they are well known in the industry. A side product of this research is that internationally recognized standards such as ISO and CMMI are pursued if they are a part of project/job
requirements; without these requirements, introducing those standards to the companies remains as a long-term target to increase quality
- âŠ