4,209 research outputs found
How to select measures for decision support systems - An optimization approach integrating informational and economic objectives
It is still an open issue of designing and adapting (data-driven) decision support systems and data warehouses to determine relevant content and in particular (performance) measures. In fact, some classic approaches to information requirements determination such as Rockartâs critical success factors method help with structuring decision makersâ information requirements and identifying thematically appropriate measures. In many cases, however, it remains unclear which and how many measures should eventually be used. Therefore, an optimization model is presented that integrates informational and economic objectives. The model incorporates (statistic) interdependencies among measures â i. e. the information they provide about one another â, decision makersâ and reporting toolsâ ability of coping with information complexity as well as negative economic effects due to measure selection and usage. We show that in general the selection policies of all-or-none or themore- the-better are not reasonable although they are often conducted in business practice. Finally, the modelâs application is illustrated by the German business-to-business sales organization of a global electronics and electrical engineering company as example.
In this position paper we discuss the importance of Green IT as a new research field that investigates
all the environmental and energy issues related to IT and information systems in general. In particular
we focus on the energy consumption of software applications, which is amplified by all the above IT
layers in a data center and thus is worth a greater attention. By adopting a top-down approach, we
address the problem from a logical perspective and try to identify the original cause that leads to
energy consumption, i.e. the elaboration of information. We propose a research roadmap to identify a
set of software complexity and quality metrics that can be used to estimate energy consumption and to
compare specific software application
The economics of community open source software projects: an empirical analysis of maintenance effort
Previous contributions in the empirical software engineering literature have consistently observed a quality degradation effect of proprietary code as a consequence of maintenance. This degradation effect, referred to as entropy effect, has been recognized to be responsible for significant increases in maintenance effort. In the Open Source context, the quality of code is a fundamental design principle. As a consequence, the maintenance effort of Open Source applications may not show a similar increasing trend over time. The goal of this paper is to empirically verify the entropy effect for a sample of 4,289 community Open Source application versions. Analyses are based on the comparison with an estimate of effort obtained with a traditional effort estimation model. Findings indicate that community Open Source applications show a slower growth of maintenance effort over time, and, therefore, are less subject to the entropy effect
The Impact of Social Netowrking on Software Design Quality and Development Effort in Open Source Projects
This paper focuses on Open Source (OS) social networks. The literature indicates that OS networks have a few nodes with a number of relationships significantly higher than the networkâs average, called hubs. It also provides numerous metrics that help verify whether a node is a hub, called centrality metrics. This paper posits that higher values of centrality metrics are positively correlated with project success. Second, it posits that higher values of centrality metrics are positively correlated with the ability of a project to attract new contributions. Third, it posits that projects with greater success have a lower software design quality. Hypotheses are tested on a sample of 56 applications written in Java from the SourceForge.net online OS repository. The corresponding social network is built by considering all the contributors, both developers and administrators, of our application sample and all contributors directly or indirectly connected with them within SourceForge.net, with a total of 57,142 nodes. Empirical results support our hypotheses, indicating that centrality metrics are significant drivers of project success that should be monitored from the perspective of a project administrator or team manager. However, they also prove that successful projects tend to have a significantly lower design quality of software. This has a number of consequences that could be visible to users and cause negative feedback effects over time
Machine tools thermostabilization using passive control strategies
The aim of this study is to investigate passive control strategies using Phase Change Materials in Machine Tools (MTs) thermostabilization. By considering the main issues related to the thermal stability, authors presented the application of novel multifunctional materials to Machine Tools structures. A set of advanced materials are considered: aluminium foams, corrugate-core sandwich panels and polymeric concrete beds. The adopted solutions have been infiltrated by phase change materials (PCMs) in order to maintain the thermal stability of MTs when the environmental temperature is perturbed. The paper shows the results of simulative and experimental tests
Efficiency implications of open source commonality and reuse
This paper analyzes the reuse choices made by open source developers and relates them to cost efficiency. We make a distinction between the commonality among applications and the actual reuse of code. The former represents the similarity between the requirements of different applications and, consequently, the functionalities that they provide. The latter represents the actual reuse of code. No application can be maintained for ever. A fundamental reason for the need for periodical replacement of code is the exponential growth of costs with the number of maintenance interventions. Intuitively, this is due to the increasing complexity of software that grows in both size and coupling among different modules. The paper measures commonality, reuse and development costs of 26 open-source projects for a total of 171 application versions. Results show that reuse choices in open-source contexts are not cost efficient. Developers tend to reuse code from the most recent version of applications, even if their requirements are closer to previous versions. Furthermore, the latest version of an application is always the one that has incurred the highest number of maintenance interventions. Accordingly, the development cost per new line of code is found to grow with reuse
Automating the simulation of SME processes through a discrete event parametric model
At the factory level, the manufacturing system can be described as a group of processes governed by complex weaves of engineering strategies and technologies. Decision- making processes involve a lot of information, driven by managerial strategies, technological implications and layout constraints. Many factors affect decisions, and their combination must be carefully managed to determine the best solutions to optimize performances. In this way, advanced simulation tools could support the decisional process of many SMEs. The accessibility of these tools is limited by knowledge, cost, data availability and development time. These tools should be used to support strategic decisions rather than specific situations. In this paper, a novel approach is proposed that aims to facilitate the simulation of manufacturing processes by fast modelling and evaluation. The idea is to realize a model that is able to be automatically adapted to the userâs specific needs. The model must be characterized by a high degree of flexibility, configurability and adaptability in order to automatically simulate multiple/heterogeneous industrial scenarios. In this way, even a SME can easily access a complex tool, perform thorough analyses and be supported in taking strategic decisions. The parametric DES model is part of a greater software platform developed during COPERNICO EU funded project
PKM mechatronic clamping adaptive device
This study proposes a novel adaptive fixturing device based on active clamping systems for smart micropositioning of thin-walled precision parts. The modular architecture and the structure flexibility make the system suitable for various industrial applications. The proposed device is realized as a Parallel Kinematic Machine (PKM), opportunely sensorized and controlled, able to perform automatic error-free workpiece clamping procedures, drastically reducing the overall fixturing set-up time. The paper describes the kinematics and dynamics of this mechatronic system. A first campaign of experimental trails has been carried out on the prototype, obtaining promising results
Securing PIN-based Authentication in Smartwatches With just Two Gestures
Smartwatches are becoming increasingly ubiquitous as they offer new capabilities to
develop sophisticated applications that make daily life easier and more convenient
for consumers. The services provided include applications for mobile payment, ticketing,
identification, access control, etc. While this makes modern smartwatches very
powerful devices, it also makes them very attractive targets for attackers. Indeed,
PINs and Pattern Lock have been widely used in smartwatches for user authentication.
However, such authentication methods are not robust against various forms of
cybersecurity attacks, such as side channel, phishing, smudge, shoulder surfing, and
video recording attacks. Moreover, the recent adoption of hardware-based solutions,
like the Trusted Execution Environment (TEE), can mitigate only partially such problems.
Thus, the userâs security and privacy are at risk without a strong authentication
scheme in place. In this work, we propose 2GesturePIN, a new authentication framework
that allows users to authenticate securely to their smartwatches and related
sensitive services through solely two gestures. 2GesturePIN leverages the rotating
bezel or crown, which are the most intuitive ways to interact with a smartwatch, as a
dedicated hardware. 2GesturePIN improves the resilience of the regular PIN authentication
method against state-of-the-art cybersecurity attacks while maintaining a
high level of usability
Differentiation between Fresh and Thawed Cephalopods Using NIR Spectroscopy and Multivariate Data Analysis
The sale of frozenâthawed fish and fish products, labeled as fresh, is currently one of the most common and insidious commercial food frauds. For this reason, the demand of reliable tools to identify the storage conditions is increasing. The present study was performed on two species, commonly sold in large-scale distribution: Cuttlefish (Sepia officinalis) and musky octopus (Eledone spp.). Fifty fresh cephalopod specimens were analyzed at refrigeration temperature (2 ± 2°C), then frozen at â20°C for 10 days and finally thawed and analyzed again. The performance of three near-infrared (NIR) instruments in identifying storage conditions were compared: The benchtop NIR Multi Purpose Analyzer (MPA) by Bruker, the portable MicroNIR by VIAVI and the handheld NIR SCiO by Consumer Physics. All collected spectra were processed and analyzed with chemometric methods. The SCiO data were also analyzed using the analytical tools available in the online application provided by the manufacturer to evaluate its performance. NIR spectroscopy, coupled with chemometrics, allowed discriminating between fresh and thawed samples with high accuracy: Cuttlefish between 82.3â94.1%, musky octopus between 91.2â97.1%, global model between 86.8â95.6%. Results show how food frauds could be detected directly in the marketplace, through small, ultra-fast and simplified handheld devices, whereas official control laboratories could use benchtop analytical instruments, coupled with chemometric approaches, to develop accurate and validated methods, suitable for regulatory purposes
- âŠ