28 research outputs found
A review of quality frameworks in information systems
Quality is a multidimensional concept that has different meanings in different contexts and perspectives. In the domain of Information system, quality is often understood as the result of an IS development process and as the quality of an IS product. Many models and frameworks have been proposed for evaluating IS quality. However, as yet there is not a commonly accepted framework or standard of IS quality. Typically, researchers propose a set of characteristics, so-called quality factors contributing to the quality of IS. Different stakeholders' perspectives are resulting in multiple definitions of quality factors of IS. For instance, some approaches are based on the IS delivery process for the selection of quality factors; while some other approaches do not clearly explain the rationale of their selection. Moreover, often relations or impacts among selected quality factors are not taken into account. Quality aspects of information are frequently considered isolated from IS quality. The impact of IS quality on information quality seems to be neglected in most approaches. Our research aims to incorporate these levels, by which we propose an IS quality framework based on IS architecture. Considering user and IS developer's perspectives, different quality factors are identified for various abstraction levels. Besides, the presentation on impacts among different quality factors helps to retrieve the root cause of IS defects. Thus, our framework provides a systematic view on quality of information and IS
Discovering Dynamic Integrity Rules with a Rules-Based Tool for Data Quality Analyzing
Rules based approaches for data quality solutions often use business rules or integrity rules
for data monitoring purpose. Integrity rules are constraints on data derived from business rules into a formal
form in order to allow computerization. One of challenges of these approaches is rules discovering, which is
usually manually made by business experts or system analysts based on experiences. In this paper, we
present our rule-based approach for data quality analyzing, in which we discuss a comprehensive method for
discovering dynamic integrity rules
Discovering Dynamic Integrity Rules with a Rules-Based Tool for Data Quality Analyzing
Rules based approaches for data quality solutions often use business rules or integrity rules
for data monitoring purpose. Integrity rules are constraints on data derived from business rules into a formal
form in order to allow computerization. One of challenges of these approaches is rules discovering, which is
usually manually made by business experts or system analysts based on experiences. In this paper, we
present our rule-based approach for data quality analyzing, in which we discuss a comprehensive method for
discovering dynamic integrity rules
Research of multi-response optimization of milling process of hardened S50C steel using minimum quantity lubrication of Vietnamese peanut oil
This study aims to build a regression model when surveying the milling process on S50C steel using Minimum Quantity Lubrication (MQL) of Vietnamese peanut oil-based on Response Surface Methodology. The paper analyses and evaluates the effect of cutting parameters, flow rates, and pressures in minimum quantity lubrication system on cutting force and surface roughness in the milling process of S50C carbon steel materials after heat treatment (reaching a hardness of 52 HRC). The Taguchi method, one of the most effective experimental planning methods nowadays, is used in this study. The statistical analysis software, namely Minitab 19, is utilized to build a regression model between parameters of the cutting process, flow rates and pressures of the minimum quantity lubrication system and the cutting force, surface roughness of the part when machining on a 5-axis CNC milling machine. Thereby analyzing and predicting the effect of cutting parameters and minimum quantity lubrication conditions on the surface roughness and cutting force during machining to determine the influence level them. In this work, the regression models of Ra and F were achieved by using the optimizer tool in Minitab 19. Moreover, the multi-response optimization problem was solved. The optimum cutting parameters and lubricating conditions are as follows: Cutting velocity Vc=190.909 m/min, feed rate fz=0.02 mm/tooth, axial depth of cut ap=0.1 and nozzle pressure P=5.596 MPa, flow rate Q=108.887 ml/h. The output parameters obtained from the above parameters are Ra=0.0586 and F=162.035 N, respectively. This result not only provides the foundation for future research but also contributes reference data for the machining proces
A Knowledge-Based Model For Context-Aware Smart Service Systems
The advancement of the Internet of Things, big data, and mobile computing leads to the need for smart services that enable the context awareness and the adaptability to their changing contexts. Today, designing a smart service system is a complex task due to the lack of an adequate model support in awareness and pervasive environment. In this paper, we present the concept of a context-aware smart service system and propose a knowledge model for context-aware smart service systems. The proposed model organizes the domain and context-aware knowledge into knowledge components based on the three levels of services: Services, Service system, and Network of service systems. The knowledge model for context-aware smart service systems integrates all the information and knowledge related to smart services, knowledge components, and context awareness that can play a key role for any framework, infrastructure, or applications deploying smart services. In order to demonstrate the approach, two case studies about chatbot as context-aware smart services for customer support are presented