14,810 research outputs found

    Modelling and validating an engineering application in kernel P systems

    Get PDF
    © 2018, Springer International Publishing AG. This paper illustrates how kernel P systems (kP systems) can be used for modelling and validating an engineering application, in this case a cruise control system of an electric bike. The validity of the system is demonstrated via formal verification, carried out using the kPWorkbench tool. Furthermore, we show how the kernel P system model can be tested using automata and X-machine based techniques

    Methodology for testing and validating knowledge bases

    Get PDF
    A test and validation toolset developed for artificial intelligence programs is described. The basic premises of this method are: (1) knowledge bases have a strongly declarative character and represent mostly structural information about different domains, (2) the conditions for integrity, consistency, and correctness can be transformed into structural properties of knowledge bases, and (3) structural information and structural properties can be uniformly represented by graphs and checked by graph algorithms. The interactive test and validation environment have been implemented on a SUN workstation

    Teaching old sensors New tricks: archetypes of intelligence

    No full text
    In this paper a generic intelligent sensor software architecture is described which builds upon the basic requirements of related industry standards (IEEE 1451 and SEVA BS- 7986). It incorporates specific functionalities such as real-time fault detection, drift compensation, adaptation to environmental changes and autonomous reconfiguration. The modular based structure of the intelligent sensor architecture provides enhanced flexibility in regard to the choice of specific algorithmic realizations. In this context, the particular aspects of fault detection and drift estimation are discussed. A mixed indicative/corrective fault detection approach is proposed while it is demonstrated that reversible/irreversible state dependent drift can be estimated using generic algorithms such as the EKF or on-line density estimators. Finally, a parsimonious density estimator is presented and validated through simulated and real data for use in an operating regime dependent fault detection framework

    Automated analysis of feature models: Quo vadis?

    Get PDF
    Feature models have been used since the 90's to describe software product lines as a way of reusing common parts in a family of software systems. In 2010, a systematic literature review was published summarizing the advances and settling the basis of the area of Automated Analysis of Feature Models (AAFM). From then on, different studies have applied the AAFM in different domains. In this paper, we provide an overview of the evolution of this field since 2010 by performing a systematic mapping study considering 423 primary sources. We found six different variability facets where the AAFM is being applied that define the tendencies: product configuration and derivation; testing and evolution; reverse engineering; multi-model variability-analysis; variability modelling and variability-intensive systems. We also confirmed that there is a lack of industrial evidence in most of the cases. Finally, we present where and when the papers have been published and who are the authors and institutions that are contributing to the field. We observed that the maturity is proven by the increment in the number of journals published along the years as well as the diversity of conferences and workshops where papers are published. We also suggest some synergies with other areas such as cloud or mobile computing among others that can motivate further research in the future.Ministerio de Economía y Competitividad TIN2015-70560-RJunta de Andalucía TIC-186

    Computing server power modeling in a data center: survey,taxonomy and performance evaluation

    Full text link
    Data centers are large scale, energy-hungry infrastructure serving the increasing computational demands as the world is becoming more connected in smart cities. The emergence of advanced technologies such as cloud-based services, internet of things (IoT) and big data analytics has augmented the growth of global data centers, leading to high energy consumption. This upsurge in energy consumption of the data centers not only incurs the issue of surging high cost (operational and maintenance) but also has an adverse effect on the environment. Dynamic power management in a data center environment requires the cognizance of the correlation between the system and hardware level performance counters and the power consumption. Power consumption modeling exhibits this correlation and is crucial in designing energy-efficient optimization strategies based on resource utilization. Several works in power modeling are proposed and used in the literature. However, these power models have been evaluated using different benchmarking applications, power measurement techniques and error calculation formula on different machines. In this work, we present a taxonomy and evaluation of 24 software-based power models using a unified environment, benchmarking applications, power measurement technique and error formula, with the aim of achieving an objective comparison. We use different servers architectures to assess the impact of heterogeneity on the models' comparison. The performance analysis of these models is elaborated in the paper

    Potential of support-vector regression for forecasting stream flow

    Get PDF
    Vodotok je važan za hidrološko proučavanje zato što određuje varijabilnost vode i magnitudu rijeke. Inženjerstvo vodnih resursa uvijek se bavi povijesnim podacima i pokušava procijeniti prognostičke podatke kako bi se osiguralo bolje predviđanje za primjenu kod bilo kojeg vodnog resursa, na pr. projektiranja vodnog potencijala brane hidroelektrana, procjene niskog protoka, i održavanja zalihe vode. U radu se predstavljaju tri računalna programa za primjenu kod rješavanja ovakvih sadržaja, tj. umjetne neuronske mreže - artificial neural networks (ANNs), prilagodljivi sustavi neuro-neizrazitog zaključivanja - adaptive-neuro-fuzzy inference systems (ANFISs), i support vector machines (SVMs). Za stvaranje procjene korištena je Rijeka Telom, smještena u Cameron Highlands distriktu Pahanga, Malaysia. Podaci o dnevnom prosječnom protoku rijeke Telom, kao što su količina padavina i podaci o vodostaju, koristili su se za period od ožujka 1984. do siječnja 2013. za podučavanje, ispitivanje i ocjenjivanje izabranih modela. SVM pristup je dao bolje rezultate nego ANFIS i ANNs kod procjenjivanja dnevne prosječne fluktuacije vodotoka.Stream flow is an important input for hydrology studies because it determines the water variability and magnitude of a river. Water resources engineering always deals with historical data and tries to estimate the forecasting records in order to give a better prediction for any water resources applications, such as designing the water potential of hydroelectric dams, estimating low flow, and maintaining the water supply. This paper presents three soft-computing approaches for dealing with these issues, i.e. artificial neural networks (ANNs), adaptive-neuro-fuzzy inference systems (ANFISs), and support vector machines (SVMs). Telom River, located in the Cameron Highlands district of Pahang, Malaysia, was used in making the estimation. The Telom River’s daily mean discharge records, such as rainfall and river-level data, were used for the period of March 1984 – January 2013 for training, testing, and validating the selected models. The SVM approach provided better results than ANFIS and ANNs in estimating the daily mean fluctuation of the stream’s flow
    corecore