471 research outputs found

    MultiPARTES: Multicore Virtualization for Mixed-Criticality Systems

    Full text link
    Modern embedded applications typically integrate a multitude of functionalities with potentially different criticality levels into a single system. Without appropriate preconditions, the integration of mixed-criticality subsystems can lead to a significant and potentially unacceptable increase of engineering and certification costs. A promising solution is to incorporate mechanisms that establish multiple partitions with strict temporal and spatial separation between the individual partitions. In this approach, subsystems with different levels of criticality can be placed in different partitions and can be verified and validated in isolation. The MultiPARTES FP7 project aims at supporting mixed- criticality integration for embedded systems based on virtualization techniques for heterogeneous multicore processors. A major outcome of the project is the MultiPARTES XtratuM, an open source hypervisor designed as a generic virtualization layer for heterogeneous multicore. MultiPARTES evaluates the developed technology through selected use cases from the offshore wind power, space, visual surveillance, and automotive domains. The impact of MultiPARTES on the targeted domains will be also discussed. In a number of ongoing research initiatives (e.g., RECOMP, ARAMIS, MultiPARTES, CERTAINTY) mixed-criticality integration is considered in multicore processors. Key challenges are the combination of software virtualization and hardware segregation and the extension of partitioning mechanisms to jointly address significant non-functional requirements (e.g., time, energy and power budgets, adaptivity, reliability, safety, security, volume, weight, etc.) along with development and certification methodology

    Cloud service analysis using round-robin algorithm for quality-of-service aware task placement for internet of things services

    Get PDF
    Round-robin (RR) is a process approach to sharing resources that requires each user to get a turn using them in an agreed order in cloud computing. It is suited for time-sharing systems since it automatically reduces the problem of priority inversion, which are low-priority tasks delayed. The time quantum is limited, and only a one-time quantum process is allowed in round-robin scheduling. The objective of this research is to improve the functionality of the current RR method for scheduling actions in the cloud by lowering the average waiting, turnaround, and response time. CloudAnalyst tool was used to enhance the RR technique by changing the parameter value in optimizing the high accuracy and low cost. The result presents the achieved overall min and max response times are 36.69 and 650.30 ms for running 300 min RR. The cost for the virtual machines (VMs) is identified from 0.5to0.5 to 3. The longer the time used, the higher the cost of the data transfer. This research is significant in improving communication and the quality of relationships within groups

    TechNews digests: Jan - Nov 2009

    Get PDF
    TechNews is a technology, news and analysis service aimed at anyone in the education sector keen to stay informed about technology developments, trends and issues. TechNews focuses on emerging technologies and other technology news. TechNews service : digests september 2004 till May 2010 Analysis pieces and News combined publish every 2 to 3 month

    Enhancing Confidentiality and Privacy Preservation in e-Health to Enhanced Security

    Get PDF
    Electronic health (e-health) system use is growing, which has improved healthcare services significantly but has created questions about the privacy and security of sensitive medical data. This research suggests a novel strategy to overcome these difficulties and strengthen the security of e-health systems while maintaining the privacy and confidentiality of patient data by utilising machine learning techniques. The security layers of e-health systems are strengthened by the comprehensive framework we propose in this paper, which incorporates cutting-edge machine learning algorithms. The suggested framework includes data encryption, access control, and anomaly detection as its three main elements. First, to prevent unauthorised access during transmission and storage, patient data is secured using cutting-edge encryption technologies. Second, to make sure that only authorised staff can access sensitive medical records, access control mechanisms are strengthened using machine learning models that examine user behaviour patterns. This research's inclusion of machine learning-based anomaly detection is its most inventive feature. The technology may identify variations from typical data access and usage patterns, thereby quickly spotting potential security breaches or unauthorised activity, by training models on past e-health data. This proactive strategy improves the system's capacity to successfully address new threats. Extensive experiments were carried out employing a broad dataset made up of real-world e-health scenarios to verify the efficacy of the suggested approach. The findings showed a marked improvement in the protection of confidentiality and privacy, along with a considerable decline in security breaches and unauthorised access events

    Health 4.0: Applications, Management, Technologies and Review

    Get PDF
    The Industry 4.0 Standard (I4S) employs technologies for automation and data exchange through cloud computing, Big Data (BD), Internet of Things (IoT), forms of wireless Internet, 5G technologies, cryptography, the use of semantic database (DB) design, Augmented Reality (AR) and Content-Based Image Retrieval (CBIR). Its healthcare extension is the so-called Health 4.0. This study informs about Health 4.0 and its potential to extend, virtualize and enable new healthcare-related processes (e.g., home care, finitude medicine, and personalized/remotely triggered pharmaceutical treatments) and transform them into services. In the future, these services will be able to virtualize multiple levels of care, connect devices and move to Personalized Medicine (PM). The Health 4.0 Cyber-Physical System (HCPS) contains several types of computers, communications, storage, interfaces, biosensors, and bioactuators. The HCPS paradigm permits observing processes from the real world, as well as monitoring patients before, during and after surgical procedures using biosensors. Besides, HCPSs contain bioactuators that accomplish the intended interventions along with other novel strategies to deploy PM. A biosensor detects some critical outer and inner patient conditions and sends these signals to a Decision-Making Unit (DMU). Mobile devices and wearables are present examples of gadgets containing biosensors. Once the DMU receives signals, they can be compared to the patient’s medical history and, depending on the protocols, a set of measures to handle a given situation will follow. The part responsible for the implementation of the automated mitigation actions are the bioactuators, which can vary from a buzzer to the remote-controlled release of some elements in a capsule inside the patient’s body.             Decentralizing health services is a challenge for the creation of health-related applications. Together, CBIR systems can enable access to information from multimedia and multimodality images, which can aid in patient diagnosis and medical decision-making. Currently, the National Health Service addresses the application of communication tools to patients and medical teams to intensify the transfer of treatments from the hospital to the home, without disruption in outpatient services. HCPS technologies share tools with remote servers, allowing data embedding and BD analysis and permit easy integration of healthcare professionals expertise with intelligent devices.  However, it is undeniable the need for improvements, multidisciplinary discussions, strong laws/protocols, inventories about the impact of novel techniques on patients/caregivers as well as rigorous tests of accuracy until reaching the level of automating any medical care technological initiative

    Modeling 4.0: Conceptual Modeling in a Digital Era

    Get PDF
    Digitization provides entirely new affordances for our economies and societies. This leads to previously unseen design opportunities and complexities as systems and their boundaries are re-defined, creating a demand for appropriate methods to support design that caters to these new demands. Conceptual modeling is an established means for this, but it needs to be advanced to adequately depict the requirements of digitization. However, unlike the actual deployment of digital technologies in various industries, the domain of conceptual modeling itself has not yet undergone a comprehensive renewal in light of digitization. Therefore, inspired by the notion of Industry 4.0, an overarching concept for digital manufacturing, in this commentary paper, we propose Modeling 4.0 as the notion for conceptual modeling mechanisms in a digital environment. In total, 12 mechanisms of conceptual modeling are distinguished, providing ample guidance for academics and professionals interested in ensuring that modeling techniques and methods continue to fit contemporary and emerging requirements

    Reinforcing Digital Trust for Cloud Manufacturing Through Data Provenance Using Ethereum Smart Contracts

    Get PDF
    Cloud Manufacturing(CMfg) is an advanced manufacturing model that caters to fast-paced agile requirements (Putnik, 2012). For manufacturing complex products that require extensive resources, manufacturers explore advanced manufacturing techniques like CMfg as it becomes infeasible to achieve high standards through complete ownership of manufacturing artifacts (Kuan et al., 2011). CMfg, with other names such as Manufacturing as a Service (MaaS) and Cyber Manufacturing (NSF, 2020), addresses the shortcoming of traditional manufacturing by building a virtual cyber enterprise of geographically distributed entities that manufacture custom products through collaboration. With manufacturing venturing into cyberspace, Digital Trust issues concerning product quality, data, and intellectual property security, become significant concerns (R. Li et al., 2019). This study establishes a trust mechanism through data provenance for ensuring digital trust between various stakeholders involved in CMfg. A trust model with smart contracts built on the Ethereum blockchain implements data provenance in CMfg. The study covers three data provenance models using Ethereum smart contracts for establishing digital trust in CMfg. These are Product Provenance, Order Provenance, and Operational Provenance. The models of provenance together address the most important questions regarding CMfg: What goes into the product, who manufactures the product, who transports the products, under what conditions the products are manufactured, and whether regulatory constraints/requisites are met

    IDARTS – Towards intelligent data analysis and real-time supervision for industry 4.0

    Get PDF
    The manufacturing industry represents a data rich environment, in which larger and larger volumes of data are constantly being generated by its processes. However, only a relatively small portion of it is actually taken advantage of by manufacturers. As such, the proposed Intelligent Data Analysis and Real-Time Supervision (IDARTS) framework presents the guidelines for the implementation of scalable, flexible and pluggable data analysis and real-time supervision systems for manufacturing environments. IDARTS is aligned with the current Industry 4.0 trend, being aimed at allowing manufacturers to translate their data into a business advantage through the integration of a Cyber-Physical System at the edge with cloud computing. It combines distributed data acquisition, machine learning and run-time reasoning to assist in fields such as predictive maintenance and quality control, reducing the impact of disruptive events in production.info:eu-repo/semantics/publishedVersio

    The Impact of MIS Software on IT Energy Consumption

    Get PDF
    The energy consumption of IT has a great impact on operational costs, in addition to being important for social responsibility and system scalability issues. Research on IT energy efficiency has always focused on hardware, whereas within the software domain it has mainly focused on embedded systems. In this paper we present the preliminary results of some experiments that we conducted to evaluate MIS applications from an energy efficiency point of view. We analyze in details some selected case studies, including 2 ERPs, 2 CRMs and 4 DBMS. Our evidence suggests i) that not only the infrastructural layers, but also the MIS applications layer does impact on the energy consumption; ii) that different MIS applications satisfying the same functional requirements consume significantly different amounts of energy; and iii) that in some scenarios energy efficiency cannot be increased simply by improving time performance

    Discovering New Vulnerabilities in Computer Systems

    Get PDF
    Vulnerability research plays a key role in preventing and defending against malicious computer system exploitations. Driven by a multi-billion dollar underground economy, cyber criminals today tirelessly launch malicious exploitations, threatening every aspect of daily computing. to effectively protect computer systems from devastation, it is imperative to discover and mitigate vulnerabilities before they fall into the offensive parties\u27 hands. This dissertation is dedicated to the research and discovery of new design and deployment vulnerabilities in three very different types of computer systems.;The first vulnerability is found in the automatic malicious binary (malware) detection system. Binary analysis, a central piece of technology for malware detection, are divided into two classes, static analysis and dynamic analysis. State-of-the-art detection systems employ both classes of analyses to complement each other\u27s strengths and weaknesses for improved detection results. However, we found that the commonly seen design patterns may suffer from evasion attacks. We demonstrate attacks on the vulnerabilities by designing and implementing a novel binary obfuscation technique.;The second vulnerability is located in the design of server system power management. Technological advancements have improved server system power efficiency and facilitated energy proportional computing. However, the change of power profile makes the power consumption subjected to unaudited influences of remote parties, leaving the server systems vulnerable to energy-targeted malicious exploit. We demonstrate an energy abusing attack on a standalone open Web server, measure the extent of the damage, and present a preliminary defense strategy.;The third vulnerability is discovered in the application of server virtualization technologies. Server virtualization greatly benefits today\u27s data centers and brings pervasive cloud computing a step closer to the general public. However, the practice of physical co-hosting virtual machines with different security privileges risks introducing covert channels that seriously threaten the information security in the cloud. We study the construction of high-bandwidth covert channels via the memory sub-system, and show a practical exploit of cross-virtual-machine covert channels on virtualized x86 platforms
    • …
    corecore