2,333 research outputs found

    Intelligent Embedded Software: New Perspectives and Challenges

    Get PDF
    Intelligent embedded systems (IES) represent a novel and promising generation of embedded systems (ES). IES have the capacity of reasoning about their external environments and adapt their behavior accordingly. Such systems are situated in the intersection of two different branches that are the embedded computing and the intelligent computing. On the other hand, intelligent embedded software (IESo) is becoming a large part of the engineering cost of intelligent embedded systems. IESo can include some artificial intelligence (AI)-based systems such as expert systems, neural networks and other sophisticated artificial intelligence (AI) models to guarantee some important characteristics such as self-learning, self-optimizing and self-repairing. Despite the widespread of such systems, some design challenging issues are arising. Designing a resource-constrained software and at the same time intelligent is not a trivial task especially in a real-time context. To deal with this dilemma, embedded system researchers have profited from the progress in semiconductor technology to develop specific hardware to support well AI models and render the integration of AI with the embedded world a reality

    Poor ergonomics costs but can good be made to pay?

    Get PDF

    From ARTEMIS Requirements to a Cross-Domain Embedded System Architecture

    Get PDF
    International audienceThis paper gives an overview of the cross-domain component-based architecture GENESYS for embedded systems. The development of this architecture has been driven by key industrial challenges identified within the ARTEMIS Strategic Research Agenda (SRA) such as composability, robustness and integrated resource management. GENESYS is a platform architecture that provides a minimal set of core services and a plurality of optional services that are predominantly implemented as self-contained system components. Choosing a suitable set of these system components that implement optional services, augmented by application specific components, can generate domain-specific instantiations of the architecture (e.g., for automotive, avionic, industrial control, mobile, and consumer electronics applications). Such a cross-domain approach is needed to support the coming Internet of Things, to take full advantage of the economies of scale of the semiconductor industry and to improve productivity

    An Appreciative Model of Schumpeterian Evolution

    Get PDF
    Schumpeter persistently sought to reconcile innovation with general equilibrium to explain economic evolution. In essence, he was interested in innovatory discontinuities that upset equilibrium and generate a transitional dynamics converging to a different state of technology. There are two central approaches to the analysis of economic evolution which revolve around the Schumpeterian vision: the evolutionary approach as originated in the landmark book by Nelson and Winter An Evolutionary Theory of Economic Change and the neoclassical approach emerging from Romer’s seminal paper “Endogenous Technological Change” Neither of these approaches is able to explain economic evolution in an economy where both general equilibrium and innovatory discontinuities can happen. In this paper I formalize the notion of innovatory discontinuity using the concept of ‘ideas production function’ and present an appreciative model of economic evolution involving equilibrium, innovation and innovatory discontinuities. This (hybrid) model sheds some light on the answer to the question: is economic evolution continuous or discontinuous?General equilibrium, economic evolution, neoclassical approach, evolutionary economics, mega-invention, innovatory discontinuities, Schumpeterian view

    Microfluidic technologies for accelerating the clinical translation of nanoparticles

    Get PDF
    Using nanoparticles for therapy and imaging holds tremendous promise for the treatment of major diseases such as cancer. However, their translation into the clinic has been slow because it remains difficult to produce nanoparticles that are consistent 'batch-to-batch', and in sufficient quantities for clinical research. Moreover, platforms for rapid screening of nanoparticles are still lacking. Recent microfluidic technologies can tackle some of these issues, and offer a way to accelerate the clinical translation of nanoparticles. In this Progress Article, we highlight the advances in microfluidic systems that can synthesize libraries of nanoparticles in a well-controlled, reproducible and high-throughput manner. We also discuss the use of microfluidics for rapidly evaluating nanoparticles in vitro under microenvironments that mimic the in vivo conditions. Furthermore, we highlight some systems that can manipulate small organisms, which could be used for evaluating the in vivo toxicity of nanoparticles or for drug screening. We conclude with a critical assessment of the near- and long-term impact of microfluidics in the field of nanomedicine.Prostate Cancer Foundation (Award in Nanotherapeutics)MIT-Harvard Center for Cancer Nanotechnology Excellence (U54-CA151884)National Heart, Lung, and Blood Institute (Programs of Excellence in Nanotechnology (HHSN268201000045C))National Science Foundation (U.S.) (Graduate Research Fellowship

    Patented Personality

    Get PDF

    Understanding multidimensional verification: Where functional meets non-functional

    Get PDF
    Abstract Advancements in electronic systems' design have a notable impact on design verification technologies. The recent paradigms of Internet-of-Things (IoT) and Cyber-Physical Systems (CPS) assume devices immersed in physical environments, significantly constrained in resources and expected to provide levels of security, privacy, reliability, performance and low-power features. In recent years, numerous extra-functional aspects of electronic systems were brought to the front and imply verification of hardware design models in multidimensional space along with the functional concerns of the target system. However, different from the software domain such a holistic approach remains underdeveloped. The contributions of this paper are a taxonomy for multidimensional hardware verification aspects, a state-of-the-art survey of related research works and trends enabling the multidimensional verification concept. Further, an initial approach to perform multidimensional verification based on machine learning techniques is evaluated. The importance and challenge of performing multidimensional verification is illustrated by an example case study

    A review of advances in pixel detectors for experiments with high rate and radiation

    Full text link
    The Large Hadron Collider (LHC) experiments ATLAS and CMS have established hybrid pixel detectors as the instrument of choice for particle tracking and vertexing in high rate and radiation environments, as they operate close to the LHC interaction points. With the High Luminosity-LHC upgrade now in sight, for which the tracking detectors will be completely replaced, new generations of pixel detectors are being devised. They have to address enormous challenges in terms of data throughput and radiation levels, ionizing and non-ionizing, that harm the sensing and readout parts of pixel detectors alike. Advances in microelectronics and microprocessing technologies now enable large scale detector designs with unprecedented performance in measurement precision (space and time), radiation hard sensors and readout chips, hybridization techniques, lightweight supports, and fully monolithic approaches to meet these challenges. This paper reviews the world-wide effort on these developments.Comment: 84 pages with 46 figures. Review article.For submission to Rep. Prog. Phy

    DNA assembly standards: Setting the low-level programming code for plant biotechnology

    Full text link
    [EN] Synthetic Biology is defined as the application of engineering principles to biology. It aims to increase the speed, ease and predictability with which desirable changes and novel traits can be conferred to living cells. The initial steps in this process aim to simplify the encoding of new instructions in DNA by establishing low-level programming languages for biology. Together with advances in the laboratory that allow multiple DNA molecules to be efficiently assembled together into a desired order in a single step, this approach has simplified the design and assembly of multigene constructs and has even facilitated the automated construction of synthetic chromosomes. These advances and technologies are now being applied to plants, for which there are a growing number of software and wetware tools for the design, construction and delivery of DNA molecules and for the engineering of endogenous genes. Here we review the efforts of the past decade that have established synthetic biology workflows and tools for plants and discuss the constraints and bottlenecks of this emerging field.Marta Vazquez-Vilar is funded by Wageningen University & Research. Diego Orzaez is funded by the Spanish Ministry of Economy and Competitiveness [grant number BIO2016-78601-R]. Nicola Patron is funded by the UK Biotechnological and Biological Sciences Research Council (BBSRC) and Engineering and Physics Research Council (EPSRC)Synthetic Biology for Growth programme [OpenPlant Synthetic Biology Research Centre, grant number BB/L0I4130/1], and by the Earlham DNA Foundry [grant number BB/CCG1720/1].VĂĄzquez-Vilar, M.; OrzĂĄez Calatayud, DV.; Patron, N. (2018). DNA assembly standards: Setting the low-level programming code for plant biotechnology. Plant Science. 273:33-41. https://doi.org/10.1016/j.plantsci.2018.02.024S334127
    • …
    corecore