50,887 research outputs found

    Integrated Design and Implementation of Embedded Control Systems with Scilab

    Get PDF
    Embedded systems are playing an increasingly important role in control engineering. Despite their popularity, embedded systems are generally subject to resource constraints and it is therefore difficult to build complex control systems on embedded platforms. Traditionally, the design and implementation of control systems are often separated, which causes the development of embedded control systems to be highly time-consuming and costly. To address these problems, this paper presents a low-cost, reusable, reconfigurable platform that enables integrated design and implementation of embedded control systems. To minimize the cost, free and open source software packages such as Linux and Scilab are used. Scilab is ported to the embedded ARM-Linux system. The drivers for interfacing Scilab with several communication protocols including serial, Ethernet, and Modbus are developed. Experiments are conducted to test the developed embedded platform. The use of Scilab enables implementation of complex control algorithms on embedded platforms. With the developed platform, it is possible to perform all phases of the development cycle of embedded control systems in a unified environment, thus facilitating the reduction of development time and cost.Comment: 15 pages, 14 figures; Open Access at http://www.mdpi.org/sensors/papers/s8095501.pd

    Password Based a Generalize Robust Security System Design Using Neural Network

    Get PDF
    Among the various means of available resource protection including biometrics, password based system is most simple, user friendly, cost effective and commonly used. But this method having high sensitivity with attacks. Most of the advanced methods for authentication based on password encrypt the contents of password before storing or transmitting in physical domain. But all conventional cryptographic based encryption methods are having its own limitations, generally either in terms of complexity or in terms of efficiency. Multi-application usability of password today forcing users to have a proper memory aids. Which itself degrades the level of security. In this paper a method to exploit the artificial neural network to develop the more secure means of authentication, which is more efficient in providing the authentication, at the same time simple in design, has given. Apart from protection, a step toward perfect security has taken by adding the feature of intruder detection along with the protection system. This is possible by analysis of several logical parameters associated with the user activities. A new method of designing the security system centrally based on neural network with intrusion detection capability to handles the challenges available with present solutions, for any kind of resource has presented

    Launching the Grand Challenges for Ocean Conservation

    Get PDF
    The ten most pressing Grand Challenges in Oceans Conservation were identified at the Oceans Big Think and described in a detailed working document:A Blue Revolution for Oceans: Reengineering Aquaculture for SustainabilityEnding and Recovering from Marine DebrisTransparency and Traceability from Sea to Shore:  Ending OverfishingProtecting Critical Ocean Habitats: New Tools for Marine ProtectionEngineering Ecological Resilience in Near Shore and Coastal AreasReducing the Ecological Footprint of Fishing through Smarter GearArresting the Alien Invasion: Combating Invasive SpeciesCombatting the Effects of Ocean AcidificationEnding Marine Wildlife TraffickingReviving Dead Zones: Combating Ocean Deoxygenation and Nutrient Runof

    Australian carbon biosequestration and bioenergy policy co-evolution: mechanisms, mitigation and convergence

    Get PDF
    The intricacies of international land-use change and forestry policy reflect the temporal, technical and political difficulty of integrating biological systems and climate change mitigation. The plethora of co-existing policies with varied technical rules, accreditation requirements, accounting methods, market registries, etc., disguise the unequal efficacies of each mechanism. This work explores the co-evolution and convergence of Australian voluntary and mandatory climate-related policies at the biosequestration-bioenergy interface. Currently, there are temporal differences between the fast-evolving and precise climate-change mechanisms, and the long-term 'permanence' sought from land use changes encouraged by biosequestration instruments. Policy convergence that favours the most efficient, appropriate and scientifically substantiated policy mechanisms is required. These policies must recognise the fundamental biological foundation of biosequestration, bioenergy, biomaterial industrial development and other areas such as food security and environmental concerns. Policy mechanisms that provide administrative simplicity, project longevity and market certainty are necessary for rural and regional Australians to cost-effectively harness the considerable climate change mitigation potential of biological systems

    On Designing Multicore-aware Simulators for Biological Systems

    Full text link
    The stochastic simulation of biological systems is an increasingly popular technique in bioinformatics. It often is an enlightening technique, which may however result in being computational expensive. We discuss the main opportunities to speed it up on multi-core platforms, which pose new challenges for parallelisation techniques. These opportunities are developed in two general families of solutions involving both the single simulation and a bulk of independent simulations (either replicas of derived from parameter sweep). Proposed solutions are tested on the parallelisation of the CWC simulator (Calculus of Wrapped Compartments) that is carried out according to proposed solutions by way of the FastFlow programming framework making possible fast development and efficient execution on multi-cores.Comment: 19 pages + cover pag

    Making More Efficient the Dissemination of the Information in the Field of Anti-Aging through Information Technology

    Get PDF
    ICT have become extremely important because they allow everybody to participate at the Information Society, in spite of under-privileged personal or social situation. Health Education Informatics Systems (HEIS), as a method for facilitating the exchange of information between specialists, physicians and patients, or authorized organisations, become a necessary modern tool which offer quality solutions, a correct source of information and pertinent instrument for taking decisions. The members of the aging society must be motivated to have access through ICT at knowledge that can improve and prolong the active life. The dramatic demographic transformations of our century have imposed the reconsideration of the social policies and of the use of HEIS for disseminating the anti-aging information, for empowering the person regarding his own state of health, and also for the real involving of the elderly in using the Internet. AgingNice is a multidisciplinary complex system that belongs to the health informatics systems with particularization in the anti-aging domain and that allows the sharing of the knowledge concerning the specific research and the promotion of the theoretical and practical information, both among the stakeholders from the medical area and at the person level.anti-aging, elderly, ICT, health informatics systems, web services

    Sustainability in design: now! Challenges and opportunities for design research, education and practice in the XXI century

    Get PDF
    Copyright @ 2010 Greenleaf PublicationsLeNS project funded by the Asia Link Programme, EuropeAid, European Commission

    Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.

    Get PDF
    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu
    corecore