39,881 research outputs found

    Fog computing, applications , security and challenges, review

    Get PDF
    The internet of things originates a world where on daily basis objects can join the internet and interchange information and in addition process, store, gather them from the nearby environment, and effectively mediate on it. A remarkable number of services might be imagined by abusing the internet of things. Fog computing which is otherwise called edge computing was introduced in 2012 as a considered is a prioritized choice for the internet of things applications. As fog computing extend services of cloud near to the edge of the network and make possible computations, communications, and storage services in proximity to the end user. Fog computing cannot only provide low latency, location awareness but also enhance real-time applications, quality of services, mobility, security and privacy in the internet of things applications scenarios. In this paper, we will summarize and overview fog computing model architecture, characteristic, similar paradigm and various applications in real-time scenarios such as smart grid, traffic control system and augmented reality. Finally, security challenges are presented

    Extending systems-on-chip to the third dimension : performance, cost and technological tradeoffs.

    Get PDF
    Because of the today's market demand for high-performance, high-density portable hand-held applications, electronic system design technology has shifted the focus from 2-D planar SoC single-chip solutions to different alternative options as tiled silicon and single-level embedded modules as well as 3-D integration. Among the various choices, finding an optimal solution for system implementation dealt usually with cost, performance and other technological trade-off analysis at the system conceptual level. It has been identified that the decisions made within the first 20% of the total design cycle time will ultimately result up to 80% of the final product cost. In this paper, we discuss appropriate and realistic metric for performance and cost trade-off analysis both at system conceptual level (up-front in the design phase) and at implementation phase for verification in the three-dimensional integration. In order to validate the methodology, two ubiquitous electronic systems are analyzed under various implementation schemes and discuss the pros and cons of each of them

    Limits on Fundamental Limits to Computation

    Full text link
    An indispensable part of our lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the last fifty years. Such Moore scaling now requires increasingly heroic efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and enrich our understanding of integrated-circuit scaling, we review fundamental limits to computation: in manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, we recall how some limits were circumvented, compare loose and tight limits. We also point out that engineering difficulties encountered by emerging technologies may indicate yet-unknown limits.Comment: 15 pages, 4 figures, 1 tabl

    Near-Memory Address Translation

    Full text link
    Memory and logic integration on the same chip is becoming increasingly cost effective, creating the opportunity to offload data-intensive functionality to processing units placed inside memory chips. The introduction of memory-side processing units (MPUs) into conventional systems faces virtual memory as the first big showstopper: without efficient hardware support for address translation MPUs have highly limited applicability. Unfortunately, conventional translation mechanisms fall short of providing fast translations as contemporary memories exceed the reach of TLBs, making expensive page walks common. In this paper, we are the first to show that the historically important flexibility to map any virtual page to any page frame is unnecessary in today's servers. We find that while limiting the associativity of the virtual-to-physical mapping incurs no penalty, it can break the translate-then-fetch serialization if combined with careful data placement in the MPU's memory, allowing for translation and data fetch to proceed independently and in parallel. We propose the Distributed Inverted Page Table (DIPTA), a near-memory structure in which the smallest memory partition keeps the translation information for its data share, ensuring that the translation completes together with the data fetch. DIPTA completely eliminates the performance overhead of translation, achieving speedups of up to 3.81x and 2.13x over conventional translation using 4KB and 1GB pages respectively.Comment: 15 pages, 9 figure

    A virtual environment for the design and simulated construction of prefabricated buildings

    Get PDF
    The construction industry has acknowledged that its current working practices are in need of substantial improvements in quality and efficiency and has identified that computer modelling techniques and the use of prefabricated components can help reduce times, costs, and minimise defects and problems of on-site construction. This paper describes a virtual environment to support the design and construction processes of buildings from prefabricated components and the simulation of their construction sequence according to a project schedule. The design environment can import a library of 3-D models of prefabricated modules that can be used to interactively design a building. Using Microsoft Project, the construction schedule of the designed building can be altered, with this information feeding back to the construction simulation environment. Within this environment the order of construction can be visualised using virtual machines. Novel aspects of the system are that it provides a single 3-D environment where the user can construct their design with minimal user interaction through automatic constraint recognition and view the real-time simulation of the construction process within the environment. This takes this area of research a step forward from other systems that only allow the planner to view the construction at certain stages, and do not provide an animated view of the construction process
    corecore