171 research outputs found

    Challenges in real-time virtualization and predictable cloud computing

    Get PDF
    Cloud computing and virtualization technology have revolutionized general-purpose computing applications in the past decade. The cloud paradigm offers advantages through reduction of operation costs, server consolidation, flexible system configuration and elastic resource provisioning. However, despite the success of cloud computing for general-purpose computing, existing cloud computing and virtualization technology face tremendous challenges in supporting emerging soft real-time applications such as online video streaming, cloud-based gaming, and telecommunication management. These applications demand real-time performance in open, shared and virtualized computing environments. This paper identifies the technical challenges in supporting real-time applications in the cloud, surveys recent advancement in real-time virtualization and cloud computing technology, and offers research directions to enable cloud-based real-time applications in the future

    Computing Safe Contention Bounds for Multicore Resources with Round-Robin and FIFO Arbitration

    Get PDF
    Numerous researchers have studied the contention that arises among tasks running in parallel on a multicore processor. Most of those studies seek to derive a tight and sound upper-bound for the worst-case delay with which a processor resource may serve an incoming request, when its access is arbitrated using time-predictable policies such as round-robin or FIFO. We call this value upper-bound delay ( ubd ). Deriving trustworthy ubd statically is possible when sufficient public information exists on the timing latency incurred on access to the resource of interest. Unfortunately however, that is rarely granted for commercial-of-the-shelf (COTS) processors. Therefore, the users resort to measurement observations on the target processor and thus compute a “measured” ubdm . However, using ubdm to compute worst-case execution time values for programs running on COTS multicore processors requires qualification on the soundness of the result. In this paper, we present a measurement-based methodology to derive a ubdm under round-robin (RoRo) and first-in-first-out (FIFO) arbitration, which accurately approximates ubd from above, without needing latency information from the hardware provider. Experimental results, obtained on multiple processor configurations, demonstrate the robustness of the proposed methodology.The research leading to this work has received funding from: the European Union’s Horizon 2020 research and innovation programme under grant agreement No 644080(SAFURE); the European Space Agency under Contract 789.2013 and NPI Contract 40001102880; and COST Action IC1202, Timing Analysis On Code-Level (TACLe). This work has also been partially supported by the Spanish Ministry of Science and Innovation under grant TIN2015-65316-P. Jaume Abella has been partially supported by the MINECO under Ramon y Cajal postdoctoral fellowship number RYC-2013-14717. The authors would like to thanks Paul Caheny for his help with the proofreading of this document.Peer ReviewedPostprint (author's final draft

    A Double-Deep Spatio-Angular Learning Framework for Light Field based Face Recognition

    Full text link
    Face recognition has attracted increasing attention due to its wide range of applications, but it is still challenging when facing large variations in the biometric data characteristics. Lenslet light field cameras have recently come into prominence to capture rich spatio-angular information, thus offering new possibilities for advanced biometric recognition systems. This paper proposes a double-deep spatio-angular learning framework for light field based face recognition, which is able to learn both texture and angular dynamics in sequence using convolutional representations; this is a novel recognition framework that has never been proposed before for either face recognition or any other visual recognition task. The proposed double-deep learning framework includes a long short-term memory (LSTM) recurrent network whose inputs are VGG-Face descriptions that are computed using a VGG-Very-Deep-16 convolutional neural network (CNN). The VGG-16 network uses different face viewpoints rendered from a full light field image, which are organised as a pseudo-video sequence. A comprehensive set of experiments has been conducted with the IST-EURECOM light field face database, for varied and challenging recognition tasks. Results show that the proposed framework achieves superior face recognition performance when compared to the state-of-the-art.Comment: Submitted to IEEE Transactions on Circuits and Systems for Video Technolog

    Hydrodynamics-Biology Coupling for Algae Culture and Biofuel Production

    Get PDF
    International audienceBiofuel production from microalgae represents an acute optimization problem for industry. There is a wide range of parameters that must be taken into account in the development of this technology. Here, mathematical modelling has a vital role to play. The potential of microalgae as a source of biofuel and as a technological solution for CO2 fixation is the subject of intense academic and industrial research. Large-scale production of microalgae has potential for biofuel applications owing to the high productivity that can be attained in high-rate raceway ponds. We show, through 3D numerical simulations, that our approach is capable of discriminating between situations where the paddle wheel is rapidly moving water or slowly agitating the process. Moreover, the simulated velocity fields can provide lagrangian trajectories of the algae. The resulting light pattern to which each cell is submitted when travelling from light (surface) to dark (bottom) can then be derived. It will then be reproduced in lab experiments to study photosynthesis under realistic light patterns

    Game theory framework for MAC parameter optimization in energy-delay constrained sensor networks

    Get PDF
    Optimizing energy consumption and end-to-end (e2e) packet delay in energy-constrained, delay-sensitive wireless sensor networks is a conflicting multiobjective optimization problem. We investigate the problem from a game theory perspective, where the two optimization objectives are considered as game players. The cost model of each player is mapped through a generalized optimization framework onto protocol-specific MAC parameters. From the optimization framework, a game is first defined by the Nash bargaining solution (NBS) to assure energy consumption and e2e delay balancing. Secondy, the Kalai-Smorodinsky bargaining solution (KSBS) is used to find an equal proportion of gain between players. Both methods offer a bargaining solution to the duty-cycle MAC protocol under different axioms. As a result, given the two performance requirements (i.e., the maximum latency tolerated by the application and the initial energy budget of nodes), the proposed framework allows to set tunable system parameters to reach a fair equilibrium point that dually minimizes the system latency and energy consumption. For illustration, this formulation is applied to six state-of-the-art wireless sensor network (WSN) MAC protocols: B-MAC, X-MAC, RI-MAC, SMAC, DMAC, and LMAC. The article shows the effectiveness and scalability of such a framework in optimizing protocol parameters that achieve a fair energy-delay performance trade-off under the application requirements

    Communication Range Dynamics and Performance Analysis for a Self-Adaptive Transmission Power Controller

    Get PDF
    The deployment of the nodes in a Wireless Sensor and Actuator Network (WSAN) is typically restricted by the sensing and acting coverage. This implies that the locations of the nodes may be, and usually are, not optimal from the point of view of the radio communication. Additionally, when the transmission power is tuned for those locations, there are other unpredictable factors that can cause connectivity failures, like interferences, signal fading due to passing objects and, of course, radio irregularities. A control-based self-adaptive system is a typical solution to improve the energy consumption while keeping good connectivity. In this paper, we explore how the communication range for each node evolves along the iterations of an energy saving self-adaptive transmission power controller when using different parameter sets in an outdoor scenario, providing a WSAN that automatically adapts to surrounding changes keeping good connectivity. The results obtained in this paper show how the parameters with the best performance keep a k-connected network, where k is in the range of the desired node degree plus or minus a specified tolerance value
    corecore