61 research outputs found

    Exploiting user provided information in dynamic consolidation of virtual machines to minimize energy consumption of cloud data centers

    Get PDF
    Dynamic consolidation of Virtual Machines (VMs) can effectively enhance the resource utilization and energy-efficiency of the Cloud Data Centers (CDC). Existing research on Cloud resource reservation and scheduling signify that Cloud Service Users (CSUs) can play a crucial role in improving the resource utilization by providing valuable information to Cloud service providers. However, utilization of CSUs' provided information in minimization of energy consumption of CDC is a novel research direction. The challenges herein are twofold. First, finding the right benign information to be received from a CSU which can complement the energy-efficiency of CDC. Second, smart application of such information to significantly reduce the energy consumption of CDC. To address those research challenges, we have proposed a novel heuristic Dynamic VM Consolidation algorithm, RTDVMC, which minimizes the energy consumption of CDC through exploiting CSU provided information. Our research exemplifies the fact that if VMs are dynamically consolidated based on the time when a VM can be removed from CDC-a useful information to be received from respective CSU, then more physical machines can be turned into sleep state, yielding lower energy consumption. We have simulated the performance of RTDVMC with real Cloud workload traces originated from more than 800 PlanetLab VMs. The empirical figures affirm the superiority of RTDVMC over existing prominent Static and Adaptive Threshold based DVMC algorithms

    Directional filtering in edge detection

    No full text

    Attention shift impairments and novelty avoidance : effects of characteristics of autism on the self-organization of an artificial neural network

    No full text
    We discuss application of Artificial Neural Networks (ANN) in simulation of attention shift impairments and novelty avoidance, common deficits in autism. It has been theorized that cortical feature maps in individuals with autism are inadequate for forming abstract codes and representations, explaining the importance paid to detail, rather than salient features. ANNs known as the Self-Organization Maps (SOM) offer insights into the development of cortical feature maps. We present results of the formation of SOMs in response to stimuli from two sources in four modes, namely, novelty seeking (normal learning), attention shift impairment, novelty avoidance and novelty avoidance in conjunction with attention shift impairment. The SOMs resulting from learning with novelty seeking and with attention shift impairment were, perhaps surprisingly, identical. In the case of learning with novelty avoidance the resulting SOMs were adapted to one of the sources at the expense of the other. The SOMs resulting from learning with novelty avoidance in conjunction with attention shift impairment were strikingly different, ranging from almost normal to poor from one simulation to the next, even with identical initial conditions. Such learning, in many different maps, would result in very uneven capacities, common in individuals with autism.Godkänd; 2001; 20061101 (ysko

    Preoccupation with a restricted pattern of interest in modelling autistic learning

    No full text
    Autism is a developmental disorder in which attention shifting is known to be restricted. Self-organization of neural networks, conditioned by different attention shifting characteristics is investigated for higher-dimensional stimuli presented to the network from different sources. The attention shifting modes are 1) novelty seeking, 2) attention shift impairment (attention is shifted but with a low probability) and 3) attention is shifted with a preference for a source which has become familiar to the map. The feature maps resulting from self-organization are much the same for modes 1 and 2 but distinctly different for mode 3, where the maps learn the stimuli from the source with the lowest variability in great detail, at the expense of the other source(s). Detailed learning in narrow fields is a known characteristic of autismValiderad; 2003; 20100924 (andbra)</p

    An attempt in modelling early intervention in autism using neural networks

    No full text
    We present a solution to a problem of early intervention in autistic learning. This is an addition to our model of autism which is based on Kohonen self-organizing maps extended with the source familiarity filter and the attention shift mechanism. In particular we study the feature map formation when attention shift is restricted by familiarity preference. The network learns the stimuli from the source with the lowest variability in great detail at the expense of the other source. The early intervention neural controller modifies the probabilities of presenting stimuli from a given source in response to the attention shift acceptance/rejection signals.Godkänd; 2004; 20060929 (ysko
    corecore