710 research outputs found

    Towards Understanding Uncertainty in Cloud Computing Resource Provisioning

    Get PDF
    In spite of extensive research of uncertainty issues in different fields ranging from computational biology to decision making in economics, a study of uncertainty for cloud computing systems is limited. Most of works examine uncertainty phenomena in users’ perceptions of the qualities, intentions and actions of cloud providers, privacy, security and availability. But the role of uncertainty in the resource and service provisioning, programming models, etc. have not yet been adequately addressed in the scientific literature. There are numerous types of uncertainties associated with cloud computing, and one should to account for aspects of uncertainty in assessing the efficient service provisioning. In this paper, we tackle the research question: what is the role of uncertainty in cloud computing service and resource provisioning? We review main sources of uncertainty, fundamental approaches for scheduling under uncertainty such as reactive, stochastic, fuzzy, robust, etc. We also discuss potentials of these approaches for scheduling cloud computing activities under uncertainty, and address methods for mitigating job execution time uncertainty in the resource provisioning.Peer ReviewedPostprint (published version

    Large-scale Data Analysis and Deep Learning Using Distributed Cyberinfrastructures and High Performance Computing

    Get PDF
    Data in many research fields continues to grow in both size and complexity. For instance, recent technological advances have caused an increased throughput in data in various biological-related endeavors, such as DNA sequencing, molecular simulations, and medical imaging. In addition, the variance in the types of data (textual, signal, image, etc.) adds an additional complexity in analyzing the data. As such, there is a need for uniquely developed applications that cater towards the type of data. Several considerations must be made when attempting to create a tool for a particular dataset. First, we must consider the type of algorithm required for analyzing the data. Next, since the size and complexity of the data imposes high computation and memory requirements, it is important to select a proper hardware environment on which to build the application. By carefully both developing the algorithm and selecting the hardware, we can provide an effective environment in which to analyze huge amounts of highly complex data in a large-scale manner. In this dissertation, I go into detail regarding my applications using big data and deep learning techniques to analyze complex and large data. I investigate how big data frameworks, such as Hadoop, can be applied to problems such as large-scale molecular dynamics simulations. Following this, many popular deep learning frameworks are evaluated and compared to find those that suit certain hardware setups and deep learning models. Then, we explore an application of deep learning to a biomedical problem, namely ADHD diagnosis from fMRI data. Lastly, I demonstrate a framework for real-time and fine-grained vehicle detection and classification. With each of these works in this dissertation, a unique large-scale analysis algorithm or deep learning model is implemented that caters towards the problem and leverages specialized computing resources

    Design of an E-learning system using semantic information and cloud computing technologies

    Get PDF
    Humanity is currently suffering from many difficult problems that threaten the life and survival of the human race. It is very easy for all mankind to be affected, directly or indirectly, by these problems. Education is a key solution for most of them. In our thesis we tried to make use of current technologies to enhance and ease the learning process. We have designed an e-learning system based on semantic information and cloud computing, in addition to many other technologies that contribute to improving the educational process and raising the level of students. The design was built after much research on useful technology, its types, and examples of actual systems that were previously discussed by other researchers. In addition to the proposed design, an algorithm was implemented to identify topics found in large textual educational resources. It was tested and proved to be efficient against other methods. The algorithm has the ability of extracting the main topics from textual learning resources, linking related resources and generating interactive dynamic knowledge graphs. This algorithm accurately and efficiently accomplishes those tasks even for bigger books. We used Wikipedia Miner, TextRank, and Gensim within our algorithm. Our algorithm‘s accuracy was evaluated against Gensim, largely improving its accuracy. Augmenting the system design with the implemented algorithm will produce many useful services for improving the learning process such as: identifying main topics of big textual learning resources automatically and connecting them to other well defined concepts from Wikipedia, enriching current learning resources with semantic information from external sources, providing student with browsable dynamic interactive knowledge graphs, and making use of learning groups to encourage students to share their learning experiences and feedback with other learners.Programa de Doctorado en Ingeniería Telemática por la Universidad Carlos III de MadridPresidente: Luis Sánchez Fernández.- Secretario: Luis de la Fuente Valentín.- Vocal: Norberto Fernández Garcí

    The price of a perfect system: learnability and the distribution of errors in the speech of children learning English as a first language

    Get PDF
    This study reports on a strictly-cognitive and symptomatic approach to the treatment of phonological disorders, by an effect which can also be reproduced in most normally- developing children. To explain how this works, it is necessary to address certain asymmetries and singularities in the distribution of children's speech errors over the whole range of development. Particular words occasion particular errors. In early phonology there is 'fronting' with Coronal displacing Dorsal, and harmonies where Coronal is lost. In the middle of phonological acquisition, the harmonic pattern changes with coronal harmony coming to prevail over other forms. As well as these asymmetries, there is also the case of harmonic or migratory errors involving the property of affrication, but not the affricate as a whole, i.e. ignoring the property of voicing. Many of these asymmetries and singularities and the harmony or movement of affrication are described here for the first time. They are all difficult to explain in current theoretical models, especially in 'bottom-up' models. On the basis of the 'top-down' notion of 'parameters' from recent work in phonology, I shall assume that: A) finite learnability has to be ensured; B) there can be no privileged information about the learnability target; and C) phonological theory and the study of speech development (normal and otherwise) have an object in common. I shall propose: A) a Parameter Setting Function, as part of the human genome, possibly a defining part; B) Phonological Parapraxis', as a way of characterising the generalisations here about incompetent phonology by the general mechanisms of floating' and 'non-association'; C) a Stage (_n-1) as a necessary construct in the theory of acquisition, typically not reached before 8;6; D) a' Representability Inspection' relating normal competence to Chomsky's Articulatory/ Perceptual interface', sensitive to a relation between featural properties such as roundness or labiality and prosodic properties such as the foot and syllable; E) a syndrome. Specific Speech and Language Impairment, SSLI, extending the notion of Specific Language Impairment, SLI.I shall hypothesise that: A) segmental and suprasegmental representations interact; B) the phonological learnability space is uniform and consistent; C) it is the very minimality of the learnability system which makes it vulnerable to SSLI. This: A) side-steps the implausible inference that development proceeds by the loss of 'processes'; B) accounts for at least some of the asymmetries noted above; C) lets parameters set' a degree of abstract exponence; D) makes it possible to abolish 'processes' such as fronting, lisping, consonant harmony, in favour of successive degrees of imprecision in the parameterisation; E) provides a conceptual mechanism for the cognitive and symptomatic therapy, mentioned above: the therapy effects an increase in the set of phonological structures which are 'representable' by the child

    Semantic discovery and reuse of business process patterns

    Get PDF
    Patterns currently play an important role in modern information systems (IS) development and their use has mainly been restricted to the design and implementation phases of the development lifecycle. Given the increasing significance of business modelling in IS development, patterns have the potential of providing a viable solution for promoting reusability of recurrent generalized models in the very early stages of development. As a statement of research-in-progress this paper focuses on business process patterns and proposes an initial methodological framework for the discovery and reuse of business process patterns within the IS development lifecycle. The framework borrows ideas from the domain engineering literature and proposes the use of semantics to drive both the discovery of patterns as well as their reuse

    Event Discovery and Classification in Space-Time Series: A Case Study for Storms

    Get PDF
    Recent advancement in sensor technology has enabled the deployment of wireless sensors for surveillance and monitoring of phenomenon in diverse domains such as environment and health. Data generated by these sensors are typically high-dimensional and therefore difficult to analyze and comprehend. Additionally, high level phenomenon that humans commonly recognize, such as storms, fire, traffic jams are often complex and multivariate which individual univariate sensors are incapable of detecting. This thesis describes the Event Oriented approach, which addresses these challenges by providing a way to reduce dimensionality of space-time series and a way to integrate multivariate data over space and/or time for the purpose of detecting and exploring high level events. The proposed Event Oriented approach is implemented using space-time series data from the Gulf of Maine Ocean Observation System (GOMOOS). GOMOOS is a long standing network of wireless sensors in the Gulf of Maine monitoring the high energy ocean environment. As a case study, high level storm events are detected and classified using the Event Oriented approach. A domain-independent ontology for detecting high level xvi composite events called a General Composite Event Ontology is presented and used as a basis of the Storm Event Ontology. Primitive events are detected from univariate sensors and assembled into Composite Storm Events using the Storm Event Ontology. To evaluate the effectiveness of the Event Oriented approach, the resulting candidate storm events are compared with an independent historic Storm Events Database from the National Climatic Data Center (NCDC) indicating that the Event Oriented approach detected about 92% of the storms recorded by the NCDC. The Event Oriented approach facilitates classification of high level composite event. In the case study, candidate storms were classified based on their spatial progression and profile. Since ontological knowledge is used for constructing high level event ontology, detection of candidate high level events could help refine existing ontological knowledge about them. In summary, this thesis demonstrates the Event Oriented approach to reduce dimensionality in complex space-time series sensor data and the facility to integrate ime series data over space for detecting high level phenomenon

    Big Data Security (Volume 3)

    Get PDF
    After a short description of the key concepts of big data the book explores on the secrecy and security threats posed especially by cloud based data storage. It delivers conceptual frameworks and models along with case studies of recent technology

    Ice Crystal Classification Using Two Dimensional Light Scattering Patterns

    Get PDF
    An investigation is presented into methods of characterising cirrus ice crystals from in-situ light scattering data. A database of scattering patterns from modelled crystals was created using the Ray Tracing with Diffraction on Facets (RTDF) model from the University of Hertfordshire, to which experimental and modelled data was fitted. Experimental data was gathered in the form of scattering patterns from ice analogue crystals with similar optical properties and hexagonal symmetry to ice, yet stable at room temperature. A laboratory rig is described which images scattering patterns from single particles while allowing precise control over the orientation of the particle with respect to the incident beam. Images of scattering patterns were captured and compared to patterns from modelled crystals with similar geometry. Methods for introducing particles en-masse and individually to the Small Ice Detector (SID) instruments are discussed, with particular emphasis on the calibration of the gain of the SID-2 instrument. The variation in gain between detector elements is found to be significant, variable over the life of the detector, and different for different detectors. Fitting was performed by comparison of test scattering patterns (either modelled or experimental) to the reference database. Representation of the two dimensional scattering patterns by asymmetry factor, moment invariants, azimuthal intensity patterns (AIP) and the Fourier transform of the AIP are compared for fitting accuracy. Direct comparison of the AIP is found to be the most accurate method. Increased resolution of the AIP is shown to improve the fitting substantially. Case studies are presented for the fitting of two ice analogue crystals to the modelled database. Fitting accuracy is found to be negatively influenced by small amounts of surface roughness and detail not currently considered by the RTDF model. Fitting of in-situ data gathered by the SID-3 instrument during the HALO 02 campaign at the AIDA cloud chamber in Germany is presented and discussed. Saturation of detector pixels is shown to affect pattern fitting. In-flight operation of the instrument involves the variation of gain of the whole detector (as opposed to individual elements) in order to obtain unsaturated images of both large and small particles

    Virtual Urbanity: A parametric tool for the generation of virtual cities

    Get PDF
    Which are the underlying rules that govern urban growth and the structure of the street network? Which are the distinctive characteristics that define highways and the differentiation of the various street patterns? How can we combine the above information and incorporate them in a computer aided urban simulation in order to successfully model a virtual city in which people will be able to successfully orientate and navigate? This research aims to address and investigate the above issues and proposes the development of a parametrically adjustable computer program in order to conduct navigational and way-finding experiments. Virtual Urbanity is a simulation engine which is capable of procedurally generating a vast and diverse variety of virtual 3D urban configurations. It uses an operational grammar which consists of a local generative process which is based on a Lindenmayer system, and a prescriptive set of global parametric rules. This combination defines the topology, the geometry, the width, the length, the density and the spatial significance of the streets, ultimately setting an effective street hierarchy. The program engages in the methodological exploration of existing and theoretical urban configurations and the analysis of the human perception about the structure of the built environment, and builds towards a working algorithm (rule-set) for the on the fly generation of city structures in the next generation video games. In accordance, a trial experiment regarding the mental correlations of the roads’ width and their hierarchical significance and function within the street network was conducted and its findings were discussed

    12th SC@RUG 2015 proceedings:Student Colloquium 2014-2015

    Get PDF
    • …
    corecore