1,100 research outputs found

    Implementing a dynamic scaling of web applications in a virtualized cloud computing environment

    Get PDF
    Cloud computing is becoming more essential day by day. The allure of the cloud is the significant value and benefits that people gain from it, such as reduced costs, increased storage, flexibility, and more mobility. Flexibility is one of the major benefits that cloud computing can provide in terms of scaling up and down the infrastructure of a network. Once traffic has increased on one server within the network, a load balancer instance will route incoming requests to a healthy instance, which is less busy and less burdened. When the full complement of instances cannot handle any more requests, past research has been done by Chieu et. al. that presented a scaling algorithm to address a dynamic scalability of web applications on a virtualized cloud computing environment based on relevant indicators that can increase or decrease servers, as needed. In this project, I implemented the proposed algorithm, but based on CPU Utilization threshold. In addition, two tests were run exploring the capabilities of different metrics when faced with ideal or challenging conditions. The results did find a superior metric that was able to perform successfully under both tests

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Availability in mobile application in IaaS cloud

    Get PDF
    Deploying software system into IaaS cloud takes infrastructure out of user's control, which diminishes visibility and changes system administration. Service outages of infrastructure services and other risks to availability have caused concern for early users of cloud. In this thesis existing web application, which is deployed in IaaS cloud, was evaluated for availability. Whole spectrum of different cloud related incidents that compromises provided service was examined. General view from availability point of view of the case Internet service was formed based on interviews. Big cloud service providers have service level agreements effective and long cloud outages are rare events. Cloud service providers build mutually independent domains or zones into infrastructure. Internet availability is largely determinative of users' perceived performance of site. Using multiple cloud service providers is a solution to cloud service unavailability. Case company had discovered requirements for availability and sufficiently prevented threats. Case company was satisfied in cloud services and there is no need to withdraw from cloud. User is a significant threat to the dependability of system, but there are no definite means to prevent user from damaging system. Taking routinely and regularly backups of data outside the cloud is the core activity in IT crisis preparedness. Application architecture was evaluated and found satisfactory. Software system contains managed database service and load balancer as an advanced feature from IaaS provider. Both services give crucial support for the availability of the system. Examined system has conceptually simple stateless recovery.Ohjelmiston kÀyttö IaaS -pilvessÀ saattaa infrastruktuurin kÀyttÀjÀn kontrollin ulottumattomiin, mikÀ heikentÀÀ nÀkyvyyttÀ ja muuttaa jÀrjestelmÀn hallintaa. Palvelukatkot infrastruktuuripalveluissa ja muut riskit saatavuudelle ovat aiheuttaneet varovaisuutta pilvipalveluiden varhaisissa kÀyttÀjissÀ. TÀssÀ diplomityössÀ evaluoitiin olemassa olevan ja IaaS -pilvessÀ kÀytettÀvÀn web-sovelluksen saatavuutta. Kokonainen kirjo erilaisia pilveen liittyviÀ tapahtumia, jotka keskeyttÀvÀt tarjotun palvelun, tutkittiin. Yleiskuva saatavuuden nÀkökulmasta katsottuna muodostettiin haastattelujen pohjalta. Suurilla pilvipalveluiden tarjoajilla on voimassa olevat palvelutasosopimukset ja pitkÀt palvelukatkot ovat harvinaisia tapahtumia. Pilvipalveluiden tarjoajat rakentavat infrastruktuuriin toisistaan riippumattomasti toimivia alueita. Suurelta osalta mÀÀrÀÀvÀ tekijÀ kÀyttÀjien kokeman sivuston suorituskyvyn kannalta on Internetin kautta palveluun liittymisen saatavuus. Useamman pilvipalvelun tarjoajan kÀyttÀminen on ratkaisu pilvipalvelun saatavuuteen. Case-yritys oli löytÀnyt vaatimukset saatavuudelle ja riittÀvÀllÀ tavalla estÀnyt riskien toteutumisen. Case-yritys oli tyytyvÀinen pilvipalveluihin ja pilvestÀ pois vetÀytymiselle ei ole tarvetta. KÀyttÀjÀ on merkittÀvÀ riski jÀrjestelmÀn luotettavuudelle, mutta ei ole varmoja tapoja estÀÀ kÀyttÀjÀÀ vahingoittamasta jÀrjestelmÀÀ. Keskeinen toiminto tietotekniseen kriisiin varautumisessa on rutiininomainen ja sÀÀnnöllinen varmuuskopioiden teko. Sovelluksen arkkitehtuuria evaluoitiin ja se havaittiin tarpeita vastaavaksi. OhjelmistojÀrjestelmÀ sisÀltÀÀ palveluntarjoajan yllÀpitÀmÀn tietokantapalvelun ja web-palvelimien tietoliikenteen kuorman tasaajan IaaS -palvelun edistyneinÀ ominaisuuksina. Molemmat palvelut tukevat ratkaisevasti jÀrjestelmÀn saatavuutta. Tarkastellussa jÀrjestelmÀssÀ on kÀsitteellisesti yksinkertainen tilaton jÀrjestelmÀn palautuminen

    The Politics of Exhaustion: Immigration Control in the British-French Border Zone

    Get PDF
    Within a climate of growing anti-immigration and populist forces gaining traction across Europe, and in response to the increased number of prospective asylum seekers arriving in Europe, recent years have seen the continued hardening of borders and a disconcerting evolution of new forms of immigration control measures utilised by states. Based on extensive field research carried out amongst displaced people in Europe in 2016-2019, this article highlights the way in which individuals in northern France are finding themselves trapped in a violent border zone, unable to move forward whilst having no obvious alternative way out of their predicament. The article seeks to illustrate the violent dynamics inherent in the immigration control measures in this border zone, characterised by both direct physical violence as well as banalised and structural forms of violence, including state neglect through the denial of services and care. The author suggests that the raft of violent measures and micro practices authorities resort to in the French-British border zone could be understood as constituting one of the latest tools for European border control and obstruction of the access to asylum procedures; a Politics of Exhaustion

    Machine-to-machine emergency system for urban safety

    Get PDF
    Nowadays most people live in urban areas. As populations grow, demand on the city ecosystem increases, directly affecting the entities responsible for the city control. Challenges like this make leaders adopt ways to engage with the surroundings of their city, making them more prepared and aware. The decisions they make not only directly affect the city in short term, but are also a means to improve the decision making process. This work aimed to develop a system which can act as an emergency and security supervisor in a city, generating alerts to empower entities responsible for disaster management. The system is capable of monitoring data from sensors and provide useful knowledge from it. This work presents an architecture for the collection of data in the Internet of Things (IoT). It delivers the analysis of the used tools and the choices made regarding the implemented system. Also, it provides the necessary inputs for developers to participate in the project, since it describes all the techniques, languages, strategies and programming paradigms used. Finally, it describes the prototype that receives data and processes it to generate alerts with the purpose of warning emergency response teams and the future implementation of a prediction module that can act as a useful tool to better manage the emergency personnel. The completion of the internship allowed the learning of new concepts and techniques, as well as the development of those that were already familiar. With regard to the company, the developed system will integrate the company’s Citibrain platform and will act as a central point, in which, every application (e.g. water management, waste management) can be subscribed to receive alerts

    The Analysis of Open Source Software and Data for Establishment of GIS Services Throughout the Network in a Mapping Organization at National or International Level

    Get PDF
    Federal agencies and their partners collect and manage large amounts of geospatial data but it is often not easily found when needed, and sometimes data is collected or purchased multiple times. In short, the best government data is not always organized and managed efficiently to support decision making in a timely and cost effective manner. National mapping agencies, various Departments responsible for collection of different types of Geospatial data and their authorities cannot, for very long, continue to operate, as they did a few years ago like people living in an island. Leaders need to look at what is now possible that was not possible before, considering capabilities such as cloud computing, crowd sourced data collection, available Open source remotely sensed data and multi source information vital in decision-making as well as new Web-accessible services that provide, sometimes at no cost. Many of these services previously could be obtained only from local GIS experts. These authorities need to consider the available solution and gather information about new capabilities, reconsider agency missions and goals, review and revise policies, make budget and human resource for decisions, and evaluate new products, cloud services, and cloud service providers. To do so, we need, choosing the right tools to rich the above-mentioned goals. As we know, Data collection is the most cost effective part of the mapping and establishment of a Geographic Information system. However, it is not only because of the cost for the data collection task but also because of the damages caused by the delay and the time that takes to provide the user with proper information necessary for making decision from the field up to the user’s hand. In fact, the time consumption of a project for data collection, processing, and presentation of geospatial information has more effect on the cost of a bigger project such as disaster management, construction, city planning, environment, etc. Of course, with such a pre-assumption that we provide all the necessary information from the existing sources directed to user’s computer. The best description for a good GIS project optimization or improvement is finding a methodology to reduce the time and cost, and increase data and service quality (meaning; Accuracy, updateness, completeness, consistency, suitability, information content, integrity, integration capability, and fitness for use as well as user’s specific needs and conditions that must be addressed with a special attention). Every one of the above-mentioned issues must be addressed individually and at the same time, the whole solution must be provided in a global manner considering all the criteria. In this thesis at first, we will discuss about the problem we are facing and what is needed to be done as establishment of National Spatial Data Infra-Structure (NSDI), the definition and related components. Then after, we will be looking for available Open Source Software solutions to cover the whole process to manage; Data collection, Data base management system, data processing and finally data services and presentation. The first distinction among Software is whether they are, Open source and free or commercial and proprietary. It is important to note that in order to make distinction among softwares it is necessary to define a clear specification for this categorization. It is somehow very difficult to distinguish what software belongs to which class from legal point of view and therefore, makes it necessary to clarify what is meant by various terms. With reference to this concept there are 2 global distinctions then, inside each group, we distinguish another classification regarding their functionalities and applications they are made for in GIScience. According to the outcome of the second chapter, which is the technical process for selection of suitable and reliable software according to the characteristics of the users need and required components, we will come to next chapter. In chapter 3, we elaborate in to the details of the GeoNode software as our best candidate tools to take responsibilities of those issues stated before. In Chapter 4, we will discuss the existing Open Source Data globally available with the predefined data quality criteria (Such as theme, data content, scale, licensing, and coverage) according to the metadata statement inside the datasets by mean of bibliographic review, technical documentation and web search engines. We will discuss in chapter 5 further data quality concepts and consequently define sets of protocol for evaluation of all datasets according to the tasks that a mapping organization in general, needed to be responsible to the probable users in different disciplines such as; Reconnaissance, City Planning, Topographic mapping, Transportation, Environment control, disaster management and etc
 In Chapter 6, all the data quality assessment and protocols will be implemented into the pre-filtered, proposed datasets. In the final scores and ranking result, each datasets will have a value corresponding to their quality according to the sets of rules that are defined in previous chapter. In last steps, there will be a vector of weight that is derived from the questions that has to be answered by user with reference to the project in hand in order to finalize the most appropriate selection of Free and Open Source Data. This Data quality preference has to be defined by identifying a set of weight vector, and then they have to be applied to the quality matrix in order to get a final quality scores and ranking. At the end of this chapter there will be a section presenting data sets utilization in various projects such as “ Early Impact Analysis” as well as “Extreme Rainfall Detection System (ERDS)- version 2” performed by ITHACA. Finally, in conclusion, the important criteria, as well as future trend in GIS software are discussed and at the end recommendations will be presented

    Toward Universal Broadband in Rural Alaska

    Get PDF
    The TERRA-Southwest project is extending broadband service to 65 communities in the Bristol Bay, Bethel and Yukon-Kuskokwim regions. A stimulus project funded by a combination of grants and loans from the Rural Utilities Service (RUS), TERRA-Southwest has installed a middle-mile network using optical fiber and terrestrial microwave. Last-mile service will be through fixed wireless or interconnection with local telephone networks. The State of Alaska, through its designee Connect Alaska, also received federal stimulus funding from the National Telecommunications and Information Administration (NTIA) for tasks that include support for an Alaska Broadband Task Force “to both formalize a strategic broadband plan for the state of Alaska and coordinate broadband activities across relevant agencies and organizations.” Thus, a study of the impact of the TERRA project in southwest Alaska is both relevant and timely. This first phase provides baseline data on current access to and use of ICTs and Internet connectivity in rural Alaska, and some insights about perceived benefits and potential barriers to adoption of broadband. It is also intended to provide guidance to the State Broadband Task Force in determining how the extension of broadband throughout the state could contribute to education, social services, and economic activities that would enhance Alaska’s future. Results of the research could also be used proactively to develop strategies to encourage broadband adoption, and to identify applications and support needed by users with limited ICT skills.Connect Alaska. The National Telecommunications and Information Administration. General Communications Incorporated.Part 1: An Analysis of Internet Use in Southwest Alaska / Introduction / Previous Studies / Current Connectivity / Analytical Framework and Research Methodology / Demographics / Mobile Phones: Access and Use / Access to the Internet / Internet Useage / Considerations about Internet Service / Interest in Broadband / Sources of News / Comparison with National Data / Internet Use by Businesses and Organizations / What Difference may Broadband make in the Region? / Conclusiongs / Part 2 Literature Review / Reference

    EUROPEAN CONFERENCE ON QUEUEING THEORY 2016

    Get PDF
    International audienceThis booklet contains the proceedings of the second European Conference in Queueing Theory (ECQT) that was held from the 18th to the 20th of July 2016 at the engineering school ENSEEIHT, Toulouse, France. ECQT is a biannual event where scientists and technicians in queueing theory and related areas get together to promote research, encourage interaction and exchange ideas. The spirit of the conference is to be a queueing event organized from within Europe, but open to participants from all over the world. The technical program of the 2016 edition consisted of 112 presentations organized in 29 sessions covering all trends in queueing theory, including the development of the theory, methodology advances, computational aspects and applications. Another exciting feature of ECQT2016 was the institution of the TakĂĄcs Award for outstanding PhD thesis on "Queueing Theory and its Applications"
    • 

    corecore