229,741 research outputs found

    RAPID WEBGIS DEVELOPMENT FOR EMERGENCY MANAGEMENT

    Get PDF
    The use of spatial data during emergency response and management helps to make faster and better decisions. Moreover spatial data should be as much updated as possible and easy to access. To face the challenge of rapid and updated data sharing the most efficient solution is largely considered the use of internet where the field of web mapping is constantly evolving. ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) is a non profit association founded by Politecnico di Torino and SITI (Higher Institute for the Environmental Systems) as a joint project with the WFP (World Food Programme). The collaboration with the WFP drives some projects related to Early Warning Systems (i.e. flood and drought monitoring) and Early Impact Systems (e.g. rapid mapping and assessment through remote sensing systems). The Web GIS team has built and is continuously improving a complex architecture based entirely on Open Source tools. This architecture is composed by three main areas: the database environment, the server side logic and the client side logic. Each of them is implemented respecting the MCV (Model Controller View) pattern which means the separation of the different logic layers (database interaction, business logic and presentation). The MCV architecture allows to easily and fast build a Web GIS application for data viewing and exploration. In case of emergency data publication can be performed almost immediately as soon as data production is completed. The server side system is based on Python language and Django web development framework, while the client side on OpenLayers, GeoExt and Ext.js that manage data retrieval and user interface. The MCV pattern applied to javascript allows to keep the interface generation and data retrieval logic separated from the general application configuration, thus the server side environment can take care of the generation of the configuration file. The web application building process is data driven and can be considered as a view of the current architecture composed by data and data interaction tools. Once completely automated, the Web GIS application building process can be performed directly by the final user, that can customize data layers and controls to interact with the

    Image Labeler: Label Earth Science Images for Machine Learning

    Get PDF
    The application of machine learning for image-based classification of earth science phenomena, such as hurricanes, is relatively new. While extremely useful, the techniques used for image-based phenomena classification require storing and managing an abundant supply of labeled images in order to produce meaningful results. Existing methods for dataset management and labeling include maintaining categorized folders on a local machine, a process that can be cumbersome and not scalable. Image Labeler is a fast and scalable web-based tool that facilitates the rapid development of image-based earth science phenomena datasets, in order to aid deep learning application and automated image classification/detection. Image Labeler is built with modern web technologies to maximize the scalability and availability of the platform. It has a user-friendly interface that allows tagging multiple images relatively quickly. Essentially, Image Labeler improves upon existing techniques by providing researchers with a shareable source of tagged earth science images for all their machine learning needs. Here, we demonstrate Image Labelers current image extraction and labeling capabilities including supported data sources, spatiotemporal subsetting capabilities, individual project management and team collaboration for large scale projects

    Portfolio Investment in Focus of Using Agent in Stock Trading Environment

    Get PDF
    Portfolio Investment in Focus of Using Agent in Stock Trading Environment is the collaboration between stock trading application and agent program. It can be regard as the value added process to the stock trading system. The rationale to embedded agent program in the system is to increase the efficiency and effectiveness of the system. The agent program that exists in the application has been programmed to do certain tasks such as to buy and sell stocks on behalf of trader. The agent is performing their job based on the data or value that defines by user. The main reason to develop the application that using agent is because realizing the fact that people in this modern society always concern about the standard of living. They want everything around them to be automated. In other word, we can say that people want computer to work for them. There are two objectives that have been set for this project. The first objective is performing small scale of study regarding the agent and the second objective is to develop simple web site regarding stock trading application using agent. To ensure both of the objectives can be achieved, the author has set the scope of study at the planning phase of the project. Basically, there are three scope of study that has been defined. The first one is to study about the agent such as it environment, functionalities and characteristics. The second scope is study about the concept of system remoteness. System remoteness means, user can access the system from their remote location via internet or World Wide Web (www) technology. The final scope of study is to study about current stock simulator such as Investopedia simulator. The reason is to get some idea for system design and implementation. For the methodology, the Rapid application Development (RAD) has been employed. The methodology has been chosen because it is effective and suitable for short duration project. It was designed for developer and user to join together and work intensively toward their goal. For Rapid Application Development methodology, the system is basically developed by using prototype. Developing system using prototype can shorten the development time and the final product of the project is early visible. By using the RAD methodology, the stock trading system using agent is able to be completed within the time allocated. I

    DIVE on the internet

    Get PDF
    This dissertation reports research and development of a platform for Collaborative Virtual Environments (CVEs). It has particularly focused on two major challenges: supporting the rapid development of scalable applications and easing their deployment on the Internet. This work employs a research method based on prototyping and refinement and promotes the use of this method for application development. A number of the solutions herein are in line with other CVE systems. One of the strengths of this work consists in a global approach to the issues raised by CVEs and the recognition that such complex problems are best tackled using a multi-disciplinary approach that understands both user and system requirements. CVE application deployment is aided by an overlay network that is able to complement any IP multicast infrastructure in place. Apart from complementing a weakly deployed worldwide multicast, this infrastructure provides for a certain degree of introspection, remote controlling and visualisation. As such, it forms an important aid in assessing the scalability of running applications. This scalability is further facilitated by specialised object distribution algorithms and an open framework for the implementation of novel partitioning techniques. CVE application development is eased by a scripting language, which enables rapid development and favours experimentation. This scripting language interfaces many aspects of the system and enables the prototyping of distribution-related components as well as user interfaces. It is the key construct of a distributed environment to which components, written in different languages, connect and onto which they operate in a network abstracted manner. The solutions proposed are exemplified and strengthened by three collaborative applications. The Dive room system is a virtual environment modelled after the room metaphor and supporting asynchronous and synchronous cooperative work. WebPath is a companion application to a Web browser that seeks to make the current history of page visits more visible and usable. Finally, the London travel demonstrator supports travellers by providing an environment where they can explore the city, utilise group collaboration facilities, rehearse particular journeys and access tourist information data

    The Metadata Education and Research Information Commons (MERIC): A Collaborative Teaching and Research Initiative

    Get PDF
    The networked environment forced a sea change in Library and Information Science (LIS) education. Most LIS programs offer a mixed-mode of instruction that integrates online learning materials with more traditional classroom pedagogical methods and faculty are now responsible for developing content and digital learning objects. The teaching commons in a networked environment is one way to share, modify and repurpose learning objects while reducing the costs to educational institutions of developing course materials totally inhouse. It also provides a venue for sharing ideas, practices, and expertise in order to provide the best learning experience for students. Because metadata education has been impacted by rapid changes and metadata research is interdisciplinary and diffuse, the Metadata Education and Research Information Commons (MERIC) initiative aims to provide a virtual environment for sharing and collaboration within the extensive metadata community. This paper describes the development of MERIC from its origin as a simple clearinghouse proof-of-concept project to a service-oriented teaching and research commons prototype. The problems of enablers and barriers to participation and collaboration are discussed and the need for specific community building research is cited as critical for the success of MERIC within a broad metadata community

    The Cognitive Atlas: Employing Interaction Design Processes to Facilitate Collaborative Ontology Creation

    Get PDF
    The Cognitive Atlas is a collaborative knowledge-building project that aims to develop an ontology that characterizes the current conceptual framework among researchers in cognitive science and neuroscience. The project objectives from the beginning focused on usability, simplicity, and utility for end users. Support for Semantic Web technologies was also a priority in order to support interoperability with other neuroscience projects and knowledge bases. Current off-the-shelf semantic web or semantic wiki technologies, however, do not often lend themselves to simple user interaction designs for non-technical researchers and practitioners; the abstract nature and complexity of these systems acts as point of friction for user interaction, inhibiting usability and utility. Instead, we take an alternate interaction design approach driven by user centered design processes rather than a base set of semantic technologies. This paper reviews the initial two rounds of design and development of the Cognitive Atlas system, including interactive design decisions and their implementation as guided by current industry practices for the development of complex interactive systems

    Virtual Collaborative R&D Teams in Malaysia Manufacturing SMEs

    Get PDF
    This paper presents the results of empirical research conducted during March to September 2009. The study focused on the influence of virtual research and development (R&D) teams within Malaysian manufacturing small and medium sized enterprises (SMEs). The specific objective of the study is better understanding of the application of collaborative technologies in business, to find the effective factors to assist SMEs to remain competitive in the future. The paper stresses to find an answer for a question “Is there any relationship between company size, Internet connection facility and virtuality?”. The survey data shows SMEs are now technologically capable of performing the virtual collaborative team, but the infrastructure usage is less. SMEs now have the necessary technology to begin the implementation process of collaboration tools to reduce research and development (R&D) time, costs and increase productivity. So, the manager of R&D should take the potentials of virtual teams into account

    A Methodology for Engineering Collaborative and ad-hoc Mobile Applications using SyD Middleware

    Get PDF
    Today’s web applications are more collaborative and utilize standard and ubiquitous Internet protocols. We have earlier developed System on Mobile Devices (SyD) middleware to rapidly develop and deploy collaborative applications over heterogeneous and possibly mobile devices hosting web objects. In this paper, we present the software engineering methodology for developing SyD-enabled web applications and illustrate it through a case study on two representative applications: (i) a calendar of meeting application, which is a collaborative application and (ii) a travel application which is an ad-hoc collaborative application. SyD-enabled web objects allow us to create a collaborative application rapidly with limited coding effort. In this case study, the modular software architecture allowed us to hide the inherent heterogeneity among devices, data stores, and networks by presenting a uniform and persistent object view of mobile objects interacting through XML/SOAP requests and responses. The performance results we obtained show that the application scales well as we increase the group size and adapts well within the constraints of mobile devices
    corecore