5,172 research outputs found
Vehicle Booking System for Human Resource Management UTP
In line with the advancement of current technology, every manual process being converted into computer-based process. The rapid changes of technology bring positive effects where people are easy to be contacted everywhere and at anytime. Thus, many of human activities may be accelerate in speed including booking process which usually takes several days to become two hour process. The purpose of this project is to develop a web based application system for Vehicle Booking Process (VBS) of Human Resource Management UTP (HRM UTP). VBS has been developed to automate the currently manual business processes. This portal is act as a one stop centre for staff and HRM Officer. By login to this website, it allows free registration and online application for requester. It is a web-based application system which allows data centralization. In order to achieve this objective, the author has done a lot of research in order to have a deep understanding about online booking system and how to design and develop a bets website. The methodology used for designing and developing this website is Iterative and Incremental Development Method with the integration of the test driven development model. The website interface designs are also included based on the comparison of existing website and user feedback. The author concludes with few recommendations in developing this website
The Data Processing Pipeline for the Herschel-HIFI Instrument
The HIFI data processing pipeline was developed to systematically process
diagnostic, calibration and astronomical observations taken with the HIFI
science instrumentas part of the Herschel mission. The HIFI pipeline processed
data from all HIFI observing modes within the Herschel automated processing
environment, as well as, within an interactive environment. A common software
framework was developed to best support the use cases required by the
instrument teams and by the general astronomers. The HIFI pipeline was built on
top of that and was designed with a high degree of modularity. This modular
design provided the necessary flexibility and extensibility to deal with the
complexity of batch-processing eighteen different observing modes, to support
the astronomers in the interactive analysis and to cope with adjustments
necessary to improve the pipeline and the quality of the end-products. This
approach to the software development and data processing effort was arrived at
by coalescing the lessons learned from similar research based projects with the
understanding that a degree of foresight was required given the overall length
of the project. In this article, both the successes and challenges of the HIFI
software development process are presented. To support future similar projects
and retain experience gained lessons learned are extracted.Comment: 18 pages, 5 figure
Bridging test and model-driven approaches in web engineering
In the last years there has been a growing interest in agile methods and their integration into the so called "unified" approaches. In the field of Web Engineering, agile approaches such as test-driven development are appealing because of the very nature of Web applications, while model-driven approaches provide a less error-prone code derivation; however the integration of both approaches is not easy. In this paper, we present a method-independent approach to combine the agile, iterative and incremental style of test-driven development with the more formal, transformation-based model-driven Web engineering approaches. We focus not only in the development process but also in the evolution of the application, and show how tests can be transformed together with model refactoring. As a proof of concept we show an illustrative example using WebRatio, the WebML design tool.Publicado en Lecture Notes in Computer Science book series (LNCS, vol. 5648).Laboratorio de Investigación y Formación en Informática Avanzad
Plan validation and mixed-initiative planning in space operations
Bringing artificial intelligence planning and scheduling applications into the real world is a hard task that is receiving more attention every day by researchers and practitioners from many fields. In many cases, it requires the integration of several underlying techniques like planning, scheduling, constraint satisfaction, mixed-initiative planning and scheduling, temporal reasoning, knowledge representation, formal models and languages, and technological issues. Most papers included in this book are clear examples on how to integrate several of these techniques. Furthermore, the book also covers many interesting approaches in application areas ranging from industrial job shop to electronic tourism, environmental problems, virtual teaching or space missions. This book also provides powerful techniques that allow to build fully deployable applications to solve real problems and an updated review of many of the most interesting areas of application of these technologies, showing how powerful these technologies are to overcome the expresiveness and efficiency problems of real world problems
Reverse Engineering and Testing of Rich Internet Applications
The World Wide Web experiences a continuous and constant evolution, where new initiatives, standards, approaches and technologies are continuously proposed for developing more effective and higher quality Web applications.
To satisfy the growing request of the market for Web applications, new technologies, frameworks, tools and environments that allow to develop Web and mobile applications with the least effort and in very short time have been introduced in the last years.
These new technologies have made possible the dawn of a new generation of Web applications, named Rich Internet Applications (RIAs), that offer greater usability and interactivity than traditional ones. This evolution has been accompanied by some drawbacks that are mostly due to the lack of applying well-known software engineering practices and approaches. As a consequence, new research questions and challenges have emerged in the field of web and mobile applications maintenance and testing.
The research activity described in this thesis has addressed some of these topics with the specific aim of proposing new and effective solutions to the problems of modelling, reverse engineering, comprehending, re-documenting and testing existing RIAs.
Due to the growing relevance of mobile applications in the renewed Web scenarios, the problem of testing mobile applications developed for the Android operating system has been addressed too, in an attempt of exploring and proposing new techniques of testing automation for these type of applications
Methodology and automated metadata extraction from multiple volume shadow copies
Modern day digital forensics investigations rely on timelines as a principal method for normalizing and chronologically categorizing artifacts recovered from computer systems. Timelines provide investigators with a chronological representation of digital evidence so they can depict altered and unaltered digital forensics data in-context to drive conclusions about system events and/or user activities. While investigators rely on many system artifacts such as file system time/date stamps, operating system artifacts, program artifacts, logs, and/or registry artifacts as input for deriving chronological representations, using only the available or most recent version of the artifacts may provide a limited picture of historical changes on a system. For instance, if previous versions of artifacts and/or previous artifact metadata changes are overwritten and/or are not retained on a system, analysis of current versions of artifacts and artifact metadata, such as time/date stamps and operating system/program/registry artifacts, may provide only a limited picture of activities for the system. Recently, the Microsoft Windows Operating System implemented a backup mechanism that is capable of retaining multiple versions of data storage units for a system, effectively providing a highly-detailed record of system changes. This backup mechanism, the Windows Volume Shadow Copy Service (VSS), exists as a service of modern Microsoft Windows Operating Systems and allows data backups to be performed while applications on a system continue to write to the system\u27s live volume(s). This allows a running system to preserve the system\u27s state to backup media at any given point while the system continues to change in real-time. After multiple VSS backups are recorded, digital investigators now have the ability to incorporate multiple versions of a system\u27s artifacts into a chronological representation, which provides a more comprehensive picture of the system\u27s historical changes. In order to effectively incorporate VSS backup, or Volume Shadow Copy (VSC), data into a chronological representation, the data must be accessed and extracted in a consistent, repeatable, and, if possible, automated manner. Previous efforts have produced a variety of manual and semi-automated methods for accessing and extracting VSC data in a repeatable manner. These methods are time consuming and often require significant storage resources if dealing with multiple VSCs. The product of this research effort is the advancement of the methodology to automate accessing and extracting directory-tree and file attribute metadata from multiple VSCs of the Windows 7 Operating System. The approach extracts metadata from multiple VSCs and combines it as one conglomerate data set. By capturing the historical changes recorded within VSC metadata, this approach enhances timeline generation. Additionally, it supports other projects which could use the metadata to visualize change-over-time by depicting how the individual metadata and the conglomerate data set changed (or remained unchanged) throughout an arbitrary snapshot of time
Recommended from our members
An investigation into the feasibility, problems and benefits of re-engineering a legacy procedural CFD code into an event driven, object oriented system that allows dynamic user interaction
This research started with questions about how the overall efficiency, reliability and ease-of-use of Computational Fluid Dynamics (CFD) codes could be improved using any available software engineering and Human Computer Interaction (HCI) techniques. Much of this research has been driven by the difficulties experienced by novice CFD users in the area of Fire Field Modelling where the introduction of performance based building regulations have led to a situation where non CFD experts are increasingly making use of CFD techniques, with varying degrees of effectiveness, for safety critical research. Formerly, such modelling has not been helped by the mode of use, high degree of expertise required from the user and the complexity of specifying a simulation case. Many of the early stages of this research were channelled by perceived limitations of the original legacy CFD software that was chosen as a framework for these investigations. These limitations included poor code clarity, bad overall efficiency due to the use of batch mode processing, poor assurance that the final results presented from the CFD code were correct and the requirement for considerable expertise on the part of users.
The innovative incremental re-engineering techniques developed to reverse-engineer, re-engineer and improve the internal structure and usability of the software were arrived at as a by-product of the research into overcoming the problems discovered in the legacy software. The incremental reengineering methodology was considered to be of enough importance to warrant inclusion in this thesis. Various HCI techniques were employed to attempt to overcome the efficiency and solution correctness problems. These investigations have demonstrated that the quality, reliability and overall run-time efficiency of CFD software can be significantly improved by the introduction of run-time monitoring and interactive solution control. It should be noted that the re-engineered CFD code is observed to run more slowly than the original FORTRAN legacy code due, mostly, to the changes in calling architecture of the software and differences in compiler optimisation: but, it is argued that the overall effectiveness, reliability and ease-of-use of the prototype software are all greatly improved. Investigations into dynamic solution control (made possible by the open software architecture and the interactive control interface) have demonstrated considerable savings when using solution control optimisation. Such investigations have also demonstrated the potential for improved assurance of correct simulation when compared with the batch mode of processing found in most legacy CFD software. Investigations have also been conducted into the efficiency implications of using unstructured group solvers.
These group solvers are a derivation of the simple point-by-point Jaccobi Over Relaxation (JOR) and Successive Over Relaxation (SOR) solvers [CROFT98] and using group solvers allows the computational processing to be more effectively targeted on regions or logical collections of cells that require more intensive computation. Considerable savings have been demonstrated for the use of both static- and dynamic- group membership when using these group solvers for a complex 3-imensional fire modelling scenario. Furthermore the improvements in the system architecture (brought about as a result of software re-engineering) have helped to create an open framework that is both easy to comprehend and extend. This is in spite of the underlying unstructured nature of the simulation mesh with all of the associated complexity that this brings to the data structures. The prototype CFD software framework has recently been used as the core processing module in a commercial Fire Field Modelling product (called "SMARTFIRE" [EWER99-1]). This CFD framework is also being used by researchers to investigate many diverse aspects of CFD technology including Knowledge Based Solution Control, Gaseous and Solid Phase Combustion, Adaptive Meshing and CAD file interpretation for ease of case specification
Temporal meta-model framework for Enterprise Information Systems (EIS) development
This thesis has developed a Temporal Meta-Model Framework for semi-automated Enterprise System Development, which can help drastically reduce the time and cost to develop, deploy and maintain Enterprise Information Systems throughout their lifecycle. It proposes that the analysis and requirements gathering can also perform the bulk of the design phase, stored and available in a suitable model which would then be capable of automated execution with the availability of a set of specific runtime components
- …