10,540 research outputs found

    Developing Time And Attendance System (TAS)

    Get PDF
    Time Attendance System is software that helps organization for registering and tracking employee attendance, it can integrate with existing payroll and human resource systems, as well as various collection devices. This system consists of three major parts. The first is the employee, through which they can record their attendance either in manually through timekeeper or automatically through special hardware device such as card reader. The second is timekeeper who is responsible for recording employee's attendance manually. The last is the administrator, the administrator may add, remove, search, view employee information or designation information and more

    A Value-Driven Framework for Software Architecture

    Get PDF
    Software that is not aligned with the business values of the organization for which it was developed does not entirely fulfill its raison d’etre. Business values represent what is important in a company, or organization, and should influence the overall software system behavior, contributing to the overall success of the organization. However, approaches to derive a software architecture considering the business values exchanged between an organization and its market players are lacking. Our quest is to address this problem and investigate how to derive value-centered architectural models systematically. We used the Technology Research method to address this PhD research question. This methodological approach proposes three steps: problem analysis, innovation, and validation. The problem analysis was performed using systematic studies of the literature to obtain full coverage on the main themes of this work, particularly, business value modeling, software architecture methods, and software architecture derivation methods. Next, the innovation step was accomplished by creating a framework for the derivation of a software reference architecture model considering an organization’s business values. The resulting framework is composed of three core modules: Business Value Modeling, Agile Reference Architecture Modeling, and Goal-Driven SOA Architecture Modeling. While the Business value modeling module focuses on building a stakeholder-centric business specification, the Agile Reference Architecture Modeling and the Goal-Driven SOA Architecture Modeling modules concentrate on generating a software reference architecture aligned with the business value specification. Finally, the validation part of our framework is achieved through proof-of-concept prototypes for three new domain specific languages, case studies, and quasi-experiments, including a family of controlled experiments. The findings from our research show that the complexity and lack of rigor in the existing approaches to represent business values can be addressed by an early requirements specification method that represents the value exchanges of a business. Also, by using sophisticated model-driven engineering techniques (e.g., metamodels, model transformations, and model transformation languages), it was possible to obtain source generators to derive a software architecture model based on early requirements value models, while assuring traceability throughout the architectural derivation process. In conclusion, despite using sophisticated techniques, the derivation process of a software reference architecture is helped by simple to use methods supported by black box transformations and guidelines that facilitate the activities for the less experienced software architects. The experimental validation process used confirmed that our framework is feasible and perceived as easy to use and useful, also indicating that the participants of the experiments intend to use it in the future

    Modelling Security Requirements Through Extending Scrum Agile Development Framework

    Get PDF
    Security is today considered as a basic foundation in software development and therefore, the modelling and implementation of security requirements is an essential part of the production of secure software systems. Information technology organisations are moving towards agile development methods in order to satisfy customers' changing requirements in light of accelerated evolution and time restrictions with their competitors in software production. Security engineering is considered difficult in these incremental and iterative methods due to the frequency of change, integration and refactoring. The objective of this work is to identify and implement practices to extend and improve agile methods to better address challenges presented by security requirements consideration and management. A major practices is security requirements capture mechanisms such as UMLsec for agile development processes. This thesis proposes an extension to the popular Scrum framework by adopting UMLsec security requirements modelling techniques with the introduction of a Security Owner role in the Scrum framework to facilitate such modelling and security requirements considerations generally. The methodology involved experimentation of the inclusion of UMLsec and the Security Owner role to determine their impact on security considerations in the software development process. The results showed that overall security requirements consideration improved and that there was a need for an additional role that has the skills and knowledge to facilitate and realise the benefits of the addition of UMLsec

    A new WebGIS approach to support ground penetrating radar deployment

    Get PDF
    En raison de l’agglomĂ©ration complexe des infrastructures souterraines dans les grandes zones urbaines et des prĂ©occupations accrues des municipalitĂ©s ou des gouvernements qui dĂ©ploient des systĂšmes d’information fonciĂšre ou des industries qui souhaitent construire ou creuser, il devient de plus en plus impĂ©ratif de localiser et de cartographier avec prĂ©cision les pipelines, les cĂąbles d’énergie hydroĂ©lectrique, les rĂ©seaux de communication ou les conduites d’eau potable et d’égout. Le gĂ©oradar (Ground Penetrating Radar ou GPR) est un outil en gĂ©ophysique qui permet de produire des images en coupe du sous-sol desquelles de l’information utile sur les infrastructures souterraines peut ĂȘtre tirĂ©e. Des expĂ©riences antĂ©rieures et une analyse documentaire approfondie ont rĂ©vĂ©lĂ© que les logiciels disponibles pour rĂ©aliser des levĂ©s GPR qui sont utilisĂ©s directement sur le terrain et hors site ne reposent pas ou trĂšs peu sur des fonctionnalitĂ©s gĂ©ospatiales. En outre, l’intĂ©gration de donnĂ©es telles que la visualisation de donnĂ©es GPR dans des espaces gĂ©orĂ©fĂ©rencĂ©s avec des orthophotos, des cartes, des points d’intĂ©rĂȘt, des plans CAO, etc., est impossible. Lorsque disponible, l’ajout d’annotations ou l’interrogation d’objets gĂ©ospatiaux susceptibles d’amĂ©liorer ou d’accĂ©lĂ©rer les investigations ne proposent pas des interfaces conviviales. Dans ce projet de recherche, une nouvelle approche est proposĂ©e pour dĂ©ployer le GPR et elle est basĂ©e sur quatre fonctionnalitĂ©s issues du Web et des systĂšmes d’information gĂ©ographique (WebGIS) jugĂ©es essentielles pour faciliter la rĂ©alisation de levĂ©s GPR sur le terrain. Pour dĂ©montrer la faisabilitĂ© de cette nouvelle approche, une extension de la plate-forme logicielle existante GVX (conçue et vendue par Geovoxel) appelĂ©e GVX-GPR a Ă©tĂ© dĂ©veloppĂ©e. GVX-GPR propose aux utilisateurs d’instruments GPR quatre fonctionnalitĂ©s soit 1) intĂ©gration de cartes, 2) gĂ©o-annotations et points d’intĂ©rĂȘt, 3) gĂ©orĂ©fĂ©rencement et visualisation de radargrammes et 4) visualisation de sections GPR gĂ©orĂ©fĂ©rencĂ©es. Afin de tester l’approche WebGIS et GPXGPR, deux sites d’étude ont Ă©tĂ© relevĂ©s par deux professionnels diffĂ©rents, un expert et un non-expert en gĂ©ophysique, ont Ă©tĂ© sĂ©lectionnĂ©s. Une premiĂšre expĂ©rimentation rĂ©alisĂ©e sur le campus de l’UniversitĂ© Laval Ă  QuĂ©bec prĂ©voyait l’identification de trois objets enterrĂ©s soit un cĂąble Ă©lectrique, une fibre optique et un tunnel dont leur position XYZ Ă©tait connue. Le deuxiĂšme essai s’est passĂ© Ă  l’Universidade Federal do Rio de Janeiro (Rio de Janeiro, BrĂ©sil), avec un professionnel expert en gĂ©ophysique. Ce 2e site cherchait Ă  reproduire un environnent plus rĂ©aliste avec une quantitĂ© inconnue d’objets enterrĂ©s. Les quatre fonctionnalitĂ©s proposĂ©es par GVX-GPR ont donc Ă©tĂ© testĂ©es et leur intĂ©rĂȘt discutĂ© par les deux utilisateurs GPR. Les deux utilisateurs GPR se sont dits trĂšs intĂ©ressĂ©s par l’outil GVX-GPR et ses nouvelles fonctionnalitĂ©s et ils aimeraient pouvoir l’intĂ©grer Ă  leur travail quotidien car ils y voient des avantages. En particulier, l’approche et GVX-GPR les a aidĂ©s Ă  dĂ©couvrir de nouvelles cibles, Ă  dĂ©limiter le territoire Ă  couvrir, Ă  interprĂ©ter les donnĂ©es GPR brutes en permettant l’interaction entre les donnĂ©es gĂ©ospatiales (en ligne) et les profils de donnĂ©es GPR, et finalement pour la cartographie Ă  produire tout en respectant la norme CityGML (donc utile au partage Ă©ventuel des donnĂ©es). De mĂȘme, une fois le systĂšme maitrisĂ©, GVX-GPR a permis d’optimiser la durĂ©e du levĂ©. Ce projet de maitrise a donc permis d’élaborer une nouvelle approche pour effectuer des levĂ©s GPR et proposer un outil logiciel pour tester la faisabilitĂ© de celle-ci. Une premiĂšre Ă©tape de validation de la faisabilitĂ© et de l’utilitĂ© a Ă©tĂ© rĂ©alisĂ©e grĂące aux deux tests effectuĂ©s. Évidemment, ces deux tests sont des premiers pas dans une phase plus large de validation qui pourrait s’effectuer, et ils ont ouvert la porte Ă  des ajustements ou l’ajout d’autres fonctionnalitĂ©s, comme la manipulation des outils de visualisation 3D et l’ajout de filtres et traitement de signal. Nous estimons nĂ©anmoins ces premiers tests concluant pour ce projet de maĂźtrise, et surtout ils dĂ©montrent que les instruments GPR gagneraient Ă  davantage intĂ©grer les donnĂ©es et fonctionnalitĂ©s gĂ©ospatiales. Nous pensons Ă©galement que nos travaux vont permettre Ă  des communautĂ©s de non spĂ©cialistes en gĂ©ophysique de s’intĂ©resser aux instruments de type GPR pour les levĂ©s d’objets enfouis. Notre approche pourra les aider Ă  prĂ©parer les donnĂ©es gĂ©ospatiales utiles Ă  la planification, Ă  effectuer le levĂ© terrain et Ă  produire les cartes associĂ©esDue to the complex agglomeration of underground infrastructures in large urban areas and accordingly increased concerns by municipalities or government who deploy land information systems or industries who want to construct or excavate, it is imperative to accurately locate and suitability map existing underground utility networks (UUN) such as pipelines, hydroelectric power cables, communication networks, or drinking water and sewage conduits. One emerging category of instrument in geophysics for collecting and extracting data from the underground is the ground penetrating radar (GPR). Previous experiments and a thorough literature review revealed that GPR software used in and off the field do not take advantage of geospatial features and data integration such as visualization of GPR data in a georeferenced space with orthophotographies, map, point of interest, CAD plans, etc. Also missing is the capability to add annotation or querying geospatial objects that may improve or expedite the investigations. These functions are long-lived in the geospatial domain, such as in geographic information system (GIS). In this research project, a new approach is proposed to deploy GPR based on four core WebGIS-enabled features, used to support field investigations with GPR. This WebGIS is based on an existing platform called GVX, designed and sold by Geovoxel as a risk management tool for civil engineering projects. In this proposed approach, a generic guideline based on GVX-GPR was developed which users can follow when deploying GPR. This approach is based on four core features which are missing on most GPR software, (1) map integration, (2) geo-annotations and points of interest, (3) radargram georeferencing and visualization, and (4) georeferenced slice visualization. In order to test the designed WebGIS-based approach, two different professionals, an expert in geophysics and a person without any background in geophysics, used the proposed approach in their day-to-day professional practice. The first experiment was conducted at UniversitĂ© Laval (QuĂ©bec – Canada) when the subject undertook an area to a survey in order to identify 3 possible targets premapped. The second, with a Geophysics-specialist, took place in Rio de Janeiro, at Universidade Federal do Rio de Janeiro’s campus. This study covered an area counting on an unknown number of buried objects, aiming at reproducing a realistic survey scenario. Four new feature were added and discussed with GPR practitioners. Both GPR user declared to be very interested by the proposed by the tool GVX-GPR and its features, being willing to apply this software on their daily basis due to the added advantages. Particularly, this approach has aided these professionals to find new buried objects, delimit the survey area, interpret raw GPR data by allowing geospatial data interaction and GPR profiles, and, finally, to produce new maps compliant with standards such as CityGML. Also, once mastered, the technology allowed the optimization of survey time. This project enabled the development of a new approach to leverage GPR surveys and proposed a new tool in order to test the approach’s feasibility. A first step into the validation of this proposal has been taken towards a feasibility and utility evaluation with two tests accomplished. Unmistakably, these are the first steps of a likely larger validation process, opening up new possibilities for the continuity of the project such as the addition of signal processing techniques and 3D data handling. We nevertheless consider these conclusive for this master’s project, above all demonstrating the value add by geospatial data integration and functions to GPR instruments. This work is also intended to the community of newcomers, or interested in GPR, to further explore this technology, since this approach shall facilitate the preparation, execution, and post-processing phases of a GPR survey

    A Proactive Approach to Application Performance Analysis, Forecast and Fine-Tuning

    Get PDF
    A major challenge currently faced by the IT industry is the cost, time and resource associated with repetitive performance testing when existing applications undergo evolution. IT organizations are under pressure to reduce the cost of testing, especially given its high percentage of the overall costs of application portfolio management. Previously, to analyse application performance, researchers have proposed techniques requiring complex performance models, non-standard modelling formalisms, use of process algebras or complex mathematical analysis. In Continuous Performance Management (CPM), automated load testing is invoked during the Continuous Integration (CI) process after a build. CPM is reactive and raises alarms when performance metrics are violated. The CI process is repeated until performance is acceptable. Previous and current work is yet to address the need of an approach to allow software developers proactively target a specified performance level while modifying existing applications instead of reacting to the performance test results after code modification and build. There is thus a strong need for an approach which does not require repetitive performance testing, resource intensive application profilers, complex software performance models or additional quality assurance experts. We propose to fill this gap with an innovative relational model associating the operation‟s Performance with two novel concepts – the operation‟s Admittance and Load Potential. To address changes to a single type or multiple types of processing activities of an application operation, we present two bi-directional methods, both of which in turn use the relational model. From annotations of Delay Points within the code, the methods allow software developers to either fine-tune the operation‟s algorithm “targeting” a specified performance level in a bottom-up way or to predict the operation‟s performance due to code changes in a top-down way under a given workload. The methods do not need complex performance models or expensive performance testing of the whole application. We validate our model on a realistic experimentation framework. Our results indicate that it is possible to characterize an application Performance as a function of its Admittance and Load Potential and that the application Admittance can be characterized as a function of the latency of its Delay Points. Applying this method to complex large-scale systems has the potential to significantly reduce the cost of performance testing during system maintenance and evolution

    Colored model based testing for software product lines (CMBT-SWPL)

    Get PDF
    Over the last decade, the software product line domain has emerged as one of the mostpromising software development paradigms. The main beneïŹts of a software product lineapproach are improvements in productivity, time to market, product quality, and customersatisfaction.Therefore, one topic that needs greater emphasis is testing of software product lines toachieve the required software quality assurance. Our concern is how to test a softwareproduct line as early as possible in order to detect errors, because the cost of error detectedIn early phases is much less compared to the cost of errors when detected later.The method suggested in this thesis is a model-based, reuse-oriented test technique calledColored Model Based Testing for Software Product Lines (CMBT-SWPL). CMBT-SWPLis a requirements-based approach for eïŹƒciently generating tests for products in a soft-ware product line. This testing approach is used for validation and veriïŹcation of productlines. It is a novel approach to test product lines using a Colored State Chart (CSC), whichconsiders variability early in the product line development process. More precisely, the vari-ability will be introduced in the main components of the CSC. Accordingly, the variabilityis preserved in test cases, as they are generated from colored test models automatically.During domain engineering, the CSC is derived from the feature model. By coloring theState Chart, the behavior of several product line variants can be modeled simultaneouslyin a single diagram and thus address product line variability early. The CSC representsthe test model, from which test cases using statistical testing are derived.During application engineering, these colored test models are customized for a speciïŹcapplication of the product line. At the end of this test process, the test cases are generatedagain using statistical testing, executed and the test results are ready for evaluation. Inxaddition, the CSC will be transformed to a Colored Petri Net (CPN) for veriïŹcation andsimulation purposes.The main gains of applying the CMBT-SWPL method are early detection of defects inrequirements, such as ambiguities incompleteness and redundancy which is then reïŹ‚ectedin saving the test eïŹ€ort, time, development and maintenance costs

    Software Evolution for Industrial Automation Systems. Literature Overview

    Get PDF

    Maps of Lessons Learnt in Requirements Engineering

    Get PDF
    Both researchers and practitioners have emphasized the importance of learning from past experiences and its consequential impact on project time, cost, and quality. However, from the survey we conducted of requirements engineering (RE) practitioners, over 70\% of the respondents stated that they seldom use RE lessons in the RE process, though 85\% of these would use such lessons if readily available. Our observation, however, is that RE lessons are scattered, mainly implicitly, in the literature and practice, which obviously, does not help the situation. We, therefore, present ``maps” of RE lessons which would highlight weak (dark) and strong (bright) areas of RE (and hence RE theories). Such maps would thus be: (a) a driver for research to ``light up” the darker areas of RE and (b) a guide for practice to benefit from the brighter areas. To achieve this goal, we populated the maps with over 200 RE lessons elicited from literature and practice using a systematic literature review and survey. The results show that approximately 80\% of the elicited lessons are implicit and that approximately 70\% of the lessons deal with the elicitation, analysis, and specification RE phases only. The RE Lesson Maps, elicited lessons, and the results from populating the maps provide novel scientific groundings for lessons learnt in RE as this topic has not yet been systematically studied in the field
    • 

    corecore