466 research outputs found

    The 10th Jubilee Conference of PhD Students in Computer Science

    Get PDF

    Reengineering real-time software systems

    Get PDF
    The problem this thesis solves is how to reengineer existing real- time applications implemented without software engineering (SE) attributes; with poor modularity and robustness, and that are difficult to read and maintain. The real-time system chosen for this study was the Model-based Mobile robot Language (MML) used on the Yamabico- 11 mobile robot, which was implemented without SE attributes. The approach taken was reengineering MML with a focus on improving modifiability while preserving functionality. First we developed a systematic plan using manual static analysis, then we incrementally reengineered the application with thorough system-level testing. Code review was used to locate and remove dead code, and synonymous and redundant variables and functions (improving modifiability, readability and robustness). Call-hierarchy tracing was used to gain explicit module restructuring insight for tighter cohesion (improving modifiability, modularity, and readability). Global-variable tracing was used to improve module coupling by localizing and minimizing global variables (improving modularity, readability, and robustness). The results were as follows: A method for applying SE to existing real-time applications after- the-fact called 'Reengineering Real-Time Software Systems' was developed, which improves modifiability, modularity, robustness and readability. MML now has improved modularity and robustness, and is easier to read and maintain.http://archive.org/details/reengineeringrea1094539996Captain, United States ArmyApproved for public release; distribution is unlimited

    Alamprotsessidest, protsesside variatsioonidest ja nendevahelisest koosmõjust: Integreeritud “jaga ja valitse” meetod äriprotsesside ja nende variatsioonide modelleerimiseks

    Get PDF
    Igat organisatsiooni võib vaadelda kui süsteemi, mis rakendab äriprotsesse väärtuste loomiseks. Suurtes organisatsioonides on tavapärane esitada äriprotsesse kasutades protsessimudeleid, mida kasutatakse erinevatel eesmärkidel nagu näiteks sisekommunikatsiooniks, koolitusteks, protsesside parendamiseks ja infosüsteemide arendamiseks. Arvestades protsessimudelite multifunktsionaalset olemust tuleb protsessimudeleid koostada selliselt, et see võimaldab nendest arusaamist ning haldamist erinevate osapoolte poolt. Käesolev doktoritöö pakkudes välja integreeritud dekompositsioonist ajendatud meetodi äriprotsesside modelleerimiseks koos nende variatsioonidega. Meetodi kandvaks ideeks on järkjärguline äriprotsessi ja selle variatsioonide dekomponeerimine alamprotsessideks. Igal dekompositsiooni tasemel ning iga alamprotsessi jaoks määratletakse esmalt kas vastavat alamprotsessi tuleks modelleerida konsolideeritud moel (üks alamprotsessi mudel kõikide või osade variatsioonide jaoks) või fragmenteeritud moel (üks alamprotsess ühe variatsiooni jaoks). Sel moel kasutades ülalt-alla lähenemist viilutatakse ja tükeldatakse äriprotsess väiksemateks osadeks. Äriprotsess viilutatakse esmalt tema variatsioonideks ning seejärel tükeldatakse dekompositsioonideks kasutades kaht peamist parameetrit. Esimeseks on äri ajendid variatsioonide jaoks – igal äriprotsessi variatsioonil on oma juurpõhjus, mis pärineb ärist endast ja põhjustab protsesside käivitamisel erisusi. Need juurpõhjused jagatakse viide kategooriasse – ajendid kliendist, tootest, operatiivsetest põhjustest, turust ja ajast. Teine parameeter on erinevuste hulk viisides (tegevuste järjekord, tulemuste väärtused jms) kuidas variatsioonid oma väljundit toodavad. Käesolevas töös esitatud meetod on valideeritud kahes praktilises juhtumiuuringus. Kui esimeses juhtumiuuringus on põhirõhk olemasolevate protsessimudelite konsolideerimisel, siis teises protsessimudelite avastamisel. Sel moel rakendatakse meetodit kahes eri kontekstis kahele üksteisest eristatud juhtumile. Mõlemas juhtumiuuringus tootis meetod protsessimudelite hulgad, milles oli liiasust kuni 50% vähem võrreldes tavapäraste meetoditega jättes samas mudelite keerukuse nendega võrreldes enamvähem samale tasemele.Every organization can be conceived as a system where value is created by means of business processes. In large organizations, it is common for business processes to be represented by means of process models, which are used for a range of purposes such as internal communication, training, process improvement and information systems development. Given their multifunctional character, process models need to be captured in a way that facilitates understanding and maintenance by a variety of stakeholders. This thesis proposes an integrated decomposition-driven method for modeling business processes with variants. The core idea of the method is to incrementally construct a decomposition of a business process and its variants into subprocesses. At each level of the decomposition and for each subprocess, we determine if this subprocess should be modeled in a consolidated manner (one subprocess model for all variants or for multiple variants) or in a fragmented manner (one subprocess model per variant). In this manner, a top-down approach of slicing and dicing a business process is taken. The process model is sliced in accordance with its variants, and then diced (decomposed). This decision is taken based on two parameters. The first is the business drivers for the existence of the variants. All variants of a business process has a root cause i.e. a reason stemming from the business that causes the processes to have differences in how they are executed. The second parameter considered when deciding how to model the variants is the degree of difference in the way the variants produce their outcomes. As such, the modeling of business process variations is dependent on their degree of similarity in regards to how they produce value (such as values, execution order and so on). The method presented in this thesis is validated by two real-life case studies. The first case study concerns a case of consolidation existing process models. The other deals with green-field process discovery. As such, the method is applied in two different contexts (consolidation and discovery) on two different cases that differ from each other. In both cases, the method produced sets of process models that had reduced the duplicity rate by up to 50 % while keeping the degree of complexity of the models relatively stable

    The 9th Conference of PhD Students in Computer Science

    Get PDF

    A method for re-modularising legacy code

    Get PDF
    This thesis proposes a method for the re-modularisation of legacy COBOL. Legacy code often performs a number of functions that if split, would improve software maintainability. For instance, program comprehension would benefit from a reduction in the size of the code modules. The method aims to identify potential reuse candidates from the functions re-modularised, and to ensure clear interfaces are present between the new modules. Furthermore, functionality is often replicated across applications and so the re-modularisation process can also seek to reduce commonality and hence the overall amount of a company's code requiring maintenance. A 10 step method is devised which assembles a number of new and existing techniques into an approach suitable for use by staff not having significant reengineering experience. Three main approaches are used throughout the method; that is the analysis of the PERFORM structure, the analysis of the data, and the use of graphical representations. Both top-down and bottom-up strategies to program comprehension are incorporated within the method as are automatable, and user controlled processes to reuse candidate selection. Three industrial case studies are used to demonstrate and evaluate the method. The case studies range in size to gain an indication of the scalability of the method. The case studies are used to evaluate the method on a step by step basis; both strong points and deficiencies are identified, as well as potential solutions to the deficiencies. A review is also presented to assesses the three main approaches of the methods; the analysis of the PERFORM and data structures, and the use of graphical representations. The review uses the process of software evolution for its evaluation using successive versions of COBOL software. The method is retrospectively applied to the earliest version and the known changes identified from the following versions are used to evaluate the re-modularisations. Within the evaluation chapters a new link within the dominance tree is proposed as is an approach for dealing with multiple dominance trees. The results show that «ach approach provides an important contribution to the method as well as giving a useful insight (in the form of graphical representations) of the process of software evolution

    Air Force Materiel Command: A Survey of Performance Measures

    Get PDF
    Performance measurement has long been a matter of debate in logistics. However, in the recent past, there has been a renewed emphasis as AF leaders continue to seek funding for weapon system spares despite marginal improvements in mission capability. The Chief\u27s Logistics Review, Logistics Transformation Program, AFMC Constraints Assessment Program, the Spares Requirement Review Board, the Spares Campaign, and the Depot Maintenance Reengineering and Transformation all represent efforts to find and implement effective answers (RAND, 2003:ix). And, while there appears to be a consensus that better performance measures are needed, there is little agreement on exactly what should be measured, and how. Many performance management plans have been developed and recommended. In 1999, the Logistics Management Institute (LMI) published Supply Chain Management: A Recommended Performance Measurement Scorecard to guide senior DoD logistics managers. Then, in 2001, the AF Logistics Management Agency developed a set of aggregate or strategic level metrics, Measuring The Health of USAF Supply, at the request of AF/ILS. Most recently, in November of 2003, the Supply Management Division published the AFMC Supply Chain Metrics Guide. However, each of these performance measurement plans each is distinctly different. This research seeks to determine how and why these performance measurements plans differ, and to examine what such differences might reveal about the nature of performance measurement in AF logistics systems

    Cloud migration of legacy applications

    Get PDF

    Acta Cybernetica : Volume 21. Number 4.

    Get PDF

    Reduction in the balanced scorecard performance measurement systems in manufacturing organizations by PCA

    Get PDF
    In this paper, we compare PCA and ordinal logistic regression in ranking the manufacturing systems. In this regard we present an integrated framework for assessment and ranking of manufacturing systems based on management and organizational performance indicators. To achieve the objectives of this study, a comprehensive study was conducted to locate all economic and technical indicators which influence organizational performance. Sixty one indicators were identified and classified in .five categories, namely, (1) financial, (2) customer satisfaction, (3) process innovation, (4) production process and (5) organizational learning and growth. These indicators are related to organizational and managerial productivity and efficiency. One actual test problem and a random sample of 12 indicators were selected to show the applicability of the integrated approach. The results of PCA and OLR showed the weak and strong points of each sector in regard to the selected indicators. Furthermore, it identifies which indicators have the major impacts on the overall performance of industrial sectors. The modeling approach of this paper could be easily utilized for managerial and organizational ranking and analysis of other sectors. The results of such studies would help top managers to have better understanding and improve existing systems with respect to managerial and organizational performance.Keywords: Productivity and competitiveness; Multivariate statistics integrated assessment, BC
    corecore