418 research outputs found

    Towards a Digital Capability Maturity Framework for Tertiary Institutions

    Get PDF
    Background: The Digital Capability (DC) of an Institution is the extent to which the institution's culture, policies, and infrastructure enable and support digital practices (Killen et al., 2017), and maturity is the continuous improvement of those capabilities. As technology continues to evolve, it is likely to give rise to constant changes in teaching and learning, potentially disrupting Tertiary Education Institutions (TEIs) and making existing organisational models less effective. An institution’s ability to adapt to continuously changing technology depends on the change in culture and leadership decisions within the individual institutions. Change without structure leads to inefficiencies, evident across the Nigerian TEI landscape. These inefficiencies can be attributed mainly to a lack of clarity and agreement on a development structure. Objectives: This research aims to design a structure with a pathway to maturity, to support the continuous improvement of DC in TEIs in Nigeria and consequently improve the success of digital education programmes. Methods: I started by conducting a Systematic Literature Review (SLR) investigating the body of knowledge on DC, its composition, the relationship between its elements and their respective impact on the Maturity of TEIs. Findings from the review led me to investigate further the key roles instrumental in developing Digital Capability Maturity in Tertiary Institutions (DCMiTI). The results of these investigations formed the initial ideas and constructs upon which the proposed structure was built. I then explored a combination of quantitative and qualitative methods to substantiate the initial constructs and gain a deeper understanding of the relationships between elements/sub-elements. Next, I used triangulation as a vehicle to expand the validity of the findings by replicating the methods in a case study of TEIs in Nigeria. Finally, after using the validated constructs and knowledge base to propose a structure based on CMMI concepts, I conducted an expert panel workshop to test the model’s validity. Results: I consolidated the body of knowledge from the SLR into a universal classification of 10 elements, each comprising sub-elements. I also went on to propose a classification for DCMiTI. The elements/sub-elements in the classification indicate the success factors for digital maturity, which were also found to positively impact the ability to design, deploy and sustain digital education. These findings were confirmed in a UK University and triangulated in a case study of Northwest Nigeria. The case study confirmed the literature findings on the status of DCMiTI in Nigeria and provided sufficient evidence to suggest that a maturity structure would be a well-suited solution to supporting DCM in the region. I thus scoped, designed, and populated a domain-specific framework for DCMiTI, configured to support the educational landscape in Northwest Nigeria. Conclusion: The proposed DCMiTI framework enables TEIs to assess their maturity level across the various capability elements and reports on DCM as a whole. It provides guidance on the criteria that must be satisfied to achieve higher levels of digital maturity. The framework received expert validation, as domain experts agreed that the proposed Framework was well applicable to developing DCMiTI and would be a valuable tool to support TEIs in delivering successful digital education. Recommendations were made to engage in further iterations of testing by deploying the proposed framework for use in TEI to confirm the extent of its generalisability and acceptability

    Methodological approaches and techniques for designing ontologies in information systems requirements engineering

    Get PDF
    Programa doutoral em Information Systems and TechnologyThe way we interact with the world around us is changing as new challenges arise, embracing innovative business models, rethinking the organization and processes to maximize results, and evolving change management. Currently, and considering the projects executed, the methodologies used do not fully respond to the companies' needs. On the one hand, organizations are not familiar with the languages used in Information Systems, and on the other hand, they are often unable to validate requirements or business models. These are some of the difficulties encountered that lead us to think about formulating a new approach. Thus, the state of the art presented in this paper includes a study of the models involved in the software development process, where traditional methods and the rivalry of agile methods are present. In addition, a survey is made about Ontologies and what methods exist to conceive, transform, and represent them. Thus, after analyzing some of the various possibilities currently available, we began the process of evolving a method and developing an approach that would allow us to design ontologies. The method we evolved and adapted will allow us to derive terminologies from a specific domain, aggregating them in order to facilitate the construction of a catalog of terminologies. Next, the definition of an approach to designing ontologies will allow the construction of a domain-specific ontology. This approach allows in the first instance to integrate and store the data from different information systems of a given organization. In a second instance, the rules for mapping and building the ontology database are defined. Finally, a technological architecture is also proposed that will allow the mapping of an ontology through the construction of complex networks, allowing mapping and relating terminologies. This doctoral work encompasses numerous Research & Development (R&D) projects belonging to different domains such as Software Industry, Textile Industry, Robotic Industry and Smart Cities. Finally, a critical and descriptive analysis of the work done is performed, and we also point out perspectives for possible future work.A forma como interagimos com o mundo à nossa volta está a mudar à medida que novos desafios surgem, abraçando modelos empresariais inovadores, repensando a organização e os processos para maximizar os resultados, e evoluindo a gestão da mudança. Atualmente, e considerando os projetos executados, as metodologias utilizadas não respondem na totalidade às necessidades das empresas. Por um lado, as organizações não estão familiarizadas com as linguagens utilizadas nos Sistemas de Informação, por outro lado, são muitas vezes incapazes de validar requisitos ou modelos de negócio. Estas são algumas das dificuldades encontradas que nos levam a pensar na formulação de uma nova abordagem. Assim, o estado da arte apresentado neste documento inclui um estudo dos modelos envolvidos no processo de desenvolvimento de software, onde os métodos tradicionais e a rivalidade de métodos ágeis estão presentes. Além disso, é efetuado um levantamento sobre Ontologias e quais os métodos existentes para as conceber, transformar e representar. Assim, e após analisarmos algumas das várias possibilidades atualmente disponíveis, iniciou-se o processo de evolução de um método e desenvolvimento de uma abordagem que nos permitisse conceber ontologias. O método que evoluímos e adaptamos permitirá derivar terminologias de um domínio específico, agregando-as de forma a facilitar a construção de um catálogo de terminologias. Em seguida, a definição de uma abordagem para conceber ontologias permitirá a construção de uma ontologia de um domínio específico. Esta abordagem permite em primeira instância, integrar e armazenar os dados de diferentes sistemas de informação de uma determinada organização. Num segundo momento, são definidas as regras para o mapeamento e construção da base de dados ontológica. Finalmente, é também proposta uma arquitetura tecnológica que permitirá efetuar o mapeamento de uma ontologia através da construção de redes complexas, permitindo mapear e relacionar terminologias. Este trabalho de doutoramento engloba inúmeros projetos de Investigação & Desenvolvimento (I&D) pertencentes a diferentes domínios como por exemplo Indústria de Software, Indústria Têxtil, Indústria Robótica e Smart Cities. Finalmente, é realizada uma análise critica e descritiva do trabalho realizado, sendo que apontamos ainda perspetivas de possíveis trabalhos futuros

    DevOps Ontology - An ontology to support the understanding of DevOps in the academy and the software industry

    Get PDF
    Currently, the degree of knowledge about what DevOps really means and what it entails is still limited. This can result in an informal and even incorrect implementation in many cases. Although several proposals related to DevOps adoption can be found, confusion is not uncommon and terminology conflict between the proposals is still evident. This article proposes DevOps Ontology, a semi-formal ontology that proposes a generic, consistent, and clear language to enable the dissemination of information related to implementing DevOps in software development. The ontology presented in this article facilitates the understanding of DevOps by identifying the relationships between software process elements and the agile principles/values that may be related to them. The DevOps Ontology has been defined considering the following aspects: the REFSENO formalism that uses the representation in UML was used and the language OWL language using Prótegé and HermiT Reasoner to evaluate the consistency of its structure. Likewise, it was satisfactorily evaluated in three application cases: a theoretical validation; instantiation of the continuous integration and deployment practices proposed by the company GitLab. Furthermore, a mobile app was created to retrieve information from the DevOps Ontology using the SPARQL protocol and RDF language. The app also evaluated the Ontology’s proficiency in responding to knowledge-based questions using SPARQL. The results showed that DevOps Ontology is consistent, complete, and concise, i.e.: to say: the consistency could be observed in the ability to be able to infer knowledge from the ontology, ensuring that the ontology is complete by checking for any incompleteness and verifying that all necessary definitions and inferences are well-established. Additionally, the ontology was assessed for conciseness to ensure that it doesn't contain redundant or unnecessary definitions. Furthermore, it has the potential for improvement by incorporating new concepts and relationships as needed. The newly suggested ontology creates a set of terms that provide a systematic and structured approach to organizing the existing knowledge in the field. This helps to minimize the confusion, inconsistency, and heterogeneity of the terminologies and concepts in the area of interest

    A conceptual framework for SPI evaluation

    Full text link
    Software Process Improvement (SPI) encompasses the analysis and modification of the processes within software development, aimed at improving key areas that contribute to the organizations' goals. The task of evaluating whether the selected improvement path meets these goals is challenging. On the basis of the results of a systematic literature review on SPI measurement and evaluation practices, we developed a framework (SPI Measurement and Evaluation Framework (SPI-MEF)) that supports the planning and implementation of SPI evaluations. SPI-MEF guides the practitioner in scoping the evaluation, determining measures, and performing the assessment. SPI-MEF does not assume a specific approach to process improvement and can be integrated in existing measurement programs, refocusing the assessment on evaluating the improvement initiative's outcome. Sixteen industry and academic experts evaluated the framework's usability and capability to support practitioners, providing additional insights that were integrated in the application guidelines of the framework

    Organisaation ohjelmistotestauspolitiikka ja -strategia

    Get PDF
    Ohjelmistotestaus on paitsi tärkeä osa ohjelmistotuotantoprosessia myös työkalu ohjelmistotuotteen loppukäyttäjälle tuotteen käyttöönotossa ja sen elinkaaren aikaisessa ylläpidossa. Testauksen päätavoite on parantaa tuotteen laatua ja varmistua siitä, että tuote täyttää sille asetetut toiveet ja vaatimukset. Mitä varhaisemmassa vaiheessa testauksella tuotteessa olevia virheitä voidaan havaita ja korjata, sitä pienemmiksi käyvät riskit tuotekehityskustannusten kasvusta, tuotteen huonosta laadusta ja loppukäyttäjän tyytymättömyydestä. Käsiteenä ohjelmistotestaus kattaa laajan valikoiman eri osa-alueita, menetelmiä ja testausmalleja sekä vaatimuksia testausta suorittavalle organisaatiolle. Ohjelmistotestauksen eri prosessien määrittelyn, suunnittelun ja toteutuksen tueksi on tarjolla joukko standardeja, jotka tarjoavat ohjeita, periaatteita ja parhaita käytänteitä testausprosessien tehokkuuden, luotettavuuden ja vaikuttavuuden saavuttamiseksi. Organisaation onnistumista testausprosessien soveltamisesta käytännössä eli testausta suorittavan organisaation kypsyyttä toiminnassaan voidaan arvioida erilaisilla kypsyysmalleilla ja kypsyyden arviointimalleilla. Organisaatiot voivat käyttää tällaisia malleja referensseinä ja verrata omaa toimintaansa niissä esitettyihin toimintamalleihin ja edelleen arvioida omaa toimintaansa. Varsinaiset kypsyyttä ja kyvykkyyttä arvioivat mallit perustuvat ulkoisen, riippumattoman toimijan tekemään auditointiin, joissa sovelletaan kyseessä olevaa arviointimallia ja -menetelmiä. Organisaatiotason ohjelmistotestauspolitiikka ja -strategia ovat dokumentteja, jotka sisältävät ohjeita, prosesseja sekä käytänteitä siitä mikä organisaation lähestymistapa testaukseen on organisaation johdon näkökulmasta. Testauspolitiikka ja -strategia määrittelevät korkean tason tavoitteet testauksella, käytettävissä olevat resurssit sekä testausprosessit. Tutkimus toteutettiin tapaustutkimuksena. Tutkimusaineisto kerättiin asiantuntijahaastatteluiden avulla. Haastattelu oli kaksiosainen - ensimmäisessä osassa kerättiin tietoa organisaation kypsyydestä ja kyvykkyydestä ohjelmistotestauksen hyödyntämällä TMMI-arviointikehystä, jälkimmäinen osa käsitteli organisaatiotason testauspolitiikkaa ja -strategiaa. Tutkimuksen lopputuloksena on kohdeorganisaatiolle laadittu ehdotus testausstrategiamallista sekä siinä käsiteltävistä testaukseen liittyvistä käytänteistä ja menetelmistä. Tutkimus tuotti tietoa kohdeorganisaation vahvuuksista ja kehittämiskohteista TMMi-kehyksen viidellä eri prosessialueella. Tutkimuksessa todettiin, että kohdeorganisaatiolla ei ole käytössään organisaatiotason ohjelmistotestauspolitiikkaa ja -strategiaa. Tutkimuksessa tunnistettiin testausstrategian osatekijät kuten standardien hyödyntäminen, tuotteisiin liittyvien riskien huomioiminen testauksen varhaisessa vaiheessa sekä sidosryhmien huomiointi käytäntöinä, joita testauspolitiikan ja strategian laatimisella ja jalkauttamisella voitaisiin luoda ja kehittää

    Framework for the Automation of SDLC Phases using Artificial Intelligence and Machine Learning Techniques

    Get PDF
    Software Engineering acts as a foundation stone for any software that is being built. It provides a common road-map for construction of software from any domain. Not following a well-defined Software Development Model have led to the failure of many software projects in the past. Agile is the Software Development Life Cycle (SDLC) Model that is widely used in practice in the IT industries to develop software on various technologies such as Big Data, Machine Learning, Artificial Intelligence, Deep learning. The focus on Software Engineering side in the recent years has been on trying to automate the various phases of SDLC namely- Requirements Analysis, Design, Coding, Testing and Operations and Maintenance. Incorporating latest trending technologies such as Machine Learning and Artificial Intelligence into various phases of SDLC, could facilitate for better execution of each of these phases. This in turn helps to cut-down costs, save time, improve the efficiency and reduce the manual effort required for each of these phases. The aim of this paper is to present a framework for the application of various Artificial Intelligence and Machine Learning techniques in the different phases of SDLC

    Ohjelmistokehityssyklien kiihdytys osana julkaisutiheyden kasvattamista ohjelmistotuotannossa

    Get PDF
    In recent years, companies engaged in software development have taken into use practices that allow the companies to release software changes almost daily to their users. Previously, release frequency for software has been counted in months or even years so the leap to daily releases can be considered big. The underlying change to software development practices is equally large, spanning from individual development teams to organizations as a whole. The phenomenon has been framed as continuous software engineering by the software engineering research community. Researchers are beginning to realize the impact of continuous software engineering to existing disciplines in the field. Continuous software engineering can be seen to touch almost every aspect of software development from the inception of an idea to its eventual manifestation as a release to the public. Release management or release engineering has become an art in itself that must be mastered in order to be effective in releasing changes rapidly. Empirical studies in the area should be helpful in further exploring the industry-driven phenomenon and understanding the effects of continuous software engineering better. The purpose of this thesis is to provide insight into the habit of releasing software changes often that is promoted by continuous software engineering. There are three main themes in the thesis. A main theme in the thesis is seeking an answer to the rationale of frequent releases. The second theme focuses on charting the software processes and practices that need to be in place when releasing changes frequently. Organizational circumstances surrounding the adoption of frequent releases and related practices are highlighted in the third theme. Methodologically, this thesis builds on a set of case studies. Focusing on software development practices of Finnish industrial companies, the thesis data has been collected from 33 different cases using a multiple-case design. Semi-structured interviews were used for data collection along with a single survey. Respondents for the interviews included developers, architects and other people involved in software development. Thematic analysis was the primary qualitative approach used to analyze the interview responses. Survey data from the single survey was analyzed with quantitative analysis. Results of the thesis indicate that a higher release frequency makes sense in many cases but there are constraints in selected domains. Daily releases were reported to be rare in the case projects. In most cases, there was a significant difference between the capability to deploy changes and the actual release cycle. A strong positive correlation was found between delivery capability and a high degree of task automation. Respondents perceived that with frequent releases, users get changes faster, the rate of feedback cycles is increased, and product quality can improve. Breaking down the software development process to four quadrants of requirements, development, testing, and operations and infrastructure, the results suggest continuity is required in all four to support frequent releases. In the case companies, the supporting development practices were usually in place but specific types of testing and the facilities for deploying the changes effortlessly were not. Realigning processes and practices accordingly needs strong organizational support. The responses imply that the organizational culture, division of labor, employee training, and customer relationships all need attention. With the right processes and the right organizational framework, frequent releases are indeed possible in specific domains and environments. In the end, release practices need to be considered individually in each case by weighing the associated risks and benefits. At best, users get to enjoy enhancements quicker and to experience an increase in the perceived value of software sooner than would otherwise be possible.Ohjelmiston julkaisu on eräänlainen virstanpylväs ohjelmiston kehityksessä, jossa ohjelmiston uusi versio saatetaan loppukäyttäjille käyttöön. Julkaistu versio voi sisältää ohjelmistoon uusia toiminnallisuuksia, korjauksia tai muita päivityksiä. Ohjelmiston julkaisutiheys säätelee kuinka tiheästi uusia versioita julkaistaan käyttäjille. Ohjelmistojen julkaisutiheys voi vaihdella sovelluksesta ja toimintaympäristöstä riippuen. Kuukausien tai vuosien pituinen julkaisuväli ei ole alalla tavaton. Viime vuosina tietyt ohjelmistoalalla toimivat yritykset ovat ottaneet käyttöön jatkuvan julkaisemisen malleja, joilla pyritään lyhentämään julkaisuvälejä kuukausista aina viikkoihin tai päiviin. Jatkuvan julkaisemisen mallien käyttöönotolla on merkittäviä vaikutuksia niin ohjelmistokehitysmenetelmiin kuin työn sisäiseen organisointiin. Jatkuvan julkaisun mallien myötä julkaisunhallinnasta on tullut keskeinen osa ohjelmistokehitystä. Väitöstyössä käsitellään julkaisutiheyden kasvattamiseen liittyviä kysymyksiä kolmen eri teeman alla. Työn ensimmäinen teema keskittyy julkaisutiheyden kasvattamisen tarkoitusperien ymmärtämiseen. Toisessa teemassa suurennuslasin alla ovat ohjelmistokehityksen käytänteet, jotka edesauttavat siirtymistä kohti jatkuvaa julkaisua. Kolmannessa teemassa huomion kohteena ovat työn organisointiin ja työkulttuurin muutokseen liittyvät seikat siirryttäessä jatkuvaan julkaisuun. Väitöstyössä esitettyihin kysymyksiin on haettu vastauksia tapaustukimusten avulla. Tapaustutkimusten kohteena ovat olleet suomalaiset ohjelmistoalan yritykset. Tietoja on kerätty haastattelu- ja kyselytutkimuksin yli kolmestakymmennestä tapauksesta. Tutkimusten tulosten perusteella julkaisutiheyden kasvattamiselle on edellytyksiä monessa ympäristössä, mutta kaikille toimialoille se ei sovellu. Yleisesti ottaen tiheät julkaisut olivat harvinaisia. Monessa tapauksessa havaittiin merkittävä ero julkaisukyvykkyyden ja varsinaisen julkaisutiheyden välillä. Julkaisukyvykkyys oli sitä parempi, mitä pidemmälle sovelluskehityksen vaiheet olivat automatisoitu. Jatkuvan julkaisun käyttöönotto edellyttää vahvaa muutosjohtamista, työntekijöiden kouluttamista, organisaatiokulttuurin uudistamista sekä asiakassuhteiden hyvää hallintaa. Parhaassa tapauksessa tiheät julkaisut nopeuttavat niin muutosten toimittamista käyttäjille kuin palautesyklejä sekä johtavat välillisesti parempaan tuotelaatuun

    The determinants of value addition: a crtitical analysis of global software engineering industry in Sri Lanka

    Get PDF
    It was evident through the literature that the perceived value delivery of the global software engineering industry is low due to various facts. Therefore, this research concerns global software product companies in Sri Lanka to explore the software engineering methods and practices in increasing the value addition. The overall aim of the study is to identify the key determinants for value addition in the global software engineering industry and critically evaluate the impact of them for the software product companies to help maximise the value addition to ultimately assure the sustainability of the industry. An exploratory research approach was used initially since findings would emerge while the study unfolds. Mixed method was employed as the literature itself was inadequate to investigate the problem effectively to formulate the research framework. Twenty-three face-to-face online interviews were conducted with the subject matter experts covering all the disciplines from the targeted organisations which was combined with the literature findings as well as the outcomes of the market research outcomes conducted by both government and nongovernment institutes. Data from the interviews were analysed using NVivo 12. The findings of the existing literature were verified through the exploratory study and the outcomes were used to formulate the questionnaire for the public survey. 371 responses were considered after cleansing the total responses received for the data analysis through SPSS 21 with alpha level 0.05. Internal consistency test was done before the descriptive analysis. After assuring the reliability of the dataset, the correlation test, multiple regression test and analysis of variance (ANOVA) test were carried out to fulfil the requirements of meeting the research objectives. Five determinants for value addition were identified along with the key themes for each area. They are staffing, delivery process, use of tools, governance, and technology infrastructure. The cross-functional and self-organised teams built around the value streams, employing a properly interconnected software delivery process with the right governance in the delivery pipelines, selection of tools and providing the right infrastructure increases the value delivery. Moreover, the constraints for value addition are poor interconnection in the internal processes, rigid functional hierarchies, inaccurate selections and uses of tools, inflexible team arrangements and inadequate focus for the technology infrastructure. The findings add to the existing body of knowledge on increasing the value addition by employing effective processes, practices and tools and the impacts of inaccurate applications the same in the global software engineering industry
    corecore