2,573 research outputs found
Using Analytical Information for Digital Business Transformation through DataOps: A Review and Conceptual Framework
Organisations are increasingly practising business analytics to generate actionable insights that can guide their digital business transformation. Transforming business digitally using business analytics is an ongoing process that requires an integrated and disciplined approach to leveraging analytics and promoting collaboration. An emerging business analytics practice, Data Operations (DataOps), provides a disciplined approach for organisations to collaborate using analytical information for digital business transformation. We propose a conceptual framework by reviewing the literature on business analytics, DataOps and organisational information processing theory (OIPT). This conceptual framework explains how organisations can employ DataOps as an integrated and disciplined approach for developing the analytical information processing capability and facilitating boundary-spanning activities required for digital business transformation. This research (a) extends current knowledge on digital transformation by linking it with business analytics from the perspective of OIPT and boundary-spanning activities, and (b) presents DataOps as a novel approach for using analytical information for digital business transformation
On the real world practice of Behaviour Driven Development
Surveys of industry practice over the last decade suggest that Behaviour Driven Development is a popular Agile practice. For example, 19% of respondents to the 14th State of Agile annual survey reported using BDD, placing it in the top 13 practices reported. As well as potential benefits, the adoption of BDD necessarily involves an additional cost of writing and maintaining Gherkin features and scenarios, and (if used for acceptance testing,) the associated step functions. Yet there is a lack of published literature exploring how BDD is used in practice and the challenges experienced by real world software development efforts. This gap is significant because without understanding current real world practice, it is hard to identify opportunities to address and mitigate challenges. In order to address this research gap concerning the challenges of using BDD, this thesis reports on a research project which explored: (a) the challenges of applying agile and undertaking requirements engineering in a real world context; (b) the challenges of applying BDD specifically and (c) the application of BDD in open-source projects to understand challenges in this different context.
For this purpose, we progressively conducted two case studies, two series of interviews, four iterations of action research, and an empirical study. The first case study was conducted in an avionics company to discover the challenges of using an agile process in a large scale safety critical project environment. Since requirements management was found to be one of the biggest challenges during the case study, we decided to investigate BDD because of its reputation for requirements management. The second case study was conducted in the company with an aim to discover the challenges of using BDD in real life. The case study was complemented with an empirical study of the practice of BDD in open source projects, taking a study sample from the GitHub open source collaboration site.
As a result of this Ph.D research, we were able to discover: (i) challenges of using an agile process in a large scale safety-critical organisation, (ii) current state of BDD in practice, (iii) technical limitations of Gherkin (i.e., the language for writing requirements in BDD), (iv) challenges of using BDD in a real project, (v) bad smells in the Gherkin specifications of open source projects on GitHub. We also presented a brief comparison between the theoretical description of BDD and BDD in practice. This research, therefore, presents the results of lessons learned from BDD in practice, and serves as a guide for software practitioners planning on using BDD in their projects
Configuration Management of Distributed Systems over Unreliable and Hostile Networks
Economic incentives of large criminal profits and the threat of legal consequences have pushed criminals to continuously improve their malware, especially command and control channels. This thesis applied concepts from successful malware command and control to explore the survivability and resilience of benign configuration management systems.
This work expands on existing stage models of malware life cycle to contribute a new model for identifying malware concepts applicable to benign configuration management. The Hidden Master architecture is a contribution to master-agent network communication. In the Hidden Master architecture, communication between master and agent is asynchronous and can operate trough intermediate nodes. This protects the master secret key, which gives full control of all computers participating in configuration management. Multiple improvements to idempotent configuration were proposed, including the definition of the minimal base resource dependency model, simplified resource revalidation and the use of imperative general purpose language for defining idempotent configuration.
Following the constructive research approach, the improvements to configuration management were designed into two prototypes. This allowed validation in laboratory testing, in two case studies and in expert interviews. In laboratory testing, the Hidden Master prototype was more resilient than leading configuration management tools in high load and low memory conditions, and against packet loss and corruption. Only the research prototype was adaptable to a network without stable topology due to the asynchronous nature of the Hidden Master architecture.
The main case study used the research prototype in a complex environment to deploy a multi-room, authenticated audiovisual system for a client of an organization deploying the configuration. The case studies indicated that imperative general purpose language can be used for idempotent configuration in real life, for defining new configurations in unexpected situations using the base resources, and abstracting those using standard language features; and that such a system seems easy to learn.
Potential business benefits were identified and evaluated using individual semistructured expert interviews. Respondents agreed that the models and the Hidden Master architecture could reduce costs and risks, improve developer productivity and allow faster time-to-market. Protection of master secret keys and the reduced need for incident response were seen as key drivers for improved security. Low-cost geographic scaling and leveraging file serving capabilities of commodity servers were seen to improve scaling and resiliency. Respondents identified jurisdictional legal limitations to encryption and requirements for cloud operator auditing as factors potentially limiting the full use of some concepts
Multidisciplinary perspectives on Artificial Intelligence and the law
This open access book presents an interdisciplinary, multi-authored, edited collection of chapters on Artificial Intelligence (‘AI’) and the Law. AI technology has come to play a central role in the modern data economy. Through a combination of increased computing power, the growing availability of data and the advancement of algorithms, AI has now become an umbrella term for some of the most transformational technological breakthroughs of this age. The importance of AI stems from both the opportunities that it offers and the challenges that it entails. While AI applications hold the promise of economic growth and efficiency gains, they also create significant risks and uncertainty. The potential and perils of AI have thus come to dominate modern discussions of technology and ethics – and although AI was initially allowed to largely develop without guidelines or rules, few would deny that the law is set to play a fundamental role in shaping the future of AI. As the debate over AI is far from over, the need for rigorous analysis has never been greater. This book thus brings together contributors from different fields and backgrounds to explore how the law might provide answers to some of the most pressing questions raised by AI. An outcome of the Católica Research Centre for the Future of Law and its interdisciplinary working group on Law and Artificial Intelligence, it includes contributions by leading scholars in the fields of technology, ethics and the law.info:eu-repo/semantics/publishedVersio
Insights into software development approaches: mining Q &A repositories
© 2023 The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/Context: Software practitioners adopt approaches like DevOps, Scrum, and Waterfall for high-quality software development. However, limited research has been conducted on exploring software development approaches concerning practitioners’ discussions on Q &A forums. Objective: We conducted an empirical study to analyze developers’ discussions on Q &A forums to gain insights into software development approaches in practice. Method: We analyzed 13,903 developers’ posts across Stack Overflow (SO), Software Engineering Stack Exchange (SESE), and Project Management Stack Exchange (PMSE) forums. A mixed method approach, consisting of the topic modeling technique (i.e., Latent Dirichlet Allocation (LDA)) and qualitative analysis, is used to identify frequently discussed topics of software development approaches, trends (popular, difficult topics), and the challenges faced by practitioners in adopting different software development approaches. Findings: We identified 15 frequently mentioned software development approaches topics on Q &A sites and observed an increase in trends for the top-3 most difficult topics requiring more attention. Finally, our study identified 49 challenges faced by practitioners while deploying various software development approaches, and we subsequently created a thematic map to represent these findings. Conclusions: The study findings serve as a useful resource for practitioners to overcome challenges, stay informed about current trends, and ultimately improve the quality of software products they develop.Peer reviewe
Revisiting the capitalization of public transport accessibility into residential land value: an empirical analysis drawing on Open Science
Background: The delivery and effective operation of public transport is fundamental for a for a transition to low-carbon emission transport systems’. However, many cities face budgetary challenges in providing and operating this type of infrastructure. Land value capture (LVC) instruments, aimed at recovering all or part of the land value uplifts triggered by actions other than the landowner, can alleviate some of this pressure. A key element of LVC lies in the increment in land value associated with a particular public action. Urban economic theory supports this idea and considers accessibility to be a core element for determining residential land value. Although the empirical literature assessing the relationship between land value increments and public transport infrastructure is vast, it often assumes homogeneous benefits and, therefore, overlooks relevant elements of accessibility. Advancements in the accessibility concept in the context of Open Science can ease the relaxation of such assumptions.
Methods: This thesis draws on the case of Greater Mexico City between 2009 and 2019. It focuses on the effects of the main public transport network (MPTN) which is organised in seven temporal stages according to its expansion phases. The analysis incorporates location based accessibility measures to employment opportunities in order to assess the benefits of public transport infrastructure. It does so by making extensive use of the open-source software OpenTripPlanner for public transport route modelling (≈ 2.1 billion origin-destination routes). Potential capitalizations are assessed according to the hedonic framework. The property value data includes individual administrative mortgage records collected by the Federal Mortgage Society (≈ 800,000). The hedonic function is estimated using a variety of approaches, i.e. linear models, nonlinear models, multilevel models, and spatial multilevel models. These are estimated by the maximum likelihood and Bayesian methods. The study also examines possible spatial aggregation bias using alternative spatial aggregation schemes according to the modifiable areal unit problem (MAUP) literature.
Results: The accessibility models across the various temporal stages evidence the spatial heterogeneity shaped by the MPTN in combination with land use and the individual perception of residents. This highlights the need to transition from measures that focus on the characteristics of transport infrastructure to comprehensive accessibility measures which reflect such heterogeneity. The estimated hedonic function suggests a robust, positive, and significant relationship between MPTN accessibility and residential land value in all the modelling frameworks in the presence of a variety of controls. The residential land value increases between 3.6% and 5.7% for one additional standard deviation in MPTN accessibility to employment in the final set of models. The total willingness to pay (TWTP) is considerable, ranging from 0.7 to 1.5 times the equivalent of the capital costs of the bus rapid transit Line-7 of the Metrobús system. A sensitivity analysis shows that the hedonic model estimation is sensitive to the MAUP. In addition, the use of a post code zoning scheme produces the closest results compared to the smallest spatial analytical scheme (0.5 km hexagonal grid).
Conclusion: The present thesis advances the discussion on the capitalization of public transport on residential land value by adopting recent contributions from the Open Science framework. Empirically, it fills a knowledge gap given the lack of literature around this topic in this area of study. In terms of policy, the findings support LVC as a mechanism of considerable potential. Regarding fee-based LVC instruments, there are fairness issues in relation to the distribution of charges or exactions to households that could be addressed using location based measures. Furthermore, the approach developed for this analysis serves as valuable guidance for identifying sites with large potential for the implementation of development based instruments, for instance land readjustments or the sale/lease of additional development rights
Recommended from our members
The impact of enterprise social networking on knowledge sharing between academic staff in higher education
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonHigher education institutions have always considered knowledge sharing critical for research excellence and finding proper methods for sharing knowledge across academic staff has therefore been a major issue for universities and knowledge management research. Recent evidence shows that many universities have embraced enterprise social networking tools to improve communication, relationships, partnerships, and knowledge sharing. To date, there is little understanding of the critical factors for online knowledge sharing behaviour between academic staff, and the impact of these factors on work benefits for academic staff which differ between consumptive users and contributive users in higher education. This study employed the extended unified theory of acceptance and use of technology (UTAUT) to examine factors affecting knowledge sharing about the consumptive use and contributive use of enterprise social network (ESN) behaviour. The study adopts a critical realism philosophical approach and employed a grounded theory mixed methods. The conceptual model was validated through structural equation modelling based on an online survey of 254 academic staff using enterprise social networking as a part of their work in the United Kingdom. The findings have significant theoretical and practical implications for researchers and policy makers. The research has developed a cohesive ESN use model by extending and modifying the unified theory of acceptance and use of technology. The findings indicate significant differences around factors affecting consumptive and contributive usage patterns within ESNs. Due to advances in communication technologies, this research argues that a previous model suggested by Venkatesh et al. (2003) is no longer fit for purpose and the new communication tools can lead to improved knowledge in higher education. This research also makes valuable contributions to universities from a managerial viewpoint, suggesting that universities could help their scholars find a more comprehensive range of funding sources matching scholars' ideas
- …