9,448 research outputs found

    The Metaverse: Survey, Trends, Novel Pipeline Ecosystem & Future Directions

    Full text link
    The Metaverse offers a second world beyond reality, where boundaries are non-existent, and possibilities are endless through engagement and immersive experiences using the virtual reality (VR) technology. Many disciplines can benefit from the advancement of the Metaverse when accurately developed, including the fields of technology, gaming, education, art, and culture. Nevertheless, developing the Metaverse environment to its full potential is an ambiguous task that needs proper guidance and directions. Existing surveys on the Metaverse focus only on a specific aspect and discipline of the Metaverse and lack a holistic view of the entire process. To this end, a more holistic, multi-disciplinary, in-depth, and academic and industry-oriented review is required to provide a thorough study of the Metaverse development pipeline. To address these issues, we present in this survey a novel multi-layered pipeline ecosystem composed of (1) the Metaverse computing, networking, communications and hardware infrastructure, (2) environment digitization, and (3) user interactions. For every layer, we discuss the components that detail the steps of its development. Also, for each of these components, we examine the impact of a set of enabling technologies and empowering domains (e.g., Artificial Intelligence, Security & Privacy, Blockchain, Business, Ethics, and Social) on its advancement. In addition, we explain the importance of these technologies to support decentralization, interoperability, user experiences, interactions, and monetization. Our presented study highlights the existing challenges for each component, followed by research directions and potential solutions. To the best of our knowledge, this survey is the most comprehensive and allows users, scholars, and entrepreneurs to get an in-depth understanding of the Metaverse ecosystem to find their opportunities and potentials for contribution

    Architecture Smells vs. Concurrency Bugs: an Exploratory Study and Negative Results

    Full text link
    Technical debt occurs in many different forms across software artifacts. One such form is connected to software architectures where debt emerges in the form of structural anti-patterns across architecture elements, namely, architecture smells. As defined in the literature, ``Architecture smells are recurrent architectural decisions that negatively impact internal system quality", thus increasing technical debt. In this paper, we aim at exploring whether there exist manifestations of architectural technical debt beyond decreased code or architectural quality, namely, whether there is a relation between architecture smells (which primarily reflect structural characteristics) and the occurrence of concurrency bugs (which primarily manifest at runtime). We study 125 releases of 5 large data-intensive software systems to reveal that (1) several architecture smells may in fact indicate the presence of concurrency problems likely to manifest at runtime but (2) smells are not correlated with concurrency in general -- rather, for specific concurrency bugs they must be combined with an accompanying articulation of specific project characteristics such as project distribution. As an example, a cyclic dependency could be present in the code, but the specific execution-flow could be never executed at runtime

    Estudo da remodelagem reversa miocárdica através da análise proteómica do miocárdio e do líquido pericárdico

    Get PDF
    Valve replacement remains as the standard therapeutic option for aortic stenosis patients, aiming at abolishing pressure overload and triggering myocardial reverse remodeling. However, despite the instant hemodynamic benefit, not all patients show complete regression of myocardial hypertrophy, being at higher risk for adverse outcomes, such as heart failure. The current comprehension of the biological mechanisms underlying an incomplete reverse remodeling is far from complete. Furthermore, definitive prognostic tools and ancillary therapies to improve the outcome of the patients undergoing valve replacement are missing. To help abridge these gaps, a combined myocardial (phospho)proteomics and pericardial fluid proteomics approach was followed, taking advantage of human biopsies and pericardial fluid collected during surgery and whose origin anticipated a wealth of molecular information contained therein. From over 1800 and 750 proteins identified, respectively, in the myocardium and in the pericardial fluid of aortic stenosis patients, a total of 90 dysregulated proteins were detected. Gene annotation and pathway enrichment analyses, together with discriminant analysis, are compatible with a scenario of increased pro-hypertrophic gene expression and protein synthesis, defective ubiquitinproteasome system activity, proclivity to cell death (potentially fed by complement activity and other extrinsic factors, such as death receptor activators), acute-phase response, immune system activation and fibrosis. Specific validation of some targets through immunoblot techniques and correlation with clinical data pointed to complement C3 β chain, Muscle Ring Finger protein 1 (MuRF1) and the dual-specificity Tyr-phosphorylation regulated kinase 1A (DYRK1A) as potential markers of an incomplete response. In addition, kinase prediction from phosphoproteome data suggests that the modulation of casein kinase 2, the family of IκB kinases, glycogen synthase kinase 3 and DYRK1A may help improve the outcome of patients undergoing valve replacement. Particularly, functional studies with DYRK1A+/- cardiomyocytes show that this kinase may be an important target to treat cardiac dysfunction, provided that mutant cells presented a different response to stretch and reduced ability to develop force (active tension). This study opens many avenues in post-aortic valve replacement reverse remodeling research. In the future, gain-of-function and/or loss-of-function studies with isolated cardiomyocytes or with animal models of aortic bandingdebanding will help disclose the efficacy of targeting the surrogate therapeutic targets. Besides, clinical studies in larger cohorts will bring definitive proof of complement C3, MuRF1 and DYRK1A prognostic value.A substituição da válvula aórtica continua a ser a opção terapêutica de referência para doentes com estenose aórtica e visa a eliminação da sobrecarga de pressão, desencadeando a remodelagem reversa miocárdica. Contudo, apesar do benefício hemodinâmico imediato, nem todos os pacientes apresentam regressão completa da hipertrofia do miocárdio, ficando com maior risco de eventos adversos, como a insuficiência cardíaca. Atualmente, os mecanismos biológicos subjacentes a uma remodelagem reversa incompleta ainda não são claros. Além disso, não dispomos de ferramentas de prognóstico definitivos nem de terapias auxiliares para melhorar a condição dos pacientes indicados para substituição da válvula. Para ajudar a resolver estas lacunas, uma abordagem combinada de (fosfo)proteómica e proteómica para a caracterização, respetivamente, do miocárdio e do líquido pericárdico foi seguida, tomando partido de biópsias e líquidos pericárdicos recolhidos em ambiente cirúrgico. Das mais de 1800 e 750 proteínas identificadas, respetivamente, no miocárdio e no líquido pericárdico dos pacientes com estenose aórtica, um total de 90 proteínas desreguladas foram detetadas. As análises de anotação de genes, de enriquecimento de vias celulares e discriminativa corroboram um cenário de aumento da expressão de genes pro-hipertróficos e de síntese proteica, um sistema ubiquitina-proteassoma ineficiente, uma tendência para morte celular (potencialmente acelerada pela atividade do complemento e por outros fatores extrínsecos que ativam death receptors), com ativação da resposta de fase aguda e do sistema imune, assim como da fibrose. A validação de alguns alvos específicos através de immunoblot e correlação com dados clínicos apontou para a cadeia β do complemento C3, a Muscle Ring Finger protein 1 (MuRF1) e a dual-specificity Tyr-phosphoylation regulated kinase 1A (DYRK1A) como potenciais marcadores de uma resposta incompleta. Por outro lado, a predição de cinases a partir do fosfoproteoma, sugere que a modulação da caseína cinase 2, a família de cinases do IκB, a glicogénio sintase cinase 3 e da DYRK1A pode ajudar a melhorar a condição dos pacientes indicados para intervenção. Em particular, a avaliação funcional de cardiomiócitos DYRK1A+/- mostraram que esta cinase pode ser um alvo importante para tratar a disfunção cardíaca, uma vez que os miócitos mutantes responderam de forma diferente ao estiramento e mostraram uma menor capacidade para desenvolver força (tensão ativa). Este estudo levanta várias hipóteses na investigação da remodelagem reversa. No futuro, estudos de ganho e/ou perda de função realizados em cardiomiócitos isolados ou em modelos animais de banding-debanding da aorta ajudarão a testar a eficácia de modular os potenciais alvos terapêuticos encontrados. Além disso, estudos clínicos em coortes de maior dimensão trarão conclusões definitivas quanto ao valor de prognóstico do complemento C3, MuRF1 e DYRK1A.Programa Doutoral em Biomedicin

    Mathematical models to evaluate the impact of increasing serotype coverage in pneumococcal conjugate vaccines

    Get PDF
    Of over 100 serotypes of Streptococcus pneumoniae, only 7 were included in the first pneumo- coccal conjugate vaccine (PCV). While PCV reduced the disease incidence, in part because of a herd immunity effect, a replacement effect was observed whereby disease was increasingly caused by serotypes not included in the vaccine. Dynamic transmission models can account for these effects to describe post-vaccination scenarios, whereas economic evaluations can enable decision-makers to compare vaccines of increasing valency for implementation. This thesis has four aims. First, to explore the limitations and assumptions of published pneu- mococcal models and the implications for future vaccine formulation and policy. Second, to conduct a trend analysis assembling all the available evidence for serotype replacement in Europe, North America and Australia to characterise invasive pneumococcal disease (IPD) caused by vaccine-type (VT) and non-vaccine-types (NVT) serotypes. The motivation behind this is to assess the patterns of relative abundance in IPD cases pre- and post-vaccination, to examine country-level differences in relation to the vaccines employed over time since introduction, and to assess the growth of the replacement serotypes in comparison with the serotypes targeted by the vaccine. The third aim is to use a Bayesian framework to estimate serotype-specific invasiveness, i.e. the rate of invasive disease given carriage. This is useful for dynamic transmission modelling, as transmission is through carriage but a majority of serotype-specific pneumococcal data lies in active disease surveillance. This is also helpful to address whether serotype replacement reflects serotypes that are more invasive or whether serotypes in a specific location are equally more invasive than in other locations. Finally, the last aim of this thesis is to estimate the epidemiological and economic impact of increas- ing serotype coverage in PCVs using a dynamic transmission model. Together, the results highlight that though there are key parameter uncertainties that merit further exploration, divergence in serotype replacement and inconsistencies in invasiveness on a country-level may make a universal PCV suboptimal.Open Acces

    Biocontrol as a key component to manage brown rot on cherry

    Get PDF
    Brown rot, caused by Monilinia spp., is one of the most important diseases in stone fruits worldwide. Brown rot can cause blossom wilts and fruit rots in the orchard as well as latent infections of fruit, leading to post-harvest fruit decaying. Current control methods rely on scheduled spraying of fungicides. However, the continuing pressure to reduce fungicide use has seen an increase in research into alternative management methods, such as biological control. NIAB EMR recently identified two microbes that significantly reduced sporulation of Monilinia laxa under laboratory conditions. These two isolates were a bacterial species Bacillus subtilis (B91) and yeast-like fungus Aureobasidium pullulans (Y126) and are currently being formulated into commercial products. We are investigating how to optimise the use of these two potential biocontrol products in practice, in terms of suppressing Monilinia sporulation on overwintered mummies and preventing infection of blossoms and fruits. When applied to mummified fruits in winter Y126’s population was stable through the winter but at a low concentration. The B91 survived a little longer with the population reaching that of the control group by week 4. Neither Biological control (BCA) treatments had an affected the population of M. laxa when compared to the control treatment of sterile distilled water. The interaction time between the BCAs and M. laxa showed the longer the interaction time the lower the spore count of M. laxa. Another study was performed looking into the ability of our BCAs to colonise and survive on blossoms. B91 did not survive well on blossoms but could survive on fruits. However, its antagonistic compounds need to be in relatively high concentration to be effective against M. laxa. Therefore, it is best used as a fungicide, ensuring the antagonistic compounds are at a high concentration when applied in the field. Y126 can persist throughout the season and was marginally, though not statistically significantly, more effective at long term reduction in M. laxa. This could be because Y126 works through competition, therefore the interaction time with the pathogen could be important for efficacy and something worth investigating further. The difference between the BCAs highlights the need to understand each BCA’s ecology to ensure maximum efficacy. In a latent infection experiment, we inoculated trees with M. laxa and then treated them with the two biocontrol isolates two weeks before harvest. Post-harvest disease development was assessed after four days of storage in 2019 and two weeks in 2020. There was a significant reduction in rot incidence (p < 0.001) of 29% (Y126) and 27% (B91) in 2019 and 62 % (Y126) and 80 % (B91) in 2020 when the harvested fruit was stored at cold store levels. With new products to be introduced into the environment, it's important to understand the effects they may have on the plant's microbiome. Using next-generation sequencing techniques, we looked at the impact B91 and Y126 has on the blossom and cherry microbiomes. There was a treatment effect in both the bacterial and fungal communities on the blossom and ripe cherry. But the biggest variability was between blocks (Geographical effect) and between the years in which we experimented (p < 0.0001). This research will assist in the development of management strategies, especially spray timings for brown rot on stone fruit, integrating BCAs with other management practices

    Modelling and Solving the Single-Airport Slot Allocation Problem

    Get PDF
    Currently, there are about 200 overly congested airports where airport capacity does not suffice to accommodate airline demand. These airports play a critical role in the global air transport system since they concern 40% of global passenger demand and act as a bottleneck for the entire air transport system. This imbalance between airport capacity and airline demand leads to excessive delays, as well as multi-billion economic, and huge environmental and societal costs. Concurrently, the implementation of airport capacity expansion projects requires time, space and is subject to significant resistance from local communities. As a short to medium-term response, Airport Slot Allocation (ASA) has been used as the main demand management mechanism. The main goal of this thesis is to improve ASA decision-making through the proposition of models and algorithms that provide enhanced ASA decision support. In doing so, this thesis is organised into three distinct chapters that shed light on the following questions (I–V), which remain untapped by the existing literature. In parentheses, we identify the chapters of this thesis that relate to each research question. I. How to improve the modelling of airline demand flexibility and the utility that each airline assigns to each available airport slot? (Chapters 2 and 4) II. How can one model the dynamic and endogenous adaptation of the airport’s landside and airside infrastructure to the characteristics of airline demand? (Chapter 2) III. How to consider operational delays in strategic ASA decision-making? (Chapter 3) IV. How to involve the pertinent stakeholders into the ASA decision-making process to select a commonly agreed schedule; and how can one reduce the inherent decision-complexity without compromising the quality and diversity of the schedules presented to the decision-makers? (Chapter 3) V. Given that the ASA process involves airlines (submitting requests for slots) and coordinators (assigning slots to requests based on a set of rules and priorities), how can one jointly consider the interactions between these two sides to improve ASA decision-making? (Chapter 4) With regards to research questions (I) and (II), the thesis proposes a Mixed Integer Programming (MIP) model that considers airlines’ timing flexibility (research question I) and constraints that enable the dynamic and endogenous allocation of the airport’s resources (research question II). The proposed modelling variant addresses several additional problem characteristics and policy rules, and considers multiple efficiency objectives, while integrating all constraints that may affect airport slot scheduling decisions, including the asynchronous use of the different airport resources (runway, aprons, passenger terminal) and the endogenous consideration of the capabilities of the airport’s infrastructure to adapt to the airline demand’s characteristics and the aircraft/flight type associated with each request. The proposed model is integrated into a two-stage solution approach that considers all primary and several secondary policy rules of ASA. New combinatorial results and valid tightening inequalities that facilitate the solution of the problem are proposed and implemented. An extension of the above MIP model that considers the trade-offs among schedule displacement, maximum displacement, and the number of displaced requests, is integrated into a multi-objective solution framework. The proposed framework holistically considers the preferences of all ASA stakeholder groups (research question IV) concerning multiple performance metrics and models the operational delays associated with each airport schedule (research question III). The delays of each schedule/solution are macroscopically estimated, and a subtractive clustering algorithm and a parameter tuning routine reduce the inherent decision complexity by pruning non-dominated solutions without compromising the representativeness of the alternatives offered to the decision-makers (research question IV). Following the determination of the representative set, the expected delay estimates of each schedule are further refined by considering the whole airfield’s operations, the landside, and the airside infrastructure. The representative schedules are ranked based on the preferences of all ASA stakeholder groups concerning each schedule’s displacement-related and operational-delay performance. Finally, in considering the interactions between airlines’ timing flexibility and utility, and the policy-based priorities assigned by the coordinator to each request (research question V), the thesis models the ASA problem as a two-sided matching game and provides guarantees on the stability of the proposed schedules. A Stable Airport Slot Allocation Model (SASAM) capitalises on the flexibility considerations introduced for addressing research question (I) through the exploitation of data submitted by the airlines during the ASA process and provides functions that proxy each request’s value considering both the airlines’ timing flexibility for each submitted request and the requests’ prioritisation by the coordinators when considering the policy rules defining the ASA process. The thesis argues on the compliance of the proposed functions with the primary regulatory requirements of the ASA process and demonstrates their applicability for different types of slot requests. SASAM guarantees stability through sets of inequalities that prune allocations blocking the formation of stable schedules. A multi-objective Deferred-Acceptance (DA) algorithm guaranteeing the stability of each generated schedule is developed. The algorithm can generate all stable non-dominated points by considering the trade-off between the spilled airline and passenger demand and maximum displacement. The work conducted in this thesis addresses several problem characteristics and sheds light on their implications for ASA decision-making, hence having the potential to improve ASA decision-making. Our findings suggest that the consideration of airlines’ timing flexibility (research question I) results in improved capacity utilisation and scheduling efficiency. The endogenous consideration of the ability of the airport’s infrastructure to adapt to the characteristics of airline demand (research question II) enables a more efficient representation of airport declared capacity that results in the scheduling of additional requests. The concurrent consideration of airlines’ timing flexibility and the endogenous adaptation of airport resources to airline demand achieves an improved alignment between the airport infrastructure and the characteristics of airline demand, ergo proposing schedules of improved efficiency. The modelling and evaluation of the peak operational delays associated with the different airport schedules (research question III) provides allows the study of the implications of strategic ASA decision-making for operations and quantifies the impact of the airport’s declared capacity on each schedule’s operational performance. In considering the preferences of the relevant ASA stakeholders (airlines, coordinators, airport, and air traffic authorities) concerning multiple operational and strategic ASA efficiency metrics (research question IV) the thesis assesses the impact of alternative preference considerations and indicates a commonly preferred schedule that balances the stakeholders’ preferences. The proposition of representative subsets of alternative schedules reduces decision-complexity without significantly compromising the quality of the alternatives offered to the decision-making process (research question IV). The modelling of the ASA as a two-sided matching game (research question V), results in stable schedules consisting of request-to-slot assignments that provide no incentive to airlines and coordinators to reject or alter the proposed timings. Furthermore, the proposition of stable schedules results in more intensive use of airport capacity, while simultaneously improving scheduling efficiency. The models and algorithms developed as part of this thesis are tested using airline requests and airport capacity data from coordinated airports. Computational results that are relevant to the context of the considered airport instances provide evidence on the potential improvements for the current ASA process and facilitate data-driven policy and decision-making. In particular, with regards to the alignment of airline demand with the capabilities of the airport’s infrastructure (questions I and II), computational results report improved slot allocation efficiency and airport capacity utilisation, which for the considered airport instance translate to improvements ranging between 5-24% for various schedule performance metrics. In reducing the difficulty associated with the assessment of multiple ASA solutions by the stakeholders (question IV), instance-specific results suggest reductions to the number of alternative schedules by 87%, while maintaining the quality of the solutions presented to the stakeholders above 70% (expressed in relation to the initially considered set of schedules). Meanwhile, computational results suggest that the concurrent consideration of ASA stakeholders’ preferences (research question IV) with regards to both operational (research question III) and strategic performance metrics leads to alternative airport slot scheduling solutions that inform on the trade-offs between the schedules’ operational and strategic performance and the stakeholders’ preferences. Concerning research question (V), the application of SASAM and the DA algorithm suggest improvements to the number of unaccommodated flights and passengers (13 and 40% improvements) at the expense of requests concerning fewer passengers and days of operations (increasing the number of rejected requests by 1.2% in relation to the total number of submitted requests). The research conducted in this thesis aids in the identification of limitations that should be addressed by future studies to further improve ASA decision-making. First, the thesis focuses on exact solution approaches that consider the landside and airside infrastructure of the airport and generate multiple schedules. The proposition of pre-processing techniques that identify the bottleneck of the airport’s capacity, i.e., landside and/or airside, can be used to reduce the size of the proposed formulations and improve the required computational times. Meanwhile, the development of multi-objective heuristic algorithms that consider several problem characteristics and generate multiple efficient schedules in reasonable computational times, could extend the capabilities of the models propositioned in this thesis and provide decision support for some of the world’s most congested airports. Furthermore, the thesis models and evaluates the operational implications of strategic airport slot scheduling decisions. The explicit consideration of operational delays as an objective in ASA optimisation models and algorithms is an issue that merits investigation since it may further improve the operational performance of the generated schedules. In accordance with current practice, the models proposed in this work have considered deterministic capacity parameters. Perhaps, future research could propose formulations that consider stochastic representations of airport declared capacity and improve strategic ASA decision-making through the anticipation of operational uncertainty and weather-induced capacity reductions. Finally, in modelling airlines’ utility for each submitted request and available time slot the thesis proposes time-dependent functions that utilise available data to approximate airlines’ scheduling preferences. Future studies wishing to improve the accuracy of the proposed functions could utilise commercial data sources that provide route-specific information; or in cases that such data is unavailable, employ data mining and machine learning methodologies to extract airlines’ time-dependent utility and preferences

    Foundations for programming and implementing effect handlers

    Get PDF
    First-class control operators provide programmers with an expressive and efficient means for manipulating control through reification of the current control state as a first-class object, enabling programmers to implement their own computational effects and control idioms as shareable libraries. Effect handlers provide a particularly structured approach to programming with first-class control by naming control reifying operations and separating from their handling. This thesis is composed of three strands of work in which I develop operational foundations for programming and implementing effect handlers as well as exploring the expressive power of effect handlers. The first strand develops a fine-grain call-by-value core calculus of a statically typed programming language with a structural notion of effect types, as opposed to the nominal notion of effect types that dominates the literature. With the structural approach, effects need not be declared before use. The usual safety properties of statically typed programming are retained by making crucial use of row polymorphism to build and track effect signatures. The calculus features three forms of handlers: deep, shallow, and parameterised. They each offer a different approach to manipulate the control state of programs. Traditional deep handlers are defined by folds over computation trees, and are the original con-struct proposed by Plotkin and Pretnar. Shallow handlers are defined by case splits (rather than folds) over computation trees. Parameterised handlers are deep handlers extended with a state value that is threaded through the folds over computation trees. To demonstrate the usefulness of effects and handlers as a practical programming abstraction I implement the essence of a small UNIX-style operating system complete with multi-user environment, time-sharing, and file I/O. The second strand studies continuation passing style (CPS) and abstract machine semantics, which are foundational techniques that admit a unified basis for implementing deep, shallow, and parameterised effect handlers in the same environment. The CPS translation is obtained through a series of refinements of a basic first-order CPS translation for a fine-grain call-by-value language into an untyped language. Each refinement moves toward a more intensional representation of continuations eventually arriving at the notion of generalised continuation, which admit simultaneous support for deep, shallow, and parameterised handlers. The initial refinement adds support for deep handlers by representing stacks of continuations and handlers as a curried sequence of arguments. The image of the resulting translation is not properly tail-recursive, meaning some function application terms do not appear in tail position. To rectify this the CPS translation is refined once more to obtain an uncurried representation of stacks of continuations and handlers. Finally, the translation is made higher-order in order to contract administrative redexes at translation time. The generalised continuation representation is used to construct an abstract machine that provide simultaneous support for deep, shallow, and parameterised effect handlers. kinds of effect handlers. The third strand explores the expressiveness of effect handlers. First, I show that deep, shallow, and parameterised notions of handlers are interdefinable by way of typed macro-expressiveness, which provides a syntactic notion of expressiveness that affirms the existence of encodings between handlers, but it provides no information about the computational content of the encodings. Second, using the semantic notion of expressiveness I show that for a class of programs a programming language with first-class control (e.g. effect handlers) admits asymptotically faster implementations than possible in a language without first-class control

    Investigating and mitigating the role of neutralisation techniques on information security policies violation in healthcare organisations

    Get PDF
    Healthcare organisations today rely heavily on Electronic Medical Records systems (EMRs), which have become highly crucial IT assets that require significant security efforts to safeguard patients’ information. Individuals who have legitimate access to an organisation’s assets to perform their day-to-day duties but intentionally or unintentionally violate information security policies can jeopardise their organisation’s information security efforts and cause significant legal and financial losses. In the information security (InfoSec) literature, several studies emphasised the necessity to understand why employees behave in ways that contradict information security requirements but have offered widely different solutions. In an effort to respond to this situation, this thesis addressed the gap in the information security academic research by providing a deep understanding of the problem of medical practitioners’ behavioural justifications to violate information security policies and then determining proper solutions to reduce this undesirable behaviour. Neutralisation theory was used as the theoretical basis for the research. This thesis adopted a mixed-method research approach that comprises four consecutive phases, and each phase represents a research study that was conducted in light of the results from the preceding phase. The first phase of the thesis started by investigating the relationship between medical practitioners’ neutralisation techniques and their intention to violate information security policies that protect a patient’s privacy. A quantitative study was conducted to extend the work of Siponen and Vance [1] through a study of the Saudi Arabia healthcare industry. The data was collected via an online questionnaire from 66 Medical Interns (MIs) working in four academic hospitals. The study found that six neutralisation techniques—(1) appeal to higher loyalties, (2) defence of necessity, (3) the metaphor of ledger, (4) denial of responsibility, (5) denial of injury, and (6) condemnation of condemners—significantly contribute to the justifications of the MIs in hypothetically violating information security policies. The second phase of this research used a series of semi-structured interviews with IT security professionals in one of the largest academic hospitals in Saudi Arabia to explore the environmental factors that motivated the medical practitioners to evoke various neutralisation techniques. The results revealed that social, organisational, and emotional factors all stimulated the behavioural justifications to breach information security policies. During these interviews, it became clear that the IT department needed to ensure that security policies fit the daily tasks of the medical practitioners by providing alternative solutions to ensure the effectiveness of those policies. Based on these interviews, the objective of the following two phases was to improve the effectiveness of InfoSec policies against the use of behavioural justification by engaging the end users in the modification of existing policies via a collaborative writing process. Those two phases were conducted in the UK and Saudi Arabia to determine whether the collaborative writing process could produce a more effective security policy that balanced the security requirements with daily business needs, thus leading to a reduction in the use of neutralisation techniques to violate security policies. The overall result confirmed that the involvement of the end users via a collaborative writing process positively improved the effectiveness of the security policy to mitigate the individual behavioural justifications, showing that the process is a promising one to enhance security compliance
    • …
    corecore