4,710 research outputs found

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    KYT2022 Finnish Research Programme on Nuclear Waste Management 2019–2022 : Final Report

    Get PDF
    KYT2022 (Finnish Research Programme on Nuclear Waste Management 2019–2022), organised by the Ministry of Economic Affairs and Employment, was a national research programme with the objective to ensure that the authorities have sufficient levels of nuclear expertise and preparedness that are needed for safety of nuclear waste management. The starting point for public research programs on nuclear safety is that they create the conditions for maintaining the knowledge required for the continued safe and economic use of nuclear energy, developing new know-how and participating in international collaboration. The content of the KYT2022 research programme was composed of nationally important research topics, which are the safety, feasibility and acceptability of nuclear waste management. KYT2022 research programme also functioned as a discussion and information-sharing forum for the authorities, those responsible for nuclear waste management and the research organizations, which helped to make use of the limited research resources. The programme aimed to develop national research infrastructure, ensure the continuing availability of expertise, produce high-level scientific research and increase general knowledge of nuclear waste management

    A Decision Support System for Economic Viability and Environmental Impact Assessment of Vertical Farms

    Get PDF
    Vertical farming (VF) is the practice of growing crops or animals using the vertical dimension via multi-tier racks or vertically inclined surfaces. In this thesis, I focus on the emerging industry of plant-specific VF. Vertical plant farming (VPF) is a promising and relatively novel practice that can be conducted in buildings with environmental control and artificial lighting. However, the nascent sector has experienced challenges in economic viability, standardisation, and environmental sustainability. Practitioners and academics call for a comprehensive financial analysis of VPF, but efforts are stifled by a lack of valid and available data. A review of economic estimation and horticultural software identifies a need for a decision support system (DSS) that facilitates risk-empowered business planning for vertical farmers. This thesis proposes an open-source DSS framework to evaluate business sustainability through financial risk and environmental impact assessments. Data from the literature, alongside lessons learned from industry practitioners, would be centralised in the proposed DSS using imprecise data techniques. These techniques have been applied in engineering but are seldom used in financial forecasting. This could benefit complex sectors which only have scarce data to predict business viability. To begin the execution of the DSS framework, VPF practitioners were interviewed using a mixed-methods approach. Learnings from over 19 shuttered and operational VPF projects provide insights into the barriers inhibiting scalability and identifying risks to form a risk taxonomy. Labour was the most commonly reported top challenge. Therefore, research was conducted to explore lean principles to improve productivity. A probabilistic model representing a spectrum of variables and their associated uncertainty was built according to the DSS framework to evaluate the financial risk for VF projects. This enabled flexible computation without precise production or financial data to improve economic estimation accuracy. The model assessed two VPF cases (one in the UK and another in Japan), demonstrating the first risk and uncertainty quantification of VPF business models in the literature. The results highlighted measures to improve economic viability and the viability of the UK and Japan case. The environmental impact assessment model was developed, allowing VPF operators to evaluate their carbon footprint compared to traditional agriculture using life-cycle assessment. I explore strategies for net-zero carbon production through sensitivity analysis. Renewable energies, especially solar, geothermal, and tidal power, show promise for reducing the carbon emissions of indoor VPF. Results show that renewably-powered VPF can reduce carbon emissions compared to field-based agriculture when considering the land-use change. The drivers for DSS adoption have been researched, showing a pathway of compliance and design thinking to overcome the ‘problem of implementation’ and enable commercialisation. Further work is suggested to standardise VF equipment, collect benchmarking data, and characterise risks. This work will reduce risk and uncertainty and accelerate the sector’s emergence

    Defining Service Level Agreements in Serverless Computing

    Get PDF
    The emergence of serverless computing has brought significant advancements to the delivery of computing resources to cloud users. With the abstraction of infrastructure, ecosystem, and execution environments, users could focus on their code while relying on the cloud provider to manage the abstracted layers. In addition, desirable features such as autoscaling and high availability became a provider’s responsibility and can be adopted by the user\u27s application at no extra overhead. Despite such advancements, significant challenges must be overcome as applications transition from monolithic stand-alone deployments to the ephemeral and stateless microservice model of serverless computing. These challenges pertain to the uniqueness of the conceptual and implementation models of serverless computing. One of the notable challenges is the complexity of defining Service Level Agreements (SLA) for serverless functions. As the serverless model shifts the administration of resources, ecosystem, and execution layers to the provider, users become mere consumers of the provider’s abstracted platform with no insight into its performance. Suboptimal conditions of the abstracted layers are not visible to the end-user who has no means to assess their performance. Thus, SLA in serverless computing must take into consideration the unique abstraction of its model. This work investigates the Service Level Agreement (SLA) modeling of serverless functions\u27 and serverless chains’ executions. We highlight how serverless SLA fundamentally differs from earlier cloud delivery models. We then propose an approach to define SLA for serverless functions by utilizing resource utilization fingerprints for functions\u27 executions and a method to assess if executions adhere to that SLA. We evaluate the approach’s accuracy in detecting SLA violations for a broad range of serverless application categories. Our validation results illustrate a high accuracy in detecting SLA violations resulting from resource contentions and provider’s ecosystem degradations. We conclude by presenting the empirical validation of our proposed approach, which could detect Execution-SLA violations with accuracy up to 99%

    Automatic Rural Road Centerline Extraction from Aerial Images for a Forest Fire Support System

    Get PDF
    In the last decades, Portugal has been severely affected by forest fires which have caused massive damage both environmentally and socially. Having a well-structured and precise mapping of rural roads is critical to help firefighters to mitigate these events. The traditional process of extracting rural roads centerlines from aerial images is extremely time-consuming and tedious, because the mapping operator has to manually label the road area and extract the road centerline. A frequent challenge in the process of extracting rural roads centerlines is the high amount of environmental complexity and road occlusions caused by vehicles, shadows, wild vegetation, and trees, bringing heterogeneous segments that can be further improved. This dissertation proposes an approach to automatically detect rural road segments as well as extracting the road centerlines from aerial images. The proposed method focuses on two main steps: on the first step, an architecture based on a deep learning model (DeepLabV3+) is used, to extract the road features maps and detect the rural roads. On the second step, the first stage of the process is an optimization for improving road connections, as well as cleaning white small objects from the predicted image by the neural network. Finally, a morphological approach is proposed to extract the rural road centerlines from the previously detected roads by using thinning algorithms like the Zhang-Suen and Guo-Hall methods. With the automation of these two stages, it is now possible to detect and extract road centerlines from complex rural environments automatically and faster than the traditional ways, and possibly integrating that data in a Geographical Information System (GIS), allowing the creation of real-time mapping applications.Nas últimas décadas, Portugal tem sido severamente afetado por fogos florestais, que têm causado grandes estragos ambientais e sociais. Possuir um sistema de mapeamento de estradas rurais bem estruturado e preciso é essencial para ajudar os bombeiros a mitigar este tipo de eventos. Os processos tradicionais de extração de eixos de via em estradas rurais a partir de imagens aéreas são extremamente demorados e fastidiosos. Um desafio frequente na extração de eixos de via de estradas rurais é a alta complexidade dos ambientes rurais e de estes serem obstruídos por veículos, sombras, vegetação selvagem e árvores, trazendo segmentos heterogéneos que podem ser melhorados. Esta dissertação propõe uma abordagem para detetar automaticamente estradas rurais, bem como extrair os eixos de via de imagens aéreas. O método proposto concentra-se em duas etapas principais: na primeira etapa é utilizada uma arquitetura baseada em modelos de aprendizagem profunda (DeepLabV3+), para detetar as estradas rurais. Na segunda etapa, primeiramente é proposta uma otimização de intercessões melhorando as conexões relativas aos eixos de via, bem como a remoção de pequenos artefactos que estejam a introduzir ruído nas imagens previstas pela rede neuronal. E, por último, é utilizada uma abordagem morfológica para extrair os eixos de via das estradas previamente detetadas recorrendo a algoritmos de esqueletização tais como os algoritmos Zhang-Suen e Guo-Hall. Automatizando estas etapas, é então possível extrair eixos de via de ambientes rurais de grande complexidade de forma automática e com uma maior rapidez em relação aos métodos tradicionais, permitindo, eventualmente, integrar os dados num Sistema de Informação Geográfica (SIG), possibilitando a criação de aplicativos de mapeamento em tempo real

    Z-Numbers-Based Approach to Hotel Service Quality Assessment

    Get PDF
    In this study, we are analyzing the possibility of using Z-numbers for measuring the service quality and decision-making for quality improvement in the hotel industry. Techniques used for these purposes are based on consumer evalu- ations - expectations and perceptions. As a rule, these evaluations are expressed in crisp numbers (Likert scale) or fuzzy estimates. However, descriptions of the respondent opinions based on crisp or fuzzy numbers formalism not in all cases are relevant. The existing methods do not take into account the degree of con- fidence of respondents in their assessments. A fuzzy approach better describes the uncertainties associated with human perceptions and expectations. Linguis- tic values are more acceptable than crisp numbers. To consider the subjective natures of both service quality estimates and confidence degree in them, the two- component Z-numbers Z = (A, B) were used. Z-numbers express more adequately the opinion of consumers. The proposed and computationally efficient approach (Z-SERVQUAL, Z-IPA) allows to determine the quality of services and iden- tify the factors that required improvement and the areas for further development. The suggested method was applied to evaluate the service quality in small and medium-sized hotels in Turkey and Azerbaijan, illustrated by the example

    19th SC@RUG 2022 proceedings 2021-2022

    Get PDF

    Investigating and mitigating the role of neutralisation techniques on information security policies violation in healthcare organisations

    Get PDF
    Healthcare organisations today rely heavily on Electronic Medical Records systems (EMRs), which have become highly crucial IT assets that require significant security efforts to safeguard patients’ information. Individuals who have legitimate access to an organisation’s assets to perform their day-to-day duties but intentionally or unintentionally violate information security policies can jeopardise their organisation’s information security efforts and cause significant legal and financial losses. In the information security (InfoSec) literature, several studies emphasised the necessity to understand why employees behave in ways that contradict information security requirements but have offered widely different solutions. In an effort to respond to this situation, this thesis addressed the gap in the information security academic research by providing a deep understanding of the problem of medical practitioners’ behavioural justifications to violate information security policies and then determining proper solutions to reduce this undesirable behaviour. Neutralisation theory was used as the theoretical basis for the research. This thesis adopted a mixed-method research approach that comprises four consecutive phases, and each phase represents a research study that was conducted in light of the results from the preceding phase. The first phase of the thesis started by investigating the relationship between medical practitioners’ neutralisation techniques and their intention to violate information security policies that protect a patient’s privacy. A quantitative study was conducted to extend the work of Siponen and Vance [1] through a study of the Saudi Arabia healthcare industry. The data was collected via an online questionnaire from 66 Medical Interns (MIs) working in four academic hospitals. The study found that six neutralisation techniques—(1) appeal to higher loyalties, (2) defence of necessity, (3) the metaphor of ledger, (4) denial of responsibility, (5) denial of injury, and (6) condemnation of condemners—significantly contribute to the justifications of the MIs in hypothetically violating information security policies. The second phase of this research used a series of semi-structured interviews with IT security professionals in one of the largest academic hospitals in Saudi Arabia to explore the environmental factors that motivated the medical practitioners to evoke various neutralisation techniques. The results revealed that social, organisational, and emotional factors all stimulated the behavioural justifications to breach information security policies. During these interviews, it became clear that the IT department needed to ensure that security policies fit the daily tasks of the medical practitioners by providing alternative solutions to ensure the effectiveness of those policies. Based on these interviews, the objective of the following two phases was to improve the effectiveness of InfoSec policies against the use of behavioural justification by engaging the end users in the modification of existing policies via a collaborative writing process. Those two phases were conducted in the UK and Saudi Arabia to determine whether the collaborative writing process could produce a more effective security policy that balanced the security requirements with daily business needs, thus leading to a reduction in the use of neutralisation techniques to violate security policies. The overall result confirmed that the involvement of the end users via a collaborative writing process positively improved the effectiveness of the security policy to mitigate the individual behavioural justifications, showing that the process is a promising one to enhance security compliance

    A productive response to legacy system petrification

    Get PDF
    Requirements change. The requirements of a legacy information system change, often in unanticipated ways, and at a more rapid pace than the rate at which the information system itself can be evolved to support them. The capabilities of a legacy system progressively fall further and further behind their evolving requirements, in a degrading process termed petrification. As systems petrify, they deliver diminishing business value, hamper business effectiveness, and drain organisational resources. To address legacy systems, the first challenge is to understand how to shed their resistance to tracking requirements change. The second challenge is to ensure that a newly adaptable system never again petrifies into a change resistant legacy system. This thesis addresses both challenges. The approach outlined herein is underpinned by an agile migration process - termed Productive Migration - that homes in upon the specific causes of petrification within each particular legacy system and provides guidance upon how to address them. That guidance comes in part from a personalised catalogue of petrifying patterns, which capture recurring themes underlying petrification. These steer us to the problems actually present in a given legacy system, and lead us to suitable antidote productive patterns via which we can deal with those problems one by one. To prevent newly adaptable systems from again degrading into legacy systems, we appeal to a follow-on process, termed Productive Evolution, which embraces and keeps pace with change rather than resisting and falling behind it. Productive Evolution teaches us to be vigilant against signs of system petrification and helps us to nip them in the bud. The aim is to nurture systems that remain supportive of the business, that are adaptable in step with ongoing requirements change, and that continue to retain their value as significant business assets
    corecore