4,655 research outputs found

    Configuration Management of Distributed Systems over Unreliable and Hostile Networks

    Get PDF
    Economic incentives of large criminal profits and the threat of legal consequences have pushed criminals to continuously improve their malware, especially command and control channels. This thesis applied concepts from successful malware command and control to explore the survivability and resilience of benign configuration management systems. This work expands on existing stage models of malware life cycle to contribute a new model for identifying malware concepts applicable to benign configuration management. The Hidden Master architecture is a contribution to master-agent network communication. In the Hidden Master architecture, communication between master and agent is asynchronous and can operate trough intermediate nodes. This protects the master secret key, which gives full control of all computers participating in configuration management. Multiple improvements to idempotent configuration were proposed, including the definition of the minimal base resource dependency model, simplified resource revalidation and the use of imperative general purpose language for defining idempotent configuration. Following the constructive research approach, the improvements to configuration management were designed into two prototypes. This allowed validation in laboratory testing, in two case studies and in expert interviews. In laboratory testing, the Hidden Master prototype was more resilient than leading configuration management tools in high load and low memory conditions, and against packet loss and corruption. Only the research prototype was adaptable to a network without stable topology due to the asynchronous nature of the Hidden Master architecture. The main case study used the research prototype in a complex environment to deploy a multi-room, authenticated audiovisual system for a client of an organization deploying the configuration. The case studies indicated that imperative general purpose language can be used for idempotent configuration in real life, for defining new configurations in unexpected situations using the base resources, and abstracting those using standard language features; and that such a system seems easy to learn. Potential business benefits were identified and evaluated using individual semistructured expert interviews. Respondents agreed that the models and the Hidden Master architecture could reduce costs and risks, improve developer productivity and allow faster time-to-market. Protection of master secret keys and the reduced need for incident response were seen as key drivers for improved security. Low-cost geographic scaling and leveraging file serving capabilities of commodity servers were seen to improve scaling and resiliency. Respondents identified jurisdictional legal limitations to encryption and requirements for cloud operator auditing as factors potentially limiting the full use of some concepts

    The Pragmatic Development of a Carbon Management Framework for UK SMEs

    Get PDF
    The UK's commitment to net-zero emissions by 2050 is challenged by critics citing current government strategies as inadequate, marked by a lack of concrete action and aspirational guidelines. Notably, businesses, including small and medium-sized enterprises (SMEs) which constitute about half of all business emissions, are pivotal to this goal. Yet, existing policies and standards often neglect the significant role of SMEs, who face barriers such as limited knowledge and resources in implementing carbon management practices. This thesis explores the development of a novel carbon management framework specifically designed for medium-sized organisations in the UK to address these problems. The research adopts a practical approach through collaboration with an industry partner, facilitating a case study for real-world application. Adopting a mixed-methods research design grounded in pragmatism, the study commenced with a qualitative study in the form of a focus group. This exploratory phase, critical for understanding SME challenges, yielded rich data revealing key management themes in strategy, energy, and data. The framework design was supported by a materiality assessment and input from key stakeholders on three major iterations. The final framework comprises three phases: establishing a baseline carbon footprint, creating a carbon reduction plan, and strategically implementing this plan. The validation process, conducted at Knowsley Safari, successfully tested the initial two phases but faced constraints in fully assessing the third phase due to time limitations. While the research achieved its primary aim of developing a novel carbon management framework for SMEs, it encountered limitations, notably in time and the generalisability of findings due to reliance on a single case study. Future research could test the framework across diverse SME settings to establish its broader applicability and effectiveness in aiding the UK's net-zero emission goals

    Towards A Practical High-Assurance Systems Programming Language

    Full text link
    Writing correct and performant low-level systems code is a notoriously demanding job, even for experienced developers. To make the matter worse, formally reasoning about their correctness properties introduces yet another level of complexity to the task. It requires considerable expertise in both systems programming and formal verification. The development can be extremely costly due to the sheer complexity of the systems and the nuances in them, if not assisted with appropriate tools that provide abstraction and automation. Cogent is designed to alleviate the burden on developers when writing and verifying systems code. It is a high-level functional language with a certifying compiler, which automatically proves the correctness of the compiled code and also provides a purely functional abstraction of the low-level program to the developer. Equational reasoning techniques can then be used to prove functional correctness properties of the program on top of this abstract semantics, which is notably less laborious than directly verifying the C code. To make Cogent a more approachable and effective tool for developing real-world systems, we further strengthen the framework by extending the core language and its ecosystem. Specifically, we enrich the language to allow users to control the memory representation of algebraic data types, while retaining the automatic proof with a data layout refinement calculus. We repurpose existing tools in a novel way and develop an intuitive foreign function interface, which provides users a seamless experience when using Cogent in conjunction with native C. We augment the Cogent ecosystem with a property-based testing framework, which helps developers better understand the impact formal verification has on their programs and enables a progressive approach to producing high-assurance systems. Finally we explore refinement type systems, which we plan to incorporate into Cogent for more expressiveness and better integration of systems programmers with the verification process

    Comparative Multiple Case Study into the Teaching of Problem-Solving Competence in Lebanese Middle Schools

    Get PDF
    This multiple case study investigates how problem-solving competence is integrated into teaching practices in private schools in Lebanon. Its purpose is to compare instructional approaches to problem-solving across three different programs: the American (Common Core State Standards and New Generation Science Standards), French (Socle Commun de Connaissances, de Compétences et de Culture), and Lebanese with a focus on middle school (grades 7, 8, and 9). The project was conducted in nine schools equally distributed among three categories based on the programs they offered: category 1 schools offered the Lebanese program, category 2 the French and Lebanese programs, and category 3 the American and Lebanese programs. Each school was treated as a separate case. Structured observation data were collected using observation logs that focused on lesson objectives and specific cognitive problem-solving processes. The two logs were created based on a document review of the requirements for the three programs. Structured observations were followed by semi-structured interviews that were conducted to explore teachers' beliefs and understandings of problem-solving competence. The comparative analysis of within-category structured observations revealed an instruction ranging from teacher-led practices, particularly in category 1 schools, to more student-centered approaches in categories 2 and 3. The cross-category analysis showed a reliance on cognitive processes primarily promoting exploration, understanding, and demonstrating understanding, with less emphasis on planning and executing, monitoring and reflecting, thus uncovering a weakness in addressing these processes. The findings of the post-observation semi-structured interviews disclosed a range of definitions of problem-solving competence prevalent amongst teachers with clear divergences across the three school categories. This research is unique in that it compares problem-solving teaching approaches across three different programs and explores underlying teachers' beliefs and understandings of problem-solving competence in the Lebanese context. It is hoped that this project will inform curriculum developers about future directions and much-anticipated reforms of the Lebanese program and practitioners about areas that need to be addressed to further improve the teaching of problem-solving competence

    Seamless Multimodal Biometrics for Continuous Personalised Wellbeing Monitoring

    Full text link
    Artificially intelligent perception is increasingly present in the lives of every one of us. Vehicles are no exception, (...) In the near future, pattern recognition will have an even stronger role in vehicles, as self-driving cars will require automated ways to understand what is happening around (and within) them and act accordingly. (...) This doctoral work focused on advancing in-vehicle sensing through the research of novel computer vision and pattern recognition methodologies for both biometrics and wellbeing monitoring. The main focus has been on electrocardiogram (ECG) biometrics, a trait well-known for its potential for seamless driver monitoring. Major efforts were devoted to achieving improved performance in identification and identity verification in off-the-person scenarios, well-known for increased noise and variability. Here, end-to-end deep learning ECG biometric solutions were proposed and important topics were addressed such as cross-database and long-term performance, waveform relevance through explainability, and interlead conversion. Face biometrics, a natural complement to the ECG in seamless unconstrained scenarios, was also studied in this work. The open challenges of masked face recognition and interpretability in biometrics were tackled in an effort to evolve towards algorithms that are more transparent, trustworthy, and robust to significant occlusions. Within the topic of wellbeing monitoring, improved solutions to multimodal emotion recognition in groups of people and activity/violence recognition in in-vehicle scenarios were proposed. At last, we also proposed a novel way to learn template security within end-to-end models, dismissing additional separate encryption processes, and a self-supervised learning approach tailored to sequential data, in order to ensure data security and optimal performance. (...)Comment: Doctoral thesis presented and approved on the 21st of December 2022 to the University of Port

    Designs of Blackness

    Get PDF
    Across more than two centuries Afro-America has created a huge and dazzling variety of literary self-expression. Designs of Blackness provides less a narrative literary history than, precisely, a series of mappings—each literary-critical and comparative while at the same time offering cultural and historical context. This carefully re-edited version of the 1998 publication opens with an estimation of earliest African American voice in the names of Phillis Wheatley and her contemporaries. It then takes up the huge span of autobiography from Frederick Douglass through to Maya Angelou. "Harlem on My Mind," which follows, sets out the literary contours of America’s premier black city. Womanism, Alice Walker’s presiding term, is given full due in an analysis of fiction from Harriet E. Wilson to Toni Morrison. Richard Wright is approached not as some regulation "realist" but as a more inward, at times near-surreal, author. Decadology has its risks but the 1940s has rarely been approached as a unique era of war and peace and especially in African American texts. Beat Generation work usually adheres to Ginsberg and Kerouac, but black Beat writing invites its own chapter in the names of Amiri Baraka, Ted Joans and Bob Kaufman. The 1960s has long become a mythic change-decade, and in few greater respects than as a black theatre both of the stage and politics. In Leon Forrest African America had a figure of the postmodern turn: his work is explored in its own right and for how it takes its place in the context of other reflexive black fiction. "African American Fictions of Passing" unpacks the whole deceptive trope of "race" in writing from Williams Wells Brown through to Charles Johnson. The two newly added chapters pursue African American literary achievement into the Obama-Trump century, fiction from Octavia Butler to Darryl Pinkney, poetry from Rita Dove to Kevin Young

    Reshaping Higher Education for a Post-COVID-19 World: Lessons Learned and Moving Forward

    Get PDF
    No abstract available

    Tradition and Innovation in Construction Project Management

    Get PDF
    This book is a reprint of the Special Issue 'Tradition and Innovation in Construction Project Management' that was published in the journal Buildings

    IMAGINING, GUIDING, PLAYING INTIMACY: - A Theory of Character Intimacy Games -

    Get PDF
    Within the landscape of Japanese media production, and video game production in particular, there is a niche comprising video games centered around establishing, developing, and fulfilling imagined intimate relationships with anime-manga characters. Such niche, although very significant in production volume and lifespan, is left unexplored or underexplored. When it is not, it is subsumed within the scope of wider anime-manga media. This obscures the nature of such video games, alternatively identified with descriptors including but not limited to ‘visual novel’, ‘dating simulator’ and ‘adult computer game’. As games centered around developing intimacy with characters, they present specific ensembles of narrative content, aesthetics and software mechanics. These ensembles are aimed at eliciting in users what are, by all intents and purposes, parasocial phenomena towards the game’s characters. In other words, these software products encourage players to develop affective and bodily responses towards characters. They are set in a way that is coherent with shared, circulating scripts for sexual and intimate interaction to guide player imaginative action. This study defines games such as the above as ‘character intimacy games’, video game software where traversal is contingent on players knowingly establishing, developing, and fulfilling intimate bonds with fictional characters. To do so, however, player must recognize themselves as playing that type of game, and to be looking to develop that kind of response towards the game’s characters. Character Intimacy Games are contingent upon player developing affective and bodily responses, and thus presume that players are, at the very least, non-hostile towards their development. This study approaches Japanese character intimacy games as its corpus, and operates at the intersection of studies of communication, AMO studies and games studies. The study articulates a research approach based on the double need of approaching single works of significance amidst a general scarcity of scholarly background on the subject. It juxtaposes data-driven approaches derived from fan-curated databases – The Visual Novel Database and Erogescape -Erogē Hyōron Kūkan – with a purpose-created ludo-hermeneutic process. By deploying an observation of character intimacy games through fan-curated data and building ludo-hermeneutics on the resulting ontology, this study argues that character intimacy games are video games where traversal is contingent on players knowingly establishing, developing, and fulfilling intimate bonds with fictional characters and recognizing themselves as doing so. To produce such conditions, the assemblage of software mechanics and narrative content in such games facilitates intimacy between player and characters. This is, ultimately, conductive to the emergence of parasocial phenomena. Parasocial phenomena, in turn, are deployed as an integral assumption regarding player activity within the game’s wider assemblage of narrative content and software mechanics
    corecore