120 research outputs found
A formal approach for network security policy validation
Network security is a crucial aspect for administrators due to increasing network size and number of functions and controls (e.g.firewall, DPI, parental control).
Errors in configuring security controls may result in serious security breaches and vulnerabilities (e.g. blocking legitimate traffic or permitting unwanted traffic) that must be absolutely detected and addressed.
This work proposes a novel approach for validating network policy enforcement, by checking the network status and configuration, and detection of the possible causes in case of misconfiguration or software attacks.
Our contribution exploits formal methods to model and validate the packet processing and forwarding behaviour of security controls, and to validate the trustworthiness of the controls by using remote attestation.
A prototype implementation of this approach is proposed to validate different scenarios
High-Fidelity Provenance:Exploring the Intersection of Provenance and Security
In the past 25 years, the World Wide Web has disrupted the way news are disseminated and consumed. However, the euphoria for the democratization of news publishing was soon followed by scepticism, as a new phenomenon emerged: fake news. With no gatekeepers to vouch for it, the veracity of the information served over the World Wide Web became a major public concern. The Reuters Digital News Report 2020 cites that in at least half of the EU member countries, 50% or more of the population is concerned about online fake news. To help address the problem of trust on information communi- cated over the World Wide Web, it has been proposed to also make available the provenance metadata of the information. Similar to artwork provenance, this would include a detailed track of how the information was created, updated and propagated to produce the result we read, as well as what agents—human or software—were involved in the process. However, keeping track of provenance information is a non-trivial task. Current approaches, are often of limited scope and may require modifying existing applications to also generate provenance information along with thei regular output. This thesis explores how provenance can be automatically tracked in an application-agnostic manner, without having to modify the individual applications. We frame provenance capture as a data flow analysis problem and explore the use of dynamic taint analysis in this context. Our work shows that this appoach improves on the quality of provenance captured compared to traditonal approaches, yielding what we term as high-fidelity provenance. We explore the performance cost of this approach and use deterministic record and replay to bring it down to a more practical level. Furthermore, we create and present the tooling necessary for the expanding the use of using deterministic record and replay for provenance analysis. The thesis concludes with an application of high-fidelity provenance as a tool for state-of-the art offensive security analysis, based on the intuition that software too can be misguided by "fake news". This demonstrates that the potential uses of high-fidelity provenance for security extend beyond traditional forensics analysis
Renewable Energy
This chapter presents an in-depth examination of major renewable energy technologies, including their installed capacity and energy supply in 2009 , the current state of market and technology development, their economic and financial feasibility in 2009 and in the near future,
as well as major issues they may face relative to their sustainability
or implementation. Renewable energy sources have been important for humankind since the beginning of civilization. For centuries, biomass has been used for heating, cooking, steam generation, and power production;
solar energy has been used for heating and drying; geothermal energy has been used for hot water supplies; hydropower, for movement;
and wind energy, for pumping and irrigation. For many decades renewable energy sources have also been used to produce electricity or
other modern energy carriers
A Deep Review Of Sustainable Approaches For Hydrogen Production For Energy Generation
Fossil fuel energy sources such as coal, oil, and natural gas produce greenhouse gases upon combustion. Over time the greenhouse gases trap heat in the atmosphere and cause global temperatures to continually increase. This continued rise in global temperatures has led to severe climate conditions that have resulted in increasing environmental, financial, social, and medical issues. To mitigate the rise in global temperatures, and the negative effects associated with it, the carbon intensity of our energy sources must be substantially reduced or eliminated where possible. In addition, alternate sources of carbon-free energy sources must be pursued and implemented. Hydrogen is a viable energy carrier for electricity generation and transportation without the emission of greenhouse gases. It occurs in fossil fuels such as coal and natural gas, biomass, and in large bodies of water. This research explores the primary methods that are used to produce hydrogen from fossil fuels and biomass using technologies such as steam methane reforming and gasification, and production from water using electrolysis. With carbon capture and storage, hydrogen can be produced semi-sustainably from natural gas and coal with reduced greenhouse gas emissions. In the electrolysis process, electricity is used to split water into hydrogen and oxygen. The preferred source of electricity to minimize the emission of greenhouse gases is renewable energy technologies such as solar, wind, hydropower, and geothermal energy. The research demonstrates the sustainable production of hydrogen and the benefits that it affords. The main benefits are electricity generation, fuel for transportation, and raw materials for several industrial processes. Also, it will facilitate the transition to non-fossil fuel energy technologies and foster our energy security. In addition, it will be instrumental in limiting the global temperature increase to 1.5 degrees Celsius above pre-industrial levels in accordance with the 2015 Paris Agreement for net zero emissions by 2050
Data Epistemologies / Surveillance and Uncertainty
Data Epistemologies studies the changing ways in which ‘knowledge’ is defined, promised, problematised, legitimated vis-á-vis the advent of digital, ‘big’ data surveillance technologies in early twenty-first century America. As part of the period’s fascination with ‘new’ media and ‘big’ data, such technologies intersect ambitious claims to better knowledge with a problematisation of uncertainty. This entanglement, I argue, results in contextual reconfigurations of what ‘counts’ as knowledge and who (or what) is granted authority to produce it – whether it involves proving that indiscriminate domestic surveillance prevents terrorist attacks, to arguing that machinic sensors can know us better than we can ever know ourselves.
The present work focuses on two empirical cases. The first is the ‘Snowden Affair’ (2013-Present): the public controversy unleashed through the leakage of vast quantities of secret material on the electronic surveillance practices of the U.S. government. The second is the ‘Quantified Self’ (2007-Present), a name which describes both an international community of experimenters and the wider industry built up around the use of data-driven surveillance technology for self-tracking every possible aspect of the individual ‘self’. By triangulating media coverage, connoisseur communities, advertising discourse and leaked material, I examine how surveillance technologies were presented for public debate and speculation.
This dissertation is thus a critical diagnosis of the contemporary faith in ‘raw’ data, sensing machines and algorithmic decision-making, and of their public promotion as the next great leap towards objective knowledge. Surveillance is not only a means of totalitarian control or a technology for objective knowledge, but a collective fantasy that seeks to mobilise public support for new epistemic systems. Surveillance, as part of a broader enthusiasm for ‘data-driven’ societies, extends the old modern project whereby the human subject – its habits, its affects, its actions – become the ingredient, the raw material, the object, the target, for the production of truths and judgments about them by things other than themselves
Computer Science 2019 APR Self-Study & Documents
UNM Computer Science APR self-study report and review team report for Spring 2019, fulfilling requirements of the Higher Learning Commission
Recommended from our members
Bespoke Security for Resource Constrained Cyber-Physical Systems
Cyber-Physical Systems (CPSs) are critical to many aspects of our daily lives. Autonomous cars, life saving medical devices, drones for package delivery, and robots for manufacturing are all prime examples of CPSs. The dual cyber/physical operating nature and highly integrated feedback control loops of CPSs means that they inherit security problems from traditional computing systems (e.g., software vulnerabilities, hardware side-channels) and physical systems (e.g., theft, tampering), while additionally introducing challenges of their own. The challenges to achieving security for CPSs stem not only from the interaction of the cyber and physical domains, but from the additional pressures of resource constraints imposed due to cost, limited energy budgets, and real-time nature of workloads. Due to the tight resource constraints of CPSs, there is often little headroom to devote for security. Thus, there is a need for low overhead deployable solutions to harden resource constrained CPSs. This dissertation shows that security can be effectively integrated into resource constrained cyber-physical system devices by leveraging fundamental physical properties, & tailoring and extending age-old abstractions in computing.
To provide context on the state of security for CPSs, this document begins with the development of a unifying framework that can be used to identify threats and opportunities for enforcing security policies while providing a systematic survey of the field. This dissertation characterizes the properties of CPSs and typical components (e.g., sensors, actuators, computing devices) in addition to the software commonly used. We discuss available security primitives and their limitations for both hardware and software. In particular, we focus on software security threats targeting memory safety. The rest of the thesis focuses on the design and implementation of novel, deployable approaches to combat memory safety on resource constrained devices used by CPSs (e.g., 32-bit processors and microcontrollers). We first discuss how cyber-physical system properties such as inertia and feedback can be used to harden software efficiently with minimal modification to both hardware and software. We develop the framework You Only Live Once (YOLO) that proactively resets a device and restores it from a secure verified snapshot. YOLO relies on inertia, to tolerate periods of resets, and on feedback to rebuild state when recovering from a snapshot. YOLO is built upon a theoretical model that is used to determine safe operating parameters to aid a system designer in deployment. We evaluate YOLO in simulation and two real-world CPSs, an engine and drone.
Second, we explore how rethinking of core computing concepts can lead to new fundamental abstractions that can efficiently hide performance overheads usually associated with hardening software against memory safety issues. To this end, we present two techniques: (i) The Phantom Address Space (PAS) is a new architectural concept that can be used to improve N-version systems by (almost) eliminating the overheads associated with handling replicated execution. Specifically, PAS can be used to provide an efficient implementation of a diversification concept known as execution path randomization aimed at thwarting code-reuse attacks. The goal of execution path randomization is to frequently switch between two distinct program variants forcing the attacker to gamble on which code to reuse. (ii) Cache Line Formats (Califorms) introduces a novel method to efficiently store memory in caches. Califorms makes the novel insight that dead spaces in program data due to its memory layout can be used to efficiently implement the concept of memory blacklisting, which prohibits a program from accessing certain memory regions based on program semantics. Califorms not onlyconsumes less memory than prior approaches, but can provide byte-granular protection while limiting the scope of its hardware changes to caches. While both PAS and Califorms were originally designed to target resource constrained devices, it's worth noting that they are widely applicable and can efficiently scale up to mobile, desktop, and server class processors.
As CPSs continue to proliferate and become integrated in more critical infrastructure, security is an increasing concern. However, security will undoubtedly always play second fiddle to financial concerns that affect business bottom lines. Thus, it is important that there be easily deployable, low-overhead solutions that can scale from the most constrained of devices to more featureful systems for future migration. This dissertation is one step towards the goal of providing inexpensive mechanisms to ensure the security of cyber-physical system software
Technical Communication Inclusionary Interventions Into Academic Spaces
While many efforts have been made to make higher education in the US more equitable, there are still academic spaces in which some knowledges and some knowledge makers are marginalized. In this dissertation, I identify three such spaces: technical editing, graduate instructor training, and online academic research in trans communities. When editors make revisions based solely in American Standard English, as most editing practices and teaching are currently based, they risk marginalizing non-heritage speakers of English and speakers of various dialects of English, like African American Vernacular English. I suggest that by shifting our focus of editing from grammar policing to editing for underrepresented audiences, we can make editing a more inclusive space for marginalized voices. I give examples of how to create these kinds of interventions both in the editing classroom and through workshops for faculty. Next, I address how programs can better support graduate student instructors’ sense of wellbeing. I suggest that one of the best ways to develop inclusive interventions in graduate instructor training is by inviting graduate students to help design the ways in which departments communicate student wellbeing. Finally, to intervene into the anti-trans violence that continues to scour the United States, I propose an intervention into the ways that academics study online trans communities. Through these kinds of interventions, I demonstrate that we can continue the work of creating more inclusive spaces in higher education
- …