17,890 research outputs found
The Viability and Potential Consequences of IoT-Based Ransomware
With the increased threat of ransomware and the substantial growth of the Internet of Things (IoT) market, there is significant motivation for attackers to carry out IoT-based ransomware campaigns. In this thesis, the viability of such malware is tested.
As part of this work, various techniques that could be used by ransomware developers to attack commercial IoT devices were explored. First, methods that attackers could use to communicate with the victim were examined, such that a ransom note was able to be reliably sent to a victim. Next, the viability of using "bricking" as a method of ransom was evaluated, such that devices could be remotely disabled unless the victim makes a payment to the attacker. Research was then performed to ascertain whether it was possible to remotely gain persistence on IoT devices, which would improve the efficacy of existing ransomware methods, and provide opportunities for more advanced ransomware to be created. Finally, after successfully identifying a number of persistence techniques, the viability of privacy-invasion based ransomware was analysed.
For each assessed technique, proofs of concept were developed. A range of devices -- with various intended purposes, such as routers, cameras and phones -- were used to test the viability of these proofs of concept. To test communication hijacking, devices' "channels of communication" -- such as web services and embedded screens -- were identified, then hijacked to display custom ransom notes. During the analysis of bricking-based ransomware, a working proof of concept was created, which was then able to remotely brick five IoT devices. After analysing the storage design of an assortment of IoT devices, six different persistence techniques were identified, which were then successfully tested on four devices, such that malicious filesystem modifications would be retained after the device was rebooted. When researching privacy-invasion based ransomware, several methods were created to extract information from data sources that can be commonly found on IoT devices, such as nearby WiFi signals, images from cameras, or audio from microphones. These were successfully implemented in a test environment such that ransomable data could be extracted, processed, and stored for later use to blackmail the victim.
Overall, IoT-based ransomware has not only been shown to be viable but also highly damaging to both IoT devices and their users. While the use of IoT-ransomware is still very uncommon "in the wild", the techniques demonstrated within this work highlight an urgent need to improve the security of IoT devices to avoid the risk of IoT-based ransomware causing havoc in our society. Finally, during the development of these proofs of concept, a number of potential countermeasures were identified, which can be used to limit the effectiveness of the attacking techniques discovered in this PhD research
Anatomy and kinematic evolution of an ancient passive margin involved into an orogenic wedge (Western Southern Alps, Varese area, Italy and Switzerland)
We make use of own geological mapping, interpretations of seismic reflection profiles and deep geophysical data to build a lithospheric-scale cross-section across the European Western Southern Alps (Varese area) and to model a progressive restoration from the end of Mesozoic rifting to present-day. Early phases of Alpine orogeny were characterized by Europe-directed thrusting, whereas post-Oligocene shortening led to basement-involving crustal accretion accompanied by backfolding, and consistent with the kinematics of the adjoining Ivrea Zone. Wedging was favored by a significant component of reactivation of the inherited Adriatic rifted margin. Our results also suggest that, during the collisional and post-collisional tectonics, lithosphere dynamics drove diachronically the onset of tectonic phases (i.e., wedging and slab retreat), from east to west, across the Western Southern Alps
Food biodiversity: Quantifying the unquantifiable in human diets
Dietary diversity is an established public health principle, and its measurement is essential for studies of diet quality and food security. However, conventional between food group scores fail to capture the nutritional variability and ecosystem services delivered by dietary richness and dissimilarity within food groups, or the relative distribution (i.e., evenness or moderation) of e.g., species or varieties across whole diets. Summarizing food biodiversity in an all-encompassing index is problematic. Therefore, various diversity indices have been proposed in ecology, yet these require methodological adaption for integration in dietary assessments. In this narrative review, we summarize the key conceptual issues underlying the measurement of food biodiversity at an edible species level, assess the ecological diversity indices previously applied to food consumption and food supply data, discuss their relative suitability, and potential amendments for use in (quantitative) dietary intake studies. Ecological diversity indices are often used without justification through the lens of nutrition. To illustrate: (i) dietary species richness fails to account for the distribution of foods across the diet or their functional traits; (ii) evenness indices, such as the Gini-Simpson index, require widely accepted relative abundance units (e.g., kcal, g, cups) and evidence-based moderation weighting factors; and (iii) functional dissimilarity indices are constructed based on an arbitrary selection of distance measures, cutoff criteria, and number of phylogenetic, nutritional, and morphological traits. Disregard for these limitations can lead to counterintuitive results and ambiguous or incorrect conclusions about the food biodiversity within diets or food systems. To ensure comparability and robustness of future research, we advocate food biodiversity indices that: (i) satisfy key axioms; (ii) can be extended to account for disparity between edible species; and (iii) are used in combination, rather than in isolation
Fair Assortment Planning
Many online platforms, ranging from online retail stores to social media
platforms, employ algorithms to optimize their offered assortment of items
(e.g., products and contents). These algorithms tend to prioritize the
platforms' short-term goals by solely featuring items with the highest
popularity or revenue. However, this practice can then lead to undesirable
outcomes for the rest of the items, making them leave the platform, and in turn
hurting the platform's long-term goals. Motivated by that, we introduce and
study a fair assortment planning problem, which requires any two items with
similar quality/merits to be offered similar outcomes. We show that the problem
can be formulated as a linear program (LP), called (FAIR), that optimizes over
the distribution of all feasible assortments. To find a near-optimal solution
to (FAIR), we propose a framework based on the Ellipsoid method, which requires
a polynomial-time separation oracle to the dual of the LP. We show that finding
an optimal separation oracle to the dual problem is an NP-complete problem, and
hence we propose a series of approximate separation oracles, which then result
in a -approx. algorithm and a PTAS for the original Problem (FAIR). The
approximate separation oracles are designed by (i) showing the separation
oracle to the dual of the LP is equivalent to solving an infinite series of
parameterized knapsack problems, and (ii) taking advantage of the structure of
the parameterized knapsack problems. Finally, we conduct a case study using the
MovieLens dataset, which demonstrates the efficacy of our algorithms and
further sheds light on the price of fairness.Comment: 86 pages, 7 figure
Modelling uncertainties for measurements of the H ā Ī³Ī³ Channel with the ATLAS Detector at the LHC
The Higgs boson to diphoton (H ā Ī³Ī³) branching ratio is only 0.227 %, but this
final state has yielded some of the most precise measurements of the particle. As
measurements of the Higgs boson become increasingly precise, greater import is
placed on the factors that constitute the uncertainty. Reducing the effects of these
uncertainties requires an understanding of their causes. The research presented
in this thesis aims to illuminate how uncertainties on simulation modelling are
determined and proffers novel techniques in deriving them.
The upgrade of the FastCaloSim tool is described, used for simulating events in
the ATLAS calorimeter at a rate far exceeding the nominal detector simulation,
Geant4. The integration of a method that allows the toolbox to emulate the
accordion geometry of the liquid argon calorimeters is detailed. This tool allows
for the production of larger samples while using significantly fewer computing
resources.
A measurement of the total Higgs boson production cross-section multiplied
by the diphoton branching ratio (Ļ Ć BĪ³Ī³) is presented, where this value was
determined to be (Ļ Ć BĪ³Ī³)obs = 127 Ā± 7 (stat.) Ā± 7 (syst.) fb, within agreement
with the Standard Model prediction. The signal and background shape modelling
is described, and the contribution of the background modelling uncertainty to the
total uncertainty ranges from 18ā2.4 %, depending on the Higgs boson production
mechanism.
A method for estimating the number of events in a Monte Carlo background
sample required to model the shape is detailed. It was found that the size of
the nominal Ī³Ī³ background events sample required a multiplicative increase by
a factor of 3.60 to adequately model the background with a confidence level of
68 %, or a factor of 7.20 for a confidence level of 95 %. Based on this estimate,
0.5 billion additional simulated events were produced, substantially reducing the
background modelling uncertainty.
A technique is detailed for emulating the effects of Monte Carlo event generator
differences using multivariate reweighting. The technique is used to estimate the
event generator uncertainty on the signal modelling of tHqb events, improving the
reliability of estimating the tHqb production cross-section. Then this multivariate
reweighting technique is used to estimate the generator modelling uncertainties
on background V Ī³Ī³ samples for the first time. The estimated uncertainties were
found to be covered by the currently assumed background modelling uncertainty
The genetic diversity and distribution of two endangered pit vipers (Bothriechis marchi and Cerrophidion wilsoni) within Cusuco National Park, Honduras
The current extinction rate is being increasingly recognised as an escalating issue, with observations still indicating that humans are the leading cause of a sixth mass extinction. Cloud forests in particular are home to a variety of threatened species, however, many of these forests are highly fragmented resulting in many species and populations existing in isolation. These forests have become centres of endemism for many species, leading to a high representation within global biodiversity hotspots and āirreplaceableā protected areas.
This study primarily investigates the genetic diversity and distribution of two recently re-classified, sympatric pit vipers (Bothriechis marchi and Cerrophidion wilsoni) within Cusuco National Park, Honduras using samples acquired by Operation Wallacea Ltd. Twenty-nine B. marchi samples and forty-two C. wilsoni samples were successfully sequenced for the cytochrome oxidase subunit 1 (CO1), NADH dehydrogenase 4 (ND4), and two segments of the cytochrome b (CytB) gene to reveal eight haplotypes within the B. marchi population (haplotype diversity = 0.697044) compared to only two haplotypes within the C. wilsoni population (haplotype diversity = 0.493612). This low variation of mtDNA haplotype diversity within the C. wilsoni population may leave it vulnerable to future challenges.
The mapping of these haplotypes suggests a partial restriction of gene flow between the west and east of the park. This appears to be a larger issue for B. marchi, however ongoing research is required to confirm the significance and potential impact of this. The results also provide further support regarding the distribution of these species, indicating a higher abundance of B. marchi around the El Corticito camp on the west side of the park compared to a higher abundance of C. wilsoni around Cantiles camp in the centre of the park. At the El Danto camp there is a good likelihood of encountering both species, however, the high levels of deforestation on this side of the park pose a significant threat.
The continuation of this deforestation could result in the loss of approximately 33-55% of all found mtDNA genetic diversity within the B. marchi population and cause significant change within the genetic structure of this population. In addition, 71% of all B. marchi encounters and 33% of C. wilsoni encounters between 2007 and 2018 were recorded on the west side of the park, highlighting a greater vulnerability of B. marchi to the current deforestation on this side of the park and indicating that a large percentage of both populations may be lost due to this deforestation should it continue
Defining Service Level Agreements in Serverless Computing
The emergence of serverless computing has brought significant advancements to the delivery of computing resources to cloud users. With the abstraction of infrastructure, ecosystem, and execution environments, users could focus on their code while relying on the cloud provider to manage the abstracted layers. In addition, desirable features such as autoscaling and high availability became a providerās responsibility and can be adopted by the user\u27s application at no extra overhead.
Despite such advancements, significant challenges must be overcome as applications transition from monolithic stand-alone deployments to the ephemeral and stateless microservice model of serverless computing. These challenges pertain to the uniqueness of the conceptual and implementation models of serverless computing. One of the notable challenges is the complexity of defining Service Level Agreements (SLA) for serverless functions. As the serverless model shifts the administration of resources, ecosystem, and execution layers to the provider, users become mere consumers of the providerās abstracted platform with no insight into its performance. Suboptimal conditions of the abstracted layers are not visible to the end-user who has no means to assess their performance. Thus, SLA in serverless computing must take into consideration the unique abstraction of its model.
This work investigates the Service Level Agreement (SLA) modeling of serverless functions\u27 and serverless chainsā executions. We highlight how serverless SLA fundamentally differs from earlier cloud delivery models. We then propose an approach to define SLA for serverless functions by utilizing resource utilization fingerprints for functions\u27 executions and a method to assess if executions adhere to that SLA. We evaluate the approachās accuracy in detecting SLA violations for a broad range of serverless application categories. Our validation results illustrate a high accuracy in detecting SLA violations resulting from resource contentions and providerās ecosystem degradations. We conclude by presenting the empirical validation of our proposed approach, which could detect Execution-SLA violations with accuracy up to 99%
Recommended from our members
Seeing Below The Surface Of Mars: Volatile sublimation in the martian regolith
The discovery of buried carbon dioxide (CO2) ice between water (H2O) ice layers within the martian south polar layered deposits has renewed interest in subsurface CO2 ice. In this thesis, subsurface CO2 ice stability is explored using a 1-D thermal and vapour diļ¬usion numerical model that simulates three phases of H2O, two phases of CO2, and adsorption of both for the ļ¬rst time.
Numerical experiments were run to examine how these two ices inļ¬uence one another, under a variety of ice-layer conļ¬gurations that are expected to be valid for Mars. The results demonstrate that an overlying near-surface H2O ice-ļ¬lled regolith layer increases subsurface CO2 ice stability by an order of magnitude. This stability increases further with the addition of an underlying H2O ice-ļ¬lled regolith layer. The initial porosity and geological materials used to represent the subsurface also have a large inļ¬uence on CO2 ice stability. The porosity limits the vapour diļ¬usion rate, while the geological materials inļ¬uence thermal conductivity and, therefore, subsurface temperatures.
Simulations at diļ¬erent orbital obliquities demonstrate that CO2 ice stability in the polar regions is greatest at low obliquities and smallest at high obliquities. The reverse is true for the equatorial regions. At higher obliquities (>45ā¦) and atmospheric pressures, the results suggest subsurface CO2 ice deposition could occur in the equatorial region.
The model results suggest that a 0.7ā27 km CO2 ice layer could sublimate away while 1 m of low-porosity H2O ice forms (in 14ā550 kyr depending on method) in the south polar layered deposits. The results also suggest CO2 ice sublimation is dependent on obliquity: ā¼0.15 km sublimates at low obliquity and ā¼1.9 km sublimates at high
obliquity over 100 kyr.
The subsurface model is a useful tool for future investigations into the historical behaviour of ices on Mars, particularly during the Noachian period when the CO2 frost-point temperature was higher
Omics measures of ageing and disease susceptibility
While genomics has been a major field of study for decades due to relatively inexpensive genotyping arrays, the recent advancement of technology has also allowed the measure and study of various āomicsā. There are now numerous methods and platforms available that allow high throughput and high dimensional quantification of many types of biological molecules. Traditional genomics and transcriptomics are now joined by proteomics, metabolomics, glycomics, lipidomics and epigenomics.
I was lucky to have access to a unique resource in the Orkney Complex Disease Study (ORCADES), a cohort of individuals from the Orkney Islands that are extremely deeply annotated. Approximately 1000 individuals in ORCADES have genomics, proteomics, lipidomics, glycomics, metabolomics, epigenomics, clinical risk factors and disease phenotypes, as well as body composition measurements from whole body scans. In addition to these cross-sectional omics and health related measures, these individuals also have linked electronic health records (EHR) available, allowing the assessment of the effect of these omics measures on incident disease over a ~10-year follow up period. In this thesis I use this phenotype rich resource to investigate the relationship between multiple types of omics measures and both ageing and health outcomes.
First, I used the ORCADES data to construct measures of biological age (BA). The idea that there is an underlying rate at which the body deteriorates with age that varies between individuals of the same chronological age, this biological age, would be more indicative of health status, functional capacity and risk of age-related diseases than chronological age. Previous models estimating BA (ageing clocks) have predominantly been built using a single type of omics assay and comparison between different omics ageing clocks has been limited. I performed the most exhaustive comparison of different omics ageing clocks yet, with eleven clocks spanning nine different omics assays. I show that different omics clocks overlap in the information they provide about age, that some omics clocks track more generalised ageing while others track specific disease risk factors and that omics ageing clocks are prognostic of incident disease over and above chronological age.
Second, I assessed whether individually or in multivariable models, omics measures are associated with health-related risk factors or prognostic of incident disease over 10 years post-assessment. I show that 2,686 single omics biomarkers are associated with 10 risk factors and 44 subsequent incident diseases. I also show that models built using multiple biomarkers from whole body scans, metabolomics, proteomics and clinical risk factors are prognostic of subsequent diabetes mellitus and that clinical risk factors are prognostic of incident hypertensive disorders, obesity, ischaemic heart disease and Framingham risk score.
Third, I investigated the genetic architecture of a subset of the proteomics measures available in ORCADES, specifically 184 cardiovascular-related proteins. Combining genome-wide association (GWAS) summary statistics from ORCADES and 17 other cohorts from the SCALLOP Consortium, giving a maximum sample size of 26,494 individuals, I performed 184 genome-wide association meta-analyses (GWAMAs) on the levels of these proteins circulating in plasma. I discovered 592 independent significant loci associated with the levels of at least one protein. I found that between 8-37% of these significant loci colocalise with known expression quantitative trait loci (eQTL). I also find evidence of causal associations between 11 plasma protein levels and disease susceptibility using Mendelian randomisation, highlighting potential candidate drug targets
- ā¦