38 research outputs found
Optimal Meshing Degree Performance Analysis in a mmWave FWA 5G Network Deployment
Fifth-generation technologies have reached a stage where it is now feasible to consider deployments that extend beyond traditional public networks. Central to this process is the application of Fixed Wireless Access (FWA) in 5G Non-public Networks (NPNs) that can utilise a novel combination of radio technologies to deploy an infrastructure on top of 5G NR or entirely from scratch. However, the use of FWA backhaul faces many challenges in relation to the trade-offs for reduced costs and a relatively simple deployment. Specifically, the use of meshed deployments is critical as it provides resilience against a temporary loss of connectivity due to link errors. In this paper, we examine the use of meshing in a FWA backhaul to determine if an optimal trade-off exists between the deployment of more nodes/links to provide multiple paths to the nearest Point of Presence (POP) and the performance of the network. Using a real 5G NPN deployment as a basis, we have conducted a simulated analysis of increasing network densities to determine the optimal configuration. Our results show a clear advantage for meshing in general, but there is also a performance trade-off to consider between overall network throughput and stability
Resilience in Information Stewardship
Information security is concerned with protecting the confi-
dentiality, integrity, and availability of information systems. System managers
deploy their resources with the aim of maintaining target levels of
these attributes in the presence of reactive threats. Information stewardship
is the challenge of maintaining the sustainability and resilience
of the security attributes of (complex, interconnected, multi-agent) information
ecosystems. In this paper, we present, in the tradition public
economics, a model of stewardship which addresses directly the question
of resilience. We model attacker-target-steward behaviour in a fully
endogenous Nash equilibrium setting. We analyse the occurrence of externalities
across targets and assess the steward’s ability to internalize
these externalities under varying informational assumptions. We apply
and simulate this model in the case of a critical national infrastructure
example
Contagion in cybersecurity attacks
Systems security is essential for the efficient operation of all organizations. Indeed, most large firms employ a designated 'Chief Information Security Officer' to coordinate the operational aspects of the organization’s information security. Part of this role is in planning investment responses to information security threats against the firm's corporate network infrastructure. To this end, we develop and estimate a vector equation system of threats to 10 important IP services, using industry standard SANS data on threats to various components of a firm's information system over the period January 2003 – February 2011. Our results reveal strong evidence of contagion between such attacks, with attacks on ssh and Secure Web Server indicating increased attack activity on other ports. Security managers who ignore such contagious inter-relationships may underestimate the underlying risk to their systems' defence of security attributes, such as sensitivity and criticality, and thus delay appropriate information security investments
Feature Selection using Tabu Search with Learning Memory: Learning Tabu Search
International audienceFeature selection in classification can be modeled as a com-binatorial optimization problem. One of the main particularities of this problem is the large amount of time that may be needed to evaluate the quality of a subset of features. In this paper, we propose to solve this problem with a tabu search algorithm integrating a learning mechanism. To do so, we adapt to the feature selection problem, a learning tabu search algorithm originally designed for a railway network problem in which the evaluation of a solution is time-consuming. Experiments are conducted and show the benefit of using a learning mechanism to solve hard instances of the literature
Commercial chicken breeds exhibit highly divergent patterns of linkage disequilibrium
The analysis of linkage disequilibrium (LD) underpins the development of effective genotyping technologies, trait mapping and understanding of biological mechanisms such as those driving recombination and the impact of selection. We apply the Malécot-Morton model of LD to create additive LD maps that describe the high-resolution LD landscape of commercial chickens. We investigated LD in chickens (Gallus gallus) at the highest resolution to date for broiler, white egg and brown egg layer commercial lines. There is minimal concordance between breeds of fine-scale LD patterns (correlation coefficient <0.21), and even between discrete broiler lines. Regions of LD breakdown, which may align with recombination hot spots, are enriched near CpG islands and transcription start sites (P<2.2 × 10?16), consistent with recent evidence described in finches, but concordance in hot spot locations between commercial breeds is only marginally greater than random. As in other birds, functional elements in the chicken genome are associated with recombination but, unlike evidence from other bird species, the LD landscape is not stable in the populations studied. The development of optimal genotyping panels for genome-led selection programmes will depend on careful analysis of the LD structure of each line of interest. Further study is required to fully elucidate the mechanisms underlying highly divergent LD patterns found in commercial chickens
Third Report on Chicken Genes and Chromosomes 2015
Following on from the First Report on Chicken Genes and Chromosomes [Schmid et al., 2000] and the Second Report in 2005 [Schmid et al., 2005], we are pleased to publish this long-awaited Third Report on the latest developments in chicken genomics. The First Report highlighted the availability of genetic and physical maps, while the Second Report was published as the chicken genome sequence was released. This report comes at a time of huge technological advances (particularly in sequencing methodologies) which have allowed us to examine the chicken genome in detail not possible until now. This has also heralded an explosion in avian genomics, with the current availability of more than 48 bird genomes [Zhang G et al., 2014b; Eöry et al., 2015], with many more planned
Towards a consolidation of worldwide journal rankings - A classification using random forests and aggregate rating via data envelopment analysis
AbstractThe question of how to assess research outputs published in journals is now a global concern for academics. Numerous journal ratings and rankings exist, some featuring perceptual and peer-review-based journal ranks, some focusing on objective information related to citations, some using a combination of the two. This research consolidates existing journal rankings into an up-to-date and comprehensive list. Existing approaches to determining journal rankings are significantly advanced with the application of a new classification approach, ‘random forests’, and data envelopment analysis. As a result, a fresh look at a publication׳s place in the global research community is offered. While our approach is applicable to all management and business journals, we specifically exemplify the relative position of ‘operations research, management science, production and operations management’ journals within the broader management field, as well as within their own subject domain
Resilience in Information Stewardship
Information security is concerned with protecting the confidentiality, integrity, and availability of information systems. System managers deploy their resources with the aim of maintaining target levels of these attributes in the presence of reactive threats. Information stewardship is the challenge of maintaining the sustainability and resilience of the security attributes of (complex, interconnected, multi-agent) information ecosystems. In this paper, we present, in the tradition of public economics, a model of stewardship which addresses directly the question of resilience. We model attacker-target-steward behaviour in a fully endogenous Nash equilibrium setting. We analyse the occurrence of externalities across targets and assess the steward’s ability to internalise these externalities under varying informational assumptions. We apply and simulate this model in the case of a critical national infrastructure example