2,229 research outputs found
Recommended from our members
H.L.A. Hart's secondary rules: what do ‘officials’ really think?
AbstractThe impact of H.L.A. Hart'sThe Concept of Lawon modern legal thinking is undisputed. But does it reflect the reality of the way British institutions work? InConcept,Hart argued, amongst other things, that one of two ‘minimum conditions necessary and sufficient for the existence of a legal system’ was that ‘its rules of recognition specifying the criteria of legal validity and its rules of change and adjudication must be effectively accepted as common public standards of official behaviour by its officials’. In this paper, we begin the process of testing that statement empirically. Specifically, we ask whether non-judicial UK officials have a uniform view of what the rules of recognition, change and adjudication are, and whether they uniformly take an internal point of view towards them (i.e. whether they accept the rules and do not merely obey them). By way of a pilot study, thirty non-judicial UK officials were interviewed. Those officials comprised currently serving and retired senior civil servants, senior military officials, chief constables and local authority chief executives. The findings of the pilot study are presented in this paper. They allow us to deduce that Hart's statement might well be an inaccurate and incomplete description of the modern British constitution, and to comment on the implications of that conclusion.LeRoux Trus
Rigorous engineering for hardware security: Formal modelling and proof in the CHERI design and implementation process
The root causes of many security vulnerabilities include a pernicious combination of two problems, often regarded as inescapable aspects of computing. First, the protection mechanisms provided by the mainstream processor architecture and C/C++ language abstractions, dating back to the 1970s and before, provide only coarse-grain virtual-memory-based protection. Second, mainstream system engineering relies almost exclusively on test-and-debug methods, with (at best) prose specifications. These methods have historically sufficed commercially for much of the computer industry, but they fail to prevent large numbers of exploitable bugs, and the security problems that this causes are becoming ever more acute.
In this paper we show how more rigorous engineering methods can be applied to the development of a new security-enhanced processor architecture, with its accompanying hardware implementation and software stack. We use formal models of the complete instruction-set architecture (ISA) at the heart of the design and engineering process, both in lightweight ways that support and improve normal engineering practice -- as documentation, in emulators used as a test oracle for hardware and for running software, and for test generation -- and for formal verification. We formalise key intended security properties of the design, and establish that these hold with mechanised proof. This is for the same complete ISA models (complete enough to boot operating systems), without idealisation.
We do this for CHERI, an architecture with \emph{hardware capabilities} that supports fine-grained memory protection and scalable secure compartmentalisation, while offering a smooth adoption path for existing software. CHERI is a maturing research architecture, developed since 2010, with work now underway on an Arm industrial prototype to explore its possible adoption in mass-market commercial processors. The rigorous engineering work described here has been an integral part of its development to date, enabling more rapid and confident experimentation, and boosting confidence in the design.This work was supported by EPSRC programme grant EP/K008528/1 (REMS: Rigorous Engineering for Mainstream Systems).
This work was supported by a Gates studentship (Nienhuis).
This project has received funding from the European Research Council
(ERC) under the European Union's Horizon 2020 research and innovation
programme (grant agreement 789108, ELVER).
This work was supported by the Defense
Advanced Research Projects Agency (DARPA) and the Air Force Research
Laboratory (AFRL), under contracts FA8750-10-C-0237 (CTSRD),
HR0011-18-C-0016 (ECATS),
and FA8650-18-C-7809 (CIFV)
Formation of regulatory modules by local sequence duplication
Turnover of regulatory sequence and function is an important part of
molecular evolution. But what are the modes of sequence evolution leading to
rapid formation and loss of regulatory sites? Here, we show that a large
fraction of neighboring transcription factor binding sites in the fly genome
have formed from a common sequence origin by local duplications. This mode of
evolution is found to produce regulatory information: duplications can seed new
sites in the neighborhood of existing sites. Duplicate seeds evolve
subsequently by point mutations, often towards binding a different factor than
their ancestral neighbor sites. These results are based on a statistical
analysis of 346 cis-regulatory modules in the Drosophila melanogaster genome,
and a comparison set of intergenic regulatory sequence in Saccharomyces
cerevisiae. In fly regulatory modules, pairs of binding sites show
significantly enhanced sequence similarity up to distances of about 50 bp. We
analyze these data in terms of an evolutionary model with two distinct modes of
site formation: (i) evolution from independent sequence origin and (ii)
divergent evolution following duplication of a common ancestor sequence. Our
results suggest that pervasive formation of binding sites by local sequence
duplications distinguishes the complex regulatory architecture of higher
eukaryotes from the simpler architecture of unicellular organisms
Perception of epidemic's related anxiety in the General French Population: a cross-sectional study in the Rhône-Alpes region
International audienceBackgroundTo efficiently plan appropriate public health interventions during possible epidemics, governments must take into consideration the following factors about the general population: their knowledge of epidemics, their fears of and psychological responses to them, their level of compliance with government measures and their communities' trusted sources of information. However, such surveys among the French general population are rare.MethodsA cross-sectional study was conducted in 2006 in a representative sample of 600 subjects living in the Rhône-Alpes region (south-east France) to investigate self-reported knowledge about infectious diseases and anxiety generated by epidemic risk with particular reference to avian influenza. Data on reactions to potentially new epidemics and the confidence level in various sources of information were also collected.ResultsRespondents were most knowledgeable about AIDS, followed by avian influenza. Overall, 75% of respondents had adequate knowledge of avian influenza. The percentage was even higher (88%) among inhabitants of the Ain district, where an avian influenza epidemic had previously been reported. However, 39% expressed anxiety about this disease. In total, 20% of respondents with knowledge about avian influenza stated that they had changed their behaviours during the epizooty. Epidemics were perceived as a real threat by 27% of respondents. In the event of a highly contagious outbreak, the majority of respondents said they would follow the advice given by authorities. The study population expressed a high level of confidence in physicians and scientists, but had strong reservations about politicians, deputies and the media.ConclusionsAlthough the survey was conducted only four months after the avian influenza outbreak, epidemics were not perceived as a major threat by the study population. The results showed that in the event of a highly infectious disease, the population would comply with advice given by public authorities
Cluster Lenses
Clusters of galaxies are the most recently assembled, massive, bound
structures in the Universe. As predicted by General Relativity, given their
masses, clusters strongly deform space-time in their vicinity. Clusters act as
some of the most powerful gravitational lenses in the Universe. Light rays
traversing through clusters from distant sources are hence deflected, and the
resulting images of these distant objects therefore appear distorted and
magnified. Lensing by clusters occurs in two regimes, each with unique
observational signatures. The strong lensing regime is characterized by effects
readily seen by eye, namely, the production of giant arcs, multiple-images, and
arclets. The weak lensing regime is characterized by small deformations in the
shapes of background galaxies only detectable statistically. Cluster lenses
have been exploited successfully to address several important current questions
in cosmology: (i) the study of the lens(es) - understanding cluster mass
distributions and issues pertaining to cluster formation and evolution, as well
as constraining the nature of dark matter; (ii) the study of the lensed objects
- probing the properties of the background lensed galaxy population - which is
statistically at higher redshifts and of lower intrinsic luminosity thus
enabling the probing of galaxy formation at the earliest times right up to the
Dark Ages; and (iii) the study of the geometry of the Universe - as the
strength of lensing depends on the ratios of angular diameter distances between
the lens, source and observer, lens deflections are sensitive to the value of
cosmological parameters and offer a powerful geometric tool to probe Dark
Energy. In this review, we present the basics of cluster lensing and provide a
current status report of the field.Comment: About 120 pages - Published in Open Access at:
http://www.springerlink.com/content/j183018170485723/ . arXiv admin note:
text overlap with arXiv:astro-ph/0504478 and arXiv:1003.3674 by other author
Gravitational Waves from Gravitational Collapse
Gravitational wave emission from the gravitational collapse of massive stars
has been studied for more than three decades. Current state of the art
numerical investigations of collapse include those that use progenitors with
realistic angular momentum profiles, properly treat microphysics issues,
account for general relativity, and examine non--axisymmetric effects in three
dimensions. Such simulations predict that gravitational waves from various
phenomena associated with gravitational collapse could be detectable with
advanced ground--based and future space--based interferometric observatories.Comment: 68 pages including 13 figures; revised version accepted for
publication in Living Reviews in Relativity (http://www.livingreviews.org
Measurement of the top quark mass using the matrix element technique in dilepton final states
We present a measurement of the top quark mass in pp¯ collisions at a center-of-mass energy of 1.96 TeV at the Fermilab Tevatron collider. The data were collected by the D0 experiment corresponding to an integrated luminosity of 9.7 fb−1. The matrix element technique is applied to tt¯ events in the final state containing leptons (electrons or muons) with high transverse momenta and at least two jets. The calibration of the jet energy scale determined in the lepton+jets final state of tt¯ decays is applied to jet energies. This correction provides a substantial reduction in systematic uncertainties. We obtain a top quark mass of mt=173.93±1.84 GeV
Zinc Finger Recombinases with Adaptable DNA Sequence Specificity
Site-specific recombinases have become essential tools in genetics and molecular biology for the precise excision or integration of DNA sequences. However, their utility is currently limited to circumstances where the sites recognized by the recombinase enzyme have been introduced into the DNA being manipulated, or natural ‘pseudosites’ are already present. Many new applications would become feasible if recombinase activity could be targeted to chosen sequences in natural genomic DNA. Here we demonstrate efficient site-specific recombination at several sequences taken from a 1.9 kilobasepair locus of biotechnological interest (in the bovine β-casein gene), mediated by zinc finger recombinases (ZFRs), chimaeric enzymes with linked zinc finger (DNA recognition) and recombinase (catalytic) domains. In the "Z-sites" tested here, 22 bp casein gene sequences are flanked by 9 bp motifs recognized by zinc finger domains. Asymmetric Z-sites were recombined by the concomitant action of two ZFRs with different zinc finger DNA-binding specificities, and could be recombined with a heterologous site in the presence of a third recombinase. Our results show that engineered ZFRs may be designed to promote site-specific recombination at many natural DNA sequences
Conformational Dynamics of Single pre-mRNA Molecules During \u3cem\u3eIn Vitro\u3c/em\u3e Splicing
The spliceosome is a complex small nuclear RNA (snRNA)-protein machine that removes introns from pre-mRNAs via two successive phosphoryl transfer reactions. The chemical steps are isoenergetic, yet splicing requires at least eight RNA-dependent ATPases responsible for substantial conformational rearrangements. To comprehensively monitor pre-mRNA conformational dynamics, we developed a strategy for single-molecule FRET (smFRET) that uses a small, efficiently spliced yeast pre-mRNA, Ubc4, in which donor and acceptor fluorophores are placed in the exons adjacent to the 5′ and 3′ splice sites. During splicing in vitro, we observed a multitude of generally reversible time-and ATP-dependent conformational transitions of individual pre-mRNAs. The conformational dynamics of branchpoint and 3′-splice site mutants differ from one another and from wild type. Because all transitions are reversible, spliceosome assembly appears to be occurring close to thermal equilibrium
- …