283 research outputs found
Stochastic Weighted Graphs: Flexible Model Specification and Simulation
In most domains of network analysis researchers consider networks that arise
in nature with weighted edges. Such networks are routinely dichotomized in the
interest of using available methods for statistical inference with networks.
The generalized exponential random graph model (GERGM) is a recently proposed
method used to simulate and model the edges of a weighted graph. The GERGM
specifies a joint distribution for an exponential family of graphs with
continuous-valued edge weights. However, current estimation algorithms for the
GERGM only allow inference on a restricted family of model specifications. To
address this issue, we develop a Metropolis--Hastings method that can be used
to estimate any GERGM specification, thereby significantly extending the family
of weighted graphs that can be modeled with the GERGM. We show that new
flexible model specifications are capable of avoiding likelihood degeneracy and
efficiently capturing network structure in applications where such models were
not previously available. We demonstrate the utility of this new class of
GERGMs through application to two real network data sets, and we further assess
the effectiveness of our proposed methodology by simulating non-degenerate
model specifications from the well-studied two-stars model. A working R version
of the GERGM code is available in the supplement and will be incorporated in
the gergm CRAN package.Comment: 33 pages, 6 figures. To appear in Social Network
The postmodern auteur : a contradiction in terms?
This thesis proposes a new approach to film authorship that is compatible with the postmodern theory of Linda Hutcheon. By taking up, building on, and combining the work of Peter Wollen, Michel Foucault, and Will Brooker I develop a theory of film authorship that moves away from conceptualisations of the author in terms of self-expression and instead conceives of the author as a text. Additionally, I identify four different genres of authorfunction: The Romantic, modernist, feminist, and commercial genres of author-function. These four genres of author-function provide a framework and critical vocabulary for the accurate description of the ways in which author-texts are constructed. The characteristics of these four genres of author-function are derived from the major trends in theories of film authorship identified in the review of literature. In addition to these genres of author-function, I also develop my own postmodern genre of author-function. The characteristics of this postmodern genre of author-function are derived from the analysis of existing literature on two key directors of postmodern film, David Lynch and Quentin Tarantino. In particular, the postmodern genre of author-function adapts and expands upon Peter Brooker’s and Will Brooker’s affirmative reading of the role played by generic reworking in Tarantino’s Pulp Fiction (1994). The characteristics of the postmodern genre of author-function are further refined through its application as a critical framework in two case studies focusing on Tony Scott and Sally Potter.
Scott and Potter serve as contrasting case studies. In addition to operating in the very different contexts of Hollywood action cinema and art cinema respectively, Scott and Potter occupy very different positions in regards to authorship. The Scott author-text is largely constructed in terms of failed authorship. In contrast, the Potter author-text is apparently more secure in its authorial status. There are, however, a number of overlaps between the Scott and Potter case studies. Firstly, films across both the Scott and Potter oeuvres exhibit stylistic features associated with postmodern film. Despite this, Scott and Potter are not included within the central canon of postmodern cinema, and occupy a more marginal position. The Scott and Potter oeuvres are also characterised as fragmented and fractured rather than in terms of unity. This further limits the possibility of constructing Scott as an auteur and suggests that the Potter author-text is more precarious than at first appears.
The thesis opens with a review of literature tracing the developments of theories of film authorship. The first chapter begins by examining the place of authorship in postmodernism as conceptualised by two key theorists of postmodernism, Fredric Jameson and Linda Hutcheon. This is followed by the development of the new approach to authorship outlined above, and its demonstration through the meta-critical analysis of existing literature on Lynch and Tarantino. This analysis also facilitates the development of the postmodern genre of author-function and provides the initial characteristics of that genre. The postmodern genre of author-function is further refined and tested through the case studies. Each of these case studies follows a similar format, beginning by situating Scott and Potter in their respective contexts. The second stage of the case studies involves determining the genres of author-function in play in the construction of the Scott and Potter author-texts. The final stage of the case study focuses on the analysis of three films by each director from the perspective of the postmodern genre of author-function in order to determine what readings are yielded by this approach, and how they compare to existing approaches.
The development of a postmodern genre of author-function facilitates a revaluation of postmodern cinema. The Scott case study demonstrates one aspect of this reappraisal, the revaluation of texts previously classified as meaningless spectacle in terms of a re-inventive impulse and a critical reworking of genre conventions. The Potter case study demonstrates both the political and critical potential of such a de-constructive engagement with genre, while also showcasing the ways in which adopting the postmodern genre of author-function as a critical perspective allows for texts to be reorganised around a new centre, and for new patterns of meaning and significance to be traced across the oeuvre
Quantifying the top-down effects of grazers on a rocky shore: selective grazing and the potential for competition
The effect of grazers on the diversity, distribution, and composition of their principal food source has rarely been described for the high intertidal zone of rocky shores, a model system for studying the potential effects of climate change. Along rocky, wave-swept shores in central California, the microphytobenthos (MPB) supports diverse assemblages of limpets and littorine snails, which, at current benign temperatures, could potentially partition food resources in a complementary fashion, thereby enhancing secondary productivity. Two limpet species in particular, Lottia scabra and L. austrodigitalis, may partition components of the MPB, and are likely to affect the composition of the MPB on which they graze. In this study, we describe the composition, nutritional value (C:N ratio), and fluorescence (an index of chlorophyll density) of ungrazed, L. scabra-grazed and L. austrodigitalis-grazed MPB, each as a function of temperature. Fluorescence decreased with increased average daily maximum temperature for ungrazed MPB, but temperature had no discernible effects on either fluorescence or the composition of the MPB of grazed assemblages. L. austrodigitalis and L. scabra did not partition the MPB, and did not exhibit complementarity. Both species exhibited an ordered grazing scheme, in which limpets grazed down certain components of the MPB before others, and grazing increased the C:N ratio of the MPB, decreasing its nutritional value. Taken together, these results suggest that L. austrodigitalis and L. scabra may experience increased competition as warming temperatures reduce the available MP
Recommended from our members
Skills and Knowledge for Data-Intensive Environmental Research.
The scale and magnitude of complex and pressing environmental issues lend urgency to the need for integrative and reproducible analysis and synthesis, facilitated by data-intensive research approaches. However, the recent pace of technological change has been such that appropriate skills to accomplish data-intensive research are lacking among environmental scientists, who more than ever need greater access to training and mentorship in computational skills. Here, we provide a roadmap for raising data competencies of current and next-generation environmental researchers by describing the concepts and skills needed for effectively engaging with the heterogeneous, distributed, and rapidly growing volumes of available data. We articulate five key skills: (1) data management and processing, (2) analysis, (3) software skills for science, (4) visualization, and (5) communication methods for collaboration and dissemination. We provide an overview of the current suite of training initiatives available to environmental scientists and models for closing the skill-transfer gap
Statistical Modeling of the Default Mode Brain Network Reveals a Segregated Highway Structure
We investigate the functional organization of the Default Mode Network (DMN) - an important subnetwork within the brain associated with a wide range of higher-order cognitive functions. While past work has shown the whole-brain network of functional connectivity follows small-world organizational principles, subnetwork structure is less well understood. Current statistical tools, however, are not suited to quantifying the operating characteristics of functional networks as they often require threshold censoring of information and do not allow for inferential testing of the role that local processes play in determining network structure. Here, we develop the correlation Generalized Exponential Random Graph Model (cGERGM) - a statistical network model that uses local processes to capture the emergent structural properties of correlation networks without loss of information. Examining the DMN with the cGERGM, we show that, rather than demonstrating small-world properties, the DMN appears to be organized according to principles of a segregated highway - suggesting it is optimized for function-specific coordination between brain regions as opposed to information integration across the DMN. We further validate our findings through assessing the power and accuracy of the cGERGM on a testbed of simulated networks representing various commonly observed brain architectures
High-fidelity phase and amplitude control of phase-only computer generated holograms using conjugate gradient minimisation
Funding: Leverhulme Trust (RPG-2013-074); EPSRC (EP/G03673X/1; EP/L015110/1).We demonstrate simultaneous control of both the phase and amplitude of light using a conjugate gradient minimisation-based hologram calculation technique and a single phase-only spatial light modulator (SLM). A cost function, which incorporates the inner product of the light field with a chosen target field within a defined measure region, is efficiently minimised to create high fidelity patterns in the Fourier plane of the SLM. A fidelity of F = 0.999997 is achieved for a pattern resembling an LG01 mode with a calculated light-usage efficiency of 41.5%. Possible applications of our method in optical trapping and ultracold atoms are presented and we show uncorrected experimental realisation of our patterns with F = 0.97 and 7.8% light efficiency.Publisher PDFPeer reviewe
IVOA Recommendation: Sky Event Reporting Metadata Version 2.0
VOEvent defines the content and meaning of a standard information packet for
representing, transmitting, publishing and archiving information about a
transient celestial event, with the implication that timely follow-up is of
interest. The objective is to motivate the observation of
targets-of-opportunity, to drive robotic telescopes, to trigger archive
searches, and to alert the community. VOEvent is focused on the reporting of
photon events, but events mediated by disparate phenomena such as neutrinos,
gravitational waves, and solar or atmospheric particle bursts may also be
reported.
Structured data is used, rather than natural language, so that automated
systems can effectively interpret VOEvent packets. Each packet may contain zero
or more of the "who, what, where, when & how" of a detected event, but in
addition, may contain a hypothesis (a "why") regarding the nature of the
underlying physical cause of the event. Citations to previous VOEvents may be
used to place each event in its correct context. Proper curation is encouraged
throughout each event's life cycle from discovery through successive
follow-ups. VOEvent packets gain persistent identifiers and are typically
stored in databases reached via registries. VOEvent packets may therefore
reference other packets in various ways. Packets are encouraged to be small and
to be processed quickly. This standard does not define a transport layer or the
design of clients, repositories, publishers or brokers; it does not cover
policy issues such as who can publish, who can build a registry of events, who
can subscribe to a particular registry, nor the intellectual property issues
Heterodinuclear ruthenium(II)-cobalt(III) complexes as models for a new approach to selective cancer treatment
Heterodinuclear ruthenium(ii)-cobalt(iii) complexes have been prepared as part of investigations into a new approach to selective cancer treatment. A cobalt(iii) centre bearing amine ligands, which serve as models for cytotoxic nitrogen mustard ligands, is connected by a bridging ligand to a ruthenium(ii)-polypyridyl moiety. Upon excitation of the ruthenium centre by visible light, electron transfer to the cobalt(iii) centre results in reduction to cobalt(ii) and consequent release of its ligands. We have synthesised several such structures and demonstrated their ability to release ligands upon excitation of the ruthenium centre by visible light
Recommended from our members
Protein-coding variants implicate novel genes related to lipid homeostasis contributing to body-fat distribution.
Body-fat distribution is a risk factor for adverse cardiovascular health consequences. We analyzed the association of body-fat distribution, assessed by waist-to-hip ratio adjusted for body mass index, with 228,985 predicted coding and splice site variants available on exome arrays in up to 344,369 individuals from five major ancestries (discovery) and 132,177 European-ancestry individuals (validation). We identified 15 common (minor allele frequency, MAF ≥5%) and nine low-frequency or rare (MAF <5%) coding novel variants. Pathway/gene set enrichment analyses identified lipid particle, adiponectin, abnormal white adipose tissue physiology and bone development and morphology as important contributors to fat distribution, while cross-trait associations highlight cardiometabolic traits. In functional follow-up analyses, specifically in Drosophila RNAi-knockdowns, we observed a significant increase in the total body triglyceride levels for two genes (DNAH10 and PLXND1). We implicate novel genes in fat distribution, stressing the importance of interrogating low-frequency and protein-coding variants
- …