381 research outputs found

    Human rights protection for online activist groups: a legal analysis of the issues, frameworks and ways forward

    Get PDF
    This thesis entitled Human Rights Protection For Online Activist Groups: A Legal Analysis of the Issues, Frameworks and Ways Forward assesses how improved protection can be provided to online activist groups that operate within England and Wales. In light of technological and legislative developments that have resulted in various challenges for online groups and their essential rights. The principle aim of this thesis is to illustrate and assess the extent to which improved protection can be provided for online activist groups, both individuals members as well as suggesting the merit in recognising the group as an entity concurrently. Focusing on the rights of freedom of association and assembly, the objective is to demonstrate that the currently suggested protections afforded to groups under jurisdictional human rights frameworks are inadequate in a nature. Due to such factors as the unprecedented impact that surveillance technologies and contextual developments continue to have on this right. This thesis finds with contemplation of (a) challenges presented by the online environment such as increased surveillance and the utilisation of ‘Hidden Spaces’, (b) the politicised, contentious, and difficult nature of the governance of digital rights (c) planned future legislative innovation within England and Wales that any progress in ensuring that the rights of online activist groups are upheld is likely to be a problematic endeavour. With issues such as the criminality of actors arising, yet the responsibility lies with the instruments and related bodies themselves, rather than with the groups. Indicating that the best approach forward within this context is to recognise groups as separate rights holding entities, within the presented scope, as reducing the impact of the above-mentioned trajectory is unlikely to occur in the near future

    Standards of rider comfort: Noise, vibration and age of rider as factors

    Get PDF
    Psychological responses of bus passengers to noise and vibration in terms of ride quality are studied in a field test. An attempt is made to correlate passenger comfort ratings with the age factor

    Structure-kinetics relationships in micellar solutions of nonionic surfactants

    Get PDF
    Micellar surfactant solutions are highly complex systems containing aggregates of different shapes and sizes all in dynamic equilibrium. I have undertaken an investigation into the kinetic processes that occur in micellar surfactant solutions subjected to both bulk perturbations and close to expanding surfaces. Supporting information regarding the equilibrium properties of surfactant micelles has been acquired using several experimental techniques including small-angle neutron scattering (SANS) and pulsed field gradient spin echo (PFGSE) nmr. Bulk exchange kinetics between micelles and monomers in solution have been investigated using both numerical modelling and stopped-flow dilution experiments. My results show that conventional theories of monomer-micelle exchange kinetics apply only under very limited conditions. In order to understand how micellesolutions respond to large perturbations from equilibrium a different approach is required. I have hypothesised an alternative monomer-micelle exchange mechanism. This hypothesis has been tested using numerical modelling and comparison of theoretical predictions with the results of stopped-flow perturbation experiments. These experimental results are consistent with my hypothesis. In addition to bulk exchange kinetics, I have also undertaken a detailed experimental investigation of adsorption kinetics from micellar systems on the millisecond timescale. Again my results indicate that conventional theoretical approaches are incomplete and I suggest an alternative adsorption pathway that should be included in future theories of adsorption from micellar surfactant solution

    Building mega-science: A systems engineering tool for the Square Kilometre Array

    Get PDF
    The Square Kilometre Array (SKA) will be the largest radio telescope in the world, with an aperture of up to one million square metres, due to be operational by 2022 at a cost estimated at 1.5 billion euros (2007). Designing a flexible instrument such as the SKA is a long-term task and requires a systems approach with inputs from both engineering and science specialists. Cost and performance modelling, and subsequent optimisation, is central to building the radio telescope. Curtin is taking a lead role in this process, and we present here a custom developed systems engineering tool that is being used in the design phase of the SKA. We outline how such a tool is being used to illuminate the performance and cost trade-offs required for this complex mega-science project reliant on emerging technologies to achieve its scientific goals. We also present some simple design decisions resulting from these trade-offs, saving hundreds of millions of euros

    System design for the square kilometre array : new views of the universe

    Get PDF
    The Square Kilometre Array (SKA) radio telescope is being designed as a premier scientific instrument of the 21st century, using novel technologies to maximise its scientific capability. The SKA has an aggressive project timeline, dynamic and evolving scientific requirements, and a large design exploration space with many interdependent sub-systems. These complexities increase the difficulty in developing cost-effective design solutions that maximise the scientific capability of the telescope within construction and operations funding constraints.To gain insight into specific design challenges in this thesis, I have developed parametric models of the telescope system that relate cost to key performance metrics. I examine, as case studies, three aspects of the SKA design that have had little investigation compared to the rest of the telescope to date, but show considerable potential for discovering new astronomical phenomena.First, I present fast transient survey strategies for exploring high time resolution parameter space, and consider the system design implications of these strategies. To maximise the scientific return from limited processing capacity, I develop a new metric, ‘event rate per beam’, to measure the cost-effectiveness of the various search strategies. The most appropriate search strategy depends on the observed sky direction and the source population; for SKA Phase 1, low-frequency aperture arrays tend to be more effective for extragalactic searches, and dishes more effective for directions of increased scatter broadening, such as near the Galactic plane.Second, I compare the cost of two design solutions for low-frequency sparse aperture array observations (70–450 MHz) that achieve similar performance: a single-band implementation with a wideband antenna design; and a dual-band implementation, with each array observing approximately half the fractional bandwidth. Perhaps somewhat surprisingly, despite the dual-band array having twice the number of antenna elements, neither a representative single or dual-band implementation is cheaper a priori, although the uncertainties are currently high. In terms of the broader telescope system design, I show that the central processing, antenna deployment and site preparation costs are potentially significant cost drivers that have so far had insufficient attention.Third, the recent site decision gives rise to the question of how to cost-effectively provide data connectivity to widely separated antennas, to enable high angular resolution observations with the SKA dish array in Africa. To facilitate the design of such a data network, I parametrise the performance and cost of an exemplar network using three simple metrics: maximum baseline length; number of remote stations (grouped antennas) on long baselines; and the product of bandwidth and number of station beams. While all three metrics are cost drivers, limiting the beam–bandwidth product reduces cost without significantly impacting scientific performance.The complexities of the SKA design environment prevent straightforward analyses of cost-effective design solutions. However, the case studies in this thesis demonstrate the importance of parametric performance and cost modelling of the telescope system in determining cost-effective design solutions that are capable of revealing large regions of unexplored parameter space in the radio Universe. The analytical approach to requirements analysis and performance-cost modelling, combined with pragmatic choices to narrow the exploration space, yields new insights into cost-effective SKA designs. Continuation of this approach will be essential to successfully integrate the forthcoming results from various verifications systems into the SKA design over the next few years

    Cost-effective aperture arrays for SKA Phase 1: single or dual-band?

    Full text link
    An important design decision for the first phase of the Square Kilometre Array is whether the low frequency component (SKA1-low) should be implemented as a single or dual-band aperture array; that is, using one or two antenna element designs to observe the 70-450 MHz frequency band. This memo uses an elementary parametric analysis to make a quantitative, first-order cost comparison of representative implementations of a single and dual-band system, chosen for comparable performance characteristics. A direct comparison of the SKA1-low station costs reveals that those costs are similar, although the uncertainties are high. The cost impact on the broader telescope system varies: the deployment and site preparation costs are higher for the dual-band array, but the digital signal processing costs are higher for the single-band array. This parametric analysis also shows that a first stage of analogue tile beamforming, as opposed to only station-level, all-digital beamforming, has the potential to significantly reduce the cost of the SKA1-low stations. However, tile beamforming can limit flexibility and performance, principally in terms of reducing accessible field of view. We examine the cost impacts in the context of scientific performance, for which the spacing and intra-station layout of the antenna elements are important derived parameters. We discuss the implications of the many possible intra-station signal transport and processing architectures and consider areas where future work could improve the accuracy of SKA1-low costing.Comment: 64 pages, 23 figures, submitted to the SKA Memo serie

    On the predictions and limitations of the BeckerDoring model for reaction kinetics in micellar surfactant solutions

    Get PDF
    We investigate the breakdown of a system of micellar aggregates in a surfactant solution following an order-one dilution. We derive a mathematical model based on the Becker–Döring system of equations, using realistic expressions for the reaction constants fit to Molecular Dynamics simulations. We exploit the largeness of typical aggregation numbers to derive a continuum model, substituting a large system of ordinary differential equations for a partial differential equation in two independent variables: time and aggregate size. Numerical solutions demonstrate that re-equilibration occurs in two distinct stages over well-separated time-scales, in agreement with experiment and with previous theories. We conclude by exposing a limitation in the Becker–Döring theory for re-equilibration and discuss potential resolutions

    The comparative cytotoxicity of riddelliine in primary mouse, rat and chick hepatocytes

    Get PDF
    Dehydropyrrolizidine alkaloid (DHPA) producing plants commonly poison livestock, wildlife and humans. Poisoning occurs when DHPAs are ingested as feed or food, or when they contaminate medicinal or herbal products. Direct toxicologic comparison of individual DHPAs is essential to estimate their actual health risks. This has been problematic due to varying models and difficulties in DHPA isolation or synthesis. In contrast, the macrocyclic DHPA riddelliine is readily isolated and it has been used as a benchmark to characterize different models of toxicity and carcinogenicity. Following earlier work with immortalized cell lines, the objective of this study was to characterize the effect of riddelliine on primary mouse, rat and chick hepatocyte cultures with the aim of developing a suitable, sensitive model for assessing DHPA-related cytotoxicity. After establishing viable cultures, the hepatocytes were exposed for 24 hours to riddelliine (from 0.1µM to 1.2mM) and cytotoxicity (CT­­50) was estimated using a mitochondrial function assay (MTT). Despite a biphasic response, possibly attributable to a sub-population of resistant chick hepatocytes, chick hepatocyte cultures were highly sensitive (CT50 0.9 µM) to riddelliine cytotoxicity relative to rat (CT50 289 µM) and mouse (CT50 627 µM) hepatocytes. Chick, mouse and rat hepatocyte cytochrome P450 3A4 activities did not correlate with riddelliine-induced cytotoxicity. With further development to utilize the highly sensitive primary chick hepatocytes, this model may be useful to directly compare panels of DHPAs, including rare or difficult to isolate alkaloids

    A genomic approach to understanding the cause and effect of annual ryegrass toxicity

    Full text link
    Annual Ryegrass Toxicity (ARGT) is a potentially lethal disease affecting livestock grazing on pastures or consuming fodder that include annual ryegrass (Lolium rigidum) contaminated with corynetoxins. The corynetoxins (CTs), among the most lethal toxins produced in nature, are produced by the bacterium Rathayibacter toxicus that uses a nematode vector to attach to and infect the seedheads of L.rigidum. There is little known of the factors that control toxin production. Several studies have speculated that a bacteriophage specific to R.toxicus may be implicated in CT production. We have developed a PCR-based assay to test for both bacterium and phage in ryegrass material and results indicate that there is a correlation between phage and bacterial presence in all toxic ryegrass samples tested so far. This PCR-based technique may ultimately allow for a rapid, high-throughput screening assay to identify potentially toxic pastures and feed in the field. Currently, ~80% of the 45 Kb genome has been sequenced an investigation to further elucidate its potential role in toxin production.Furthermore, specific alterations in gene expression as a result of exposure to CTs or the closely related tunicamycins (TMs), which are commercially available and considered biologically indistinguishable from CTs, will be evaluated for use as biomarkers of exposure. The effects of both toxins will be analysed in vitro using a rat hepatocyte cell line and screened on a low-density DNA micro array &ldquo;CT-Chip&rdquo; that contains &lt;100 selected rat hepatic genes. The results are expected to further define the bioequivalence of CTs and TMs and to identify levels of exposure that are related to specific toxic effects or have no adverse effect on livestock.<br /

    Lessons from pandemic influenza A(H1N1): The research-based vaccine industry's perspective

    Get PDF
    AbstractAs A(H1N1) influenza enters the post-pandemic phase, health authorities around the world are reviewing the response to the pandemic. To ensure this process enhances future preparations, it is essential that perspectives are included from all relevant stakeholders, including vaccine manufacturers. This paper outlines the contribution of R&D-based influenza vaccine producers to the pandemic response, and explores lessons that can be learned to improve future preparedness.The emergence of 2009 A(H1N1) influenza led to unprecedented collaboration between global health authorities, scientists and manufacturers, resulting in the most comprehensive pandemic response ever undertaken, with a number of vaccines approved for use three months after the pandemic declaration. This response was only possible because of the extensive preparations undertaken during the last decade.During this period, manufacturers greatly increased influenza vaccine production capacity, and estimates suggest a further doubling of capacity by 2014. Producers also introduced cell-culture technology, while adjuvant and whole virion technologies significantly reduced pandemic vaccine antigen content. This substantially increased pandemic vaccine production capacity, which in July 2009 WHO estimated reached 4.9 billion doses per annum. Manufacturers also worked with health authorities to establish risk management plans for robust vaccine surveillance during the pandemic. Individual producers pledged significant donations of vaccine doses and tiered-pricing approaches for developing country supply.Based on the pandemic experience, a number of improvements would strengthen future preparedness. Technical improvements to rapidly select optimal vaccine viruses, and processes to speed up vaccine standardization, could accelerate and extend vaccine availability. Establishing vaccine supply agreements beforehand would avoid the need for complex discussions during a period of intense time pressure.Enhancing international regulatory co-operation and mutual recognition of approvals could accelerate vaccine supply, while maintaining safety standards. Strengthening communications with the public and healthcare workers using new approaches and new channels could help improve vaccine uptake. Finally, increasing seasonal vaccine coverage will be particularly important to extend and sustain pandemic vaccine production capacity
    • …
    corecore