990 research outputs found

    Beyond Provenance

    Get PDF
    "Human intentionality in chemical patterns in Bronze Age metals For the last 180 years, scientists have been attempting to determine the ‘provenance’ (geological source) of the copper used in Bronze Age artefacts. However, despite advances in analytical technologies, the theoretical approach has remained virtually unchanged over this period, with the interpretative methodology only changing to accommodate the increasing capacity of computers. This book represents a concerted effort to think about the composition of Bronze Age metal as the product of human intentionality as well as of geology. It considers the trace element composition of the metal, the alloying elements, and the lead isotopic composition, showing how a combination of these aspects, along with archaeological context and typology, can reveal much more about the life history of such artefacts, expanding considerably upon the rather limited ambition of knowing where the ore was extracted. Beyond Provenance serves as a ‘how-to handbook’ for those wishing to look for evidence of human intentionality in the chemical patterning observed in bronzes.

    Making Megaliths: Shifting and Unstable Stones in the Neolithic of the Avebury Landscape

    Get PDF
    This paper focuses upon the web of practices and transformations bound up in the extraction and movement of megaliths during the Neolithic of southern Britain. The focus is on the Avebury landscape of Wiltshire, where over 700 individual megaliths were employed in the construction of ceremonial and funerary monuments. Locally-sourced, little consideration has been given to the process of acquisition and movement of sarsen stones that make up key monuments such as the Avebury henge and its avenues; attention instead focussing on the middle-distance transportation of sarsen out of this region to Stonehenge. Though stone movements were local, we argue they were far from lacking in significance, as indicated by the subsequent monumentalization of at least two locations from which they were likely acquired. We argue that since such stones embodied place(s);their removal, movement and resetting represented a remarkably dynamic and potentially disruptive reconfiguration of the world as it was known. Megaliths were never inert or stable matter, and we need to embrace this in our interpretative accounts if we are to understand the very different types of monument that emerged in prehistory as a result

    Authenticity, Artifice and the Druidical Temple of Avebury

    Get PDF
    This paper engages with the legacy of a prehistoric monument – the Avebury henge, in southern England – and the influential work of an early antiquarian – William Stukeley. We highlight how the reception of Stukeley’s 1743 work, Abury: a temple of the British druids, has structured images of Avebury and shaped the authenticity claims of later scholars, artists and religious groups. In biographical terms, Stukeley’s carefully crafted Abury has possessed a very active afterlife, its status shifting from that of primary record (of Avebury), to a form of constructionalblueprint (for Avebury), to a partial and flawed primary record (of an Avebury), only to end up for some as an unassailable and defijinitive record (of the Avebury). At the centre of this narrative is the status of Abury as a material agent around which various authenticity claims have been constructed

    Delivering a toolbox of flexible platforms for clinical and commercial bioprocessing production: ‘Defining the business drivers for development and implementation’

    Get PDF
    Despite the growing success, the biopharmaceutical industry continues to face competitive challenges from multiple sources. The cost pressures include evolving reimbursement, global competition and loss of drug exclusivity. As a result, there is a significant drive to boost the overall productivity of bio therapeutic programs by shortening development timelines and lowering both development and production costs, while maintaining product quality. Low cost production solutions must be aggressively pursued due to the large number of drug candidates in development, and their relatively high dosing requirements. The industry is facing an expanding range of modalities such as bispecifics and nanobodies, that provide a more heterogeneous product pipeline and a wider range of product demand (kg/yr). In addition supply chains need to be more responsive to the patient needs in a more personalized approach. These industry challenges need flexible solutions that can provide agility and lower cost. The presentation will discuss the business case, development and implementation of such a toolbox of low cost flexible platform solutions to meet a range of scenarios faced in clinical development, that also provide a line of sight to commercial. Examples will include the simple single use fed batch process for low demand processes versus advanced integrated/continuous automated processing for higher demands. The implementation strategy through the clinical phases to commercial will be discussed for commodity mAb production using a fully automated continuous process with product attribute control and real time release. This provides a supply responsive approach to rapid changes in demand to provide a ‘supply on demand’ process. This production synchronization should provide a responsive approach to changing drug demand, shorten clinical and commercial timelines and minimize inventory costs. These cost reduction initiatives, in combination with regional manufacturing, should help to expand patient accessibility to biologics and vaccines

    Between analysis and transformation: technology, methodology and evaluation on the SPLICE project

    Get PDF
    This paper concerns the ways in which technological change may entail methodological development in e-learning research. The focus of our argument centres on the subject of evaluation in e-learning and how technology can contribute to consensus-building on the value of project outcomes, and the identification of mechanisms behind those outcomes. We argue that a critical approach to the methodology of evaluation which harnesses technology in this way is vital to agile and effective policy and strategy-making in institutions as the challenges of transformation in a rapidly changing educational and technological environment are grappled with. With its focus on mechanisms and multiple stakeholder perspectives, we identify Pawson and Tilley’s ‘Realistic Evaluation’ as an appropriate methodological approach for this purpose, and we report on its use within a JISC-funded project on social software, SPLICE (Social Practices, Learning and Interoperability in Connected Environments). The project created new tools to assist the identification of mechanisms responsible for change to personal and institutional technological practice. These tools included collaborative mind-mapping and focused questioning, and tools for the animated modelling of complex mechanisms. By using these tools, large numbers of project stakeholders could engage in a process where they were encouraged to articulate and share their theories and ideas as to why project outcomes occurred. Using the technology, this process led towards the identification and agreement of common mechanisms which had explanatory power for all stakeholders. In conclusion, we argue that SPLICE has shown the potential of technologically-mediated Realistic Evaluation. Given the technologies we now have, a methodology based on the mass cumulation of stakeholder theories and ideas about mechanisms is feasible. Furthermore, the summative outcomes of such a process are rich in explanatory and predictive power, and therefore useful to the immediate and strategic problems of the sector. Finally, we argue that as well as generating better explanations for phenomena, the evaluation process can itself become transformative for stakeholders

    Protein Refinery Operations Lab (PRO Lab): A sandbox for continuous protein production & advanced process control

    Get PDF
    Significant strides towards implementation of continuous bioprocessing are being made at an ever increasing rate. Advances in technology for traditional unit operations such as cell-retention devices in perfusion cell culture, continuous multi-column chromatography (CMCC) and single-pass tangential flow filtration have led to demonstrations of both semi-continuous and fully-continuous protein production processes operating at periodic steady states at the pilot-scale. Previous proof of concept work at Merck & Co., Inc. has shown an automated (DeltaV) and single-use monoclonal antibody (mAb) purification scheme through Protein A CMCC and pH viral inactivation with minimal human interaction for 30 days fed from a perfusion bioreactor1. This automation scheme has since been expanded to encompass an integrated mAb upstream and platform downstream process, resulting in an entirely automated ‘protein refinery’ sandbox. In this presentation a vision for a continuous bioprocessing facility of the future will be presented wherein the integration of Process Analytical Technologies (PAT), Multivariate Data Analysis, (MVDA), and feedback control strategies will lead to more streamlined plant operations and high product quality consistency. A discussion of how the control strategies put into place in PRO Lab lays the groundwork for this vision and how PRO Lab will be used to pilot PAT, MVDA, and feedback control as they become mature enough for integration into the continuous platform will be provided. These tools, working together, and validated in the sandbox environment, will ultimately enable real-time-release of drug substance. PRO Lab will also enable better holistic process understanding by enabling perturbation analysis and propagation throughout the production line. Process and product quality consistency data through a period of \u3e30days will be presented from PRO Lab as an initial step towards toward the ultimate vision of an automated well-controlled, well characterized protein refinery

    Effective risk governance for environmental policy making: a knowledge management perspective

    Get PDF
    Effective risk management within environmental policy making requires knowledge on natural, economic and social systems to be integrated; knowledge characterised by complexity, uncertainty and ambiguity. We describe a case study in a (UK) central government department exploring how risk governance supports and hinders this challenging integration of knowledge. Forty-five semi-structured interviews were completed over a two year period. We found that lateral knowledge transfer between teams working on different policy areas was widely viewed as a key source of knowledge. However, the process of lateral knowledge transfer was predominantly informal and unsupported by risk governance structures. We argue this made decision quality vulnerable to a loss of knowledge through staff turnover, and time and resource pressures. Our conclusion is that the predominant form of risk governance framework, with its focus on centralised decision-making and vertical knowledge transfer is insufficient to support risk-based, environmental policy making. We discuss how risk governance can better support environmental policy makers through systematic knowledge management practices

    Supervised Distance Matrices: Theory and Applications to Genomics

    Get PDF
    We propose a new approach to studying the relationship between a very high dimensional random variable and an outcome. Our method is based on a novel concept, the supervised distance matrix, which quantifies pairwise similarity between variables based on their association with the outcome. A supervised distance matrix is derived in two stages. The first stage involves a transformation based on a particular model for association. In particular, one might regress the outcome on each variable and then use the residuals or the influence curve from each regression as a data transformation. In the second stage, a choice of distance measure is used to compute all pairwise distances between variables in this transformed data. When the outcome is right-censored, we show that the supervised distance matrix can be consistently estimated using inverse probability of censoring weighted (IPCW) estimators based on the mean and covariance of the transformed data. The proposed methodology is illustrated with examples of gene expression data analysis with a survival outcome. This approach is widely applicable in genomics and other fields where high-dimensional data is collected on each subject

    Resampling-based Multiple Testing: Asymptotic Control of Type I Error and Applications to Gene Expression Data

    Get PDF
    We define a general statistical framework for multiple hypothesis testing and show that the correct null distribution for the test statistics is obtained by projecting the true distribution of the test statistics onto the space of mean zero distributions. For common choices of test statistics (based on an asymptotically linear parameter estimator), this distribution is asymptotically multivariate normal with mean zero and the covariance of the vector influence curve for the parameter estimator. This test statistic null distribution can be estimated by applying the non-parametric or parametric bootstrap to correctly centered test statistics. We prove that this bootstrap estimated null distribution provides asymptotic control of most type I error rates. We show that obtaining a test statistic null distribution from a data null distribution, e.g. projecting the data generating distribution onto the space of all distributions satisfying the complete null), only provides the correct test statistic null distribution if the covariance of the vector influence curve is the same under the data null distribution as under the true data distribution. This condition is a weak version of the subset pivotality condition. We show that our multiple testing methodology controlling type I error is equivalent to constructing an error-specific confidence region for the true parameter and checking if it contains the hypothesized value. We also study the two sample problem and show that the permutation distribution produces an asymptotically correct null distribution if (i) the sample sizes are equal or (ii) the populations have the same covariance structure. We include a discussion of the application of multiple testing to gene expression data, where the dimension typically far exceeds the sample size. An analysis of a cancer gene expression data set illustrates the methodology
    corecore