77 research outputs found

    Is Consciousness Computable? Quantifying Integrated Information Using Algorithmic Information Theory

    Get PDF
    In this article we review Tononi's (2008) theory of consciousness as integrated information. We argue that previous formalizations of integrated information (e.g. Griffith, 2014) depend on information loss. Since lossy integration would necessitate continuous damage to existing memories, we propose it is more natural to frame consciousness as a lossless integrative process and provide a formalization of this idea using algorithmic information theory. We prove that complete lossless integration requires noncomputable functions. This result implies that if unitary consciousness exists, it cannot be modelled computationally.Comment: Maguire, P., Moser, P., Maguire, R. & Griffith, V. (2014). Is consciousness computable? Quantifying integrated information using algorithmic information theory. In P. Bello, M. Guarini, M. McShane, & B. Scassellati (Eds.), Proceedings of the 36th Annual Conference of the Cognitive Science Society. Austin, TX: Cognitive Science Societ

    Is Consciousness Computable? Quantifying Integrated Information Using Algorithmic Information Theory

    Get PDF
    In this article we review Tononi’s (2008) theory of consciousness as integrated information. We argue that previous formalizations of integrated information (e.g. Griffith, 2014) depend on information loss. Since lossy integration would necessitate continuous damage to existing memories, we propose it is more natural to frame consciousness as a lossless integrative process and provide a formalization of this idea using algorithmic information theory. We prove that complete lossless integration requires noncomputable functions. This result implies that if unitary consciousness exists, it cannot be modelled computationally

    Growing and testing mycelium bricks as building insulation materials

    Get PDF
    In order to improve energy performance of buildings, insulation materials (such as mineral glass and rock wools, or fossil fuel-based plastic foams) are being used in increasing quantities, which may lead to potential problem with materials depletions and landfill disposal. One sustainable solution suggested is the use of bio-based, biodegradable materials. A number of attempts have been made to develop biomaterials, such as sheep wood, hemcrete or recycled papers. In this paper, a novel type of bio insulation materials ? mycelium is examined. The aim is to produce mycelium materials that could be used as insulations. The bio-based material was required to have properties that matched existing alternatives, such as expanded polystyrene, in terms of physical and mechanical characteristics but with an enhanced level of biodegradability. The testing data showed mycelium bricks exhibited good thermal performance. Future work is planned to improve growing process and thermal performance of the mycelium brickspublishersversionPeer reviewe

    The North Wyke Farm Platform: Methodologies Used in the Remote Sensing of the Quantity and Quality of Drainage Water

    Get PDF
    The North Wyke Farm Platform(NWFP) for agri-environmental research in temperate grassland was established in the UK in 2010 (Orr et al. 2011). Here we describe the instrumentation and methodologies used to monitor the quantity and quality of drainage water at a total of 15 H-flumes draining 5 sub-catchments within three farmlets. Each of 15 flume laboratories is supplied with 3 kW of mains power and connected to both fibre optic and UHF (Ultra High Frequency) radio networks for data exchange. The radio data network also provides telemetry for rain gauges and soil temperature/moisture probes located away from the flumes and within the catchment blocks. Water flow is measured using bubbler flow meters and when flow is above a defined threshold level, water is pumped into bespoke 13-litre stainless steel bypass cells on a 15-minute cycle using bi-directional peristaltic pumps. A range of sensors located within the bypass cells measure the following water quality parameters: nitrate, ammonium, dissolved organic carbon, temperature, conductivity, turbidity, pH and dissolved oxygen. Total phosphorus and ortho phosphorus are measured at one flume in each farmlet. Networked auto-samplers are also provided at each flume site for the measurement of other wa-ter quality parameters as required. All data are logged and sent to a dedicated server at a 15 min resolution while a web front end allows advanced visualization capabilities and remote control of the entire system. The system is configured to allow for flexibility and future expansion to a wider range of parameters

    The North Wyke Farm Platform: A New UK National Capability for Research into Sustainability of Agricultural Temperate Grassland Management

    Get PDF
    The North Wyke Farm Platform is a new UK National Capability that will enable studies that can be closely monitored and controlled under different land-use options at the farm-scale. As a Biotechnology and Biological Sciences Research Council-funded National Capability, the Farm Platform provides centralised scientific facilities including core data (field and water chemistry, water flow rates, greenhouse gas emissions from soils, livestock and agronomic data, and farm management records). Access to the Farm Platform for experimental work or to data will be available to other research users and collaborators. This shared approach will enhance the depth and breadth of information gained for the benefit of the wider community

    In Silico Derivation of HLA-Specific Alloreactivity Potential from Whole Exome Sequencing of Stem Cell Transplant Donors and Recipients: Understanding the Quantitative Immuno-biology of Allogeneic Transplantation

    Get PDF
    Donor T cell mediated graft vs. host effects may result from the aggregate alloreactivity to minor histocompatibility antigens (mHA) presented by the HLA in each donor-recipient pair (DRP) undergoing stem cell transplantation (SCT). Whole exome sequencing has demonstrated extensive nucleotide sequence variation in HLA-matched DRP. Non-synonymous single nucleotide polymorphisms (nsSNPs) in the GVH direction (polymorphisms present in recipient and absent in donor) were identified in 4 HLA-matched related and 5 unrelated DRP. The nucleotide sequence flanking each SNP was obtained utilizing the ANNOVAR software package. All possible nonameric-peptides encoded by the non-synonymous SNP were then interrogated in-silico for their likelihood to be presented by the HLA class I molecules in individual DRP, using the Immune-Epitope Database (IEDB) SMM algorithm. The IEDB-SMM algorithm predicted a median 18,396 peptides/DRP which bound HLA with an IC50 of <500nM, and 2254 peptides/DRP with an IC50 of <50nM. Unrelated donors generally had higher numbers of peptides presented by the HLA. A similarly large library of presented peptides was identified when the data was interrogated using the Net MHCPan algorithm. These peptides were uniformly distributed in the various organ systems. The bioinformatic algorithm presented here demonstrates that there may be a high level of minor histocompatibility antigen variation in HLA-matched individuals, constituting an HLA-specific alloreactivity potential. These data provide a possible explanation for how relatively minor adjustments in GVHD prophylaxis yield relatively similar outcomes in HLA matched and mismatched SCT recipients.Comment: Abstract: 235, Words: 6422, Figures: 7, Tables: 3, Supplementary figures: 2, Supplementary tables:

    Determination of Total and Bioavailable Soil Lead from a Shooting Range in Central California.

    Get PDF
    Lead can pose a significant risk to environmental quality at and around shooting ranges due to its use in bullets and shot. The concentrations of Pb in soils, plants and surficial waters from a shooting range were determined in this study. Soil and plant samples were analyzed for total Pb (US EPA method 3050a) to determine the extent of Pb contamination. The toxicity characteristic leach procedure (TCLP; US EPA method 1311) was followed to ascertain bioavailable Pb. Soil samples ranged from 14.71 to 6346.15 mg Pb kg-1 soil with an average value of 1157.43 (±2000.57) mg Pb kg-1 soil across the shooting range. Plant samples ranged from 632.76 to 2896.00 mg Pb kg-1 plant with an average value of 1410.31 (±1287.11) mg Pb kg-1 plant, demonstrating significant Pb uptake. Bioavailable Pb was highest in the berm at 2038.00 mg Pb kg-1 soil. Sampling at depth showed Pb concentrations of 72.92 mg Pb kg-1 soil. When compared to surface samples (897.96 mg Pb kg-1), this shows some Pb is leaching through the profile. High Pb concentrations were detected in soil samples collected from the drainage (457.84 mg Pb kg-1), while low Pb levels were detected in the stormwater retention pond and sediments (0.11 mg Pb L-1 and 39.36 mg Pb kg-1 respectively). This indicates Pb is being transported through erosion of soil colloids. Elevated Pb levels from soil sampled in the drainage indicate most Pb present is attached to soil colloids and not free (Pb+2) to leach or runoff. Higher concentrations of Pb were detected in plant samples than extracted by TCLP bioavailable Pb estimations. This could present a problem for any pastoral activities and should come under further scrutiny

    ScotGrid: Providing an Effective Distributed Tier-2 in the LHC Era

    Get PDF
    ScotGrid is a distributed Tier-2 centre in the UK with sites in Durham, Edinburgh and Glasgow. ScotGrid has undergone a huge expansion in hardware in anticipation of the LHC and now provides more than 4MSI2K and 500TB to the LHC VOs. Scaling up to this level of provision has brought many challenges to the Tier-2 and we show in this paper how we have adopted new methods of organising the centres, from fabric management and monitoring to remote management of sites to management and operational procedures, to meet these challenges. We describe how we have coped with different operational models at the sites, where Glagsow and Durham sites are managed "in house" but resources at Edinburgh are managed as a central university resource. This required the adoption of a different fabric management model at Edinburgh and a special engagement with the cluster managers. Challenges arose from the different job models of local and grid submission that required special attention to resolve. We show how ScotGrid has successfully provided an infrastructure for ATLAS and LHCb Monte Carlo production. Special attention has been paid to ensuring that user analysis functions efficiently, which has required optimisation of local storage and networking to cope with the demands of user analysis. Finally, although these Tier-2 resources are pledged to the whole VO, we have established close links with our local physics user communities as being the best way to ensure that the Tier-2 functions effectively as a part of the LHC grid computing framework..Comment: Preprint for 17th International Conference on Computing in High Energy and Nuclear Physics, 7 pages, 1 figur
    • …
    corecore