920 research outputs found

    The Use of Historical Data in Rule-Based Modelling for Scenarios to Improve Resilience within the Building Stock

    Get PDF
    Digital documentation has become integral to the preservation, analysis and communication of historical sites. New platforms are now being developed that involve complex 3D models and allow the analysis of spatial data. These include procedural modelling, a technique that enables the rapid development of ‘dynamic’ 3D environments, and generation of simulations for entire cities, resulting in low cost, high resolution 3D city models. Though procedural modelling has been used in the context of archaeology to ‘recreate’ cities at specific historic time points, the use of historical data in the development of rule-based procedural models for current cities has been little explored. Here, we test the extent to which construction age data, historical building regulations and architectural knowledge can be used in the generation of procedural rules, and the level of detail and potential impact that these models may have. Rather than creating an accurate representation of the city, we instead seek to simulate the way in which urban areas are likely to behave under certain conditions, in order to test what-if? planning scenarios. This allows us to explore more flexible ways of digitally ‘creating’ cities, past and present, and to gain insights into underlying ‘rules’ that govern their physical form

    Redesigning Experience Consumption in Social VR Worlds : Decentralised Value Creation, Mobilisation, and Exchanges

    Get PDF
    Virtual reality technology based corporates have been developing user-driven open markets for over a decade. The most noticeable initial player was Linden Lab, the service provider of Second Life that launched its metaverse world in 2003. The main features of the service were collaborative VR creation interfaces, individual asset management systems, and the virtual currency named Linden dollar. By the main interaction/interface structure, the residents of Second Life could create a pixel world of their own imagination, reserve the value of digital experiences, and exchange the value of imagination, and experience individually or collectively. A decade of life experience of the virtual world gave us lessons of how people interact, communicate, and evaluate virtual goods and experiences. The recent HMD technology emerged the second round of consumer-based Social VR platform race that has become more immersive, realistic, and user- centred. Relating to the technological leap, recent appeals of Social VR platforms have drawn a great attention from public: Project Sansar of Linden Lab, High Fidelity, and AltSpaceVR to name but a few. The social VR platforms commonly installed co-presence, avatar embodiments, real-time collaborations, and communications over virtual spaces. Project Sansar has conceptually inherited the idea of “the world by residents” from Second Life, its virtual monetisation pilot system. Project Sansar launched creators preview in late 2016, and recruited 3D builders among Second Life business owners, who have potentiality of opening a new business in the Project Sansar platform. After over a year of creators preview operation, it tackled major technical issues of building worlds over a HMD enabled VR system, and key storefront features of a consumer based creation system. Apart from their day-to-day trouble shootings, it raised a couple of questions on critical facets of consumer based creation systems. First, the crowdsourced world building needs a common ground of the planning body. How do they build the common ground to give a form of imagination? Even though they run an extremely peripheral structure, ultimately they need an organizational steering that maintains rules, and actions. In short, how can residents share common architecture in a speculative nature? Eventually how does it transit to a planning, designing, building and consuming cycle? To address these research questions, we conducted a design game research that gathered data from design game participants who were second and third year students (N = 7) in an interaction design course. The participatory design methodology installed a group simulation for creativity that appeared the influences of immersion, and collaboration to a collective design quality. The research results highlighted the key elements of user-driven innovation in the virtual world and consumer based platforms. It gives insightful design guidelines for consumer based virtual reality services those are targeting decentralised monetary system developments in a practical perspective. Theoretically, it has a potential contribution to the complexity area that assimilates innovation models for multi-agents in technical sectors

    Living with a Digital Twin: Operational management and engagement using IoT and Mixed Realities at UCL's Here East Campus on the Queen Elizabeth Olympic Park

    Get PDF
    The concept of Digital Twin is becoming increasingly popular with researchers and professionals in the AEC industries as a means of visualising, modelling and working with complex urban systems. This is achieved through the coupling of physical systems with comprehensive digital representations that automatically update to match the state of their physical counterpart. Using real-time data from IoT technologies and advanced 3D visualisation this research conducts a practical investigation into the process of creating and working with a Digital Twin of the new UCL Campus at Here East on the Queen Elizabeth Olympic Park

    Efficient Bayesian-based Multi-View Deconvolution

    Full text link
    Light sheet fluorescence microscopy is able to image large specimen with high resolution by imaging the sam- ples from multiple angles. Multi-view deconvolution can significantly improve the resolution and contrast of the images, but its application has been limited due to the large size of the datasets. Here we present a Bayesian- based derivation of multi-view deconvolution that drastically improves the convergence time and provide a fast implementation utilizing graphics hardware.Comment: 48 pages, 20 figures, 1 table, under review at Nature Method

    Imaging the Two Gaps of the High-TC Superconductor Pb-Bi2Sr2CuO6+x

    Full text link
    The nature of the pseudogap state, observed above the superconducting transition temperature TC in many high temperature superconductors, is the center of much debate. Recently, this discussion has focused on the number of energy gaps in these materials. Some experiments indicate a single energy gap, implying that the pseudogap is a precursor state. Others indicate two, suggesting that it is a competing or coexisting phase. Here we report on temperature dependent scanning tunneling spectroscopy of Pb-Bi2Sr2CuO6+x. We have found a new, narrow, homogeneous gap that vanishes near TC, superimposed on the typically observed, inhomogeneous, broad gap, which is only weakly temperature dependent. These results not only support the two gap picture, but also explain previously troubling differences between scanning tunneling microscopy and other experimental measurements.Comment: 6 page

    Using zebrafish larval models to study brain injury, locomotor and neuroinflammatory outcomes following intracerebral haemorrhage.

    Get PDF
    Intracerebral haemorrhage (ICH) is a devastating condition with limited treatment options, and current understanding of pathophysiology is incomplete. Spontaneous cerebral bleeding is a characteristic of the human condition that has proven difficult to recapitulate in existing pre-clinical rodent models. Zebrafish larvae are frequently used as vertebrate disease models and are associated with several advantages, including high fecundity, optical translucency and non-protected status prior to 5 days post-fertilisation. Furthermore, other groups have shown that zebrafish larvae can exhibit spontaneous ICH. The aim of this study was to investigate whether such models can be utilised to study the pathological consequences of bleeding in the brain, in the context of pre-clinical ICH research. Here, we compared existing genetic (bubblehead) and chemically inducible (atorvastatin) zebrafish larval models of spontaneous ICH and studied the subsequent disease processes. Through live, non-invasive imaging of transgenic fluorescent reporter lines and behavioural assessment we quantified brain injury, locomotor function and neuroinflammation following ICH. We show that ICH in both zebrafish larval models is comparable in timing, frequency and location. ICH results in increased brain cell death and a persistent locomotor deficit. Additionally, in haemorrhaged larvae we observed a significant increase in macrophage recruitment to the site of injury. Live in vivo imaging allowed us to track active macrophage-based phagocytosis of dying brain cells 24 hours after haemorrhage. Morphological analyses and quantification indicated that an increase in overall macrophage activation occurs in the haemorrhaged brain. Our study shows that in zebrafish larvae, bleeding in the brain induces quantifiable phenotypic outcomes that mimic key features of human ICH. We hope that this methodology will enable the pre-clinical ICH community to adopt the zebrafish larval model as an alternative to rodents, supporting future high throughput drug screening and as a complementary approach to elucidating crucial mechanisms associated with ICH pathophysiology

    Data mash-ups and the future of mapping

    Get PDF
    The term 'mash-up' refers to websites that weave data from different sources into new Web services. The key to a successful Web service is to gather and use large datasets and harness the scale of the Internet through what is known as network effects. This means that data sources are just as important as the software that 'mashes' them, and one of the most profound pieces of data that a user has at any one time is his or her location. In the past this was a somewhat fuzzy concept, perhaps as vague as a verbal reference to being in a particular shop or café or an actual street address. Recent events, however, have changed this. In the 1990s, President Bill Clinton's policy decision to open up military GPS satellite technology for 'dual-use' (military and civilian) resulted in a whole new generation of location-aware devices. Around the same time, cartography and GIScience were also undergoing dramatic, Internet-induced changes. Traditional, resource intensive processes and established organizations, in both the public and private sectors, were being challenged by new, lightweight methods. The upshot has been that map making, geospatial analysis and related activities are undergoing a process of profound change. New players have entered established markets and disrupted routes to knowledge and, as we have already seen with Web 2.0, newly empowered amateurs are part of these processes. Volunteers are quite literally grabbing a GPS unit and hitting the streets of their local town to help create crowdsourced datasets that are uploaded to both open source and proprietary databases. The upshot is an evolving landscape which Tim O'Reilly, proponent of Web 2.0 and always ready with a handy moniker, has labelled Where 2.0. Others prefer the GeoWeb, Spatial Data Infrastructure, Location Infrastructure, or perhaps just location based services. Whatever one might call it, there are a number of reasons why its development should be of interest to those in higher and further education. Firstly, since a person's location is such a profound unit of information and of such value to, for example, the process of targeting advertising, there has been considerable investment in Web 2.0-style services that make use of it. Understanding these developments may provide useful insights for how other forms of data might be used. Secondly, education, particularly research, is beginning to realize the huge potential of the data mash-up concept. As Government, too, begins to get involved, it is likely that education will be expected to take advantage of, and indeed come to relish, the new opportunities for working with data. This TechWatch report describes the context for the changes that are taking place and explains why the education community needs to understand the issues around how to open up data, how to create mash-ups that do not compromise accuracy and quality and how to deal with issues such as privacy and working with commercial and non-profit third parties. It also shows how data mash-ups in education and research are part of an emerging, richer information environment with greater integration of mobile applications, sensor platforms, e-science, mixed reality, and semantic, machine-computable data and speculates on how this is likely to develop in the future

    Legacy iron and steel wastes in the UK: Extent, resource potential, and management futures

    Get PDF
    This is the author accepted manuscript. The final version is available on open access from Elsevier via the DOI in this recordThe iron and steel industry has a long tradition of bulk reuse of slags for a range of construction applications. Growing interest in recent years has seen slag resource recovery options extend to critical raw material recovery and atmospheric carbon capture. Full scale deployment of such technologies is currently limited in part by absent or partial inventories of slag deposit locations, data on composition, and volume estimates in many jurisdictions. This paper integrates a range of spatial information to compile a database of iron and steel slag deposits in mainland United Kingdom (UK) for the first time and evaluate the associated resource potential. Over 190 million tonnes of legacy iron and steel slag are present across current and former iron and steel working regions of the UK, with particular concentrations in the north west and north east of England, and central Scotland. While significant potential stockpiles of blast furnace and basic oxygen furnace slag could provide up to 0.9 million tonnes of vanadium and a cumulative carbon dioxide capture potential of 57–138 million tonnes, major management challenges for resource recovery are apparent. Over one third are located in close proximity to designated conservation areas which may limit resource recovery. Furthermore, land use analyses show that many of the sites have already been redeveloped for housing (nearly 30% urban cover). Deposits from recent decades in current or recently closed steel-working areas may have the greatest potential for resource recovery where such ambitions could be coupled with site restoration and regeneration efforts.Natural Environment Research Council (NERC)Engineering and Physical Sciences Research Council (EPSRC)Economic and Social Research Council (ESRC)UK Department for Business Energy and Industrial Strategy (BEIS

    Osteoarticular Infection in Three Young Thoroughbred Horses Caused by a Novel Gram Negative Cocco-Bacillus

    Full text link
    © 2020 Bernard J. Hudson et al. We describe three cases of osteoarticular infection (OAI) in young thoroughbred horses in which the causative organism was identified by MALDI-TOF as Kingella species. The pattern of OAI resembled that reported with Kingella infection in humans. Analysis by 16S rRNA PCR enabled construction of a phylogenetic tree that placed the isolates closer to Simonsiella and Alysiella species, rather than Kingella species. Average nucleotide identity (ANI) comparison between the new isolate and Kingella kingae and Alysiella crassa however revealed low probability that the new isolate belonged to either of these species. This preliminary analysis suggests the organism isolated is a previously unrecognised species

    WNT signaling regulates self-renewal and differentiation of prostate cancer cells with stem cell characteristics

    Get PDF
    Prostate cancer cells with stem cell characteristics were identified in human prostate cancer cell lines by their ability to form from single cells self-renewing prostaspheres in non-adherent cultures. Prostaspheres exhibited heterogeneous expression of proliferation, differentiation and stem cell-associated makers CD44, ABCG2 and CD133. Treatment with WNT inhibitors reduced both prostasphere size and self-renewal. In contrast, addition of Wnt3a caused increased prostasphere size and self-renewal, which was associated with a significant increase in nuclear Β-catenin, keratin 18, CD133 and CD44 expression. As a high proportion of LNCaP and C4-2B cancer cells express androgen receptor we determined the effect of the androgen receptor antagonist bicalutamide. Androgen receptor inhibition reduced prostasphere size and expression of PSA, but did not inhibit prostasphere formation. These effects are consistent with the androgen-independent self-renewal of cells with stem cell characteristics and the androgen-dependent proliferation of transit amplifying cells. As the canonical WNT signaling effector Β-catenin can also associate with the androgen receptor, we propose a model for tumour propagation involving a balance between WNT and androgen receptor activity. That would affect the self-renewal of a cancer cell with stem cell characteristics and drive transit amplifying cell proliferation and differentiation. In conclusion, we provide evidence that WNT activity regulates the self-renewal of prostate cancer cells with stem cell characteristics independently of androgen receptor activity. Inhibition of WNT signaling therefore has the potential to reduce the self-renewal of prostate cancer cells with stem cell characteristics and improve the therapeutic outcome.Peer reviewe
    corecore