1,488 research outputs found

    Interactive and collaborative blended learning for undergraduates

    Get PDF
    This is an ESCalate research project from 2008 led by the University of Exeter. The aim of this project was to investigate ways of using new technologies for collaborative online learning in a blended learning context. A variety of interactive online learning tasks and e-learning tools such as wikis, discussion forums and concept maps were used for both independent learning and assessment purposes. The research was intended to show whether a more flexible approach to the use of these new technologies could promote engagement and raise the perceived quality of the learning experience of students leading to an improved e-learning confidence for the undergraduate group with better participation in online critical discussion and collaborative work. An additional outcome was the development of online tutoring skills for tutors and the opportunity to trial a range of blended learning materials and methodologies. The project involved 92 first year undergraduates from Education Studies and Childhood and Youth Studies degree programmes following a newly constructed blended learning modul

    A new sample of X-ray selected narrow emission-line galaxies. I. The nature of optically elusive AGN

    Full text link
    Using the 3XMM catalogue of serendipitous X-ray sources, and the SDSS-DR9 spectroscopic catalogue, we have obtained a new sample of X-ray selected narrow emission line galaxies. The standard optical diagnostic diagram and selection by hard X-ray luminosity expose a mismatch between the optically-based and X-ray-based classifications. The nature of these misclassified elusive AGN can be understood in terms of their broader X-ray and optical properties and leads to a division of this sub-sample into two groups. A little more than half are likely to be narrow-line Seyfert 1s (NLS1s), so misclassified because of the contribution of the Broad Line Region (BLR) to their optical spectra. The remainder have some of the properties of Seyfert 2 (Sy2) AGN; their optical elusiveness can be explained by optical dilution from the host galaxy plus a star-formation contribution and by their underluminous optical emission due to low accretion rates. Because some of the Sy2 sources have very low accretion rates, are unabsorbed, plus the fact that they lack broad optical emission lines, they are good candidates to be True Sy2 AGN.Comment: 13 pages, 13 figues, accepted for publication in A&

    The UK's global gas challenge

    Get PDF
    A UKERC Research Report exploring the UK's global gas challenge. This report takes an interdisciplinary perspective, which marries energy security insights from politics and international relations, with detailed empirical understanding from energy studies and perspectives from economic geography that emphasise the spatial distribution of actors, networks and resource flows that comprise the global gas industry. Natural gas production in the UK peaked in 2000, and in 2004 it became a net importer. A decade later and the UK now imports about half of the natural gas that it consumes. The central thesis of the project on which this report is based is that as the UK’s gas import dependence has grown, it has effectively been ‘globalising’ its gas security; consequently UK consumers are increasingly exposed to events in global gas markets. - See more at: http://www.ukerc.ac.uk/publications/the-uk-s-global-gas-challenge.html#sthash.wEP831Zn.dpu

    A bridge to a low carbon future? Modelling the long-term global potential of natural gas

    Get PDF
    This project uses the global TIMES Integrated Assessment Model in UCL (‘TIAM-UCL’) to provide robust quantitative insights into the future of natural gas in the energy system and in particular whether or not gas has the potential to act as a ‘bridge’ to a low-carbon future on both a global and regional basis out to 2050. This report first explores the dynamics of a scenario that disregards any need to cut greenhouse gas (GHG) emissions. Such a scenario results in a large uptake in the production and consumption of all fossil fuels, with coal in particular dominating the electricity system. It is unconventional sources of gas production that account for much of the rise in natural gas production; with shale gas exceeding 1 Tcm after 2040. Gas consumption grows in all sectors apart from the electricity sector, and eventually becomes cost effective both as a marine fuel (as liquefied natural gas) and in medium goods vehicles (as compressed natural gas). It next examines how different gas market structures affect natural gas production, consumption, and trade patterns. For the two different scenarios constructed, one continued current regionalised gas markets, which are characterised by very different prices in different regions with these prices often based on oil indexation, while the other allowed a global gas price to form based on gas supply-demand fundamentals. It finds only a small change in overall global gas production levels between these but a major difference in levels of gas trade and so conclude that if gas exporters choose to defend oil indexation in the short-term, they may end up destroying their export markets in longer term. A move towards pricing gas internationally, based on supply-demand dynamics, is thus shown to be crucial if the if they are to maintain their current levels of exports. Nevertheless, it is also shown that, regardless of how gas is priced in the future, scenarios leading to a 2oC temperature rise generally have larger pipeline and LNG exports than scenarios that lead to a higher temperature increase. For pipeline trade, the adoption of any ambitious emissions reduction agreement results in little loss of markets and could (if carbon capture and storage is available) actually lead to a much greater level of exports. For LNG trade, because of the significant role that gas can play in replacing future coal demand in the emerging economies in Asia, markets that are largely supplied by LNG at present, we demonstrate that export countries should actively pursue an ambitious global agreement on GHG emissions mitigation if they want to expand their exports. These results thus have important implications for the negotiating positions of gas-exporting countries in the ongoing discussions on agreeing an ambitious global agreement on emissions reduction

    The International Space Station Habitat

    Get PDF
    The International Space Station (ISS) is an engineering project unlike any other. The vehicle is inhabited and operational as construction goes on. The habitability resources available to the crew are the crew sleep quarters, the galley, the waste and hygiene compartment, and exercise equipment. These items are mainly in the Russian Service Module and their placement is awkward for the crew to deal with ISS assembly will continue with the truss build and the addition of International Partner Laboratories. Also, Node 2 and 3 will be added. The Node 2 module will provide additional stowage volume and room for more crew sleep quarters. The Node 3 module will provide additional Environmental Control and Life Support Capability. The purpose of the ISS is to perform research and a major area of emphasis is the effects of long duration space flight on humans, a result of this research they will determine what are the habitability requirements for long duration space flight

    Measuring Computer Forensics Skill

    Get PDF
    Computer forensic analysts combine their technical skills with their forensic aptitude to recover information from computers and storage devices. Most technology professionals demonstrate expertise through the acquisition of different professional certifications. Certifications, however, are not always a valid judge of skill, because certifications are formatted as written and applicable tests. It is common for people to forget knowledge and skills when they are not routinely practiced. The same applies with technology certifications. One must practice the skills learned for the certification test consistently in order to convert them to long-term memory. “Cognitive processes play a prominent role in the acquisition and retention of new behavior patterns” (Bandura 1977, p. 192). As a skill is practiced, it is better retained. Due to the current inability to accurately measure an individual’s skills and understanding of computer forensics principles, this research will investigate how to measure proficiency amongst professionals and novices. Recent research utilized conceptual expertise within the context of computer security (Giboney et al. 2016). This study utilized a technique to quickly measure the difference between novices and experts. Following their guidelines, we propose to do the same for computer forensics expertise with the following research question: What knowledge, skills and abilities are needed to be demonstrated in a measure to assess computer forensics expertise? Conceptual expertise is the understanding about the theoretical concepts and their relationship in a topic area. The SEAM process (Giboney et al. 2016) aims to gauge the practical application of situations to the goal wherein experts can show their conceptual expertise. The conceptual expertise task is based on the idea that those who have surface level knowledge will group scenarios by surface features while experts will be able to group the same scenarios by deep features (Giboney et al. 2016). The assessment has been designed to measure the understanding of basic computer forensics processes. It consists of twenty-five situations created to highlight different stages of the digital forensic process. These situations focus on a gender-neutral individual, Jordan and the tasks they perform given certain parameters. Survey takers will group the situations by stage of forensics or by what crime the task is involved with. We will show that the assessment can accurately determine an individual’s understanding of computer forensics. When this is shown, this assessment could be used in a variety of ways including initial assessments of job candidates and pre- and post- tests for computer forensic classes

    Testing of Highly Accurate Blackbodies

    Get PDF
    Many organizations, including Space Dynamics Laboratory, have built blackbodies with calculated emissivities of 0.995 to 0.9999 and estimated radiance temperature uncertainties of a few hundred mK or less. However, the calculated performance has generally not been demonstrated through testing or comparison with other high-performance blackbodies. Intercomparison is valuable; historically, when equipment or experimental results have been intercompared they are often found to disagree by more than the claimed uncertainties. Blackbody testing has been limited because testing at the required accuracy (0.1% or better in radiance) is a significant expense. Such testing becomes essential when proven, SI-traceble, absolute accuracy is required, such as for the CLARREO mission, which has an absolute accuracy requirement of 0.1 K (3 sigma) at 220 K over most of the thermal infrared and needs high-performance blackbodies to support this requirement. Properly testing blackbodies requires direct measurement of emissivity and accurate measurement of radiance or comparison of radiance from two blackbodies. This presentation will discuss these testing needs, various types of testing, and test results for a CLARREO prototype blacbkody

    A High-Accuracy Blackbody for CLARREO

    Get PDF
    The NASA climate science mission Climate Absolute Radiance and Refractivity Observatory (CLARREO), which is to measure Earth’s emitted spectral radiance from orbit for 5 years, has an absolute accuracy requirement of 0.1 K (3σ) at 220 K over most of the thermal infrared. To meet this requirement, CLARREO needs highly accurate on-board blackbodies which remain accurate over the life of the mission. Space Dynamics Laboratory is developing a prototype blackbody that demonstrates the ability to meet the needs of CLARREO. This prototype is based on a blackbody design currently in use, which is relatively simple to build, was developed for use on the ground or on-orbit, and is readily scalable for aperture size and required performance. We expect the CLARREO prototype to have emissivity of ~0.9999 from 1.5 to 50 ÎŒm, temperature uncertainties of ~25 mK (3σ), and radiance uncertainties of ~10 mK due to temperature gradients. The high emissivity and low thermal gradient uncertainties are achieved through cavity design, while the SI-traceable temperature uncertainty is attained through the use of phase change materials (mercury, gallium, and water) in the blackbody. Blackbody temperature sensor calibration is maintained over time by comparing sensor readings to the known melt temperatures of these materials, which are observed by heating through their melt points. Since blackbody emissivity can potentially change over time due to changes in surface emissivity (especially for an on-orbit blackbody) an on-board means of detecting emissivity change is desired. The prototype blackbody will include an emissivity monitor based on a quantum cascade laser to demonstrate the concept

    A Platform for the Analysis of Qualitative and Quantitative Data about the Built Environment and its Users

    Get PDF
    There are many scenarios in which it is necessary to collect data from multiple sources in order to evaluate a system, including the collection of both quantitative data - from sensors and smart devices - and qualitative data - such as observations and interview results. However, there are currently very few systems that enable both of these data types to be combined in such a way that they can be analysed side-by-side. This paper describes an end-to-end system for the collection, analysis, storage and visualisation of qualitative and quantitative data, developed using the e-Science Central cloud analytics platform. We describe the experience of developing the system, based on a case study that involved collecting data about the built environment and its users. In this case study, data is collected from older adults living in residential care. Sensors were placed throughout the care home and smart devices were issued to the residents. This sensor data is uploaded to the analytics platform and the processed results are stored in a data warehouse, where it is integrated with qualitative data collected by healthcare and architecture researchers. Visualisations are also presented which were intended to allow the data to be explored and for potential correlations between the quantitative and qualitative data to be investigated
    • 

    corecore