863 research outputs found

    Research groups: How big should they be?

    Get PDF
    Understanding the relationship between scientific productivity and research group size is important for deciding how science should be funded. We have investigated the relationship between these variables in the life sciences in the United Kingdom using data from 398 principle investigators (PIs). We show that three measures of productivity, the number of publications, the impact factor of the journals in which papers are published and the number of citations, are all positively correlated to group size, although they all show a pattern of diminishing returns—doubling group size leads to less than a doubling in productivity. The relationships for the impact factor and the number of citations are extremely weak. Our analyses suggest that an increase in productivity will be achieved by funding more PIs with small research groups, unless the cost of employing post-docs and PhD students is less than 20% the cost of a PI. We also provide evidence that post-docs are more productive than PhD students both in terms of the number of papers they produce and where those papers are published

    Synthesis and Characterization of Metal Oxide Nanostructures for Efficient Gas Sensor Applications

    Get PDF

    Synthesis of Characterization and Sensing properties of Metal-Oxide Nanostructures

    Get PDF

    Dataflow Programming and Acceleration of Computationally-Intensive Algorithms

    Get PDF
    The volume of unstructured textual information continues to grow due to recent technological advancements. This resulted in an exponential growth of information generated in various formats, including blogs, posts, social networking, and enterprise documents. Numerous Enterprise Architecture (EA) documents are also created daily, such as reports, contracts, agreements, frameworks, architecture requirements, designs, and operational guides. The processing and computation of this massive amount of unstructured information necessitate substantial computing capabilities and the implementation of new techniques. It is critical to manage this unstructured information through a centralized knowledge management platform. Knowledge management is the process of managing information within an organization. This involves creating, collecting, organizing, and storing information in a way that makes it easily accessible and usable. The research involved the development textual knowledge management system, and two use cases were considered for extracting textual knowledge from documents. The first case study focused on the safety-critical documents of a railway enterprise. Safety is of paramount importance in the railway industry. There are several EA documents including manuals, operational procedures, and technical guidelines that contain critical information. Digitalization of these documents is essential for analysing vast amounts of textual knowledge that exist in these documents to improve the safety and security of railway operations. A case study was conducted between the University of Huddersfield and the Railway Safety Standard Board (RSSB) to analyse EA safety documents using Natural language processing (NLP). A graphical user interface was developed that includes various document processing features such as semantic search, document mapping, text summarization, and visualization of key trends. For the second case study, open-source data was utilized, and textual knowledge was extracted. Several features were also developed, including kernel distribution, analysis offkey trends, and sentiment analysis of words (such as unique, positive, and negative) within the documents. Additionally, a heterogeneous framework was designed using CPU/GPU and FPGAs to analyse the computational performance of document mapping

    Effect of pre-storage salicylic acid, calcium chloride and 2,4-dichlorophenoxyacetic acid dipping on chilling injury and quality of ‘Taify’ cactus pear fruit during cold storage

    Get PDF
    The effects of pre-storage salicylic acid (SA) calcium chloride (CaCl2) and 2,4-dichlorophenoxyacetic acid (2,4-D) treatments on chilling injury (CI) and quality of cactus pear fruit during storage were investigated. The results showed that SA application at 2.0, 3.0 or 4.0 mM significantly decreased CI index compared to all the other treatments. Increasing SA rate to 3.0 or 4.0 mM did not result in a further reduction in CI index. However, CaCl2 (at 2, 3, and 4%) and 2,4-D (at 100, 150 and 200 ppm) had no effect on CI index. CI increased during storage and was higher at 30 than at 10 and 20 days of storage. Weight loss was not affected by any of the treatments but was higher at 10 than at 20 and 30 days of storage. Decay was not affected by any of the treatments but was higher at 30 than at 10 and 20 days of storage. Firmness was higher at 200 ppm 2,4-D than all the other treatments. Fruit acidity was not affected by any of the applied treatments but was lower at 20 and 30 days than at 10 days of storage. The pH of fruit juice increased during 30 days of storage and was lower in the SA treatments than the control. Total soluble solids (TSS) concentration was higher in the control than all the other treatments, except for the SA at 4.0 mM treatment. TSS concentration was higher at 10 than at 20 and 30 days of storage. Vitamin C concentration was lower at 20 than at 10 and 30 days of storage and was lower in the CaCl2 treatments than the control. Total phenols concentration increased during 30 days of storage and was lower in the CaCl2 and 2,4-D treatments than the control. It was concluded that pre-storage SA dipping at 2.0 mM reduced chilling injury and retained quality of cactus fruit.Key words: Cactus pear, salicylic acid, CaCl2, 2,4-D, chilling injury, storage

    Renewable Sources of Energy in Pakistan

    Get PDF
    This paper has been divided into three parts. Part I sets the background of energy for development and some features of the Pakistan situation. Part II shows the need for renewable sources and introduces their likely contribution in the near future (2,000 A.D.). In the third part, we examine the various renewable sources of energy to obtain estimates of their economics and give broad recommendations

    A generic approach, employing information systems, for introducing manufacturing information systems in SME 's

    Get PDF
    This thesis presents an approach which the small and medium size firms can use in-house to introduce manufacturing information systems. The approach developed is generic and employs information system design and analysis techniques to guide Small and Medium size Enterprises (SME's) from specification of their need, right through to the implementation of an appropriate solution. Although there are various tools and methodologies that are available for large organisations needs, none are available for SME's. Therefore, the approach presented in this thesis provides original and significant improvements on current practice. The approach emphasises the importance of taking a company wide approach to analyse systems throughout its various departments to establish bad practices and system flaws which may impinge on the performance of the manufacturing operations. The research involved three independent stages. The first stage was the identification of the problem which was realised from two sources: literature survey and interviews with case study company managers. The second stage was the development of a novel approach. The final stage included the validation of the approach by implementing it in five different SMEs in the Devon and Cornwall region. Through the use of this work, company's are encouraged to improve ownership and commitment to the manufacturing information systems by fully involving the relevant company personnel in identifying and resolving various problems. The approach proposed also helps managers understand how the various processes work in other areas of company, and can subsequently lead to improvements in other departments

    The Prevent strategy and the UK ‘war on terror’: embedding infrastructures of surveillance in Muslim communities

    Get PDF
    The Prevent policy was introduced in the UK in 2003 as part of an overall post 9/11 counter-terrorism approach (CONTEST), with the aim of preventing the radicalisation of individuals to terrorism. In 2015, the Prevent policy became a legal duty for public sector institutions, and as such, its reach has extended much deeper into society. This article, based on ongoing ethnographic fieldwork—including interviews, focus groups and participant observations—seeks to uncover and analyse the function of surveillance at the heart of the Prevent strategy. Contrary to official denials, surveillance forms an essential feature of the Prevent strategy. It regards radicalisation as part of an overall conveyor belt to terrorism, and thus attempts to control the future by acting in the present. The article shows how the framing of the terror threat in the ‘war on terror’, as an ‘Islamic threat’, has afforded a surveillance infrastructure, embedded into Muslim communities, which has securitised relations with local authorities. Its intelligence products, as well as the affective consequences of surveillance, have served to contain and direct Muslim political agency. Such an analysis uncovers the practice of Islamophobia at the heart of the Prevent strategy, which accounts for its surveillance tendencies
    corecore