16,986 research outputs found

    To share or not to share: Publication and quality assurance of research data outputs. A report commissioned by the Research Information Network

    No full text
    A study on current practices with respect to data creation, use, sharing and publication in eight research disciplines (systems biology, genomics, astronomy, chemical crystallography, rural economy and land use, classics, climate science and social and public health science). The study looked at data creation and care, motivations for sharing data, discovery, access and usability of datasets and quality assurance of data in each discipline

    TLAD 2010 Proceedings:8th international workshop on teaching, learning and assesment of databases (TLAD)

    Get PDF
    This is the eighth in the series of highly successful international workshops on the Teaching, Learning and Assessment of Databases (TLAD 2010), which once again is held as a workshop of BNCOD 2010 - the 27th International Information Systems Conference. TLAD 2010 is held on the 28th June at the beautiful Dudhope Castle at the Abertay University, just before BNCOD, and hopes to be just as successful as its predecessors.The teaching of databases is central to all Computing Science, Software Engineering, Information Systems and Information Technology courses, and this year, the workshop aims to continue the tradition of bringing together both database teachers and researchers, in order to share good learning, teaching and assessment practice and experience, and further the growing community amongst database academics. As well as attracting academics from the UK community, the workshop has also been successful in attracting academics from the wider international community, through serving on the programme committee, and attending and presenting papers.This year, the workshop includes an invited talk given by Richard Cooper (of the University of Glasgow) who will present a discussion and some results from the Database Disciplinary Commons which was held in the UK over the academic year. Due to the healthy number of high quality submissions this year, the workshop will also present seven peer reviewed papers, and six refereed poster papers. Of the seven presented papers, three will be presented as full papers and four as short papers. These papers and posters cover a number of themes, including: approaches to teaching databases, e.g. group centered and problem based learning; use of novel case studies, e.g. forensics and XML data; techniques and approaches for improving teaching and student learning processes; assessment techniques, e.g. peer review; methods for improving students abilities to develop database queries and develop E-R diagrams; and e-learning platforms for supporting teaching and learning

    Initiating organizational memories using ontology network analysis

    Get PDF
    One of the important problems in organizational memories is their initial set-up. It is difficult to choose the right information to include in an organizational memory, and the right information is also a prerequisite for maximizing the uptake and relevance of the memory content. To tackle this problem, most developers adopt heavy-weight solutions and rely on a faithful continuous interaction with users to create and improve its content. In this paper, we explore the use of an automatic, light-weight solution, drawn from the underlying ingredients of an organizational memory: ontologies. We have developed an ontology-based network analysis method which we applied to tackle the problem of identifying communities of practice in an organization. We use ontology-based network analysis as a means to provide content automatically for the initial set up of an organizational memory

    TLAD 2010 Proceedings:8th international workshop on teaching, learning and assesment of databases (TLAD)

    Get PDF
    This is the eighth in the series of highly successful international workshops on the Teaching, Learning and Assessment of Databases (TLAD 2010), which once again is held as a workshop of BNCOD 2010 - the 27th International Information Systems Conference. TLAD 2010 is held on the 28th June at the beautiful Dudhope Castle at the Abertay University, just before BNCOD, and hopes to be just as successful as its predecessors.The teaching of databases is central to all Computing Science, Software Engineering, Information Systems and Information Technology courses, and this year, the workshop aims to continue the tradition of bringing together both database teachers and researchers, in order to share good learning, teaching and assessment practice and experience, and further the growing community amongst database academics. As well as attracting academics from the UK community, the workshop has also been successful in attracting academics from the wider international community, through serving on the programme committee, and attending and presenting papers.This year, the workshop includes an invited talk given by Richard Cooper (of the University of Glasgow) who will present a discussion and some results from the Database Disciplinary Commons which was held in the UK over the academic year. Due to the healthy number of high quality submissions this year, the workshop will also present seven peer reviewed papers, and six refereed poster papers. Of the seven presented papers, three will be presented as full papers and four as short papers. These papers and posters cover a number of themes, including: approaches to teaching databases, e.g. group centered and problem based learning; use of novel case studies, e.g. forensics and XML data; techniques and approaches for improving teaching and student learning processes; assessment techniques, e.g. peer review; methods for improving students abilities to develop database queries and develop E-R diagrams; and e-learning platforms for supporting teaching and learning

    A Survey of Agent-Based Modeling Practices (January 1998 to July 2008)

    Get PDF
    In the 1990s, Agent-Based Modeling (ABM) began gaining popularity and represents a departure from the more classical simulation approaches. This departure, its recent development and its increasing application by non-traditional simulation disciplines indicates the need to continuously assess the current state of ABM and identify opportunities for improvement. To begin to satisfy this need, we surveyed and collected data from 279 articles from 92 unique publication outlets in which the authors had constructed and analyzed an agent-based model. From this large data set we establish the current practice of ABM in terms of year of publication, field of study, simulation software used, purpose of the simulation, acceptable validation criteria, validation techniques and complete description of the simulation. Based on the current practice we discuss six improvements needed to advance ABM as an analysis tool. These improvements include the development of ABM specific tools that are independent of software, the development of ABM as an independent discipline with a common language that extends across domains, the establishment of expectations for ABM that match their intended purposes, the requirement of complete descriptions of the simulation so others can independently replicate the results, the requirement that all models be completely validated and the development and application of statistical and non-statistical validation techniques specifically for ABM.Agent-Based Modeling, Survey, Current Practices, Simulation Validation, Simulation Purpose

    Coaching for creativity, imagination, and innovation

    Get PDF
    The Chartered Institute of Personal Development (CIPD) has acknowledged the rise of coaching, and has developed a set of standards to guide the coaching profession. The aim of this discussion paper is to explore the potential of creative coaching. What it could offer professional practitioners, and to investigate what professionals understand to be the components of creative coaching. In order, to reach conclusions and recommendations on how the professional coach can practically engage with creative coaching within existing coaching frameworks creatively. The research methodology supporting the investigation is evidence-based research, both quantitative and qualitative. This is based on conducting a questionnaire of Middlesex University Business School Master Business Administration and Human Resource professionals, and conducting a focus group with a group of Human Resource professionals studying developing individuals and teams, and the innovative practitioner. One-to-one creative coaching sessions with trainee coaches studying at the i-coach academy, and then a further in-depth case study with one of the trainee coaches using creative competencies. The objectives of the investigation are to ascertain: creative coaching goals and beliefs; expectations on the creative tools, techniques, and processes that could be used both individually and organisationally; qualities of a good creative coach; the outcomes and evaluation of the creative coaching relationship. A case for the contribution of creative coaching: to achieving goal-setting creatively; applying creative techniques within existing models of coaching; leaders as creative coaches; creative team coaching for organisational innovation has been made. Creative coaching based on an expertise in theories, processes, tools and techniques of creativity, as well as an understanding of adult education principles, and can make a valuable contribution in coaching training and educational programmes, as well practicing coaches

    Life of occam-Pi

    Get PDF
    This paper considers some questions prompted by a brief review of the history of computing. Why is programming so hard? Why is concurrency considered an “advanced” subject? What’s the matter with Objects? Where did all the Maths go? In searching for answers, the paper looks at some concerns over fundamental ideas within object orientation (as represented by modern programming languages), before focussing on the concurrency model of communicating processes and its particular expression in the occam family of languages. In that focus, it looks at the history of occam, its underlying philosophy (Ockham’s Razor), its semantic foundation on Hoare’s CSP, its principles of process oriented design and its development over almost three decades into occam-? (which blends in the concurrency dynamics of Milner’s ?-calculus). Also presented will be an urgent need for rationalisation – occam-? is an experiment that has demonstrated significant results, but now needs time to be spent on careful review and implementing the conclusions of that review. Finally, the future is considered. In particular, is there a future

    Evaluating the impact of an enhanced energy performance standard on load-bearing masonry domestic construction: Understanding the gap between designed and real performance: lessons from Stamford Brook.

    Get PDF
    This report is aimed at those with interests in the procurement, design and construction of new dwellings both now and in the coming years as the Government’s increasingly stringent targets for low and zero carbon housing approach. It conveys the results of a research project, carried out between 2001 and 2008, that was designed to evaluate the extent to which low carbon housing standards can be achieved in the context of a large commercial housing development. The research was led by Leeds Metropolitan University in collaboration with University College London and was based on the Stamford Brook development in Altrincham, Cheshire. The project partners were the National Trust, Redrow and Taylor Wimpey and some 60 percent of the planned 700 dwelling development has been completed up to June 2008. As the UK house building industry and its suppliers grapple with the challenges of achieving zero carbon housing by 2016, the lessons arising from this project are timely and of considerable value. Stamford Brook has demonstrated that designing masonry dwellings to achieve an enhanced energy standard is feasible and that a number of innovative approaches, particularly in the area of airtightness, can be successful. The dwellings, as built, exceed the Building Regulations requirements in force at the time but tests on the completed dwellings and longer term monitoring of performance has shown that, overall, energy consumption and carbon emissions, under standard occupancy, are around 20 to 25 percent higher than design predictions. In the case of heat loss, the discrepancy can be much higher. The report contains much evidence of considerable potential but points out that realising the design potential requires a fundamental reappraisal of processes within the industry from design and construction to the relationship with its supply chain and the development of the workforce. The researchers conclude that, even when builders try hard, current mainstream technical and organisational practices together with industry cultures present barriers to consistent delivery of low and zero carbon performance. They suggest that the underlying reasons for this are deeply embedded at all levels of the house building industry. They point out also that without fundamental change in processes and cultures, technological innovations, whether they be based on traditional construction or modern methods are unlikely to reach their full potential. The report sets out a series of wide ranging implications for new housing in the UK, which are given in Chapter 14 and concludes by firmly declaring that cooperation between government, developers, supply chains, educators and researchers will be crucial to improvement. The recommendations in this report are already being put into practice by the researchers at Leeds Metropolitan University and University College London in their teaching and in further research projects. The implications of the work have been discussed across the industry at a series of workshops undertaken in 2008 as part of the LowCarb4Real project (see http://www.leedsmet.ac.uk/as/cebe/projects/lowcarb4real/index.htm). In addition, the learning is having an impact on the work of the developers (Redrow and Taylor Wimpey) who, with remarkable foresight and enthusiasm, hosted the project. This report seeks to make the findings more widely available and is offered for consideration by everyone who has a part to play in making low and zero carbon housing a reality

    New approaches to using scientific data - statistics, data mining and related technologies in research and research training

    No full text
    This paper surveys technological changes that affect the collection, organization, analysis and presentation of data. It considers changes or improvements that ought to influence the research process and direct the use of technology. It explores implications for graduate research training. The insights of Evidence-Based Medicine are widely relevant across many different research areas. Its insights provide a helpful context within which to discuss the use of technological change to improve the research process. Systematic data-based overview has to date received inadequate attention, both in research and in research training. Sharing of research data once results are published would both assist systematic overview and allow further scrutiny where published analyses seem deficient. Deficiencies in data collection and published data analysis are surprisingly common. Technologies that offer new perspectives on data collection and analysis include data warehousing, data mining, new approaches to data visualization and a variety of computing technologies that are in the tradition of knowledge engineering and machine learning. There is a large overlap of interest with statistics. Statistics is itself changing dramatically as a result of the interplay between theoretical development and the power of new computational tools. I comment briefly on other developing mathematical science application areas - notably molecular biology. The internet offers new possibilities for cooperation across institutional boundaries, for exchange of information between researchers, and for dissemination of research results. Research training ought to equip students both to use their research skills in areas different from those in which they have been immediately trained, and to respond to the challenge of steadily more demanding standards. There should be an increased emphasis on training to work cooperatively

    The Difficult Reception of Rigorous Descriptive Social Science in the Law

    Get PDF
    Mutual disdain is an effective border patrol at the demarcation lines between disciplines. Social scientists tend to react with disdain when they observe how their findings are routinely stripped of all the caveats, assumptions and careful limitations once they travel into law. Likewise, lawyers tend to react with disdain when they read all the laborious proofs and checks for what looks to them like a minuscule detail in a much larger picture. But mutual disdain comes at a high price. All cross-border intellectual trade is stifled. This paper explores the social science/law border from the legal side. The natural barriers turn out to be significant, but not insurmountable. Specifically the paper looks at the challenges of integrating rigorous descriptive social science into the application of the law in force by courts and administrative authorities. This is where the gap is most difficult to bridge. The main impediments are implicit value judgments inherent in models, conceptual languages and strictly controlled ways of generating empirical evidence; the difference between explanation, hypothesis testing and prediction, on the one hand, and decision-making, on the other; the ensuing difference between theoretical and practical reasoning, and the judicial tradition of engaging in holistic thinking; last but not least, the strife of the legal system for autonomy, in order to maintain its viability. If a legal academic assumes the position of an outside observer, she may entirely ignore all these concerns and simply follow the methodological standards of descriptive social science. This is, for instance, what most of law and economics does. The legal academic may, instead, choose to contribute to the making of new law. She will then find it advisable to partly ignore the strictures of rigorous methodology in order to be open to more aspects of the regulatory issue. But it is not difficult, at least, to follow the standards of the social sciences for analysing the core problem. The integration is most difficult if an academic does doctrinal work. But it is precisely here where the division of intellectual labour between legal practice and legal academia is most important. Academics who themselves are versatile in the respective social science translate the decisive insights into suggestions for a better reading of statutory provisions or case law.law and economics, law and statistics, explanation vs. decision-making, practical reasoning, psychology of judicial decision-making
    corecore