2,027 research outputs found
Recommended from our members
The organisational impact of open educational resources
The open educational resource (OER) movement has been growing rapidly since 2001, stimulated by funding from benefactors such as the Hewlett Foundation and UNESCO, and providing educational content freely to institutions and learners across the World. Individuals and organisations are motivated by a variety of drivers to produce OERs, both altruistic and self-interested. There are parallels with the open source movement where authors and others combine their efforts to provide a product which they and others can use freely and adapt to their own purposes. There are many different ways in which OER initiatives are organised and an infinite range of possibilities for how the OERs themselves are constituted. If institutions are to develop sustainable OER initiatives they need to build successful change management initiatives, developing models for the production and quality assurance of OERs, licensing them through appropriate mechanisms such as the Creative Commons, and considering how the resources will be discovered and used by learners
Recommended from our members
Enhancing Moodle to meet the needs of 200,000 distance learners
In 2005 The Open University UK selected Moodle as the basis of its institutional virtual learning environment. Since then, the system has been integrated with existing elearning and administrative systems at the University and considerably enhanced during an extensive development programme costing around âŹ8m and taking nearly three years. Many policy issues have emerged which needed to be tackled alongside the software developments in order for the platform to be adopted by the 7,000 tutors and nearly 200,000 students of the University. The Moodle system has proven to be reliable, scalable and customisable and has resulted in a more flexible system for the Open University than the commercial alternatives. This paper examines some of the many enhancements made to Moodle by the Open University, most of which have been fed back into the product for the benefit of other Moodle users. It describes some of the policy and pedagogical issues which have emerged during the roll-out of Moodle across the
University
Conceptualising item banks
Sclater and MacDonald (2004) provide a simple definition of an item bank: a collection of items for a particular assessment, subject or educational sector, classified by metadata which facilitates searching and automated test creation. There is a need to define more closely the various elements and attributes of the item bank itself and to show how an item bank might fit into the larger picture of a distributed national (or even international) item bank infrastructure
Using evaluation to inform the development of a user-focused assessment engine
This paper reports on the evaluation of a new assessment system, Technologies for Online Interoperability (TOIA). TOIA was built from a user-focussed specification of an assessment system. The formative evaluation of the project complemented this initial specification by ensuring that user feedback on the development and use of the system was iteratively fed back into the development process. The paper begins by summarising some of the key barriers and enablers to the use of assessment systems and the uptake of Computer-Assisted Assessment (CAA). It goes on to provide a critique of the impact of technology on assessment and considers whether innovative uses of information and communication technology (ICT) might result in new e-pedagogies and practices in assessment. The paper then reports on the findings of the TOIA evaluation and discusses how these were used to inform the development of the system
Recommended from our members
Learning in an age of digital networks
The final years of the twentieth century and early years of the twenty first century have been marked by the rapid rise of digital and networked technologies. Some have even called it a paradigm shift and suggested that it will lead to a dramatic change in the way young people learn (Tapscott and Williams, 2010). As with all commentary on new technologies we should beware of being carried away with the excitement of the new. There is a recurrent innovation cycle beginning with over excitement followed by disappointment and once the reaction has set in against the new it is followed by a move away to yet another new technology, often before a proper assessment and evaluation of the previous cycle can take place. Equally we must be careful not to ignore the profound changes that are taking place and how they may affect universities and learning in society more generally. A recent description by a UK based think tank Demos characterized the kind of university that is emerging from the engagement with new digital and networked technologies as the 'edgeless university' (Bradwell, 2009). The term edgeless is borrowed from work on the city that suggests edgeless cities have the function of cities without being organized in their classic form. In the same way the Demos pamphlet suggests that the university retains an identifiable function but the functions of the university are no longer confined to a single institution nor are they confined to higher education institutions more broadly. Over a decade ago Brown and Duguid (2000) identified the core functions of universities as the capacity to grant degrees, to accredit students and to provide the warrant that guaranteed the credentials obtained by the students from the university. They also suggested that the introduction of what were then new technologies would lead to an increased focus on these core functions. The core role remains in the edgeless university but the boundaries to these may alter. This article tries to provide a way of thinking about new technologies that manages to balance these two conflicting needs. It identifies some current ways of thinking about the changes taking place in universities that are related to digital and networked technologies and to assess their impact. It then goes on to suggest the kinds of choices we may have to make in relation to new technologies at a variety of levels, the personal, the institutional and in terms of society in general. The edgeless university is associated with broad technological change but whether such change is inevitable is still an issue that needs to be discussed
Developing a national item bank
The COLA project has been developing a large bank of assessment items for units across the Scottish further education curriculum since May 2003. These will be made available to learners mainly via collegesâ virtual learning environments. Many people have been involved in the development of the COLA item bank. Processes have included deciding on appropriate item types and subject areas, training authors, peer-reviewing and quality assuring the items and assessments, and ensuring they are interoperable and tagged with appropriate metadata
Putting interoperability to the test: building a large reusable assessment item bank
The COLA project has been developing a large bank of assessment items for units across the Scottish further education curriculum since May 2003. These will be made available to learners mainly via colleges virtual learning environments (VLEs). Many people have been involved in the development of the COLA assessment item bank to ensure a high level of technical and pedagogical quality. Processes have included deciding on appropriate item types and subject areas, training authors, peer-reviewing and quality assuring the items and assessments, and ensuring they are tagged with appropriate metadata. One of the biggest challenges has been to ensure that the assessments are deliverable across the four main virtual learning environments in use in Scottish colleges-and also through a stand-alone assessment system. COLA is significant because no other large project appears to have successfully developed standards-compliant assessment content for delivery across multiple VLEs. This paper discusses how COLA has dealt with the organizational, pedagogical and technical issues which arise when commissioning items from many authors for delivery across an educational sector
Recommended from our members
Age, Depth, and Residual Depth Anomalies in the North Pacific: Implications for Thermal Models of the Lithosphere and Upper Mantle
We present an empirical basement depth versus age relation for the North Pacific Ocean, based on the statistical treatment of an ocean-wide gridded data set. The SYNBAPS bathymetry was averaged into half-degree intervals and corrected for the effects of sediment loading. The resulting basement depths were plotted against ages determined from a revised isochron chart based on a recent compilation of magnetic lineations and various published plate reconstructions. On crust older than 80 Ma, the depths are skewed to the shallow side of the depth versus age distribution by large numbers of seamounts. Therefore the mean and standard deviations are not useful representations of the data. A more appropriate representation is the mode (or greatest concentration of points) and contours around the mode. The contours around the mode show that most ocean floor increases in depth with the square root of age out to crust of 80 Ma. Beyond this the majority of the data oscillates about a line that remains essentially constant as the age in-creases. Approximately 56% of all the data points lie within a + 300m band about the mode. If the sediment thickness data in the older basins of the western North Pacific is correct then the flattening of the depths favor a model in which extra heat is supplied to the base of the lithosphere on older ocean floor. Residual depth anomalies were calculated by removing the depths predicted by such a model. These anomalies correlate with bathymetric features and occur predominantly on crust of 120 and 160 Ma. They account for the rises in the mode at these two ages. The overall subsidence of the ocean floor can be accounted for by the cooling of a thermo-mechanical boundary layer. Correlations between geoid height and depth are evidence that many of the residual depth anomalies result from convective plumes which reset the thermal structure of the lithosphere. It is possible that this process observed at different times after the initial resetting of the isotherms may account for many of the depth anomalies in the western North Pacific.Institute for Geophysic
Interoperability with CAA: does it work in practice?
IMS has been promising question and test interoperability (QTI) for a number of years. Reported advantages of interoperability include the avoidance of âlock inâ to one proprietary system, the ability to integrate systems from different vendors, and the facilitation of an exchange of questions and tests between institutions. The QTI specification, while not yet an international standard for the exchange of questions, tests and results, now appears to be stable enough for vendors to have developed systems which implement such an exchange in a fairly sophisticated way. The costs to software companies of implementing QTI âcomplianceâ in their existing CAA systems, however, are high. Allowing users to move their data to other systems may not seem to make commercial sense either. As awareness of the advantages of interoperability increases within education, software companies are realising that adding QTI import and export facilities to their products can be a selling point. A handful of vendors have signed up to the concept of interoperability and have taken part in the IMS QTI Working Group. Others state that their virtual learning environments or CAA systems are âconformantâ with IMS QTI but do these assertions stand up when the packages are tested together? The CETIS Assessment Special Interest Group has been monitoring developments in this area for over a year and has carried out an analysis of tools which exploit the QTI specifications. This paper describes to what extent the tools genuinely interoperate and examines the likely benefits for users and future prospects for CAA interoperability
Introduction
This is the post print version of the chapter - Copyright @ 2003 The editorsThis book is about surrogacy and, more specifically, surrogate motherhood. It is a collection of essays that aims to provide a contemporary and international picture of a practice, traceable to ancient times, devised to solve the problem of childlessness. The collection, which explores surrogacy from a variety of perspectives including law, policy, medicine and psychology, is timely. For although there is nothing new in the notion that a woman might bear a child for someone else, there is some evidence that the incidence of surrogacy is increasing and technology has developed to make ever more complex arrangements possible
- âŠ