5,376 research outputs found
The Information Commons: a public policy report
This report describes the history of the information commons, presents examples of online commons that provide new ways to store and deliver information, and concludes with policy recommendations. Available in PDF and HTML versions.BRENNAN CENTER FOR JUSTICE at NYU SCHOOL OF LAW
Democracy Program, Free Expression Policy Project
161 Avenue of the Americas, 12th floor New York NY 10013
Phone: (212) 998-6730 Web site: www.brennancenter.org
Free Expression Policy Project: www.fepproject.or
Cyberscience and the Knowledge-Based Economy, Open Access and Trade Publishing: From Contradiction to Compatibility with Nonexclusive Copyright Licensing
Open source, open content and open access are set to fundamentally alter the conditions of knowledge production and distribution. Open source, open content and open access are also the most tangible result of the shift towards e-Science and digital networking. Yet, widespread misperceptions exist about the impact of this shift on knowledge distribution and scientific publishing. It is argued, on the one hand, that for the academy there principally is no digital dilemma surrounding copyright and there is no contradiction between open science and the knowledge-based economy if profits are made from nonexclusive rights. On the other hand, pressure for the âdigital doublingâ of research articles in Open Access repositories (the âgreen roadâ) is misguided and the current model of Open Access publishing (the âgold roadâ) has not much future outside biomedicine. Commercial publishers must understand that business models based on the transfer of copyright have not much future either. Digital technology and its economics favour the severance of distribution from certification. What is required of universities and governments, scholars and publishers, is to clear the way for digital innovations in knowledge distribution and scholarly publishing by enabling the emergence of a competitive market that is based on nonexclusive rights. This requires no change in the law but merely an end to the praxis of copyright transfer and exclusive licensing. The best way forward for research organisations, universities and scientists is the adoption of standard copyright licenses that reserve some rights, namely Attribution and No Derivative Works, but otherwise will allow for the unlimited reproduction, dissemination and re-use of the research article, commercial uses included
Recommended from our members
Application of Big Data to Support Evidence-Based Public Health Policy Decision-Making for Hearing
Ideally, public health policies are formulated from scientific data; however, policy-specific data are often unavailable. Big data can generate ecologically-valid, high-quality scientific evidence, and therefore has the potential to change how public health policies are formulated. Here, we discuss the use of big data for developing evidence-based hearing health policies, using data collected and analyzed with a research prototype of a data repository known as EVOTION (EVidence-based management of hearing impairments: public health pOlicy-making based on fusing big data analytics and simulaTION), to illustrate our points. Data in the repository consist of audiometric clinical data, prospective real-world data collected from hearing aids and an app, and responses to questionnaires collected for research purposes. To date, we have used the platform and a synthetic dataset to model the estimated risk of noise-induced hearing loss and have shown novel evidence of ways in which external factors influence hearing aid usage patterns. We contend that this research prototype data repository illustrates the value of using big data for policy-making by providing high-quality evidence that could be used to formulate and evaluate the impact of hearing health care policies
Next Generation Teaching and Learning ??? Technologies and Trends
The landscape of teaching and learning has been radically shifted
in the last 15 years by the advent of web technologies, which
enabled the emergence of Learning Management Systems (LMS).
These systems changed the educational paradigm by extending the
classroom borders, capturing and persisting course content and
giving instructors more flexibility and access to students and other
resources. However, they also constrained and limited the
evolution of teaching and learning by imposing a traditional,
instructional framework. With the advent of Web 2.0
technologies, participation and collaboration have become
predominant experiences on the Web. The teaching and learning
community, as a whole, has been late to capitalize on these
technologies in the classroom. Part of this trend is due to
constraints in the technology (LMS), and part is due to the fact
that participatory media tools require an additional shift in
educational paradigms, from instructional, on-the-pulpit type of
teaching, to a student-centered, adaptive environment where
students can contribute to the course material and learn from one
another. This panel will discuss the next generation of teaching
and learning, involving more lightweight, modular systems to
empower instructors to be flexible, explore new student-centered
paradigms, and plug and play tools as needed. We will also
discuss how the iSchools are and should be increasingly involved
in studying these new forms, formulating best practices and
supporting the needs of teachers as they move toward more
collaborative learning environments
Appraisal and the Future of Archives in the Digital Era
Discussion of the implications of new technologies, changing public policies, and transformation of culture for how archivists practice and think about appraisal
Research, relativity and relevance : can universal truths answer local questions
It is a commonplace that the internet has led to a globalisation of informatics and that this has had beneficial effects in terms of standards and interoperability. However this necessary harmonisation has also led to a growing understanding that this positive trend has an in-built assumption that "one size fits all". The paper explores the importance of local and national research in addressing global issues and the appropriateness of local solutions and applications. It concludes that federal and collegial solutions are to be preferred to imperial solutions
Highly Granular Calorimeters: Technologies and Results
The CALICE collaboration is developing highly granular calorimeters for
experiments at a future lepton collider primarily to establish technologies for
particle flow event reconstruction. These technologies also find applications
elsewhere, such as detector upgrades for the LHC. Meanwhile, the large data
sets collected in an extensive series of beam tests have enabled detailed
studies of the properties of hadronic showers in calorimeter systems, resulting
in improved simulation models and development of sophisticated reconstruction
techniques. In this proceeding, highlights are included from studies of the
structure of hadronic showers and results on reconstruction techniques for
imaging calorimetry. In addition, current R&D activities within CALICE are
summarized, focusing on technological prototypes that address challenges from
full detector system integration and production techniques amenable to mass
production for electromagnetic and hadronic calorimeters based on silicon,
scintillator, and gas techniques.Comment: 11 pages, 16 figures, the proceeding for the overview talk presented
at the conference Instrumentation for Colliding Beam Physics 2017 (INSTR17),
Novosibirsk, Russia, 27 February - 3 March 2017, to be published in JINS
Seismic Risk Analysis of Revenue Losses, Gross Regional Product and transportation systems.
Natural threats like earthquakes, hurricanes or tsunamis have shown seri- ous impacts on communities. In the past, major earthquakes in the United States like Loma Prieta 1989, Northridge 1994, or recent events in Italy like LâAquila 2009 or Emilia 2012 earthquake emphasized the importance of pre- paredness and awareness to reduce social impacts. Earthquakes impacted businesses and dramatically reduced the gross regional product. Seismic Hazard is traditionally assessed using Probabilistic Seismic Hazard Anal- ysis (PSHA). PSHA well represents the hazard at a specific location, but itâs unsatisfactory for spatially distributed systems. Scenario earthquakes overcome the problem representing the actual distribution of shaking over a spatially distributed system. The performance of distributed productive systems during the recovery process needs to be explored.
Scenario earthquakes have been used to assess the risk in bridge networks and the social losses in terms of gross regional product reduction. The proposed method for scenario earthquakes has been applied to a real case study: Treviso, a city in the North East of Italy. The proposed method for scenario earthquakes requires three models: one representation of the sources (Italian Seismogenic Zonation 9), one attenuation relationship (Sa- betta and Pugliese 1996) and a model of the occurrence rate of magnitudes (Gutenberg Richter). A methodology has been proposed to reduce thou- sands of scenarios to a subset consistent with the hazard at each location. Earthquake scenarios, along with Mote Carlo method, have been used to simulate business damage. The response of business facilities to earthquake has been obtained from fragility curves for precast industrial building. Fur- thermore, from business damage the reduction of productivity has been simulated using economic data from the National statistical service and a proposed piecewise âloss of functionality modelâ. To simulate the economic process in the time domain, an innovative businesses recovery function has been proposed.
The proposed method has been applied to generate scenarios earthquakes at the location of bridges and business areas. The proposed selection method- ology has been applied to reduce 8000 scenarios to a subset of 60. Subse- quently, these scenario earthquakes have been used to calculate three system performance parameters: the risk in transportation networks, the risk in terms of business damage and the losses of gross regional product. A novel model for business recovery process has been tested. The proposed model has been used to represent the business recovery process and simulate the effects of government aids allocated for reconstruction.
The proposed method has efficiently modeled the seismic hazard using scenario earthquakes. The scenario earthquakes presented have been used to assess possible consequences of earthquakes in seismic prone zones and to increase the preparedness. Scenario earthquakes have been used to sim- ulate the effects to economy of the impacted area; a significant Gross Regional Product reduction has been shown, up to 77% with an earthquake with 0.0003 probability of occurrence. The results showed that limited funds available after the disaster can be distributed in a more efficient way
- âŠ