48 research outputs found
Protocols for Scholarly Communication
CERN, the European Organization for Nuclear Research, has operated an
institutional preprint repository for more than 10 years. The repository
contains over 850,000 records of which more than 450,000 are full-text OA
preprints, mostly in the field of particle physics, and it is integrated with
the library's holdings of books, conference proceedings, journals and other
grey literature. In order to encourage effective propagation and open access to
scholarly material, CERN is implementing a range of innovative library services
into its document repository: automatic keywording, reference extraction,
collaborative management tools and bibliometric tools. Some of these services,
such as user reviewing and automatic metadata extraction, could make up an
interesting testbed for future publishing solutions and certainly provide an
exciting environment for e-science possibilities. The future protocol for
scientific communication should naturally guide authors towards OA publication
and CERN wants to help reach a full open access publishing environment for the
particle physics community and the related sciences in the next few years.Comment: 8 pages, to appear in Library and Information Systems in Astronomy
2. Porter lâaction au plan europĂ©en : le rĂŽle dâEBLIDA
EBLIDA, une fĂ©dĂ©ration dâassociations nationales de professionnels et dâinstitutions de la documentation, des bibliothĂšques, des archives et des musĂ©es des pays de lâUnion europĂ©enne, a Ă©tĂ© fondĂ©e en 1992 par un petit nombre dâassociations nationales de bibliothĂšques publiques afin de centraliser les contacts entre ces derniĂšres et lâUnion europĂ©enne (UE). Cette structure regroupe plusieurs associations de bibliothĂšques spĂ©cialisĂ©es dans une grande variĂ©tĂ© de secteurs et implantĂ©es dans tous ..
Quantitative Analysis of the Publishing Landscape in High-Energy Physics
World-wide collaboration in high-energy physics (HEP) is a tradition which
dates back several decades, with scientific publications mostly coauthored by
scientists from different countries. This coauthorship phenomenon makes it
difficult to identify precisely the ``share'' of each country in HEP scientific
production. One year's worth of HEP scientific articles published in
peer-reviewed journals is analysed and their authors are uniquely assigned to
countries. This method allows the first correct estimation on a ``pro rata''
basis of the share of HEP scientific publishing among several countries and
institutions. The results provide an interesting insight into the geographical
collaborative patterns of the HEP community. The HEP publishing landscape is
further analysed to provide information on the journals favoured by the HEP
community and on the geographical variation of their author bases. These
results provide quantitative input to the ongoing debate on the possible
transition of HEP publishing to an Open Access model.Comment: For a better on-screen viewing experience this paper can also be
obtained at:
http://doc.cern.ch/archive/electronic/cern/preprints/open/open-2006-065.pd
Establishing a consortium for Open Access (OA) publishing in particle physics
A meeting has been called at CERN on November 3rd 2006 to work towards establishing a consortium of major particle physics funding agencies, aimed at guiding a transition of the current subscription model for journals to a more stable, more competitive and more affordable future for the dissemination of quality-assured scientific information adapted to the era of electronic publishing. The meeting will gather representatives of major European particle physics agencies and library consortia. In order to be successful it is vital that the stakeholders, representing as they do the funding bodies and academia, see themselves responsible for the financing and organization of the dissemination of scientific information and its quality assurance. In particular the transition to a wider availability of research results cannot afford to be held back due to a lack of concerted effort among the agencies financing the research
Engaging Researchers with Data Management: The Cookbook
Effective Research Data Management (RDM) is a key component of research integrity and reproducible research, and its importance is increasingly emphasised by funding bodies, governments, and research institutions around the world. However, many researchers are unfamiliar with RDM best practices, and research support staff are faced with the difficult task of delivering support to researchers across different disciplines and career stages. What strategies can institutions use to solve these problems?Engaging Researchers with Data Management is an invaluable collection of 24 case studies, drawn from institutions across the globe, that demonstrate clearly and practically how to engage the research community with RDM. These case studies together illustrate the variety of innovative strategies research institutions have developed to engage with their researchers about managing research data. Each study is presented concisely and clearly, highlighting the essential ingredients that led to its success and challenges encountered along the way. By interviewing key staff about their experiences and the organisational context, the authors of this book have created an essential resource for organisations looking to increase engagement with their research communities.This handbook is a collaboration by research institutions, for research institutions. It aims not only to inspire and engage, but also to help drive cultural change towards better data management. It has been written for anyone interested in RDM, or simply, good research practice
Recommended from our members
Novel flavours paired with glutamate condition increased intake in older adults in the absence of changes in liking
Previous research on the repeat exposure to a novel flavour combined with monosodium glutamate (MSG) has shown an increase in liking and consumption for the particular flavour. The aim of the current work was to investigate whether this could also be observed in the case of older people, since they are most affected by undernutrition in the developed world and ways to increase consumption of food are of significant importance for this particular age group. For this study, 40 older adults (age 65-88) repeatedly consumed potato soup with two novel flavours (lemongrass and cumin) which were either with or without a high level of MSG (5%w/w). A randomized single blind within-subject design was implemented, where each participant was exposed to both soup flavours three times over 6 days, with one of the soup flavours containing MSG. After three repeat exposures, consumption increased significantly for the soups where the flavours had contained MSG during the repeated exposure (mean weight consumed increased from 123 to 164 g, p=0.017), implying that glutamate conditioned for increased wanting and consumption, despite the fact that the liking for the soup had not increased
Communiquer !
L'enjeu de la communication en direction des Ă©lus, des dĂ©cideurs, mais aussi des journalistes, est devenu vital pour les bibliothĂšques : il s'agit de donner Ă voir aux tutelles leurs activitĂ©s, de rendre intelligible leur stratĂ©gie de dĂ©veloppement, de construire une image institutionnelle forte. De quels moyens dispose la bibliothĂšque pour faire la preuve du bienfondĂ© de son existence ? Comment communiquer en direction d'un Ă©lu municipal ou rĂ©gional, d'un responsable politique de la bibliothĂšque ? Que peut apporter une bonne collaboration avec des journalistes, avec les partenaires naturels ou hiĂ©rarchiques au sein de l'universitĂ© ou de la collectivitĂ© territoriale ? Comment utiliser Ă bon escient les mĂ©thodes du lobbying et du marketing, ou l'emploi des rĂ©seaux sociaux ? VoilĂ quelques-unes des questions abordĂ©es ici. Une quinzaine d'auteurs d'horizons divers (des sociologues, des enseignants, des journalistes, des bibliothĂ©caires, des responsables de communicationâŠ) nous font part de leurs propres expĂ©riences, donnent des conseils mĂ©thodologiques et des outils fort utiles, en proposant de nombreux exemples et des mises en situation. Auteur de plusieurs ouvrages dans le domaine de la documentation et des sciences de l'information, Jean-Philippe Accart est actuellement directeur des bibliothĂšques de la FacultĂ© des sciences de l'UniversitĂ© de GenĂšve
D7.4 How to be FAIR with your data. A teaching and training handbook for higher education institutions
This handbook aims to support higher education institutions with the integration of FAIR-related content in their curricula and teaching. It was written and edited by a group of about 40 collaborators in a series of six book sprint events that took place between 1 and 10 June 2021. The document provides practical material, such as competence profiles, learning outcomes and lesson plans, and supporting information. It incorporates community feedback received during the public consultation which ran from 27 July to 12 September 2021
Archiving of particle physics data and results for long-term access and use
Preprints and published material are not the only output from high energy physics research that should be archived for future generations. Data are frequently not stored long-term and yet examples have arisen where such storage has been proved necessary. There are also lost possibilities for training, and indeed the records of science for future generations are diminished by the absence. Lessons learned from previous attempts and from other fields for which experimental data are successfully stored, can be used to build a storage paradigm for the future. Data from particle physics experiments are highly complex but a collaborative effort from IT staff, librarians, and physicists can perhaps have success. Issues requiring consideration include: who will have the right to access the data; how will access rights be managed; what level of data should be stored; for what length of time should the data be stored, and what additional information associated with the data must be collected. Technical problems associated with the storage and future use of analysis software must also be tackled