10 research outputs found

    Peer-reviewed open research data: results of a pilot

    Get PDF
    Peer review of publications is at the core of science and primarily seen as instrument for ensuring research quality. However, it is less common to value independently the quality of the underlying data as well. In the light of the “data deluge” it makes sense to extend peer review to the data itself and this way evaluate the degree to which the data are fit for re-use. This paper describes a pilot study at EASY - the electronic archive for (open) research data at our institution. In EASY, researchers can archive their data and add metadata themselves. Devoted to open access and data sharing, at the archive we are interested in further enriching these metadata with peer reviews. As pilot we established a workflow where researchers who have downloaded data sets from the archive were asked to review the downloaded data set. This paper describes the details of the pilot including the findings, both quantitative and qualitative. Finally we discuss issues that need to be solved when such a pilot should be turned into structural peer review functionality of the archiving system

    Management quality management processes in a naval ship construction company: A qualitative case analysis

    Get PDF
    This industrial research study investigates the challenges encountered in the quality management implementation in a naval ship construction and maintenance company. This study will provide a proper view of the work completed in the process of ship construction and maintenance, especially in the Quality Department and will recommend improvements in quality, particularly in the building of a ship. Specifically, it aims to deeply examine the application of quality management knowledge and tools in the process-based work inspection planning, work monitoring activities and record-keeping information management. Additionally, the role of the Quality Department in the ship construction projects has been the main focus of this research study. In achieving the objectives, this case study has applied the qualitative approach which involved activities such as interviewing the focus group especially in Quality Department, observing the work-related activities that involve with quality work inspection process and reviewing quality-related documentation base on the ship construction work report and quality work inspection activities. The results of the three approaches were then triangulated and analysed by using Nvivo sohare for identification of relevant themes that normally use by qualitative researcher. The study has revealed the importance of team work and high understanding among various departments in managing the inspection planning and work-related information. It has identified the issues that had occurred in during the research, provided an analysis that can benefit the company and contributed to academic knowledge and also enhance the company's vision and mission. Furthermore, with proper improvement activities aligned with the actual work process will also result in higher productivity and quality of work processes as well as reducing the difficulties and problems encountered in the implementation of the quality management of this company

    Metadata quality issues in learning repositories

    Get PDF
    Metadata lies at the heart of every digital repository project in the sense that it defines and drives the description of digital content stored in the repositories. Metadata allows content to be successfully stored, managed and retrieved but also preserved in the long-term. Despite the enormous importance of metadata in digital repositories, one that is widely recognized, studies indicate that what is defined as metadata quality, is relatively low in most cases of digital repositories. Metadata quality is loosely defined as "fitness for purpose" meaning that low quality of metadata means that metadata cannot fulfill its purpose which is to allow for the successful storage, management and retrieval of resources. In practice, low metadata quality leads to ineffective searches for content, ones that recall the wrong resources or even worse, no resources which makes them invisible to the intended user, that is the "client" of each digital repository. The present dissertation approaches this problem by proposing a comprehensive metadata quality assurance method, namely the Metadata Quality Assurance Certification Process (MQACP). The basic idea of this dissertation is to propose a set of methods that can be deployed throughout the lifecycle of a repository to ensure that metadata generated from content providers are of high quality. These methods have to be straightforward, simple to apply with measurable results. They also have to be adaptable with minimum effort so that they can be used in different contexts easily. This set of methods was described analytically, taking into account the actors needed to apply them, describing the tools needed and defining the anticipated outcomes. In order to test our proposal, we applied it on a Learning Federation of repositories, from day 1 of its existence until it reached its maturity and regular operation. We supported the metadata creation process throughout the different phases of the repositories involved by setting up specific experiments using the methods and tools of the MQACP. Throughout each phase, we measured the resulting metadata quality to certify that the anticipated improvement in metadata quality actually took place. Lastly, through these different phases, the cost of the MQACP application was measured to provide a comparison basis for future applications. Based on the success of this first application, we decided to validate the MQACP approach by applying it on another two cases of a Cultural and a Research Federation of repositories. This would allow us to prove the transferability of the approach to other cases the present some similarities with the initial one but mainly significant differences. The results showed that the MQACP was successfully adapted to the new contexts, with minimum adaptations needed, with similar results produced and also with comparable costs. In addition, looking closer at the common experiments carried out in each phase of each use case, we were able to identify interesting patterns in the behavior of content providers that can be further researched. The dissertation is completed with a set of future research directions that came out of the cases examined. These research directions can be explored in order to support the next version of the MQACP in terms of the methods deployed, the tools used to assess metadata quality as well as the cost analysis of the MQACP methods

    Metadata quality issues in learning repositories

    Get PDF
    Metadata lies at the heart of every digital repository project in the sense that it defines and drives the description of digital content stored in the repositories. Metadata allows content to be successfully stored, managed and retrieved but also preserved in the long-term. Despite the enormous importance of metadata in digital repositories, one that is widely recognized, studies indicate that what is defined as metadata quality, is relatively low in most cases of digital repositories. Metadata quality is loosely defined as "fitness for purpose" meaning that low quality of metadata means that metadata cannot fulfill its purpose which is to allow for the successful storage, management and retrieval of resources. In practice, low metadata quality leads to ineffective searches for content, ones that recall the wrong resources or even worse, no resources which makes them invisible to the intended user, that is the "client" of each digital repository. The present dissertation approaches this problem by proposing a comprehensive metadata quality assurance method, namely the Metadata Quality Assurance Certification Process (MQACP). The basic idea of this dissertation is to propose a set of methods that can be deployed throughout the lifecycle of a repository to ensure that metadata generated from content providers are of high quality. These methods have to be straightforward, simple to apply with measurable results. They also have to be adaptable with minimum effort so that they can be used in different contexts easily. This set of methods was described analytically, taking into account the actors needed to apply them, describing the tools needed and defining the anticipated outcomes. In order to test our proposal, we applied it on a Learning Federation of repositories, from day 1 of its existence until it reached its maturity and regular operation. We supported the metadata creation process throughout the different phases of the repositories involved by setting up specific experiments using the methods and tools of the MQACP. Throughout each phase, we measured the resulting metadata quality to certify that the anticipated improvement in metadata quality actually took place. Lastly, through these different phases, the cost of the MQACP application was measured to provide a comparison basis for future applications. Based on the success of this first application, we decided to validate the MQACP approach by applying it on another two cases of a Cultural and a Research Federation of repositories. This would allow us to prove the transferability of the approach to other cases the present some similarities with the initial one but mainly significant differences. The results showed that the MQACP was successfully adapted to the new contexts, with minimum adaptations needed, with similar results produced and also with comparable costs. In addition, looking closer at the common experiments carried out in each phase of each use case, we were able to identify interesting patterns in the behavior of content providers that can be further researched. The dissertation is completed with a set of future research directions that came out of the cases examined. These research directions can be explored in order to support the next version of the MQACP in terms of the methods deployed, the tools used to assess metadata quality as well as the cost analysis of the MQACP methods

    Lived Experiences of Returning Service Members Reintegrating With Their Children on the Autism Spectrum

    Get PDF
    Military deployments of a caregiver have a powerful and potentially damaging impact of the attachment bond between the service member and their child with autism spectrum disorder (ASD). Research has shown that children with ASD react to their caregiver’s departure, that they direct more social behavior to the caregiver than to a stranger, and that many of them increase their proximity seeking behavior after separation from the caregiver. Military children with ASD were underrepresented in previous attachment and reintegration research. This qualitative study explored the lived of experience of the military caregiver attaching or reattaching to their child with ASD after a prolonged absence. Using Ainsworth and Bowlby’s attachment theory, the study examined four military caregivers who had deployed for a period of 6 months or longer while leaving a child with ASD behind. Data from the interviews with four active duty Navy service members were analyzed to identify relevant themes among the returning service members, which were then broken into structural and textural descriptions thus forming the essence of their experiences. Results of this study indicated that children with ASD separated from their military caregiver for more than 6 months had increased behavioral challenges. In addition, military caregivers discussed the difficulties in finding programs to assist them with returning home to their child with ASD. Professionals supporting military families with children diagnosed with ASD may be able to recognize and provide interventions to address the emotional needs of the exceptional family member along with offering parental support to the military caregiver after a prolong absence

    Reintegration of Female Non-Commissioned Officer Veterans into the Private Business Sector

    Get PDF
    Experiencing a significant career transition can directly impact military veterans. Literature exists on military transition and reintegration but is focused on topics ranging from combat-related disabilities and mental health issues to higher learning. There is a lack of knowledge regarding female Non-Commissioned Officer (NCO) veterans’ transition and reintegration experiences. The purpose of this qualitative study, using a phenomenological design with purposeful sampling, was to explore the lived experiences of female NCO veterans’ process of transition and private-sector reintegration. The research question evaluated participants’ perceptions, leveraging them to increase awareness and improve programs for the U.S. veteran population. Semi structured interviews were used with a sample of 16 female NCO veterans using audio recording and verbatim transcription of the interviews. The concepts of transition and reintegration formed the basis for the conceptual framework. Through a conceptual lens, Schlossberg’s 4S and Nicholson’s work-role transition models aided in revealing 17 emergent themes. The findings of this explorative study confirmed that transition and reintegration challenges linked (a) ineffective transition and reintegration programs, (b) consistent inability to translate military management skills and experience to private-sector employment, and (c) lack of gender-specific resources. Government officials, policymakers, and employers can use the findings to improve programs and policies directly impacting management models. Moreover, the findings may help to advance positive social change by influencing perspectives and improving resources, thus contribute to enhanced career transition and private-sector reintegration for U.S. veterans

    Open Science in der Soziologie: Eine interdisziplinäre Bestandsaufnahme zur offenen Wissenschaft und eine Untersuchung ihrer Verbreitung in der Soziologie

    Get PDF
    Open Science aims to provide open access to all items produced within the research process, primarily to text publications, research data and research software. Moreover, Open Science should also bring transparency in moderating processes (such as the assessment and the review of text publications or data) and the production and utilizations of para-informations (as impact metrics). Open Science proponents describe it as a efficient, innovation-friendly and transparent science because open information can be disseminated,reused and re-analyzed more quickly and easily than closed information. The work is based on a multidisciplinary inventory of Open Access to text publications, Open Access to research data, Open Access to research software, Open Review and Open Metrics, all more typically in the STM subjects (Science, Technology, Medicine) to find than in the social sciences or humanities. Based on this synopsis the work is dedicated to the specifics of sociology, which is commonly regarded as a latecomer in the Open Science. It empirically investigates the prevalence and relevance of Open Science, Open Access to text publications, Open Access to research data, Open Access to research software, Open Review and Open Metrics in Sociology
    corecore