1,928 research outputs found

    Effective packaging-related specification management software for a packaging documentation system

    Get PDF
    An extreme amount of time and money is lost when a company utilizes an inadequate packaging documentation system. Problems such as confusion, decreased productivity, inaccurate information, and inefficient time to market arise. Having the right tool (in this case specific software) for the job will successfully provide needed information to flow throughout a packaging documentation system. In investigating the importance of this tool, Duracell will be the company focused on as part of this case study. Duracell\u27s current packaging documentation system consists of a series of internal electronic transfers and physical distribution of packaging specifications. Relevant information is distributed among departments inside the company and other departments/affiliated businesses outside of the company. Duracell, as well as many other companies, can cut significant costs by effectively managing information and having well organized proficiently operating specification management software. The purpose of this study is to compare, evaluate, and determine Duracell\u27s current packaging documentation management needs. After examining DCS (Document Control System) 6.0 Professional, it was found to be a much more effective software than Duracell\u27s current DCS software system, DCS 2000. The hypothesis of this particular study is that all the types of Duracell\u27s packaging documentation (artwork, bill of materials, CAD drawings, pallet patterns, planograms, etc.) can be integrated into DCS 6.0 Professional, which can offer a great deal of cost and time savings throughout Duracell\u27s packaging documentation system. The study determines that DCS 6.0 Professional, when properly used, can provide Duracell with a business solution that enables successful management functions for completed packaging documentation to occur through Duracell\u27s internal and external organization on a consistent basis

    An open software development-based ecosystem of R packages for metabolomics data analysis

    Get PDF
    A frequent problem with scientific research software is the lack of support, maintenance and further development. In particular, development by a single researcher can easily result in orphaned software packages, especially if combined with poor documentation or lack of adherence to open software development standards. The RforMassSpectrometry initiative aims to develop an efficient and stable infrastructure for mass spectrometry (MS) data analysis. As part of this initiative, a growing ecosystem of R software packages is being developed covering different aspects of metabolomics and proteomics data analysis. To avoid the aforementioned problems, community contributions are fostered, and open development, documentation and long-term support emphasized. At the heart of the package ecosystem is the Spectra package that provides the core infrastructure to handle and analyze MS data. Its design allows easy expansion to support additional file or data formats including data representations with minimal memory footprint or remote data access. The xcms package for LC-MS data preprocessing was updated to reuse this infrastructure, enabling now also the analysis of very large, or remote, data. This integration simplifies in addition complete analysis workflows which can include the MsFeatures package for compounding, and the MetaboAnnotation package for annotation of untargeted metabolomics experiments. Public annotation resources can be easily accessed through packages such as MsBackendMassbank, MsBackendMgf, MsBackendMsp or CompoundDb, the latter also allowing to create and manage lab-specific compound databases. Finally, the MsCoreUtils and MetaboCoreUtils packages provide efficient implementations of commonly used algorithms, designed to be re-used in other R packages. Ultimately, and in contrast to a monolithic software design, the package ecosystem enables to build customized, modular, and reproducible analysis workflows. Future development will focus on improved data structures and analysis methods for chromatographic data, and better interoperability with other open source softwares including a direct integration with Python MS libraries

    Connecting RPA Development and Business : A Tool for Process Definition, Agile RPA Development and Maintenance

    Get PDF
    The world of automation is changing rapidly, especially in the Robotic Process Automation (RPA) field, as companies implement software robots to alleviate the repeatable manual work. This thesis has two purposes. The first purpose is to find out if it is possible to improve Robotic Process Automation projects’ kick-off, communication, documentation, and maintenance with a PDD-SDD Tool. And the second purpose is to demonstrate how the PDD-SDD Tool was designed and developed. The initial hypothesis for the project: A specialized Tool for automating documentation and project kick-off could streamline RPA projects and mitigate communication-related misunderstandings. The methods used in the thesis assisted with understanding the needs of the RPA professionals that are working with the whole RPA project life-cycle. For creating the starting point, the hypothesis was constructed by analyzing the current issues in RPA projects in the industry. Based on the hypothesis an Initial Design of the Tool was created and interview questions were produced. Next, three rounds of structured interviews were held for data gathering and demonstrating the developed prototypes. Lastly, a final presentation of the Tool was given to the same study group and improvement proposals were gathered. The findings of this thesis align with the initial hypothesis based on the data gathered from the interviewees during the interviews, the Tool aids the professionals with process definition by providing guiding questions during the definition process. Automated documentation based on the developers’ code alleviated with documentation burden, which leads to up-to-date documentation, which also leads to better communication. The automatically generated code base for the developers created from the process definition and the up-to-date documentation enhanced maintenance work dramatically. The value of this thesis lies with the actualization of a custom Tool developed specifically for MOST Digital’s RPA projects’ life-cycle. Also, the methodologies used in this thesis for data extraction could be further used to collect new features for the developed Tool or to develop a new tool altogether. This project was initiated in Fall 2019 and finished in Summer of 2021, and resulted with a fully functional StandAlone version of the designed PDD-SDD Tool

    [Subject benchmark statement]: computing

    Get PDF

    Recent Trends in Software Engineering Research As Seen Through Its Publications

    Get PDF
    This study provides some insight into the field of software engineering through analysis of its recent research publications. Data for this study are taken from the ACM\u27s Guide to Computing Literature (GUIDE) They include both the professionally assigned Computing Classification System (CCS) descriptors and the title text of each software engineering publication reviewed by the GUIDE from 1998 through 2001. The first part of this study provides a snapshot of software engineering by applying co-word analysis techniques to the data. This snapshot indicates recent themes or areas of interest, which, when compared with the results from earlier studies, reveal current trends in software engineering. Software engineering continues to have no central focus. Concepts like software development, process improvement, applications, parallelism, and user interfaces are persistent and, thus, help define the field, but they provide little guidance for researchers or developers of academic curricula. Of more interest and use are the specific themes illuminated by this study, which provide a clearer indication of the current interests of the field. Two prominent themes are the related issues of programming-in-the-large and best practices. Programming-in-the-large is the term often applied to large-scale and long-term software development, where project and people management, code reusability, performance measures, documentation, and software maintenance issues take on special importance. These issues began emerging in earlier periods, but seem to have risen to prominence during the current period. Another important discovery is the trend in software development toward using networking and the Internet. Many network- and Internet-related descriptors were added to the CCS in 1998. The prominent appearance and immediate use of these descriptors during this period indicate that this is a real trend and not just an aberration caused by their recent addition. The titles of the period reflect the prominent themes and trends. In addition to corroborating the keyword analysis, the title text confirms the relevance of the CCS and its most recent revision. By revealing current themes and trends in software engineering, this study provides some guidance to the developers of academic curricula and indicates directions for further research and study

    Proceedings of the Eighth Annual Software Engineering Workshop

    Get PDF
    The four major topics of discussion included: the NASA Software Engineering Laboratory, software testing, human factors in software engineering and software quality assessment. As in the past years, there were 12 position papers presented (3 for each topic) followed by questions and very heavy participation by the general audience

    Identifying and addressing adaptability and information system requirements for tactical management

    Get PDF

    Lifecycle information for e-literature: full report from the LIFE project

    Get PDF
    This Report is a record of the LIFE Project. The Project has been run for one year and its aim is to deliver crucial information about the cost and management of digital material. This information should then in turn be able to be applied to any institution that has an interest in preserving and providing access to electronic collections. The Project is a joint venture between The British Library and UCL Library Services. The Project is funded by JISC under programme area (i) as listed in paragraph 16 of the JISC 4/04 circular- Institutional Management Support and Collaboration and as such has set requirements and outcomes which must be met and the Project has done its best to do so. Where the Project has been unable to answer specific questions, strong recommendations have been made for future Project work to do so. The outcomes of this Project are expected to be a practical set of guidelines and a framework within which costs can be applied to digital collections in order to answer the following questions: • What is the long term cost of preserving digital material; • Who is going to do it; • What are the long term costs for a library in HE/FE to partner with another institution to carry out long term archiving; • What are the comparative long-term costs of a paper and digital copy of the same publication; • At what point will there be sufficient confidence in the stability and maturity of digital preservation to switch from paper for publications available in parallel formats; • What are the relative risks of digital versus paper archiving. The Project has attempted to answer these questions by using a developing lifecycle methodology and three diverse collections of digital content. The LIFE Project team chose UCL e-journals, BL Web Archiving and the BL VDEP digital collections to provide a strong challenge to the methodology as well as to help reach the key Project aim of attributing long term cost to digital collections. The results from the Case Studies and the Project findings are both surprising and illuminating
    • …
    corecore