1,780 research outputs found

    TANGO: Transparent heterogeneous hardware Architecture deployment for eNergy Gain in Operation

    Get PDF
    The paper is concerned with the issue of how software systems actually use Heterogeneous Parallel Architectures (HPAs), with the goal of optimizing power consumption on these resources. It argues the need for novel methods and tools to support software developers aiming to optimise power consumption resulting from designing, developing, deploying and running software on HPAs, while maintaining other quality aspects of software to adequate and agreed levels. To do so, a reference architecture to support energy efficiency at application construction, deployment, and operation is discussed, as well as its implementation and evaluation plans.Comment: Part of the Program Transformation for Programmability in Heterogeneous Architectures (PROHA) workshop, Barcelona, Spain, 12th March 2016, 7 pages, LaTeX, 3 PNG figure

    Addressing data management training needs: a practice based approach from the UK

    Get PDF
    In this paper, we describe the current challenges to the effective management and preservation of research data in UK universities, and the response provided by the JISC Managing Research Data programme. This paper will discuss, inter alia, the findings and conclusions from data management training projects of the first iteration of the programme and how they informed the design of the second, paying particular attention to initiatives to develop and embed training materials

    Moving beyond e-journals

    Get PDF
    Paul Ayris explains to Elspeth Hyams why scholarly communication has moved beyond the debate on e-journals pricing and open access

    The Critical Need for Software Architecture Practices in Software Development Process

    Get PDF
    Software architecture is the master plan of every reliable software system. It is the building block of any kind of software system which greatly determines the success of the system. This paper argues that every system needs a good architecture and that requires the use of good architecture engineering practices in a software development process. The paper recognized software architecture practice as a discipline pervading all phases of software development and then identifies some of the pertinent areas where architectural practice can be used based on a framework. In addition a model showing how software architecture fits into the phases of a generic software development process lifecycle was presented. The model is to enable software developers and acquirers to use effective software architecture practices during software development in order to exert significantly greater control over software product qualities. Keywords: Software architecture, Software Development, Software, Quality, Stakeholders, Software engineerin

    Key Performance Indicators for Implementing Sustainability and Environmental Protection in Early Process Design Activities

    Get PDF
    The adoption of a sustainability perspective in chemical industry shall start from the early phases of process design (e.g. conceptual design, technology selection, process development) where the key drivers in the environmental, economical, and hazard fingerprint of a process are defined. These phases also allow the opportunities for the lower cost of design change. A sound support of design activities requires quantitative tools, allowing for the assessment of the sustainability profile of a process, the identification of possible improvements and supporting informed tradeoffs. Though several tools for process development were proposed in last decades, application is still limited in the current practice because of issues on data requirement, indicator definition and customization to specific application needs (e.g. PFD definition in design of polypropylene production plants). This study focuses on the application to the early process design of environmental and exergy Key Performance Indicators (KPIs) to support sustainability-oriented design activities. It was tailored on the specific industrial application of polypropylene production plants. The choice of a specific sector allowed customization of the method, promoting ease of application and allowing the assessment of multiple scenarios (e.g. sensitivity on material and energy supply strategies, comparison of different technologies). Results obtained draw up sustainable guidelines to improve design activities within the scope in a lifecycle perspective

    Towards Logic Programming as a Service: Experiments in tuProlog

    Get PDF
    none4noIn this paper we explore the perspective of Logic Programming as a Service (LPaaS), with a broad notion of “service” going beyond the mere handling of the logic engine lifecycle, knowledge base management, reasoning queries execution, etc. In particular, we present tuProlog as-a-service, a Prolog engine based on the tuProlog core made available as an encapsulated service to effectively support the spreading of intelligence in pervasive systems—mainly, Internet-of-Things (IoT) applications scenarios. So, after recalling the main features of tuProlog technology, we discuss the design and implementation of tuProlog as-a-service, focussing in particular on the iOS platform because of the many supported smart devices (phones, watches, etc.), the URL-based communication support among apps, and the multi-language resulting scenarios.openRoberta Calegari; Enrico Denti; Stefano Mariani; Andrea OmiciniRoberta Calegari; Enrico Denti; Stefano Mariani; Andrea Omicin

    ALMA: ALgorithm Modeling Application

    Get PDF
    As of today, the most recent trend in information technology is the employment of large-scale data analytic methods powered by Artificial Intelligence (AI), influencing the priorities of businesses and research centers all over the world. However, due to both the lack of specialized talent and the need for greater compute, less established businesses struggle to adopt such endeavors, with major technological mega-corporations such as Microsoft, Facebook and Google taking the upper hand in this uneven playing field. Therefore, in an attempt to promote the democratization of AI and increase the efficiency of data scientists, this work proposes a novel no-code/low-code AI platform: the ALgorithm Modeling Application (ALMA). Moreover, as the state of the art of such platforms is still gradually maturing, current solutions often fail into encompassing security/safety aspects directly into their process. In that respect, the solution proposed in this thesis aims not only to achieve greater development and deployment efficiency while building machine learning applications but also to build upon others by addressing the inherent pitfalls of AI through a ”secure by design” philosophy.Atualmente, a tendĂȘncia mais recente no domĂ­nio das tecnologias de informação e a utilização de mĂ©todos de anĂĄlise de dados baseados em InteligĂȘncia Artificial (IA), influenciando as prioridades das empresas e centros de investigação de todo o mundo. No entanto, devido Ă  falta de talento especializado no mercado e a necessidade de obter equipamentos com maior capacidade de computação, negĂłcios menos estabelecidos tĂȘm maiores dificuldades em realizar esse tipo de investimentos quando comparados a grandes empresas tecnolĂłgicas como a Microsoft, o Facebook e a Google. Deste modo, na tentativa de promover a democratização da IA e aumentar a eficiĂȘncia dos cientistas de dados, este trabalho propĂ”e uma nova plataforma de no-code/low- code: “THe Algorithm Modeling Application” (ALMA). Por outro lado, e visto que a maioria das soluçÔes atuais falham em abranger aspetos de segurança relativos ˜ a IA diretamente no seu processo, a solução proposta nesta tese visa nĂŁo sĂł alcançar maior eficiĂȘncia na construção de soluçÔes baseadas em IA, mas tambĂ©m abordar as questĂ”es de segurança implĂ­citas ao seu uso

    Managing Research Data in Big Science

    Get PDF
    The project which led to this report was funded by JISC in 2010--2011 as part of its 'Managing Research Data' programme, to examine the way in which Big Science data is managed, and produce any recommendations which may be appropriate. Big science data is different: it comes in large volumes, and it is shared and exploited in ways which may differ from other disciplines. This project has explored these differences using as a case-study Gravitational Wave data generated by the LSC, and has produced recommendations intended to be useful variously to JISC, the funding council (STFC) and the LSC community. In Sect. 1 we define what we mean by 'big science', describe the overall data culture there, laying stress on how it necessarily or contingently differs from other disciplines. In Sect. 2 we discuss the benefits of a formal data-preservation strategy, and the cases for open data and for well-preserved data that follow from that. This leads to our recommendations that, in essence, funders should adopt rather light-touch prescriptions regarding data preservation planning: normal data management practice, in the areas under study, corresponds to notably good practice in most other areas, so that the only change we suggest is to make this planning more formal, which makes it more easily auditable, and more amenable to constructive criticism. In Sect. 3 we briefly discuss the LIGO data management plan, and pull together whatever information is available on the estimation of digital preservation costs. The report is informed, throughout, by the OAIS reference model for an open archive
    • 

    corecore