35 research outputs found

    DIGISER. Digital Innovation in Governance and Public Service Provision

    Get PDF
    Digital Innovation Challenges In view of the increasingly intense pressures on the public sector to address the challenges of our time, governments and other public entities are gradually adopting digital innovation, seeking to promote quality public services. Digital technologies and capabilities create opportunities to re-organise public service inno- vation and delivery in ways that reduce cost and increase quality, proactiveness and citizen-centricity. Multilevel governance, networks and other collaboration systems (at local, regional, national and interna- tional level) are gaining importance as key drivers of this process of digital innovation and transformation. The link to the innovation ecosystem, including all sectors of activity, both private and public (e.g., academia, industry, business, citizens and governments) appears as fundamental in all phases of the creation, devel- opment, implementation and maintenance of public services and policy making. Information and communi- cation technologies are conceived as essential elements to support the creation and sustainability of these collaboration processes. In an era in which information gains relevance in the management of the territory and allows new power relations, the expectations of citizens are increasingly demanding and specific. Considering the develop- ments of recent years, such as the economic, social and health crises, the pressure placed on the resolution of global challenges is progressively transferred to the scope of cities. There are several elements that con- tribute to the importance of cities in the digital innovation transformation process namely buying- power, being closer to citizens and being able to work across different sectors. In fact, urban territories increasingly represent a greater number of citizens - in Europe, for example, they correspond to 75% of the population - have greater autonomy in management, worldwide they contribute to 80% of the global GDP and have the potential to provide a major contribution to the resolution of global challenges. The balance between change (promoted by the digital innovation strategies) and stability (driven by organi- sational inertia) needs to be handled carefully. The transformation process has to be based on a long-term strategy and to occur in a sustainable way, by focusing on learning experiences and knowledge and tech- nology transfer, while being sensitive to the local context to ensure improvement. At the European level, the Digital Transition has been considered a main goal for the next decade. The EU launched the European Green Deal and Europe Fit for the Digital Age, a twin initiative, which links green and digital transition. The vision for the EU ́s digital decade is reflected in the Digital Compass 2030 and includes 4 cardinal points: skills, government, infrastructure and Business. With the aim of having 100% of the key public services online by 2030, the digital compass ensures that digital will contribute in a positive way to improve citizens quality of life while reducing the resources spent. To support this vision, and by understanding the importance of community-led data-driven solutions and the potential of collaborative ap- proaches, several initiatives are being implemented. The Living-in.EU movement, for example, points out the European Way’ where multi-level governance and co-creation processes support the development of a cohesive digital Europe in the path towards digital transition. Another initiative contributing to this strategy is promoted by Open & Agile Smart Cities which is connecting cities through Minimal Interoperability Mech- anisms (MIMs) - “a set of practical capabilities based on open technical specifications that allow cities and communities to replicate and scale solutions globally”. The MIMs contribute to the creation of the European Single Market by providing a common technical ground for the procurement and deployment of urban data platforms and end-to-end solutions in cities

    Big-Data Science in Porous Materials: Materials Genomics and Machine Learning

    Full text link
    By combining metal nodes with organic linkers we can potentially synthesize millions of possible metal organic frameworks (MOFs). At present, we have libraries of over ten thousand synthesized materials and millions of in-silico predicted materials. The fact that we have so many materials opens many exciting avenues to tailor make a material that is optimal for a given application. However, from an experimental and computational point of view we simply have too many materials to screen using brute-force techniques. In this review, we show that having so many materials allows us to use big-data methods as a powerful technique to study these materials and to discover complex correlations. The first part of the review gives an introduction to the principles of big-data science. We emphasize the importance of data collection, methods to augment small data sets, how to select appropriate training sets. An important part of this review are the different approaches that are used to represent these materials in feature space. The review also includes a general overview of the different ML techniques, but as most applications in porous materials use supervised ML our review is focused on the different approaches for supervised ML. In particular, we review the different method to optimize the ML process and how to quantify the performance of the different methods. In the second part, we review how the different approaches of ML have been applied to porous materials. In particular, we discuss applications in the field of gas storage and separation, the stability of these materials, their electronic properties, and their synthesis. The range of topics illustrates the large variety of topics that can be studied with big-data science. Given the increasing interest of the scientific community in ML, we expect this list to rapidly expand in the coming years.Comment: Editorial changes (typos fixed, minor adjustments to figures

    Artificial intelligence in biological activity prediction

    Get PDF
    Artificial intelligence has become an indispensable resource in chemoinformatics. Numerous machine learning algorithms for activity prediction recently emerged, becoming an indispensable approach to mine chemical information from large compound datasets. These approaches enable the automation of compound discovery to find biologically active molecules with important properties. Here, we present a review of some of the main machine learning studies in biological activity prediction of compounds, in particular for sweetness prediction. We discuss some of the most used compound featurization techniques and the major databases of chemical compounds relevant to these tasks.This study was supported by the European Commission through project SHIKIFACTORY100 - Modular cell factories for the production of 100 compounds from the shikimate pathway (Reference 814408), and by the Portuguese FCT under the scope of the strategic funding of UID/BIO/04469/2019 unit and BioTecNorte operation (NORTE-01-0145-FEDER-000004) funded by the European Regional Development Fund under the scope of Norte2020.info:eu-repo/semantics/publishedVersio

    Effect of missing data on multitask prediction methods

    Get PDF
    There has been a growing interest in multitask prediction in chemoinformatics, helped by the increasing use of deep neural networks in this field. This technique is applied to multitarget data sets, where compounds have been tested against different targets, with the aim of developing models to predict a profile of biological activities for a given compound. However, multitarget data sets tend to be sparse; i.e., not all compound-target combinations have experimental values. There has been little research on the effect of missing data on the performance of multitask methods. We have used two complete data sets to simulate sparseness by removing data from the training set. Different models to remove the data were compared. These sparse sets were used to train two different multitask methods, deep neural networks and Macau, which is a Bayesian probabilistic matrix factorization technique. Results from both methods were remarkably similar and showed that the performance decrease because of missing data is at first small before accelerating after large amounts of data are removed. This work provides a first approximation to assess how much data is required to produce good performance in multitask prediction exercises

    Need for recovery amongst emergency physicians in the UK and Ireland: A cross-sectional survey

    Get PDF
    OBJECTIVES: To determine the need for recovery (NFR) among emergency physicians and to identify demographic and occupational characteristics associated with higher NFR scores. DESIGN: Cross-sectional electronic survey. SETTING: Emergency departments (EDs) (n=112) in the UK and Ireland. PARTICIPANTS: Emergency physicians, defined as any registered physician working principally within the ED, responding between June and July 2019. MAIN OUTCOME MEASURE: NFR Scale, an 11-item self-administered questionnaire that assesses how work demands affect intershift recovery. RESULTS: The median NFR Score for all 4247 eligible, consented participants with a valid NFR Score was 70.0 (95% CI: 65.5 to 74.5), with an IQR of 45.5-90.0. A linear regression model indicated statistically significant associations between gender, health conditions, type of ED, clinical grade, access to annual and study leave, and time spent working out-of-hours. Groups including male physicians, consultants, general practitioners (GPs) within the ED, those working in paediatric EDs and those with no long-term health condition or disability had a lower NFR Score. After adjusting for these characteristics, the NFR Score increased by 3.7 (95% CI: 0.3 to 7.1) and 6.43 (95% CI: 2.0 to 10.8) for those with difficulty accessing annual and study leave, respectively. Increased percentage of out-of-hours work increased NFR Score almost linearly: 26%-50% out-of-hours work=5.7 (95% CI: 3.1 to 8.4); 51%-75% out-of-hours work=10.3 (95% CI: 7.6 to 13.0); 76%-100% out-of-hours work=14.5 (95% CI: 11.0 to 17.9). CONCLUSION: Higher NFR scores were observed among emergency physicians than reported in any other profession or population to date. While out-of-hours working is unavoidable, the linear relationship observed suggests that any reduction may result in NFR improvement. Evidence-based strategies to improve well-being such as proportional out-of-hours working and improved access to annual and study leave should be carefully considered and implemented where feasible
    corecore