380,803 research outputs found

    Understanding DNS Query Composition at B-Root

    Full text link
    The Domain Name System (DNS) is part of critical internet infrastructure, as DNS is invoked whenever a remote server is accessed (an URL is visited, an API request is made, etc.) by any application. DNS queries are served in hierarchical manner, with most queries served locally from cached data, and a small fraction propagating to the top of the hierarchy - DNS root name servers. Our research aims to provide a comprehensive, longitudinal characterization of DNS queries received at B-Root over ten years. We sampled and analyzed a 28-billion-query large dataset from the ten annual Day in the Life of the Internet (DITL) experiments from 2013 through 2022. We sought to identify and quantify unexpected DNS queries, establish longitudinal trends, and compare our findings with published results of others. We found that unexpected query traffic increased from 39.57% in 2013 to 67.91% in 2022, with 36.55% of queries being priming queries. We also observed growth and decline of Chromium-initiated, random DNS queries. Finally, we analyzed the largest DNS query senders and established that most of their traffic consists of unexpected queries.Comment: 20 pages with 18 figures and 1 table. Published and presented at the 2022 IEEE/ACM International Conference on Big Data Computing, Applications and Technologies (BDCAT

    Responsible Data Governance of Neuroscience Big Data

    Get PDF
    Open access article.Current discussions of the ethical aspects of big data are shaped by concerns regarding the social consequences of both the widespread adoption of machine learning and the ways in which biases in data can be replicated and perpetuated. We instead focus here on the ethical issues arising from the use of big data in international neuroscience collaborations. Neuroscience innovation relies upon neuroinformatics, large-scale data collection and analysis enabled by novel and emergent technologies. Each step of this work involves aspects of ethics, ranging from concerns for adherence to informed consent or animal protection principles and issues of data re-use at the stage of data collection, to data protection and privacy during data processing and analysis, and issues of attribution and intellectual property at the data-sharing and publication stages. Significant dilemmas and challenges with far-reaching implications are also inherent, including reconciling the ethical imperative for openness and validation with data protection compliance and considering future innovation trajectories or the potential for misuse of research results. Furthermore, these issues are subject to local interpretations within different ethical cultures applying diverse legal systems emphasising different aspects. Neuroscience big data require a concerted approach to research across boundaries, wherein ethical aspects are integrated within a transparent, dialogical data governance process. We address this by developing the concept of “responsible data governance,” applying the principles of Responsible Research and Innovation (RRI) to the challenges presented by the governance of neuroscience big data in the Human Brain Project (HBP)

    A Framework for Integrating Transportation Into Smart Cities

    Get PDF
    In recent years, economic, environmental, and political forces have quickly given rise to “Smart Cities” -- an array of strategies that can transform transportation in cities. Using a multi-method approach to research and develop a framework for smart cities, this study provides a framework that can be employed to: Understand what a smart city is and how to replicate smart city successes; The role of pilot projects, metrics, and evaluations to test, implement, and replicate strategies; and Understand the role of shared micromobility, big data, and other key issues impacting communities. This research provides recommendations for policy and professional practice as it relates to integrating transportation into smart cities

    "Billion Dollar Bets" to Create Economic Opportunity for Every American

    Get PDF
    The American Dream--the notion that if you "work hard and play by the rules," you will improve your lot in life--has become impossible for Americans to achieve. That was the conclusion of nearly six out of ten people who responded to a June 2014, CNNMoney poll. In a December 2015 Harvard Institute of Politics' survey of millennials, nearly half pronounced the American Dream "dead."Given the fact that social mobility in the United States has largely remained stagnant for more than 30 years, many people doubt there's a better economic future for themselves and their children. Indeed, it will take a sustained effort to restore economic opportunity for all Americans. But according to research by The Bridgespan Group, reports of the American Dream's demise just might be premature.Drawing from an extensive research base--as well as dozens of interviews with experts and practitioners and the diverse perspectives of an advisory board--a Bridgespan team embarked on an effort to map out "what matters most" to increase upward economic mobility for millions of low-income Americans. (Learn more about our research effort in the Overview of Research.)The team identified an array of on-the-ground interventions that are already building pathways to the middle class, as well as promising innovations that are just beginning to emerge. The results of that investigation can be found in this report, "Billion Dollar Bets" to Create Economic Opportunity for Every American.We framed our research around this question: "How could a philanthropic investment of 1billiondramaticallyincreaseupwardsocialmobilityforlowincomeindividualsandfamilies?"Withaccesstocapitalthatisflexibleandadaptable,philanthropistsareuniquelypositionedtoputsocialmobilityonanupwardtrajectory.Roughly80percentofthelargestdonorsaspiretoimpelsocialchange,butjust20percentofphilanthropicinvestmentsabove1 billion dramatically increase upward social mobility for low-income individuals and families?" With access to capital that is flexible and adaptable, philanthropists are uniquely positioned to put social mobility on an upward trajectory. Roughly 80 percent of the largest donors aspire to impel social change, but just 20 percent of philanthropic investments above 10 million went to social-change organizations between 2000 and 2012. Philanthropists have lacked the sightlines into shovel-ready projects and they've lacked the confidence that large investments would actually impact the economic lives of many people.Our intent was to create a series of roadmaps that illustrate how investments of $1 billion might improve the lifetime earnings of millions of low-income Americans. We began by identifying four promising areas where large investments of private capital would likely catalyze population-level change.We then evaluated scores of concepts for restoring the meritocratic ideal to many more Americans. Working with our advisory board, we selected 15 of those concepts as illustrative "big bets" that span the four investment areas. To get a better understanding of the promise and pitfalls that come with any attempt to take on the social mobility challenge, we took a deeper dive into six of the proposed bets:Improve early childhood developmentEstablish clear and viable pathways to careersDecrease rates of conviction and incarcerationReduce unintended pregnanciesReduce the effect of concentrated poverty on the lives of people living in distressed neighborhoodsImprove the performance of public systems that administer and oversee social service

    Complex railway systems: capacity and utilisation of interconnected networks

    Get PDF
    Introduction Worldwide the transport sector faces several issues related to the rising of traffic demand such as congestion, energy consumption, noise, pollution, safety, etc. Trying to stem the problem, the European Commission is encouraging a modal shift towards railway, considered as one of the key factors for the development of a more sustainable European transport system. The coveted increase in railway share of transport demand for the next decades and the attempt to open up the rail market (for freight, international and recently also local services) strengthen the attention to capacity usage of the system. This contribution proposes a synthetic methodology for the capacity and utilisation analysis of complex interconnected rail networks; the procedure has a dual scope since it allows both a theoretically robust examination of suburban rail systems and a solid approach to be applied, with few additional and consistent assumptions, for feasibility or strategic analysis of wide networks (by efficiently exploiting the use of Big Data and/or available Open Databases). Method In particular the approach proposes a schematization of typical elements of a rail network (stations and line segments) to be applied in case of lack of more detailed data; in the authors’ opinion the strength points of the presented procedure stem from the flexibility of the applied synthetic methods and from the joint analysis of nodes and lines. The article, after building a quasiautomatic model to carry out several analyses by changing the border conditions or assumptions, even presents some general abacuses showing the variability of capacity/utilization of the network’s elements in function of basic parameters. Results This has helped in both the presented case studies: one focuses on a detailed analysis of the Naples’ suburban node, while the other tries to broaden the horizon by examining the whole European rail network with a more specific zoom on the Belgium area. The first application shows how the procedure can be applied in case of availability of fine-grained data and for metropolitan/regional analysis, allowing a precise detection of possible bottlenecks in the system and the individuation of possible interventions to relieve the high usage rate of these elements. The second application represents an on-going attempt to provide a broad analysis of capacity and related parameters for the entire European railway system. It explores the potentiality of the approach and the possible exploitation of different ‘Open and Big Data’ sources, but the outcomes underline the necessity to rely on proper and adequate information; the accuracy of the results significantly depend on the design and precision of the input database. Conclusion In conclusion, the proposed methodology aims to evaluate capacity and utilisation rates of rail systems at different geographical scales and according to data availability; the outcomes might provide valuable information to allow efficient exploitation and deployment of railway infrastructure, better supporting policy (e.g. investment prioritization, rail infrastructure access charges) and helping to minimize costs for users.The presented case studies show that the method allows indicative evaluations on the use of the system and comparative analysis between different elementary components, providing a first identification of ‘weak’ links or nodes for which, then, specific and detailed analyses should be carried out, taking into account more in depth their actual configuration, the technical characteristics and the real composition of the traffic (i.e. other elements influencing the rail capacity, such as: the adopted operating systems, the station traffic/route control & safety system, the elastic release of routes, the overlap of block sections, etc.)

    Does \u2018bigger\u2019mean \u2018better\u2019? Pitfalls and shortcuts associated with big data for social research

    Get PDF
    \u2018Big data is here to stay.\u2019 This key statement has a double value: is an assumption as well as the reason why a theoretical reflection is needed. Furthermore, Big data is something that is gaining visibility and success in social sciences even, overcoming the division between humanities and computer sciences. In this contribution some considerations on the presence and the certain persistence of Big data as a socio-technical assemblage will be outlined. Therefore, the intriguing opportunities for social research linked to such interaction between practices and technological development will be developed. However, despite a promissory rhetoric, fostered by several scholars since the birth of Big data as a labelled concept, some risks are just around the corner. The claims for the methodological power of bigger and bigger datasets, as well as increasing speed in analysis and data collection, are creating a real hype in social research. Peculiar attention is needed in order to avoid some pitfalls. These risks will be analysed for what concerns the validity of the research results \u2018obtained through Big data. After a pars distruens, this contribution will conclude with a pars construens; assuming the previous critiques, a mixed methods research design approach will be described as a general proposal with the objective of stimulating a debate on the integration of Big data in complex research projecting

    Big data and smart cities: a public sector organizational learning perspective

    Get PDF
    Public sector organizations (city authorities) have begun to explore ways to exploit big data to provide smarter solutions for cities. The way organizations learn to use new forms of technology has been widely researched. However, many public sector organisations have found themselves in new territory in trying to deploy and integrate this new form of technology (big data) to another fast moving and relatively new concept (smart city). This paper is a cross-sectional scoping study—from two UK smart city initiatives—on the learning processes experienced by elite (top management) stakeholders in the advent and adoption of these two novel concepts. The findings are an experiential narrative account on learning to exploit big data to address issues by developing solutions through smart city initiatives. The findings revealed a set of moves in relation to the exploration and exploitation of big data through smart city initiatives: (a) knowledge finding; (b) knowledge reframing; (c) inter-organization collaborations and (d) ex-post evaluations. Even though this is a time-sensitive scoping study it gives an account on a current state-of-play on the use of big data in public sector organizations for creating smarter cities. This study has implications for practitioners in the smart city domain and contributes to academia by operationalizing and adapting Crossan et al’s (Acad Manag Rev 24(3): 522–537, 1999) 4I model on organizational learning

    Protecting data privacy with decentralized self-emerging data release systems

    Get PDF
    In the age of Big Data, releasing private data at a future point in time is critical for various applications. Such self-emerging data release requires the data to be protected until a prescribed data release time and be automatically released to the target recipient at the release time. While straight-forward centralized approaches such as cloud storage services may provide a simple way to implement self-emerging data release, unfortunately, they are limited to a single point of trust and involves a single point of control. This dissertation proposes new decentralized designs of self-emerging data release systems using large-scale peer-to-peer (P2P) networks as the underlying infrastructure to eliminate a single point of trust or control. The first part of the dissertation presents the design of decentralized self-emerging data release systems using two different P2P network infrastructures, namely Distributed Hash Table (DHT) and blockchain. The second part of this dissertation proposes new mechanisms for supporting two key functionalities of self-emerging data release, namely (i) enabling the release of self-emerging data to blockchain-based smart contracts for facilitating a wide range of decentralized applications and (ii) supporting a cost-effective gradual release of self-emerging data in the decentralized infrastructure. We believe that the outcome of this dissertation would contribute to the development of decentralized security primitives and protocols in the context of timed release of private data

    Risk factors affecting the ability for earned value management to accurately assess the performance of infrastructure projects in Australia

    Get PDF
    Purpose – The purpose of this paper is to investigate a set of risk-related factors influencing the earned value management (EVM) concept as an assessment technique in evaluating the progress of modern sustainable infrastructure construction projects. Design/methodology/approach – A qualitative research approach has been adopted for identifying risk-related factors influencing EVM concept from a literature review and through interviewing industry personnel, followed by an inductive process to form sets of key factors and their measuring items. Findings – EVM is a common method for assessing project performance. A weakness of this approach is that EVM assessment in its current form does not measure the impact of a number of project performance factors that result from the complexity of modern infrastructure construction projects, and thus does not accurately assess their impact in this performance. This paper discusses and explains a range of potential risk factors to evaluating project performance such as sustainability, stakeholder requirements, communication, procurement strategy, weather, experience of staff, site condition, design issues, financial risk, subcontractor, government requirements and material. In addition, their measuring items were identified. Practical implications – This research assists projects managers to improve the evaluation process of infrastructure construction performance by incorporating a range of factors likely to impact on that performance and which are not included in current EVM calculations. Originality/value – This research addresses the need to include in the EVM calculation a range of risk factors affecting the performance of infrastructure projects in Australia and therefore makes this calculation a more reliable tool for assessing project performance
    corecore