32,180 research outputs found

    Deep Technology Tracing for High-tech Companies

    Full text link
    Technological change and innovation are vitally important, especially for high-tech companies. However, factors influencing their future research and development (R&D) trends are both complicated and various, leading it a quite difficult task to make technology tracing for high-tech companies. To this end, in this paper, we develop a novel data-driven solution, i.e., Deep Technology Forecasting (DTF) framework, to automatically find the most possible technology directions customized to each high-tech company. Specially, DTF consists of three components: Potential Competitor Recognition (PCR), Collaborative Technology Recognition (CTR), and Deep Technology Tracing (DTT) neural network. For one thing, PCR and CTR aim to capture competitive relations among enterprises and collaborative relations among technologies, respectively. For another, DTT is designed for modeling dynamic interactions between companies and technologies with the above relations involved. Finally, we evaluate our DTF framework on real-world patent data, and the experimental results clearly prove that DTF can precisely help to prospect future technology emphasis of companies by exploiting hybrid factors.Comment: 6 pages, 7 figure

    Data Science and Ebola

    Get PDF
    Data Science---Today, everybody and everything produces data. People produce large amounts of data in social networks and in commercial transactions. Medical, corporate, and government databases continue to grow. Sensors continue to get cheaper and are increasingly connected, creating an Internet of Things, and generating even more data. In every discipline, large, diverse, and rich data sets are emerging, from astrophysics, to the life sciences, to the behavioral sciences, to finance and commerce, to the humanities and to the arts. In every discipline people want to organize, analyze, optimize and understand their data to answer questions and to deepen insights. The science that is transforming this ocean of data into a sea of knowledge is called data science. This lecture will discuss how data science has changed the way in which one of the most visible challenges to public health is handled, the 2014 Ebola outbreak in West Africa.Comment: Inaugural lecture Leiden Universit

    The Ascent of America's High-Growth Companies: An Analysis of the Geography of Entrepreneurship

    Get PDF
    This report offers the first-ever deep dive into the geographic trends of America's fastest growing private companies -- the Inc. 500. Inc. magazine's annual ranking, which began in 1982, has become an important point of pride for high-achieving companies and a source of research for economists. Not until now, however, has anyone dissected the past thirty years of comprehensive data from these high-growth companies. Through a partnership with Inc. magazine, the Ewing Marion Kauffman Foundation has done just that. In this, one of a set of studies examining Inc. 500 data over time, we offer a geographic analysis of how regional characteristics are associated with fast-growing companies and innovations. Tracing hundreds of Inc. firms per year and thousands per decade, we have captured a range of innovations and analyzed the regions that continuously produce fast-growing companies. Knowing that very little is understood about the geography of high-growth companies, we approached this analysis with a range of questions: where are the fast-growing Inc. firms located at the state and metropolitan levels? How have they shifted over time? Do we find greater geographic concentration of Inc. firms over time? How is the geography of Inc. firms different from commonly associated growth factors, such as high-tech industries, venture capital firms, and research universities?As you review the findings of this report, keep in mind that the creation of another ranking is not our primary objective. It is more important to demonstrate different regions with different sectors and strengths, in contrast to previously identified areas that have been highlighted as strong producers of high-tech companies. Thus, our objective is to shed light on formerly understudied areas of economic development

    The Making of Cloud Applications An Empirical Study on Software Development for the Cloud

    Full text link
    Cloud computing is gaining more and more traction as a deployment and provisioning model for software. While a large body of research already covers how to optimally operate a cloud system, we still lack insights into how professional software engineers actually use clouds, and how the cloud impacts development practices. This paper reports on the first systematic study on how software developers build applications in the cloud. We conducted a mixed-method study, consisting of qualitative interviews of 25 professional developers and a quantitative survey with 294 responses. Our results show that adopting the cloud has a profound impact throughout the software development process, as well as on how developers utilize tools and data in their daily work. Among other things, we found that (1) developers need better means to anticipate runtime problems and rigorously define metrics for improved fault localization and (2) the cloud offers an abundance of operational data, however, developers still often rely on their experience and intuition rather than utilizing metrics. From our findings, we extracted a set of guidelines for cloud development and identified challenges for researchers and tool vendors

    Beyond Bitcoin: Issues in Regulating Blockchain Transactions

    Get PDF
    The buzz surrounding Bitcoin has reached a fever pitch. Yet in academic legal discussions, disproportionate emphasis is placed on bitcoins (that is, virtual currency), and little mention is made of blockchain technology—the true innovation behind the Bitcoin protocol. Simply, blockchain technology solves an elusive networking problem by enabling “trustless” transactions: value exchanges over computer networks that can be verified, monitored, and enforced without central institutions (for example, banks). This has broad implications for how we transact over electronic networks. This Note integrates current research from leading computer scientists and cryptographers to elevate the legal community’s understanding of blockchain technology and, ultimately, to inform policymakers and practitioners as they consider different regulatory schemes. An examination of the economic properties of a blockchain-based currency suggests the technology’s true value lies in its potential to facilitate more efficient digital-asset transfers. For example, applications of special interest to the legal community include more efficient document and authorship verification, title transfers, and contract enforcement. Though a regulatory patchwork around virtual currencies has begun to form, its careful analysis reveals much uncertainty with respect to these alternative applications

    Grand Challenges of Traceability: The Next Ten Years

    Full text link
    In 2007, the software and systems traceability community met at the first Natural Bridge symposium on the Grand Challenges of Traceability to establish and address research goals for achieving effective, trustworthy, and ubiquitous traceability. Ten years later, in 2017, the community came together to evaluate a decade of progress towards achieving these goals. These proceedings document some of that progress. They include a series of short position papers, representing current work in the community organized across four process axes of traceability practice. The sessions covered topics from Trace Strategizing, Trace Link Creation and Evolution, Trace Link Usage, real-world applications of Traceability, and Traceability Datasets and benchmarks. Two breakout groups focused on the importance of creating and sharing traceability datasets within the research community, and discussed challenges related to the adoption of tracing techniques in industrial practice. Members of the research community are engaged in many active, ongoing, and impactful research projects. Our hope is that ten years from now we will be able to look back at a productive decade of research and claim that we have achieved the overarching Grand Challenge of Traceability, which seeks for traceability to be always present, built into the engineering process, and for it to have "effectively disappeared without a trace". We hope that others will see the potential that traceability has for empowering software and systems engineers to develop higher-quality products at increasing levels of complexity and scale, and that they will join the active community of Software and Systems traceability researchers as we move forward into the next decade of research

    Grand Challenges of Traceability: The Next Ten Years

    Full text link
    In 2007, the software and systems traceability community met at the first Natural Bridge symposium on the Grand Challenges of Traceability to establish and address research goals for achieving effective, trustworthy, and ubiquitous traceability. Ten years later, in 2017, the community came together to evaluate a decade of progress towards achieving these goals. These proceedings document some of that progress. They include a series of short position papers, representing current work in the community organized across four process axes of traceability practice. The sessions covered topics from Trace Strategizing, Trace Link Creation and Evolution, Trace Link Usage, real-world applications of Traceability, and Traceability Datasets and benchmarks. Two breakout groups focused on the importance of creating and sharing traceability datasets within the research community, and discussed challenges related to the adoption of tracing techniques in industrial practice. Members of the research community are engaged in many active, ongoing, and impactful research projects. Our hope is that ten years from now we will be able to look back at a productive decade of research and claim that we have achieved the overarching Grand Challenge of Traceability, which seeks for traceability to be always present, built into the engineering process, and for it to have "effectively disappeared without a trace". We hope that others will see the potential that traceability has for empowering software and systems engineers to develop higher-quality products at increasing levels of complexity and scale, and that they will join the active community of Software and Systems traceability researchers as we move forward into the next decade of research

    Digital Architecture as Crime Control

    Get PDF
    This paper explains how theories of realspace architecture inform the prevention of computer crime. Despite the prevalence of the metaphor, architects in realspace and cyberspace have not talked to one another. There is a dearth of literature about digital architecture and crime altogether, and the realspace architectural literature on crime prevention is often far too soft for many software engineers. This paper will suggest the broad brushstrokes of potential design solutions to cybercrime, and in the course of so doing, will pose severe criticisms of the White House\u27s recent proposals on cybersecurity. The paper begins by introducing four concepts of realspace crime prevention through architecture. Design should: (1) create opportunities for natural surveillance, meaning its visibility and susceptibility to monitoring by residents, neighbors, and bystanders; (2) instill a sense of territoriality so that residents develop proprietary attitudes and outsiders feel deterred from entering a private space; (3) build communities and avoid social isolation; and (4) protect targets of crime. There are digital analogues to each goal. Natural-surveillance principles suggest new virtues of open-source platforms, such as Linux, and territoriality outlines a strong case for moving away from digital anonymity towards psuedonymity. The goal of building communities will similarly expose some new advantages for the original, and now eroding, end-to-end design of the Internet. An understanding of architecture and target prevention will illuminate why firewalls at end points will more effectively guarantee security than will attempts to bundle security into the architecture of the Net. And, in total, these architectural lessons will help us chart an alternative course to the federal government\u27s tepid approach to computer crime. By leaving the bulk of crime prevention to market forces, the government will encourage private barricades to develop - the equivalent of digital gated communities - with terrible consequences for the Net in general and interconnectivity in particular
    corecore