2,345 research outputs found

    U.S. Military Innovation In The 21st Century: The Era Of The “Spin-On”

    Get PDF
    The intersection between the U.S. military and technological innovation, a “military-innovation nexus,” has led to the genesis of key technologies, including nuclear energy, general computing, GPS, and satellite technology from World War II to the present. However, an evolving innovation context in the twenty-first century, including the leadership of the commercial sector in technology innovation and the resurgence of great power competition, has led to doubts about the ability of the Department of Defense to discover and promote the technological innovations of the future. The Third Offset Strategy was formulated in 2014 in response to these concerns: The offset strategy promulgated reforms to bring the Pentagon and the commercial sector closer together while creating alternative contracting mechanisms for streamlined procurement and prototyping. Using defense biometrics and artificial intelligence as case studies of spin-on innovations adopted by the military, this Article seeks to understand the efficacy of the reforms undertaken under the auspices of the Third Offset Strategy to improve the institutional underpinnings of the U.S. innovation system for national security. I argue that the Third Offset Strategy has allowed the Pentagon to more effectively procure, develop, and field commercial technologies in the twenty-first century, and I conclude by proposing modest recommendations for the successful acquisition of spin-on innovations

    Lessons Learned on Adopting Automated Compliance Checking in AEC Industry: A Global Study

    Get PDF
    Over the last decades, numerous Automated Compliance Checking (ACC) systems have been developed. However, ACC is still not broadly used in the real world today; little is known as to how ACC can be better accepted by the end users. This paper reports on a multiple-case study to learn valuable lessons from recent attempts to adopt Automated Compliance Checking (ACC) systems world21 wide. Firstly, eighteen semi-structured interviews were conducted with twenty experts from eight countries and supplementary data (e.g. documents, product information, and literature) related to each case were collected. Secondly, the interview and supplementary data were then coded to develop prominent themes. Thirdly, through a cross-case analysis, twelve most determining variables that could influence the ACC adoption were identified. Three path models that explain the interrelationships between these variables and ten propositions that can guide future ACC adoption were deduced. The results indicate that the government should play an important role to facilitate ACC adoption through funding, policies, and incentives. This study also provides valuable information to software vendors for delivering ACC systemsthat meet the needs of the industry, and for innovation managers in the industry to develop appropriate adoption plans for the ACC technology

    Survey on 6G Frontiers: Trends, Applications, Requirements, Technologies and Future Research

    Get PDF
    Emerging applications such as Internet of Everything, Holographic Telepresence, collaborative robots, and space and deep-sea tourism are already highlighting the limitations of existing fifth-generation (5G) mobile networks. These limitations are in terms of data-rate, latency, reliability, availability, processing, connection density and global coverage, spanning over ground, underwater and space. The sixth-generation (6G) of mobile networks are expected to burgeon in the coming decade to address these limitations. The development of 6G vision, applications, technologies and standards has already become a popular research theme in academia and the industry. In this paper, we provide a comprehensive survey of the current developments towards 6G. We highlight the societal and technological trends that initiate the drive towards 6G. Emerging applications to realize the demands raised by 6G driving trends are discussed subsequently. We also elaborate the requirements that are necessary to realize the 6G applications. Then we present the key enabling technologies in detail. We also outline current research projects and activities including standardization efforts towards the development of 6G. Finally, we summarize lessons learned from state-of-the-art research and discuss technical challenges that would shed a new light on future research directions towards 6G

    Bridging the Geospatial Education-Workforce Divide: A Case Study on How Higher Education Can Address the Emerging Geospatial Drivers and Trends of the Intelligent Web Mapping Era

    Get PDF
    The purpose of this exploratory collective case study is to discover how geospatial education can meet the geospatial workforce needs of the Commonwealth of Virginia, in the emerging intelligent web mapping era. Geospatial education uses geographic information systems (GIS) to enable student learning by increasing in-depth spatial analysis and meaning using geotechnology tools (Baker & White, 2003). Bandura’s (1977) self-efficacy theory and geography concept of spatial thinking form an integrated theoretical framework of spatial cognition for this study. Data collection included in-depth interviews of twelve geospatial stakeholders, documentation collection, and supporting Q methodology to determine the viewpoints of a total of 41 geospatial stakeholders. Q methodology is a type of data collection that when used as a qualitative method utilizes sorting by the participant to determine their preferences. Data analysis strategies included cross-case synthesis, direct interpretation, generalizations, and a correlation matrix to show similarities in participants\u27 preferences. The results revealed four collaborative perceptions of the stakeholders, forming four themes of social education, technology early adoption, data collaboration, and urban fundamentals. Four strategies were identified for higher education to prepare students for the emerging geospatial workforce trends. These strategies are to teach fundamentals, develop agile faculty and curriculum, use an interdisciplinary approach, and collaborate. These strategies reflect the perceptions of stakeholders in this study on how higher education can meet the emerging drivers and trends of the geospatial workforce

    Strategies for the intelligent selection of components

    Get PDF
    It is becoming common to build applications as component-intensive systems - a mixture of fresh code and existing components. For application developers the selection of components to incorporate is key to overall system quality - so they want the `best\u27. For each selection task, the application developer will de ne requirements for the ideal component and use them to select the most suitable one. While many software selection processes exist there is a lack of repeatable, usable, exible, automated processes with tool support. This investigation has focussed on nding and implementing strategies to enhance the selection of software components. The study was built around four research elements, targeting characterisation, process, strategies and evaluation. A Post-positivist methodology was used with the Spiral Development Model structuring the investigation. Data for the study is generated using a range of qualitative and quantitative methods including a survey approach, a range of case studies and quasiexperiments to focus on the speci c tuning of tools and techniques. Evaluation and review are integral to the SDM: a Goal-Question-Metric (GQM)-based approach was applied to every Spiral

    Architectural artificial intelligence: exploring and developing strategies, tools, and pedagogies toward the integration of deep learning in the architectural profession

    Full text link
    The growing incessance for data collection is a trend born from the basic promise of data: “save everything you can, and someday you’ll be able to figure out some use for it all” (Schneier 2016, p. 40). However, this has manifested as a plague of information overload, where “it would simply be impossible for humans to deal with all of this data” (Davenport 2014, p. 151). Especially within the field of architecture, where designers are tasked with leveraging all available sources of information to compose an informed solution. Too often, “the average designer scans whatever information [they] happen on, [
] and introduces this randomly selected information into forms otherwise dreamt up in the artist’s studio of mind” (Alexander 1964, p. 4). As data accumulates— less so the “oil”, and more the “exhaust of the information age” (Schneier 2016, p. 20)—we are rapidly approaching a point where even the programmers enlisted to automate are inadequate. Yet, as the size of data warehouses increases, so too does the available computational power and the invention of clever algorithms to negotiate it. Deep learning is an exemplar. A subset of artificial intelligence, deep learning is a collection of algorithms inspired by the brain, capable of automated self-improvement, or “learning”, through observations of large quantities of data. In recent years, the rise in computational power and the access to these immense databases have fostered the proliferation of deep learning to almost all fields of endeavour. The application of deep learning in architecture not only has the potential to resolve the issue of rising complexity, but introduce a plethora of new tools at the architect’s disposal, such as computer vision, natural language processing, and recommendation systems. Already, we are starting to see its impact on the field of architecture. Which raises the following questions: what is the current state of deep learning adoption in architecture, how can one better facilitate its integration, and what are the implications for doing so? This research aims to answer those questions through an exploration of strategies, tools, and pedagogies for the integration of deep learning in the architectural profession
    • 

    corecore