67,817 research outputs found

    Contextual Sensitivity in Grounded Theory: The Role of Pilot Studies

    Get PDF
    Grounded Theory is an established methodological approach for context specific inductive theory building. The grounded nature of the methodology refers to these specific contexts from which emergent propositions are drawn. Thus, any grounded theory study requires not only theoretical sensitivity, but also a good insight on how to design the research in the human activity systems to be studied. The lack of this insight may result in inefficient theoretical sampling or even erroneous purposeful sampling. These problems would not necessarily be critical, as it could be argued that through the elliptical process that characterizes grounded theory, remedial loops would always bring the researcher to the core of the theory. However, these elliptical remedial processes can take very long periods of time and result in catastrophic delays in research projects. As a strategy, this paper discusses, contrasts and compares the use of pilot studies in four different grounded theory projects. Each pilot brought different insights about the context, resulting in changes of focus, guidance to improve data collection instruments and informing theoretical sampling. Additionally, as all four projects were undertaken by researchers with little experience of inductive approaches in general and grounded theory in particular, the pilot studies also served the purpose of training in interviewing, relating to interviewees, memoing, constant comparison and coding. This last outcome of the pilot study was actually not planned initially, but revealed itself to be a crucial success factor in the running of the projects. The paper concludes with a theoretical proposition for the concept of contextual sensitivity and for the inclusion of the pilot study in grounded theory research designs

    A survey of self organisation in future cellular networks

    Get PDF
    This article surveys the literature over the period of the last decade on the emerging field of self organisation as applied to wireless cellular communication networks. Self organisation has been extensively studied and applied in adhoc networks, wireless sensor networks and autonomic computer networks; however in the context of wireless cellular networks, this is the first attempt to put in perspective the various efforts in form of a tutorial/survey. We provide a comprehensive survey of the existing literature, projects and standards in self organising cellular networks. Additionally, we also aim to present a clear understanding of this active research area, identifying a clear taxonomy and guidelines for design of self organising mechanisms. We compare strength and weakness of existing solutions and highlight the key research areas for further development. This paper serves as a guide and a starting point for anyone willing to delve into research on self organisation in wireless cellular communication networks

    e-Business challenges and directions: important themes from the first ICE-B workshop

    Get PDF
    A three-day asynchronous, interactive workshop was held at ICE-B’10 in Piraeus, Greece in July of 2010. This event captured conference themes for e-Business challenges and directions across four subject areas: a) e-Business applications and models, b) enterprise engineering, c) mobility, d) business collaboration and e-Services, and e) technology platforms. Quality Function Deployment (QFD) methods were used to gather, organize and evaluate themes and their ratings. This paper summarizes the most important themes rated by participants: a) Since technology is becoming more economic and social in nature, more agile and context-based application develop methods are needed. b) Enterprise engineering approaches are needed to support the design of systems that can evolve with changing stakeholder needs. c) The digital native groundswell requires changes to business models, operations, and systems to support Prosumers. d) Intelligence and interoperability are needed to address Prosumer activity and their highly customized product purchases. e) Technology platforms must rapidly and correctly adapt, provide widespread offerings and scale appropriately, in the context of changing situational contexts

    Development of methodologies and procedures for identifying STS users and uses

    Get PDF
    A study was conducted to identify new uses and users of the new Space Transporation System (STS) within the domestic government sector. The study develops a series of analytical techniques and well-defined functions structured as an integrated planning process to assure efficient and meaningful use of the STS. The purpose of the study is to provide NASA with the following functions: (1) to realize efficient and economic use of the STS and other NASA capabilities, (2) to identify new users and uses of the STS, (3) to contribute to organized planning activities for both current and future programs, and (4) to air in analyzing uses of NASA's overall capabilities

    Recommender Systems

    Get PDF
    The ongoing rapid expansion of the Internet greatly increases the necessity of effective recommender systems for filtering the abundant information. Extensive research for recommender systems is conducted by a broad range of communities including social and computer scientists, physicists, and interdisciplinary researchers. Despite substantial theoretical and practical achievements, unification and comparison of different approaches are lacking, which impedes further advances. In this article, we review recent developments in recommender systems and discuss the major challenges. We compare and evaluate available algorithms and examine their roles in the future developments. In addition to algorithms, physical aspects are described to illustrate macroscopic behavior of recommender systems. Potential impacts and future directions are discussed. We emphasize that recommendation has a great scientific depth and combines diverse research fields which makes it of interests for physicists as well as interdisciplinary researchers.Comment: 97 pages, 20 figures (To appear in Physics Reports

    Big Data Ethics in Research

    Get PDF
    The main problems faced by scientists in working with Big Data sets, highlighting the main ethical issues, taking into account the legislation of the European Union. After a brief Introduction to Big Data, the Technology section presents specific research applications. There is an approach to the main philosophical issues in Philosophical Aspects, and Legal Aspects with specific ethical issues in the EU Regulation on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (Data Protection Directive - General Data Protection Regulation, "GDPR"). The Ethics Issues section details the specific aspects of Big Data. After a brief section of Big Data Research, I finalize my work with the presentation of Conclusions on research ethics in working with Big Data. CONTENTS: Abstract 1. Introduction - 1.1 Definitions - 1.2 Big Data dimensions 2. Technology - 2.1 Applications - - 2.1.1 In research 3. Philosophical aspects 4. Legal aspects - 4.1 GDPR - - Stages of processing of personal data - - Principles of data processing - - Privacy policy and transparency - - Purposes of data processing - - Design and implicit confidentiality - - The (legal) paradox of Big Data 5. Ethical issues - Ethics in research - Awareness - Consent - Control - Transparency - Trust - Ownership - Surveillance and security - Digital identity - Tailored reality - De-identification - Digital inequality - Privacy 6. Big Data research Conclusions Bibliography DOI: 10.13140/RG.2.2.11054.4640

    Composable M&S web services for net-centric applications

    Get PDF
    Service-oriented architectures promise easier integration of functionality in the form of web services into operational systems than is the case with interface-driven system-oriented approaches. Although the Extensible Markup Language (XML) enables a new level of interoperability among heterogeneous systems, XML alone does not solve all interoperability problems users contend with when integrating services into operational systems. To manage the basic challenges of service interoperation, we developed the Levels of Conceptual Interoperability Model (LCIM) to enable a layered approach and gradual solution improvements. Furthermore, we developed methods of model-based data engineering (MBDE) for semantically consistent service integration as a first step. These methods have been applied in the U.S. in collaboration with industry resulting in proofs of concepts. The results are directly applicable in a net-centric and net-enabled environment

    Report from GI-Dagstuhl Seminar 16394: Software Performance Engineering in the DevOps World

    Get PDF
    This report documents the program and the outcomes of GI-Dagstuhl Seminar 16394 "Software Performance Engineering in the DevOps World". The seminar addressed the problem of performance-aware DevOps. Both, DevOps and performance engineering have been growing trends over the past one to two years, in no small part due to the rise in importance of identifying performance anomalies in the operations (Ops) of cloud and big data systems and feeding these back to the development (Dev). However, so far, the research community has treated software engineering, performance engineering, and cloud computing mostly as individual research areas. We aimed to identify cross-community collaboration, and to set the path for long-lasting collaborations towards performance-aware DevOps. The main goal of the seminar was to bring together young researchers (PhD students in a later stage of their PhD, as well as PostDocs or Junior Professors) in the areas of (i) software engineering, (ii) performance engineering, and (iii) cloud computing and big data to present their current research projects, to exchange experience and expertise, to discuss research challenges, and to develop ideas for future collaborations

    A Web Service Composition Method Based on OpenAPI Semantic Annotations

    Full text link
    Automatic Web service composition is a research direction aimed to improve the process of aggregating multiple Web services to create some new, specific functionality. The use of semantics is required as the proper semantic model with annotation standards is enabling the automation of reasoning required to solve non-trivial cases. Most previous models are limited in describing service parameters as concepts of a simple hierarchy. Our proposed method is increasing the expressiveness at the parameter level, using concept properties that define attributes expressed by name and type. Concept properties are inherited. The paper also describes how parameters are matched to create, in an automatic manner, valid compositions. Additionally, the composition algorithm is practically used on descriptions of Web services implemented by REST APIs expressed by OpenAPI specifications. Our proposal uses knowledge models (ontologies) to enhance these OpenAPI constructs with JSON-LD semantic annotations in order to obtain better compositions for involved services. We also propose an adjusted composition algorithm that extends the semantic knowledge defined by our model.Comment: International Conference on e-Business Engineering (ICEBE) 9 page
    • …
    corecore