1,570,459 research outputs found

    Enhancing loco-regional adaptive governance for integrated chronic care through agent based modelling (ABM)

    Get PDF
    Introduction: Moving from existing segmented care to integrated care is complex and disruptive. It is complex in the sense that the type of changes and the timeframe of these changes are not completely predictable. It is disruptive in the sense that the process of change modifies but also is influenced by the nature of interactions at the individual and organisational level. As a consequence, building competences to govern the necessary changes towards integrated care should include capacity to adapt to unexpected situations. Therefore, the tacit knowledge of the stakeholders (“knowledge-in-practice developed from direct experience; subconsciously understood and applied”1) should be at the centre. However, the usual research and training practices using such a knowledge (i.e. action research or case studies), are highly time-consuming. New approaches are therefore needed to elicit tacit knowledge. One of them is agent based modelling (ABM)2 through computer simulation. The aim of this paper is to make a “showcase” of an agent-based model that uses the emergence of tacit knowledge and enhances loco-regional adaptive governance for improving integrated chronic care. Theory/Methods: We used a complex adaptive system’s lens to study the health systems integration process. We applied key components of ABM to assess how health systems adapts through the dynamics of heterogeneous and interconnected agents (agents are characterised by their level of autonomy, heterogeneity, and interactions with other agents). The agent-based model was developed through a process where concept maps, causal loop diagrams, object-oriented unified modelling language diagrams and computer simulation (using Netlogo©) were iteratively used. Results: The agent-based model was presented to health professionals with variable experience in healthcare to elicit their perceptions and tacit knowledge. It consisted of agents with certain characteristics and transition rules. Agents included providers, patients, networks’ or health systems’ managers. Agents can adopt or influence the adoption of integrated care through learning and because of being aware, motivated and capable of decision making. The environment includes institutional arrangements (e.g., financing, training, information systems and legislation) and leadership. Different scenarios were created and discussed. Key rules to strengthen adaptive governance were reflected on. Discussion and conclusion: This study is an initial step of an exercise to use ABM as a means to elicit of and enhance tacit knowledge to strengthen governance for integrated care. It is expected that the study will foster dialogue between actors of loco-regional projects to integrate health and social care for chronic diseases in Belgium (a new program initiated by federal authorities). Suggestions for future research: Future research is expected to continue developing methods that combine ABM with participative exploration approaches to make better use of tacit knowledge in strengthening loco-regional governance for the development of integrated care

    Computationally Linking Chemical Exposure to Molecular Effects with Complex Data: Comparing Methods to Disentangle Chemical Drivers in Environmental Mixtures and Knowledge-based Deep Learning for Predictions in Environmental Toxicology

    Get PDF
    Chemical exposures affect the environment and may lead to adverse outcomes in its organisms. Omics-based approaches, like standardised microarray experiments, have expanded the toolbox to monitor the distribution of chemicals and assess the risk to organisms in the environment. The resulting complex data have extended the scope of toxicological knowledge bases and published literature. A plethora of computational approaches have been applied in environmental toxicology considering systems biology and data integration. Still, the complexity of environmental and biological systems given in data challenges investigations of exposure-related effects. This thesis aimed at computationally linking chemical exposure to biological effects on the molecular level considering sources of complex environmental data. The first study employed data of an omics-based exposure study considering mixture effects in a freshwater environment. We compared three data-driven analyses in their suitability to disentangle mixture effects of chemical exposures to biological effects and their reliability in attributing potentially adverse outcomes to chemical drivers with toxicological databases on gene and pathway levels. Differential gene expression analysis and a network inference approach resulted in toxicologically meaningful outcomes and uncovered individual chemical effects — stand-alone and in combination. We developed an integrative computational strategy to harvest exposure-related gene associations from environmental samples considering mixtures of lowly concentrated compounds. The applied approaches allowed assessing the hazard of chemicals more systematically with correlation-based compound groups. This dissertation presents another achievement toward a data-driven hypothesis generation for molecular exposure effects. The approach combined text-mining and deep learning. The study was entirely data-driven and involved state-of-the-art computational methods of artificial intelligence. We employed literature-based relational data and curated toxicological knowledge to predict chemical-biomolecule interactions. A word embedding neural network with a subsequent feed-forward network was implemented. Data augmentation and recurrent neural networks were beneficial for training with curated toxicological knowledge. The trained models reached accuracies of up to 94% for unseen test data of the employed knowledge base. However, we could not reliably confirm known chemical-gene interactions across selected data sources. Still, the predictive models might derive unknown information from toxicological knowledge sources, like literature, databases or omics-based exposure studies. Thus, the deep learning models might allow predicting hypotheses of exposure-related molecular effects. Both achievements of this dissertation might support the prioritisation of chemicals for testing and an intelligent selection of chemicals for monitoring in future exposure studies.:Table of Contents ... I Abstract ... V Acknowledgements ... VII Prelude ... IX 1 Introduction 1.1 An overview of environmental toxicology ... 2 1.1.1 Environmental toxicology ... 2 1.1.2 Chemicals in the environment ... 4 1.1.3 Systems biological perspectives in environmental toxicology ... 7 Computational toxicology ... 11 1.2.1 Omics-based approaches ... 12 1.2.2 Linking chemical exposure to transcriptional effects ... 14 1.2.3 Up-scaling from the gene level to higher biological organisation levels ... 19 1.2.4 Biomedical literature-based discovery ... 24 1.2.5 Deep learning with knowledge representation ... 27 1.3 Research question and approaches ... 29 2 Methods and Data ... 33 2.1 Linking environmental relevant mixture exposures to transcriptional effects ... 34 2.1.1 Exposure and microarray data ... 34 2.1.2 Preprocessing ... 35 2.1.3 Differential gene expression ... 37 2.1.4 Association rule mining ... 38 2.1.5 Weighted gene correlation network analysis ... 39 2.1.6 Method comparison ... 41 Predicting exposure-related effects on a molecular level ... 44 2.2.1 Input ... 44 2.2.2 Input preparation ... 47 2.2.3 Deep learning models ... 49 2.2.4 Toxicogenomic application ... 54 3 Method comparison to link complex stream water exposures to effects on the transcriptional level ... 57 3.1 Background and motivation ... 58 3.1.1 Workflow ... 61 3.2 Results ... 62 3.2.1 Data preprocessing ... 62 3.2.2 Differential gene expression analysis ... 67 3.2.3 Association rule mining ... 71 3.2.4 Network inference ... 78 3.2.5 Method comparison ... 84 3.2.6 Application case of method integration ... 87 3.3 Discussion ... 91 3.4 Conclusion ... 99 4 Deep learning prediction of chemical-biomolecule interactions ... 101 4.1 Motivation ... 102 4.1.1Workflow ...105 4.2 Results ... 107 4.2.1 Input preparation ... 107 4.2.2 Model selection ... 110 4.2.3 Model comparison ... 118 4.2.4 Toxicogenomic application ... 121 4.2.5 Horizontal augmentation without tail-padding ...123 4.2.6 Four-class problem formulation ... 124 4.2.7 Training with CTD data ... 125 4.3 Discussion ... 129 4.3.1 Transferring biomedical knowledge towards toxicology ... 129 4.3.2 Deep learning with biomedical knowledge representation ...133 4.3.3 Data integration ...136 4.4 Conclusion ... 141 5 Conclusion and Future perspectives ... 143 5.1 Conclusion ... 143 5.1.1 Investigating complex mixtures in the environment ... 144 5.1.2 Complex knowledge from literature and curated databases predict chemical- biomolecule interactions ... 145 5.1.3 Linking chemical exposure to biological effects by integrating CTD ... 146 5.2 Future perspectives ... 147 S1 Supplement Chapter 1 ... 153 S1.1 Example of an estrogen bioassay ... 154 S1.2 Types of mode of action ... 154 S1.3 The dogma of molecular biology ... 157 S1.4 Transcriptomics ... 159 S2 Supplement Chapter 3 ... 161 S3 Supplement Chapter 4 ... 175 S3.1 Hyperparameter tuning results ... 176 S3.2 Functional enrichment with predicted chemical-gene interactions and CTD reference pathway genesets ... 179 S3.3 Reduction of learning rate in a model with large word embedding vectors ... 183 S3.4 Horizontal augmentation without tail-padding ... 183 S3.5 Four-relationship classification ... 185 S3.6 Interpreting loss observations for SemMedDB trained models ... 187 List of Abbreviations ... i List of Figures ... vi List of Tables ... x Bibliography ... xii Curriculum scientiae ... xxxix Selbständigkeitserklärung ... xlii

    Innovation Management in the Globalized Digital Society

    Get PDF
    AbstractInnovation is by far the trendiest management issue nowadays and the rhetoric of innovation has reached every sector of the economy and society, as well. Increasing competitiveness implies economic change through the introduction of new technologies and new methods of production, as well as, the development of new skills. Innovation is the core of this process. Innovation management is focused on the systematic processes that organizations use to develop new and improved products, services and business processes. It involves development of creative ideas within the organization and the networked environment. Focusing on the management of innovation implies also the management of talents among the employees. The knowledge captured in new technologies and processes has led to growth and competitiveness. Developing knowledge-based society requires adequate levels of investment in research, development, education, as well as creating a favorable environment for innovation. Reengineering in terms of innovation has helped many companies to improve their productivity and consequently to grow in competitiveness. Hereby we discuss the management of innovation in the circumstances of market globalization, digital revolution, and dynamic development of technology, products and services. Management of innovation is a complex task of leadership that aims at a systemic process of change throughout strategic and operational approaches. In this paper we discuss a model of innovation management based on the analysis of the driving forces of change and a framework in which domain and problem definition play an important role. The paper presents also the National Innovation Systems with a special view of Romania's position within the European Union in terms of innovation

    Verso un’ecologia generale. Per una cibernetica delle differenze

    Get PDF
    The present paper investigates different ways of employing cybernetics to define a general ecology. In An Introduction to Cybernetics, Ross Ashby distinguishes between a pragmatic, a therapeutic, and an encyclopedic function of cybernetics. Pragmatic and therapeutic functions are generally used as a homeotechnic approach to understand and regulate complex systems. The encyclopedic function refers to the ways in which cybernetics considers the modes-of-relation between different branches of knowledge. In §1 and §2, I focus on some improper uses of the expression “general ecology”. Associated with images such as Spaceship Earth and Gaia, general ecology involves a global, holistic and totalizing point of view, based on the binary codes local/global and part/whole. Concerning this global approach, cybernetics is used only pragmatically and therapeutically. Notwithstanding the reference to general ecology, this approach remains restricted to an economy of nature. In §3 and §4, I sketch a different path toward general ecology by passing through the encyclopedic function of cybernetics. Cybernetics thus becomes a way to compose branches of knowledge without implying a general systems theory, using instead operations of transduction – abstract machines – in order to mediate between different branches. In this perspective, the composition between systems is no longer regarded in the light of the binary codes local/global and part/whole, rather, it is based on the “ecological difference” (recursive and multiscale differentiation between system and environment).The present paper investigates different ways of employing cybernetics to define a general ecology. In An Introduction to Cybernetics, Ross Ashby distinguishes between a pragmatic, a therapeutic, and an encyclopedic function of cybernetics. Pragmatic and therapeutic functions are generally used as a homeotechnic approach to understand and regulate complex systems. The encyclopedic function refers to the ways in which cybernetics considers the modes-of-relation between different branches of knowledge. In §1 and §2, I focus on some improper uses of the expression “general ecology”. Associated with images such as Spaceship Earth and Gaia, general ecology involves a global, holistic and totalizing point of view, based on the binary codes local/global and part/whole. Concerning this global approach, cybernetics is used only pragmatically and therapeutically. Notwithstanding the reference to general ecology, this approach remains restricted to an economy of nature. In §3 and §4, I sketch a different path toward general ecology by passing through the encyclopedic function of cybernetics. Cybernetics thus becomes a way to compose branches of knowledge without implying a general systems theory, using instead operations of transduction – abstract machines – in order to mediate between different branches. In this perspective, the composition between systems is no longer regarded in the light of the binary codes local/global and part/whole, rather, it is based on the “ecological difference” (recursive and multiscale differentiation between system and environment).The present paper investigates different ways of employing cybernetics to define a general ecology. In An Introduction to Cybernetics, Ross Ashby distinguishes between a pragmatic, a therapeutic, and an encyclopedic function of cybernetics. Pragmatic and therapeutic functions are generally used as a homeotechnic approach to understand and regulate complex systems. The encyclopedic function refers to the ways in which cybernetics considers the modes-of-relation between different branches of knowledge. In §1 and §2, I focus on some improper uses of the expression “general ecology”. Associated with images such as Spaceship Earth and Gaia, general ecology involves a global, holistic and totalizing point of view, based on the binary codes local/global and part/whole. Concerning this global approach, cybernetics is used only pragmatically and therapeutically. Notwithstanding the reference to general ecology, this approach remains restricted to an economy of nature. In §3 and §4, I sketch a different path toward general ecology by passing through the encyclopedic function of cybernetics. Cybernetics thus becomes a way to compose branches of knowledge without implying a general systems theory, using instead operations of transduction – abstract machines – in order to mediate between different branches. In this perspective, the composition between systems is no longer regarded in the light of the binary codes local/global and part/whole, rather, it is based on the “ecological difference” (recursive and multiscale differentiation between system and environment)

    BIG DATA RANKING SYSTEM AS AN EFFECTIVE METHOD OF VISUALIZING THE QUALITY OF URBAN STRUCTURAL UNITS

    Get PDF
    Proceedings of the XXV ISUF International Conference “Urban Form and Social Context: from Traditions to Newest Demands” (Krasnoyarsk, July 5–9, 2018)Big data is the basis for new technological changes. Constantly growing volumes of arrays greatly complicate data processing and understanding. Big data analysis extracts knowledge and meaningful information from large and complex data sets. The extraction of information displays regularities hidden in the data. Modern cities use the latest technologies to support sustainable development and a high standard of living. The indicator of a high standard of living of the urban population and, consequently, an indicator of a quality city is the quality of the urban environment. To evaluate the structural units of a city, the most common method is ranking. Ranking systems based on big data are the most effective method of visualizing the quality of structural elements of a city. Innovative ways of collecting and analyzing data are gradually replacing obsolete mechanisms of city management. Unlike statistical data, which are out of date by the time of their analysis, big data can be processed in real time that increases the quality and speed of decision making. The complexity of big data methods implementing in ranking systems is caused by problems of staff shortages, technical equipment, legal rights, security problems and openness of data. Ranking quality systems of the urban environment can be used by the city administration, designers, civil communities to assess the current state and management of the urban environment. The creation of such ranking systems is the first step towards the formation of smart open data-driven cities. The introduction of big data into cities can be divided into three levels as the influence of data on urban governance increases: applied (open data city); semi-autonomous (data-driven city); autonomous (smart city)

    Naval Engineering A National Naval Obligation

    Get PDF
    As part of its national obligations, ONR must ensure US world leadership in those unique technology areas that insure naval superiority. ONR accomplishes this mission through research, recruitment and education, maintaining an adequate base of talent, and sustaining critical infrastructure for research and experimentation. One critical area requiring support by ONR is the "knowledge infrastructure" in Naval Architecture and Marine Engineering. An innovative knowledge infrastructure in NA & ME consists of two main elements: • People who have the knowledge, skills and experience to perform innovative design and engineering applied to in Naval Architecture and Marine Engineering; and • An industry that employs these people and allows this innovative knowledge to be applied in the ships it designs and builds for the Navy. The universities along with industry develop the technology and educate the people who are employed by industry. In turn, the research supported primarily by the government provides direct support for the conduct of research and the education of the future faculty who perform their doctoral research in this discipline. This study examined the current situation in navy related Naval Architecture and Marine Engineering. The need for ONR support in this area is identified and recommendations made to establish long term support that will provide for the introduction of innovative technology in naval ships. The following are documented in this report to establish this need: (1) The uniqueness of "Engineering for the Marine Environment" is explained. Naval Architecture and Marine Engineering, among all engineering disciplines, studies the design of complex marine systems and their performance in the marine environment. The latter is stochastic in nature and exerts motion and vibration dependent loads. (2) The uniqueness of analysis, design, and manufacture of naval ships is presented. A key unique aspect of naval ship design is the need for new capabilities in performance such as high speed while remaining affordable. (3) A vision of the role and knowledge of the NA&ME professional of the future is presented. In a distributed simulation based environment, naval architects will lead the design effort by contributing the expertise in marine mechanics, design of complex marine systems, and design for manufacturing. Naval Architects are trained in marine mechanics and the design of complex marine systems. This breadth of skills will be even broader in the future while remaining base on experience in designing naval ships. (4) The Navy need for a solid national knowledge infrastructure in NA&ME is established. Accordingly, the need for ONR support of research and education in the few healthy NA&ME Departments remaining in top tier US universities is very strong. (5) Navy needs for breakthroughs in such areas as survivability of structures, stealth and hydrodynamic performance, and adaptive structures are identified. From those, fundamental research that naval architects are uniquely qualified to perform for ONR is specified. (6) A selective industry survey has established the areas of technical expertise needed. Naval Architecture and Integrated Ship Design and Shipbuilding and Manufacturing Technology top the list. (7) Freshmen in engineering, the few universities remaining active in teaching and research in NA&ME, ONR, and the shipbuilding industry are the parties involved in this problem. The challenges each party faces are discussed. (8) The urgency for ONR to help preserve the knowledge infrastructure in NA&ME is assessed based on current national trends in funding and student choices. (9) An educated estimate of the national need for naval architects is presented and used as a basis for establishing the level of long term funding in research and education required for a steadily healthy and competitive higher education environment. (10) An implementation plan for a vigorous knowledge infrastructure and a healthy university environment is proposed. This plan abides by the ONR mandate of supporting fundamental, high risk, innovative research needed by the Navy. It calls for: • A research program centered on National Challenge Initiatives with the intent to revolutionize the state of the art in ship analysis and design and to bring the participants, industry, government and academia, in this endeavor closer together in perspective and time for innovation. • Acknowledging NA&ME as a specialty area of basic research. This is typically done by federal research funding agencies. As an example, in NSF, mechanical, civil, electrical, chemical, etc. are established specialty areas. • Modernization of contents and methods of delivery of marine curricula. • Industrial participation in both research and education activities.Office of Naval Research Code 334 ONR Contract number N0001499WR2016

    Evaluation of relevance of stochastic parameters on Hidden Markov Models

    Full text link
    International audiencePrediction of physical particular phenomenon is based on knowledge of the phenomenon. This knowledge helps us to conceptualize this phenomenon around different models. Hidden Markov Models (HMM) can be used for modeling complex processes. This kind of models is used as tool for fault diagnosis systems. Nowadays, industrial robots living in stochastic environment need faults detection to prevent any breakdown. In this paper, we wish to evaluate relevance of Hidden Markov Models parameters, without a priori knowledges. After a brief introduction of Hidden Markov Model, we present the most used selection criteria of models in current literature and some methods to evaluate relevance of stochastic events resulting from Hidden Markov Models. We support our study by an example of simulated industrial process by using synthetic model of Vrignat's study (Vrignat 2010). Therefore, we evaluate output parameters of the various tested models on this process, for finally come up with the most relevant model

    Semantic derivation of enterprise information architecture from riva-based business process architecture

    Get PDF
    Contemporary Enterprise Information Architecture (EIA) design practice in the industry still suffers from issues that hamper the investment in the EIA design. First and foremost of these issues is the shortcoming of EIA design research to bridge the gap between business and systems (or information) architectures. Secondly, contemporary developed business process architecture methods, and in particular object-based ones have not been fully exploited for EIA design and thus widening the gap between business processes and systems. In practice, knowledge-driven approaches have been thoroughly influencing EIA design. Thirdly, the lack of using knowledge representation methods adversely affected the automation (or semi-automation) of the EIA design process. Software Engineering (SE) technologies and Knowledge Representation using ontologies continue to prove instrumental in the design of domain knowledge. Finally, current EIA development methods have often resulted in complex designs that hampered both adopting and exploiting EIA in medium to large scale organisations.This research is aimed at investigating the derivation of the EIA from a given semantic representation of object-based Business Process Architecture (BPA), and in particular Riva-based BPA using the design science research-based methodology. The key design artefact of this research is the development of the BPAOntoEIA framework that semantically derives EIA from a semantic representation of Riva-based BPA of an enterprise. In this framework, EIA elements were derived from the semantic Riva BPA elements and associated business process models, with forward and backward traceability from/to the derived EIA to/from the original BPA. The BPAOntoEIA framework has been evaluated using the semantic Cancer Care and Registration BPA in Jordan. This framework has been validated using an authentic concern-based evaluation framework employing both static and dynamic validation approaches.The BPAOntoEIA framework contributes to bridging the gap between the business and systems world by providing a business/IT alignment through the EIA derivation process, and using the semantic knowledge of business processes within the resultant EIA. A major novel contribution is the introduction of new evaluation metrics for EIA design, which are quantitative, and are not only indicative of the quality of the semantic EIA derivation from the associated BPA but also the extent of utilisingbusiness process knowledge and traceability amongst EIA elements.Amongst other novel contributions is the semantic EIA derivation process that comprises a suite of the Semantic Web Rules Language (SWRL) rules applied on the semantic BPA elements. The derivation scheme utilises the generic EIA (gEIAOnt) ontology that was developed in this research and represents a semantic meta-model of EIA elements of a generic enterprise. The resultant EIA provides a highly coherent semantic information model that is in-line with the theory of EIA design, semantically enriched, and fully utilises the semantic knowledge of business processes.Benefits of this research to industry include the semantic EIA derivation process and a resultant information model that utilises the semantic information of business processes in the enterprise. Therefore, this enables the enterprise strategic management to plan for a single, secure and accessible information resource that is business process driven, and enabled in an agile environment. The semantic enrichment of the EIA is a starting point for a simplistic design of a domain-independent semantic enterprise architecture for the development of systems of systems in loosely coupled enterprises

    BOARD INVITED REVIEW: Prospects for improving management of animal disease introductions using disease-dynamic models

    Get PDF
    Management and policy decisions are continually made to mitigate disease introductions in animal populations despite often limited surveillance data or knowledge of disease transmission processes. Science-based management is broadly recognized as leading to more effective decisions yet application of models to actively guide disease surveillance and mitigate risks remains limited. Disease-dynamic models are an efficient method of providing information for management decisions because of their ability to integrate and evaluate multiple, complex processes simultaneously while accounting for uncertainty common in animal diseases. Here we review disease introduction pathways and transmission processes crucial for informing disease management and models at the interface of domestic animals and wildlife. We describe how disease transmission models can improve disease management and present a conceptual framework for integrating disease models into the decision process using adaptive management principles. We apply our framework to a case study of African swine fever virus in wild and domestic swine to demonstrate how disease-dynamic models can improve mitigation of introduction risk. We also identify opportunities to improve the application of disease models to support decision-making to manage disease at the interface of domestic and wild animals. First, scientists must focus on objective-driven models providing practical predictions that are useful to those managing disease. In order for practical model predictions to be incorporated into disease management a recognition that modeling is a means to improve management and outcomes is important. This will be most successful when done in a cross-disciplinary environment that includes scientists and decisionmakers representing wildlife and domestic animal health. Lastly, including economic principles of value-of-information and cost-benefit analysis in disease-dynamic models can facilitate more efficient management decisions and improve communication of model forecasts. Integration of disease-dynamic models into management and decision-making processes is expected to improve surveillance systems, risk mitigations, outbreak preparedness, and outbreak response activities

    Reconceptualizing information systems business value in the non-profit organizational context

    Get PDF
    The nonprofit sector is an important part of the U.S. economy as an estimated 2.3 million non-profit organizations contributed 804.8billiontothegrossdomesticproduct(GDP),approximately5.5804.8 billion to the gross domestic product (GDP), approximately 5.5% of GDP (Roeger, Blackwood, & Pettijohn, 2012). Significant monetary investments and expenditures are made by these organizations. Non-profit organizations reported 1.51 trillion in revenue, 1.45trillioninexpenses,and1.45 trillion in expenses, and 2.71 trillion in total assets (Roeger et al., 2012). Many non-profit organizations use donated funds to address complex social problems such as education inequality, financial instability, and limited access to health care services. To impact change in these social areas, non-profit organizations operate within a complex business environment characterized by a significant reliance on volunteers, collaboration with other non-profit organizations, and the pursuit of community-driven strategic objectives. The contextual factors that characterize non-profit organizations can have an impact on the way information systems (IS) are integrated within organizational practices and on how these organizations can use IS effectively to achieve business goals (Zhang et al., 2010). Yet, IS research within the non-profit setting is considerably limited (Zhang et al., 2010) and the extent of the impact of these contextual factors is unknown. Further, understanding how non-profit organizations gain value from IS in the non-profit environment has also been neglected in academic literature. Typical terms associated with IS business value research, such as impact on productivity, on market performance, or on economic growth (Schryen, 2013), are not applicable in the non-profit business environment. Non-profit organizational performance is dualistic in nature, primarily focusing on the attainment of various social goals within a particular community in addition to traditional financial measures (Zmud, Carte, & Te'eni, 2004). Therefore, an alternate conceptualization of IS business value and its relationship to organizational performance is necessary when examining IS in non-profit organizations. This multi-method dissertation aims to address the aforementioned issues by focusing on the role of IS in non-profit organizational practices to examine how IS business value is derived in the non-profit context and its impact on non-profit organizational performance. We employ an alternate approach to examining IS business value through the usage of the knowledge-based view of the firm as the theoretical base. This divergence from previous studies which focus solely on the resource-based view of the firm provides us with an entirely new avenue for examining IS business value in the non-profit organizational context. First, within the Introduction, we provide a detailed explanation of the contextual factors in the non-profit context. Second, we provide a thorough literature review on IS business value and discuss the difficulties in directly applying it in the non-profit organizational context. Third, we argue for reconceptualizing IS business value using the knowledge-based view of the firm as the theoretical base. This provides us with a firm ground upon which we can conduct the three studies of this dissertation. The research detailed was conducted at two organizations: United Way of Greater Greensboro (UWGG) and United Way of Central Carolinas (UWCC). Study 1 employs an action research approach at UWGG where, through collaboration with key employees, practical solutions were developed to address IS related issues faced by the focal organization. More specifically, we focused on the utilization of the Enterprise System in an organizational practice and derived theoretical insights on IS business value through integrating Practice theory and Process Theory in the action research approach. Study 2 employs case study methodology to examine business intelligence (BI) practices at UWCC. We provide background on BI usage in the for-profit organizational context and highlight the lack of research in the non-profit organizational context. We then examine BI from a process perspective and theorize on the value that is derived from the organizational utilization of an integrated data system. We draw from intellectual capital research, a core concept based on the knowledge-based view of the firm, to examine how BI provides UWCC with new knowledge on the impact of their programs in the community. We theorize on non-profit IS business value through examining the relationship between BI-facilitated Intellectual capital and its resultant impact on the non-profit's social goal. Study 3 provides a comparative analysis of the role of IS in the social goal strategies employed at both UWGG and UWCC. Using SWOT (strengths, weaknesses, opportunities, and threats) analysis, we examine the favorable and unfavorable aspects of how information systems are utilized in each organization's social goal strategy and provide prescriptive insight into how non-profit organizations can transition towards better strategic IS utilization. Lastly, we conclude this dissertation with a brief summary of salient points, including the dissertation's contributions to research and practice and a discussion of future research. Overall, this three study dissertation provides a holistic view of the role of IS in non-profit organizational social goal strategies and how non-profits derive value from their information systems. This dissertation fills gaps in research on IS business value by reconceptualizing it from a knowledge-based view of the firm, applying it in the non-profit organizational context, and developing theoretical insights on it from multiple perspectives. We make significant contributions to literature in management, organizational behavior, and information systems through our focus on IS usage and utilization in non-profit organizations. This dissertation is one of the first studies to examine non-profit IS organizational practices in situ, provide practical insight to the role of IS in non-profit social goal strategies, and develop theoretical insights into how non-profits utilize and gain value from information systems
    corecore