553 research outputs found

    Application of nitrogen metabolism in autotrophic bacteria to chemosynthetic bioregeneration in space missions, supplement

    Get PDF
    The chemolithotroph, Hydrogenomonas eutropha, was considered as a life support, bioregenerative system. This project focuses on several metabolic functions that are related to the proposed nitrogen cycle between man and this microbe. Specifically this organism has the capability to utilize as the sole nitrogen source such urine components as urea and fifteen individual amino acids, but not nine other amino acids. The effectiveness of utilization was high for many amino acids. Several specific growth inhibitions were also observed. The enzyme that catalyzes the incorporation of ammonia in the medium into amino acids was identified as a NADP-specific, L-glutamate dehydrogenase. This enzyme has a constitutive nature. This organism can synthesize all of its amino acids from carbon dioxide and ammonia. Therefore with the background literature of multiple pathways of individual amino acid biosyntheses, our evidence to date is consistent with the Hydrogeneomonas group having the same pathway of valine-isoleucine formation as the classical E. coli

    Adaptive methods for linear dynamic systems in the frequency domain with application to global optimization

    Get PDF
    Designers often seek to improve their designs by considering several discrete modifications. These modifications may require changes in materials and geometry, as well as the addition or removal of individual components. In general, if the modifications are applied one at a time, none of them may sufficiently improve the performance. Also, the total number of modifications that may be included in the final design is often limited due to cost or other constraints. The designer must therefore determine the optimal combination of modifications in order to complete the design. While this design challenge arises fairly commonly in practice, very little research has studied it in its full generality. This work assumes that the mathematical description of the design and its modifications are frequency dependent matrices. Such matrices typically arise due to finite element analysis as well as other modeling techniques. Computing performance metrics related to steady-state forced response, also known as performing a frequency sweep, involves factorizing these matrices many times. Additionally, determining the globally optimum design in this case involves an exhaustive search of the combinations of modifications. These factors lead to prohibitively long run times particularly as the size of the system grows. The research presented here seeks to reduce these costs, making such a search feasible. Several innovative techniques have been developed and tested over the course of the research, focused in two primary areas: adaptive frequency sweeps and efficient combinatorial optimization. The frequency sweep methods rely on an adaptive bisection of the frequency range and either a subspace approximation based on implicit interpolatory model order reduction or an elementwise approximation using piecewise multi-point Padé interpolants. Additionally, a strategy for augmenting the adaptive methods with the system's modal information is presented. For combinatorial optimization, an approximation algorithm is developed that capitalizes on any presence of dynamic uncoupling between modifications. The net effect of this work is to allow designers and researchers to develop new dynamic systems and perform analyses faster and more efficiently than ever before

    Business Intelligence Software for the Classroom: MicroStrategy Resources on the Teradata University Network

    Get PDF
    Faculty members are challenged with staying abreast of business intelligence and teaching the topic in relevant ways. The latest enhancement to the Teradata University Network (www.TeradataUniversityNetwork.com) is the addition of business intelligence software The website now offers MicroStrategy 7i, an interactive environment for business reporting and analysis and several MicroStrategy analytic modules that focus on analysis for specific business processes. The new software is available for hands-on use by faculty and students. This tutorial describes these business intelligence resources and provides several ways in which the resources can be used to create effective classroom experiences. The resources are available to all faculty and students at no cost by registering with the Teradata University Network

    Climate Change, Capitalism, and Citizen Science: Developing a dialectical framework for examining volunteer participation in climate change research

    Get PDF
    This dissertation discusses the complex social relations that link citizen science, scientific literacy, and the dissemination of information to the public. Scientific information is not produced in value-neutral settings by people removed from their social context. Instead, science is a social pursuit and the scientist\u27s social context is embedded in the knowledge produced. Additionally, the dissemination of this information via numerous media outlets is filtered through institutional lenses and subject to journalistic norms. As a result, the general public must be able to recognize the inherent biases in this information. Yet, the rates of scientific literacy in the U.S. are quite low, which suggests that people may not be capable of fully understanding the biases present. Furthermore, people tend to seek out sources that reinforce their values and personal perspectives, thus reinforcing their own biases. Improving scientific literacy allows people to see past these biases and translate media narratives in order to comprehend the facts and evidence presented to them. Citizen science is both an epistemological tool used by scientists to collect and interpret scientific data and a means to improve the scientific literacy of participants. Citizen science programs have the ability to generate real knowledge and improve the critical thinking skills necessary for the general public to interpret scientific information

    Overcoming Organizational Obstacles to Artificial Intelligence Value Creation: Propositions for Research

    Get PDF
    Artificial intelligence (AI) is the next technology revolution, and one which offers huge potential benefits for companies around the world. In fact, companies that learn how to use AI effectively will be positioned to maximize value creation using data in the emerging algorithmic economy. Uptake of AI has been limited, however, and there are mounting concerns associated with AI use. This paper explores what companies need to better understand about AI so they can make the most of this transformational technology. The paper develops a research framework and an associated research agenda intended to motivate practice-based research that will help organizations overcome obstacles for AI value creation

    Enablers and Mechanisms: Practices for Achieving Synergy with Business Analytics

    Get PDF
    Business Analytics (BA) systems use advanced statistical and computational techniques to analyze organizational data and enable informed and insightful decision-making. BA systems interact with other organizational systems and if their relationship is synergistic, together they create higher-order BA-enabled organizational systems, which have the potential to create value and gain competitive advantage. In this paper, we focus on the enablers and mechanisms of synergy between BA and other organizational systems and identify a set of organizational practices that underlie the emergence of BA-enabled organizational systems. We use a case study involving a large IT firm to identify the organizational practices associated with synergistic relationships that lead to the emergence of higher-order BA-enabled organizational systems

    Data Liquidity: Conceptualization, Measurement and Determinants

    Get PDF
    Despite the rhetoric that “data is the new oil” organizations continue to face challenges in data monetization, and we don’t have a reliable way to measure how easily data assets can be reused and recombined in value creation and appropriation efforts. Data asset liquidity is a critical, yet underexamined, prerequisite for data monetization initiatives. We contribute to the theorizing process by advancing a definition, conceptualization, and measurement of data liquidity as an asset level construct. Based on interviews with 95 Chief Data and Analytics Officers from 67 distinct large global organizations, we identify three determinants of data liquidity: inherent asset characteristics, structural asset characteristics, and asset environment characteristics. We theorize the existence of equifinal configurations that yield liquid data assets, configurations that should prove helpful to academics and practitioners seeking to understand data liquidity and its impact on firms’ data monetization efforts as well as society at large

    Editors\u27 Comments

    Get PDF

    Sherwin-Williams\u27 Data Mart Strategy: Creating Intelligence Across the Supply Chain

    Get PDF
    Companies can build a data warehouse using a top-down or a bottom-up approach, and each has its advantages and disadvantages. With the top-down approach, a project team creates an enterprise data warehouse that combines data from across the organization, and end-user applications are developed after the warehouse is in place. This strategy is likely to result in a scaleable data warehouse, but like most large IT projects, it is time consuming, expensive, and may fail to deliver benefits within a reasonable timeframe. With the bottom-up approach, a project team begins by creating a data mart that has a limited set of data sources and that meets very specific user requirements. After the data mart is complete, subsequent marts are developed, and they are conformed to data structures and processes that are already in place. The data marts are incrementally architected into an enterprise data warehouse that meets the needs of users across the organization. The appeal of the data mart strategy is that a mart can be built quickly, at relatively little cost and risk, while providing a proof of concept for data warehousing. The risk is that the initial data mart will not scale into an enterprise data warehouse, and what has been built will have to be scrapped and redone. This article provides a case study of Sherwin-Williams\u27 successful use of the bottom-up, data mart strategy. It provides background information on Sherwin-Williams, the data warehousing project, the benefits being realized from the warehouse, and the lessons learned. The case is a textbook example of how to successfully execute a data mart strategy. Video clips of interviews with key individuals at Sherwin-Williams help bring the case alive
    corecore