886,643 research outputs found

    A review of some problems in global-local stress analysis

    Get PDF
    The various types of local-global finite-element problems point out the need to develop a new generation of software. First, this new software needs to have a complete analysis capability, encompassing linear and nonlinear analysis of 1-, 2-, and 3-dimensional finite-element models, as well as mixed dimensional models. The software must be capable of treating static and dynamic (vibration and transient response) problems, including the stability effects of initial stress, and the software should be able to treat both elastic and elasto-plastic materials. The software should carry a set of optional diagnostics to assist the program user during model generation in order to help avoid obvious structural modeling errors. In addition, the program software should be well documented so the user has a complete technical reference for each type of element contained in the program library, including information on such topics as the type of numerical integration, use of underintegration, and inclusion of incompatible modes, etc. Some packaged information should also be available to assist the user in building mixed-dimensional models. An important advancement in finite-element software should be in the development of program modularity, so that the user can select from a menu various basic operations in matrix structural analysis

    Methodology for Seamless Supply Chain Planning

    Get PDF
    Today, enterprises are typically in a constant process of acquiring and updating its information technologies, however typically without an overall view of the global inter and intra enterprise’s system integration. Researchers have been proposing methodologies and platforms to assist such integration of applications and data. However, implementing new technologies in organizations is a difficult task, since its quality needs for architectures development are more exigent and critical than ever, due to the systems complexity, dimension and to the interoperability requirements to interact with third party applications and infrastructures. This paper proposes a methodology for seamless Supply Chain Planning (SCP), by using a domain reference ontology, data model representation standards, software components evaluation and interoperability checking processes. The methodology VALTE is used to assure that enterprises use tools for SCP compliant to semantics, represented in a common reference ontology, created by the MENTOR methodology. These two horizontal methodologies are vertically supported by interoperability checking processes, which assure an interoperable supply chain planning system

    Enterprise Reference Library

    Get PDF
    Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference management tool during the development, test, and operational phases of this project. The Enterprise Reference Library project expands the capabilities developed for IMM to address the above issues and enhance collaboration across JSC. Method: After significant market analysis for a multi-user reference management tool, no available commercial tool was found to meet this need, so a software program was built around a commercial tool, Reference Manager 12 by The Thomson Corporation. A use case approach guided the requirements development phase. The premise of the design is that individuals use their own reference management software and export to SharePoint when their library is incorporated into the Enterprise Reference Library. This results in a searchable user-specific library application. An accompanying share folder will warehouse the electronic full-text articles, which allows the global user community to access full -text articles. Discussion: An enterprise reference library solution can provide a multidisciplinary collection of full text articles. This approach improves efficiency in obtaining and storing reference material while greatly reducing labor, purchasing and duplication costs. Most importantly, increasing collaboration across research groups provides unprecedented access to information relevant to NASA s mission. Conclusion: This project is an expansion and cost-effective leveraging of the existing JSC centralized library. Adding key word and author search capabilities and an alert function for notifications about new articles, based on users profiles, represent examples of future enhancements

    A technology reference model for client/server software development

    Get PDF
    In today's highly competitive global economy, information resources representing enterprise-wide information are essential to the survival of an organization. The development of and increase in the use of personal computers and data communication networks are supporting or, in many cases, replacing the traditional computer mainstay of corporations. The client/server model incorporates mainframe programming with desktop applications on personal computers. The aim of the research is to compile a technology model for the development of client/server software. A comprehensive overview of the individual components of the client/server system is given. The different methodologies, tools and techniques that can be used are reviewed, as well as client/server-specific design issues. The research is intended to create a road map in the form of a Technology Reference Model for Client/Server Software Development.ComputingM. Sc. (Information Systems

    Building and calibrating a country-level detailed global electricity model based on public data

    Get PDF
    Deep decarbonization of the global electricity sector is required to meet ambitious climate change targets. This underlines the need for improved models to facilitate an understanding of the global challenges ahead, particularly on the concept of large-scale interconnection of power systems. Developments in recent years regarding availability of open data as well as improvements in hardware and software has stimulated the use of more advanced and detailed electricity system models. In this paper we explain the process of developing a first-of-its-kind reference global electricity system model with over 30,000 individual power plants representing 164 countries spread out over 265 nodes. We describe the steps in the model development, assess the limitations and existing data gaps and we furthermore showcase the robustness of the model by benchmarking calibrated hourly simulation results with historical emission and generation data on a country level. The model can be used to evaluate the operation of today's power systems or can be applied for scenario studies assessing a range of global decarbonization pathways. Comprehensive global power system datasets are provided as part of the model input data, with all data being openly available under the FAIR Guiding Principles for scientific data management and stewardship allowing users to modify or recreate the model in other simulation environments. The software used for this study (PLEXOS) is freely available for academic use

    The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    Get PDF
    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making

    The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    Get PDF
    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding.  A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive,  high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making

    Investigating the problems experienced by virtual team members engaged in requirements elicitation

    Get PDF
    The constant acceleration in the rate of technological innovation, and the ever growing emphasis on the importance of information for competition has seen organisations around the world strive for the technologies that give them global customer reach. One of the most pervasive technological innovations developed is the internet, and its unique quality of being able to draw people from across the world together in one virtual space has given birth to the concept of virtual teams. Organisations have seized the advantages of such virtual teams to give them the cost and time reductions they need to stay competitive in the global marketplace. In the software industry, where product and service development is always a race against time, forward thinking software companies in the developed world have taken full advantage of the cost and time saving benefits that virtual teams have to offer. In addition, the rate of expansion of technology and software to support such teams is also growing exponentially, offering increasingly faster ways of virtual working. Despite the immense advantages offered by such teams, South African software development companies do not seem to engage in distributed work to any great degree. The importance of this research rests on the belief that South African software development companies will be unable to avoid engaging in distributed software development if they are to achieve and maintain competitiveness in the global marketplace. This research focuses on a sub-section of the software development process with a specific reference to South African software development. The requirements elicitation phase of software development is one of the initial stages of any software project. It is here that developers work with the users in order to identify requirements for the system to be built. It is acknowledged that other phases of distributed development also bring to bear their own problems, however, in the interests of scoping this research, only the requirements elicitation process is focused on. The research shows that most techniques of requirements elicitation can be adapted for use within the virtual environment, although each technique has its share of advantages and disadvantages. In addition, virtual team members experience problems during their general, day-to-day interactions, many of these arising from the dependence on technology for communication and task performance. The research identifies the problems in both categories, and develops a holistic model of virtual requirements elicitation to prevent or solve the problems experienced by virtual teams engaged in distributed requirements elicitation. The model is made up of three key frameworks, each of which prescribes actions to be taken to ensure the success of the virtual team within the requirements elicitation process. The model is verified through the testing of its critical success factors. Certain aspects of the model were adapted based on the findings of the study, but it was confirmed that the rationale behind the model is sound, indicating that it has the potential to solve the problems of virtual RE when implemented

    Carbon residence time dominates uncertainty in terrestrial vegetation responses to future climate and atmospheric CO2.

    Get PDF
    Future climate change and increasing atmospheric CO2 are expected to cause major changes in vegetation structure and function over large fractions of the global land surface. Seven global vegetation models are used to analyze possible responses to future climate simulated by a range of general circulation models run under all four representative concentration pathway scenarios of changing concentrations of greenhouse gases. All 110 simulations predict an increase in global vegetation carbon to 2100, but with substantial variation between vegetation models. For example, at 4 °C of global land surface warming (510-758 ppm of CO2), vegetation carbon increases by 52-477 Pg C (224 Pg C mean), mainly due to CO2 fertilization of photosynthesis. Simulations agree on large regional increases across much of the boreal forest, western Amazonia, central Africa, western China, and southeast Asia, with reductions across southwestern North America, central South America, southern Mediterranean areas, southwestern Africa, and southwestern Australia. Four vegetation models display discontinuities across 4 °C of warming, indicating global thresholds in the balance of positive and negative influences on productivity and biomass. In contrast to previous global vegetation model studies, we emphasize the importance of uncertainties in projected changes in carbon residence times. We find, when all seven models are considered for one representative concentration pathway × general circulation model combination, such uncertainties explain 30% more variation in modeled vegetation carbon change than responses of net primary productivity alone, increasing to 151% for non-HYBRID4 models. A change in research priorities away from production and toward structural dynamics and demographic processes is recommended.The research leading to these results has received funding from the European Community’s Seventh Framework Programme (FP7 2007-2013) under Grant 238366. R.B., R.K., R.D., A.W., and P.D.F. were supported by the Joint Department of Energy and Climate Change/Department for Environment, Food and Rural Affairs Met Office Hadley Centre Climate Programme (GA01101). A.I. and K.N. were supported by the Environment Research and Technology Development Fund (S-10) of the Ministry of the Environment, Japan. We acknowledge the World Climate Research Programme’s Working Group on Coupled Modelling, which is responsible for the Coupled Model Intercomparison Project (CMIP), and we thank the climate modeling groups responsible for the GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, and NorESM1-M models for producing and making available their model output. For CMIP, the US Department of Energy’s Program for Climate Model Diagnosis and Intercomparison provides coordinating support and led development of software infrastructure in partnership with the Global Organization for Earth System Science Portals. This work has been conducted under the framework of the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP). The ISI-MIP Fast Track project was funded by the German Federal Ministry of Education and Research (BMBF) with project funding Reference 01LS1201A.This is the author accepted manuscript. The final version is available from PNAS via http://dx.doi.org/10.1073/pnas.122247711

    Independent Verification of Mars-GRAM 2010 with Mars Climate Sounder Data

    Get PDF
    The Mars Global Reference Atmospheric Model (Mars-GRAM) is an engineering-level atmospheric model widely used for diverse mission and engineering applications. Applications of Mars-GRAM include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Atmospheric influences on landing site selection and long-term mission conceptualization and development can also be addressed utilizing Mars-GRAM. Mars-GRAM's perturbation modeling capability is commonly used, in a Monte Carlo mode, to perform high-fidelity engineering end-to-end simulations for entry, descent, and landing. Mars-GRAM is an evolving software package resulting in improved accuracy and additional features. Mars-GRAM 2005 has been validated against Radio Science data, and both nadir and limb data from the Thermal Emission Spectrometer (TES). From the surface to 80 km altitude, Mars-GRAM is based on the NASA Ames Mars General Circulation Model (MGCM). Above 80 km, Mars-GRAM is based on the University of Michigan Mars Thermospheric General Circulation Model (MTGCM). The most recent release of Mars-GRAM 2010 includes an update to Fortran 90/95 and the addition of adjustment factors. These adjustment factors are applied to the input data from the MGCM and the MTGCM for the mapping year 0 user-controlled dust case. The adjustment factors are expressed as a function of height (z), latitude and areocentric solar longitude (Ls)
    corecore