26 research outputs found

    Suitability of Unidata Metapps for Incorporation in Platform-Independent User-Customized Aviation Weather Products Generation Software

    Get PDF
    The Air Force Combat Climatology Center (AFCCC) is tasked to provide long-range seasonal forecasts for worldwide locations. Currently, the best long-range temperature forecasts the weather community has are the climatological standard normals. This study creates a stepping-stone into the solution of long-range forecasting by finding a process to predict temperatures better than those using climatological standard normals or simple frequency distributions of occurrences. Northern Hemispheric teleconnection indices and the standardized Southern Oscillation index are statistically compared to three-month summed Heating Degree Days (HDDs) and Cooling Degree Days (CDDs) at 14 U.S. locations. First, linear regression was accomplished. The results showed numerous valid models, however, the percent of variance resolved by the models was rarely over 30%. The HDDs and CDDs were then analyzed with Data-mining classification tree statistics, however, the results proved difficult to extract any predictive quantitative information. Finally a Data-mining regression tree analysis was performed. At each conditional outcome, a range of HDDs/CDDs is produced using the predicted standard deviations about the mean. Verification of independent teleconnection indices was used as predictors in the conditional model; 90% of the resulting HDDs/CDDs fell into the calculated range. An overall average reduction in the forecast range was 35.7% over climatolog

    Advanced Scientific Visualization, a Multidisciplinary Technology Based on Engineering and Computer Science

    Get PDF
    Today’s visualization tools are equipped with highly interactive visual aids, which allow analysis and inspection of complex numerical data generated from high-bandwidth data sources such as simulation software, experimental rigs, satellites, scanners, etc. Such tools help scientists and engineers in data extraction, visualization, interpretation and analysis tasks, enabling them to experience a high degree of interaction and effectiveness in solving their design problems, which become more and more complex day by day. As the variety of today’s visualization tools is diversifying, there is a need for their simultaneous use within different engineering software when solving multidisciplinary engineering problems. It is evident that such tools have to be available for a combined use, in order to eliminate many well known problems of sharing, accessing and exchanging design models and the related information content. It is shown that Object-Oriented methodology is a well adapted approach to stream the software development process of future engineering applications. The three European projects ALICE, LASCOT and SERKET are given as examples in which the evolving computer software technologies have been researched and demonstrated to address the evolution of the visualization software in engineering and for information visualization in general

    Virtualising visualisation: A distributed service based approach to visualisation on the Grid

    Get PDF
    Context: Current visualisation systems are not designed to work with the large quantities of data produced by scientists today, they rely on the abilities of a single resource to perform all of the processing and visualisation of data which limits the problem size that they can investigate. Objectives: The objectives of this research are to address the issues encountered by scientists with current visualisation systems and the deficiencies highlighted in current visualisation systems. The research then addresses the question:” How do you design the ideal service oriented architecture for visualisation that meets the needs of scientists?” Method: A new design for a visualisation system based upon a Service Oriented Architecture is proposed to address the issues identified, the architecture is implemented using Java and web service technology. The implementation of the architecture also realised several case study scenarios as demonstrators. Evaluation: Evaluation was performed using case study scenarios of scientific problems and performance data was conducted through experimentation. The scenarios were assessed against the requirements for the architecture and the performance data against a base case simulating a single resource implementation. Conclusion: The virtualised visualisation architecture shows promise for applications where visualisation can be performed in a highly parallel manner and where the problem can be easily sub-divided into chunks for distributed processing

    Semotus visum: a flexible remote visualization framework

    Get PDF
    Journal ArticleBy offering more detail and precision, large data sets can provide greater insights to researchers than small data sets. However, these data sets require greater computing resources to view and manage. Remote visualization techniques allow the use of computers that cannot be operated locally. The Semotus Visum framework applies a high-performance client-server paradigm to the problem. The framework utilizes both client and server resources via multiple rendering methods. Experimental results show the framework delivers high framerates and low latency across a wide range of data sets

    An Isosurface Continuity Algorithm for Super Adaptive Resolution Data

    Get PDF
    We present the chain-gang algorithm for isosurface rendering of super adaptive resolution (SAR) volume data in order to minimize (1) the space needed for storage of both the data and the isosurface and (2) the time taken for computation. The chain-gan

    Methods and Distributed Software for Visualization of Cracks Propagating in Discrete Particle Systems

    Get PDF
    Scientific visualization is becoming increasingly important in analyzing and interpreting numerical and experimental data sets. Parallel computations of discrete particle systems lead to large data sets that can be produced, stored and visualized on distributed IT infrastructures. However, this leads to very complicated environments handling complex simulation and interactive visualization on the remote heterogeneous architectures. In micro-structure of continuum, broken connections between neighbouring particles can form complex cracks of unknown geometrical shape. The complex disjoint surfaces of cracks with holes and unavailability of a suitable scalar field defining the crack surfaces limit the application of the common surface extraction methods. The main visualization task is to extract the surfaces of cracks according to the connectivity of the broken connections and the geometry of the neighbouring particles. The research aims at enhancing the visualization methods of discrete particle systems and increasing speed of distributed visualization software. The dissertation consists of introduction, three main chapters and general conclusions. In the first Chapter, a literature review on visualization software, distributed environments, discrete element simulation of particle systems and crack visualization methods is presented. In the second Chapter, novel visualization methods were proposed for extraction of crack surfaces from monodispersed particle systems modelled by the discrete element method. The cell cut-based method, the Voronoi-based method and cell centre-based method explicitly define geometry of propagating cracks in fractured regions. The proposed visualization methods were implemented in the grid visualization e–service VizLitG and the distributed visualization software VisPartDEM. Partial data set transfer from the grid storage element was developed to reduce the data transfer and visualization time. In the third Chapter, the results of experimental research are presented. The performance of e-service VizLitG was evaluated in a geographically distributed grid. Different types of software were employed for data transfer in order to present the quantitative comparison. The performance of the developed visualization methods was investigated. The quantitative comparison of the execution time of local Voronoi-based method and that of global Voronoi diagrams generated by Voro++ library was presented. The accuracy of the developed methods was evaluated by computing the total depth of cuts made in particles by the extracted crack surfaces. The present research confirmed that the proposed visualization methods and the developed distributed software were capable of visualizing crack propagation modelled by the discrete element method in monodispersed particulate media

    MZmine 2: Modular framework for processing, visualizing, and analyzing mass spectrometry-based molecular profile data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Mass spectrometry (MS) coupled with online separation methods is commonly applied for differential and quantitative profiling of biological samples in metabolomic as well as proteomic research. Such approaches are used for systems biology, functional genomics, and biomarker discovery, among others. An ongoing challenge of these molecular profiling approaches, however, is the development of better data processing methods. Here we introduce a new generation of a popular open-source data processing toolbox, MZmine 2.</p> <p>Results</p> <p>A key concept of the MZmine 2 software design is the strict separation of core functionality and data processing modules, with emphasis on easy usability and support for high-resolution spectra processing. Data processing modules take advantage of embedded visualization tools, allowing for immediate previews of parameter settings. Newly introduced functionality includes the identification of peaks using online databases, MS<sup>n </sup>data support, improved isotope pattern support, scatter plot visualization, and a new method for peak list alignment based on the random sample consensus (RANSAC) algorithm. The performance of the RANSAC alignment was evaluated using synthetic datasets as well as actual experimental data, and the results were compared to those obtained using other alignment algorithms.</p> <p>Conclusions</p> <p>MZmine 2 is freely available under a GNU GPL license and can be obtained from the project website at: <url>http://mzmine.sourceforge.net/</url>. The current version of MZmine 2 is suitable for processing large batches of data and has been applied to both targeted and non-targeted metabolomic analyses.</p

    Granite: A scientific database model and implementation

    Get PDF
    The principal goal of this research was to develop a formal comprehensive model for representing highly complex scientific data. An effective model should provide a conceptually uniform way to represent data and it should serve as a framework for the implementation of an efficient and easy-to-use software environment that implements the model. The dissertation work presented here describes such a model and its contributions to the field of scientific databases. In particular, the Granite model encompasses a wide variety of datatypes used across many disciplines of science and engineering today. It is unique in that it defines dataset geometry and topology as separate conceptual components of a scientific dataset. We provide a novel classification of geometries and topologies that has important practical implications for a scientific database implementation. The Granite model also offers integrated support for multiresolution and adaptive resolution data. Many of these ideas have been addressed by others, but no one has tried to bring them all together in a single comprehensive model. The datasource portion of the Granite model offers several further contributions. In addition to providing a convenient conceptual view of rectilinear data, it also supports multisource data. Data can be taken from various sources and combined into a unified view. The rod storage model is an abstraction for file storage that has proven an effective platform upon which to develop efficient access to storage. Our spatial prefetching technique is built upon the rod storage model, and demonstrates very significant improvement in access to scientific datasets, and also allows machines to access data that is far too large to fit in main memory. These improvements bring the extremely large datasets now being generated in many scientific fields into the realm of tractability for the ordinary researcher. We validated the feasibility and viability of the model by implementing a significant portion of it in the Granite system. Extensive performance evaluations of the implementation indicate that the features of the model can be provided in a user-friendly manner with an efficiency that is competitive with more ad hoc systems and more specialized application specific solutions

    Desarrollo de una librería computacional para la estimación espacial utilizando conceptos geográficos y programación orientada a objetos espaciales

    Full text link
    Este trabajo final de máster trata acerca de la construcción de una librería de estimación geoestadística que permita la consideración explícita de la localización espacial de los datos de entrada y los resultados obtenidos mediante su georeferenciación a partir de sistemas de coordenadas geográficos o planos estándar.Álvarez Villa, ÓD. (2009). Desarrollo de una librería computacional para la estimación espacial utilizando conceptos geográficos y programación orientada a objetos espaciales. http://hdl.handle.net/10251/13759Archivo delegad
    corecore