17 research outputs found

    Fault-Tolerant Ring Embeddings in Hypercubes -- A Reconfigurable Approach

    Get PDF
    We investigate the problem of designing reconfigurable embedding schemes for a fixed hypercube (without redundant processors and links). The fundamental idea for these schemes is to embed a basic network on the hypercube without fully utilizing the nodes on the hypercube. The remaining nodes can be used as spares to reconfigure the embeddings in case of faults. The result of this research shows that by carefully embedding the application graphs, the topological properties of the embedding can be preserved under fault conditions, and reconfiguration can be carried out efficiently. In this dissertation, we choose the ring as the basic network of interest, and propose several schemes for the design of reconfigurable embeddings with the aim of minimizing reconfiguration cost and performance degradation. The cost is measured by the number of node-state changes or reconfiguration steps needed for processing of the reconfiguration, and the performance degradation is characterized as the dilation of the new embedding after reconfiguration. Compared to the existing schemes, our schemes surpass the existing ones in terms of applicability of schemes and reconfiguration cost needed for the resulting embeddings

    Snapshot hyperspectral imaging : near-infrared image replicating imaging spectrometer and achromatisation of Wollaston prisms

    Get PDF
    Conventional hyperspectral imaging (HSI) techniques are time-sequential and rely on temporal scanning to capture hyperspectral images. This temporal constraint can limit the application of HSI to static scenes and platforms, where transient and dynamic events are not expected during data capture. The Near-Infrared Image Replicating Imaging Spectrometer (N-IRIS) sensor described in this thesis enables snapshot HSI in the short-wave infrared (SWIR), without the requirement for scanning and operates without rejection in polarised light. It operates in eight wavebands from 1.1μm to 1.7μm with a 2.0° diagonal field-of-view. N-IRIS produces spectral images directly, without the need for prior topographic or image reconstruction. Additional benefits include compactness, robustness, static operation, lower processing overheads, higher signal-to-noise ratio and higher optical throughput with respect to other HSI snapshot sensors generally. This thesis covers the IRIS design process from theoretical concepts to quantitative modelling, culminating in the N-IRIS prototype designed for SWIR imaging. This effort formed the logical step in advancing from peer efforts, which focussed upon the visible wavelengths. After acceptance testing to verify optical parameters, empirical laboratory trials were carried out. This testing focussed on discriminating between common materials within a controlled environment as proof-of-concept. Significance tests were used to provide an initial test of N-IRIS capability in distinguishing materials with respect to using a conventional SWIR broadband sensor. Motivated by the design and assembly of a cost-effective visible IRIS, an innovative solution was developed for the problem of chromatic variation in the splitting angle (CVSA) of Wollaston prisms. CVSA introduces spectral blurring of images. Analytical theory is presented and is illustrated with an example N-IRIS application where a sixfold reduction in dispersion is achieved for wavelengths in the region 400nm to 1.7μm, although the principle is applicable from ultraviolet to thermal-IR wavelengths. Experimental proof of concept is demonstrated and the spectral smearing of an achromatised N-IRIS is shown to be reduced by an order of magnitude. These achromatised prisms can provide benefits to areas beyond hyperspectral imaging, such as microscopy, laser pulse control and spectrometry

    Renewable Energy Resource Assessment and Forecasting

    Get PDF
    In recent years, several projects and studies have been launched towards the development and use of new methodologies, in order to assess, monitor, and support clean forms of energy. Accurate estimation of the available energy potential is of primary importance, but is not always easy to achieve. The present Special Issue on ‘Renewable Energy Resource Assessment and Forecasting’ aims to provide a holistic approach to the above issues, by presenting multidisciplinary methodologies and tools that are able to support research projects and meet today’s technical, socio-economic, and decision-making needs. In particular, research papers, reviews, and case studies on the following subjects are presented: wind, wave and solar energy; biofuels; resource assessment of combined renewable energy forms; numerical models for renewable energy forecasting; integrated forecasted systems; energy for buildings; sustainable development; resource analysis tools and statistical models; extreme value analysis and forecasting for renewable energy resources

    Large Model Visualization : Techniques and Applications

    Get PDF
    The size of datasets in scientific computing is rapidly increasing. This increase is caused by a boost of processing power in the past years, which in turn was invested in an increase of the accuracy and the size of the models. A similar trend enabled a significant improvement of medical scanners; more than 1000 slices of a resolution of 512x512 can be generated by modern scanners in daily practice. Even in computer-aided engineering typical models eas-ily contain several million polygons. Unfortunately, the data complexity is growing faster than the rendering performance of modern computer systems. This is not only due to the slower growing graphics performance of the graphics subsystems, but in particular because of the significantly slower growing memory bandwidth for the transfer of the geometry and image data from the main memory to the graphics accelerator. Large model visualization addresses this growing divide between data complexity and rendering performance. Most methods focus on the reduction of the geometric or pixel complexity, and hence also the memory bandwidth requirements are reduced. In this dissertation, we discuss new approaches from three different research areas. All approaches target at the reduction of the processing complexity to achieve an interactive visualization of large datasets. In the second part, we introduce applications of the presented ap-proaches. Specifically, we introduce the new VIVENDI system for the interactive virtual endoscopy and other applications from mechanical engineering, scientific computing, and architecture.The size of datasets in scientific computing is rapidly increasing. This increase is caused by a boost of processing power in the past years, which in turn was invested in an increase of the accuracy and the size of the models. A similar trend enabled a significant improvement of medical scanners; more than 1000 slices of a resolution of 512x512 can be generated by modern scanners in daily practice. Even in computer-aided engineering typical models eas-ily contain several million polygons. Unfortunately, the data complexity is growing faster than the rendering performance of modern computer systems. This is not only due to the slower growing graphics performance of the graphics subsystems, but in particular because of the significantly slower growing memory bandwidth for the transfer of the geometry and image data from the main memory to the graphics accelerator. Large model visualization addresses this growing divide between data complexity and rendering performance. Most methods focus on the reduction of the geometric or pixel complexity, and hence also the memory bandwidth requirements are reduced. In this dissertation, we discuss new approaches from three different research areas. All approaches target at the reduction of the processing complexity to achieve an interactive visualization of large datasets. In the second part, we introduce applications of the presented ap-proaches. Specifically, we introduce the new VIVENDI system for the interactive virtual endoscopy and other applications from mechanical engineering, scientific computing, and architecture

    Workshop on Fuzzy Control Systems and Space Station Applications

    Get PDF
    The Workshop on Fuzzy Control Systems and Space Station Applications was held on 14-15 Nov. 1990. The workshop was co-sponsored by McDonnell Douglas Space Systems Company and NASA Ames Research Center. Proceedings of the workshop are presented

    Ant Colony Algorithms for the Resolution of Semantic Searches in P2P Networks

    Full text link
    Tesis por compendio[EN] The long-lasting trend in the field of computation of stress and resource distribution has found its way into computer networks via the concept of peer-to-peer (P2P) connectivity. P2P is a symmetrical model, where each network node is enabled a comparable range of capacities and resources. It stands in a stark contrast to the classical, strongly asymmetrical client-server approach. P2P, originally considered only a complimentary, server-side structure to the straightforward client-server model, has been shown to have the substantial potential on its own, with multiple, widely known benefits: good fault tolerance and recovery, satisfactory scalability and intrinsic load distribution. However, contrary to client-server, P2P networks require sophisticated solutions on all levels, ranging from network organization, to resource location and managing. In this thesis we address one of the key issues of P2P networks: performing efficient resource searches of semantic nature under realistic, dynamic conditions. There have been numerous solutions to this matter, with evolutionary, stigmergy-based, and simple computational foci, but few attempt to resolve the full range of challenges this problem entails. To name a few: real-life P2P networks are rarely static, nodes disconnect, reconnect and change their content. In addition, a trivial incorporation of semantic searches into well-known algorithms causes significant decrease in search efficiency. In our research we build a solution incrementally, starting with the classic Ant Colony System (ACS) within the Ant Colony Optimization metaheuristic (ACO). ACO is an algorithmic framework used for solving combinatorial optimization problems that fits contractually the problem very well, albeit not providing an immediate solution to any of the aforementioned problems. First, we propose an efficient ACS variant in structured (hypercube structured) P2P networks, by enabling a path-post processing algorithm, which called Tabu Route Optimization (TRO). Next, we proceed to resolve the issue of network dynamism with an ACO-compatible information diffusion approach. Consequently, we attempt to incorporate the semantic component of the searches. This initial approximation to the problem was achieved by allowing ACS to differentiate between search types with the pheromone-per-concept idea. We called the outcome of this merger Routing Concept ACS (RC-ACS). RC-ACS is a robust, static multipheromone implementation of ACS. However, we were able to conclude from it that the pheromone-per-concept approach offers only limited scalability and cannot be considered a global solution. Thus, further progress was made in this respect when we introduced to RC-ACS our novel idea: dynamic pheromone creation, which replaces the static one-to-one assignment. We called the resulting algorithm Angry Ant Framework (AAF). In AAF new pheromone levels are created as needed and during the search, rather than prior to it. The final step was to enable AAF, not only to create pheromone levels, but to reassign them to optimize the pheromone usage. The resulting algorithm is called EntropicAAF and it has been evaluated as one of the top-performing algorithms for P2P semantic searches under all conditions.[ES] La popular tendencia de distribución de carga y recursos en el ámbito de la computación se ha transmitido a las redes computacionales a través del concepto de la conectividad peer-to-peer (P2P). P2P es un modelo simétrico, en el cual a cada nodo de la red se le otorga un rango comparable de capacidades y recursos. Se trata de un fuerte contraste con el clásico y fuertemente asimétrico enfoque cliente-servidor. P2P, originalmente considerado solo como una estructura del lado del servidor complementaria al sencillo modelo cliente-servidor, ha demostrado tener un potencial considerable por sí mismo, con múltiples beneficios ampliamente conocidos: buena tolerancia a fallos y recuperación, escalabilidad satisfactoria y distribución de carga intrínseca. Sin embargo, al contrario que el modelo cliente-servidor, las redes P2P requieren de soluciones sofisticadas a todos los niveles, desde la organización de la red hasta la gestión y localización de recursos. Esta tesis aborda uno de los problemas principales de las redes P2P: la búsqueda eficiente de recursos de naturaleza semántica bajo condiciones dinámicas y realistas. Ha habido numerosas soluciones a este problema basadas en enfoques evolucionarios, estigmérgicos y simples, pero pocas han tratado de resolver el abanico completo de desafíos. En primer lugar, las redes P2P reales son raramente estáticas: los nodos se desconectan, reconectan y cambian de contenido. Además, la incorporación trivial de búsquedas semánticas en algoritmos conocidos causa un decremento significativo de la eficiencia de la búsqueda. En esta investigación se ha construido una solución de manera incremental, comenzando por el clásico Ant Colony System (ACS) basado en la metaheurística de Ant Colony Optimization (ACO). ACO es un framework algorítmico usado para búsquedas en grafos que encaja perfectamente con las condiciones del problema, aunque no provee una solución inmediata a las cuestiones mencionadas anteriormente. En primer lugar, se propone una variante eficiente de ACS para redes P2P estructuradas (con estructura de hipercubo) permitiendo el postprocesamiento de las rutas, al que hemos denominado Tabu Route Optimization (TRO). A continuación, se ha tratado de resolver el problema del dinamismo de la red mediante la difusión de la información a través de una estrategia compatible con ACO. En consecuencia, se ha tratado de incorporar el componente semántico de las búsquedas. Esta aproximación inicial al problema ha sido lograda permitiendo al ACS diferenciar entre tipos de búsquedas através de la idea de pheromone-per-concept. El resultado de esta fusión se ha denominado Routing Concept ACS (RC-ACS). RC-ACS es una implementación multiferomona estática y robusta de ACS. Sin embargo, a partir de esta implementación se ha podido concluir que el enfoque pheromone-per-concept ofrece solo escalabilidad limitada y que no puede ser considerado una solución global. Por lo tanto, para lograr una mejora a este respecto, se ha introducido al RC-ACS una novedosa idea: la creación dinámica de feromonas, que reemplaza la asignación estática uno a uno. En el algoritmo resultante, al que hemos denominado Angry Ant Framework (AAF), los nuevos niveles de feromona se crean conforme se necesitan y durante la búsqueda, en lugar de crearse antes de la misma. La mejora final se ha obtenido al permitir al AAF no solo crear niveles de feromona, sino también reasignarlos para optimizar el uso de la misma. El algoritmo resultante se denomina EntropicAAF y ha sido evaluado como uno de los algoritmos más exitosos para las búsquedas semánticas P2P bajo todas las condiciones.[CA] La popular tendència de distribuir càrrega i recursos en el camp de la computació s'ha estès cap a les xarxes d'ordinadors a través del concepte de connexions d'igual a igual (de l'anglès, peer to peer o P2P). P2P és un model simètric on cada node de la xarxa disposa del mateix nombre de capacitats i recursos. P2P, considerat originàriament només una estructura situada al servidor complementària al model client-servidor simple, ha provat tindre el suficient potencial per ella mateixa, amb múltiples beneficis ben coneguts: una bona tolerància a errades i recuperació, una satisfactòria escalabilitat i una intrínseca distribució de càrrega. No obstant, contràriament al client-servidor, les xarxes P2P requereixen solucions sofisticades a tots els nivells, que varien des de l'organització de la xarxa a la localització de recursos i la seua gestió. En aquesta tesi s'adreça un dels problemes clau de les xarxes P2P: ser capaç de realitzar eficientment cerques de recursos de naturalesa semàntica sota condicions realistes i dinàmiques. Existeixen nombroses solucions a aquest tema basades en la computació simple, evolutiva i també basades en l'estimèrgia (de l'anglès, stigmergy), però pocs esforços s'han realitzat per intentar resoldre l'ampli conjunt de reptes existent. En primer lloc, les xarxes P2P reals són rarament estàtiques: els nodes es connecten, desconnecten i canvien els seus continguts. A més a més, la incorporació trivial de cerques semàntiques als algorismes existents causa una disminució significant de l'eficiència de la cerca. En aquesta recerca s'ha construït una solució incremental, començant pel sistema clàssic de colònia de formigues (de l'anglés, Ant Colony System o ACS) dins de la metaheurística d'optimització de colònies de formigues (de l'anglès, Ant Colony Optimization o ACO). ACO és un entorn algorísmic utilitzat per cercar en grafs i que aborda el problema de forma satisfactòria, tot i que no proveeix d'una solució immediata a cap dels problemes anteriorment mencionats. Primer, s'ha proposat una variant eficient d'ACS en xarxes P2P estructurades (en forma d'hipercub) a través d'un algorisme de processament post-camí el qual s'ha anomenat en anglès Tabu Route Optimization (TRO). A continuació, s'ha procedit a resoldre el problema del dinamisme de les xarxes amb un enfocament de difusió d'informació compatible amb ACO. Com a conseqüència, s'ha intentat incorporar la component semàntica de les cerques. Aquest enfocament inicial al problema s'ha realitzat permetent a ACS diferenciar entre tipus de cerques amb la idea de ''feromona per concepte'', i s'ha anomenat a aquest producte Routing Concept ACS o RC-ACS. RC-ACS és una implementació multi-feromona robusta i estàtica d'ACS. No obstant, s'ha pogut concloure que l'enfocament de feromona per concepte ofereix només una escalabilitat limitada i no pot ser considerada una solució global. En aquest respecte s'ha realitzat progrés posteriorment introduint una nova idea a RC-ACS: la creació dinàmica de feromones, la qual reemplaça a l'assignació un a un de les mateixes. A l'algorisme resultant se l'ha anomenat en anglès Angry Ant Framework (AAF). En AAF es creen nous nivells de feromones a mesura que es necessiten durant la cerca, i no abans d'aquesta. El progrés final s'ha aconseguit quan s'ha permès a AAF, no sols crear nivells de feromones, sinó reassignar-los per optimitzar la utilització de feromones. L'algorisme resultant s'ha anomenat EntropicAAF i ha sigut avaluat com un dels algorismes per a cerques semàntiques P2P amb millors prestacions.Krynicki, KK. (2016). Ant Colony Algorithms for the Resolution of Semantic Searches in P2P Networks [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/61293TESISPremios Extraordinarios de tesis doctoralesCompendi

    Technology 2000, volume 1

    Get PDF
    The purpose of the conference was to increase awareness of existing NASA developed technologies that are available for immediate use in the development of new products and processes, and to lay the groundwork for the effective utilization of emerging technologies. There were sessions on the following: Computer technology and software engineering; Human factors engineering and life sciences; Information and data management; Material sciences; Manufacturing and fabrication technology; Power, energy, and control systems; Robotics; Sensors and measurement technology; Artificial intelligence; Environmental technology; Optics and communications; and Superconductivity

    High Performance Functional Bio-based Polymers for Skin-contact Products

    Get PDF
    Beauty masks, diapers, wound dressings, wipes, protective clothes and biomedical products: all these high-value and/or large-volume products must be highly compatible with human skin and they should have specific functional properties, such as anti-microbial, anti-inflammatory and anti-oxidant properties. They are currently partially or totally produced using fossil-based sources, with evident issues linked to their end of life, as their waste generates an increasing environmental concern. On the contrary, biopolymers and active biomolecules from biobased sources could be used to produce new materials that are highly compatible with the skin and also biodegradable. The final products can be obtained by exploiting safe and smart nanotechnologies such as the extrusion of bionanocomposites and electrospinning/electrospray, as well as innovative surface modification and control methodologies. For all these reasons, recently, many researchers, such as those involved in the European POLYBIOSKIN project activities, have been working in the field of biomaterials with anti-microbial, anti-inflammatory and anti-oxidant properties, as well as biobased materials which are renewable and biodegradable. The present book gathered research and review papers dedicated to materials and technologies for high-performance products where the attention paid to health and environmental impact is efficiently integrated, considering both the skin-compatibility of the selected materials and their source/end of life

    27th Annual European Symposium on Algorithms: ESA 2019, September 9-11, 2019, Munich/Garching, Germany

    Get PDF

    Cumulative index to NASA Tech Briefs, 1986-1990, volumes 10-14

    Get PDF
    Tech Briefs are short announcements of new technology derived from the R&D activities of the National Aeronautics and Space Administration. These briefs emphasize information considered likely to be transferrable across industrial, regional, or disciplinary lines and are issued to encourage commercial application. This cumulative index of Tech Briefs contains abstracts and four indexes (subject, personal author, originating center, and Tech Brief number) and covers the period 1986 to 1990. The abstract section is organized by the following subject categories: electronic components and circuits, electronic systems, physical sciences, materials, computer programs, life sciences, mechanics, machinery, fabrication technology, and mathematics and information sciences
    corecore