8 research outputs found
Recommended from our members
Cost and performance modeling for Earth system data management and beyond
Current and anticipated storage environments confront domain scientist and data center operators with usability, performance and cost challenges. The amount of data upcoming system will be required to handle is expected to grow exponentially, mainly due to increasing resolution and affordable compute power. Unfortunately, the relationship between cost and performance is not always well understood requiring considerable effort for educated procurement. Within the Centre of Excellence in Simulation of Weather and Climate in Europe (ESiWACE) models to better understand cost and performance of current and future systems are being explored. This paper presents models and methodology focusing on, but not limited to, data centers used in the context of climate and numerical weather prediction. The paper concludes with a case study of alternative deployment strategies and outlines the challenges anticipating their impact on cost and performance. By publishing these early results, we would like to make the case to work towards standard models and methodologies collaboratively as a community to create sufficient incentives for vendors to provide specifications in formats which are compatible to these modeling tools. In addition to that, we see application for such formalized models and information in I/O re lated middleware, which are expected to make automated but reasonable decisions in increasingly heterogeneous data centers
Optimization of the holographic process for imaging and lithography
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2010.Cataloged from PDF version of thesis.Includes bibliographical references (p. 272-297).Since their invention in 1948 by Dennis Gabor, holograms have demonstrated to be important components of a variety of optical systems and their implementation in new fields and methods is expected to continue growing. Their ability to encode 3D optical fields on a 2D plane opened the possibility of novel applications for imaging and lithography. In the traditional form, holograms are produced by the interference of a reference and object waves recording the phase and amplitude of the complex field. The holographic process has been extended to include different recording materials and methods. The increasing demand for holographic-based systems is followed by a need for efficient optimization tools designed for maximizing the performance of the optical system. In this thesis, a variety of multi-domain optimization tools designed to improve the performance of holographic optical systems are proposed. These tools are designed to be robust, computationally efficient and sufficiently general to be applied when designing various holographic systems. All the major forms of holographic elements are studied: computer generated holograms, thin and thick conventional holograms, numerically simulated holograms and digital holograms. Novel holographic optical systems for imaging and lithography are proposed. In the case of lithography, a high-resolution system based on Fresnel domain computer generated holograms (CGHs) is presented. The holograms are numerically designed using a reduced complexity hybrid optimization algorithm (HOA) based on genetic algorithms (GAs) and the modified error reduction (MER) method. The algorithm is efficiently implemented on a graphic processing unit. Simulations as well as experimental results for CGHs fabricated using electron-beam lithography are presented. A method for extending the system's depth of focus is proposed. The HOA is extended for the design and optimization of multispectral CGHs applied for high efficiency solar concentration and spectral splitting. A second lithographic system based on optically recorded total internal reflection (TIR) holograms is studied. A comparative analysis between scalar and (cont.) vector diffraction theories for the modeling and simulation of the system is performed.A complete numerical model of the system is conducted including the photoresist response and first order models for shrinkage of the holographic emulsion. A novel block-stitching algorithm is introduced for the calculation of large diffraction patterns that allows overcoming current computational limitations of memory and processing time. The numerical model is implemented for optimizing the system's performance as well as redesigning the mask to account for potential fabrication errors. The simulation results are compared to experimentally measured data. In the case of imaging, a segmented aperture thin imager based on holographically corrected gradient index lenses (GRIN) is proposed. The compound system is constrained to a maximum thickness of 5mm and utilizes an optically recorded hologram for correcting high-order optical aberrations of the GRIN lens array. The imager is analyzed using system and information theories. A multi-domain optimization approach is implemented based on GAs for maximizing the system's channel capacity and hence improving the information extraction or encoding process. A decoding or reconstruction strategy is implemented using the superresolution algorithm. Experimental results for the optimization of the hologram's recording process and the tomographic measurement of the system's space-variant point spread function are presented. A second imaging system for the measurement of complex fluid flows by tracking micron sized particles using digital holography is studied. A stochastic theoretical model based on a stability metric similar to the channel capacity for a Gaussian channel is presented and used to optimize the system. The theoretical model is first derived for the extreme case of point source particles using Rayleigh scattering and scalar diffraction theory formulations. The model is then extended to account for particles of variable sizes using Mie theory for the scattering of homogeneous dielectric spherical particles. The influence and statistics of the particle density dependent cross-talk noise are studied. Simulation and experimental results for finding the optimum particle density based on the stability metric are presented. For all the studied systems, a sensitivity analysis is performed to predict and assist in the correction of potential fabrication or calibration errors.by José Antonio Domínguez-Caballero.Ph.D
A resource-based view of the firm : integrating the role of IT as a strategic resource - an empirical study of South African personal financial services (Assurance) firms, 1999-2003
Includes bibliographical references (p. 502-531).This research explores an "inside the black box" view on how IT enables sustainable competitive advantage. Most researchers have investigated IT competitive competencies that make up a firm's strategic framework to understand competitive advantage. However, Resource-Based Theory (RBT) probes into the inner workings of a firm, suggesting that a firm's IT assets and resources are the basis of a firm's "rare" core competencies to compete successfully. Using RBT and research in the economics, strategy, and IT literatures, an initial "Framework of Sustainability" was created, against which the case studies were conducted. This framework was used as foundation to develop semi-structured questionnaires in which 45, 90 minute (on average) interviews were conducted with managers in the four firms. Both internal and external documents about the firms and the industry were used as sources of corroborating evidence. In addition, a "bottoms up" view was obtained with evidence gathered from a short questionnaire and focus groups discussions held with 178 staff employees in the four firms
Управление проектами распределенных проектов и программ
Управление проектами распределенных проектов и программ : монография / В. Н. Бурков, С. Д. Бушуев, А. М. Возный, А. Ю. Гайда, Т. Г. Григорян, А. А. Иванова, Н. Р. Кнырик, М. Э. Колесник, И. В. Кононенко, К. В. Кошкин, А. А. Павлов, С. С. Рыжков, А. С. Рыжков, С. О. Слободян, Х. Танака, С. К. Чернов. – Николаев : Изд. Торубара В. В., 2015. – 388 с.В книге рассматриваются теоретические основы управления распределенными проектами и программами, для чего предлагаются соответствующие понятийный аппарат и базовая методология, эффективные модели и методы управления, а также механизмы создания организационных систем. Исследованы и усовершенствованы модели оптимизации, учитывающие степень несопоставимости сравнительных альтернатив для нахождения весов объектов (альтернатив и критериев), представлен формальный аппарат и результаты его использования. Предлагаются критерии оценки компетентности менеджеров управления распределенными проектами и программами.
Монография предназначена для научных работников, инженеров и аспирантов, занимающихся изучением, разработкой и развитием распределенных проектов и программ
Bowdoin Orient v.136, no.1-25 (2006-2007)
https://digitalcommons.bowdoin.edu/bowdoinorient-2000s/1007/thumbnail.jp
Bowdoin Orient v.126, no.1-23 (1997-1998)
https://digitalcommons.bowdoin.edu/bowdoinorient-1990s/1010/thumbnail.jp