24,328 research outputs found

    Parallel Implementation of the PHOENIX Generalized Stellar Atmosphere Program. II: Wavelength Parallelization

    Get PDF
    We describe an important addition to the parallel implementation of our generalized NLTE stellar atmosphere and radiative transfer computer program PHOENIX. In a previous paper in this series we described data and task parallel algorithms we have developed for radiative transfer, spectral line opacity, and NLTE opacity and rate calculations. These algorithms divided the work spatially or by spectral lines, that is distributing the radial zones, individual spectral lines, or characteristic rays among different processors and employ, in addition task parallelism for logically independent functions (such as atomic and molecular line opacities). For finite, monotonic velocity fields, the radiative transfer equation is an initial value problem in wavelength, and hence each wavelength point depends upon the previous one. However, for sophisticated NLTE models of both static and moving atmospheres needed to accurately describe, e.g., novae and supernovae, the number of wavelength points is very large (200,000--300,000) and hence parallelization over wavelength can lead both to considerable speedup in calculation time and the ability to make use of the aggregate memory available on massively parallel supercomputers. Here, we describe an implementation of a pipelined design for the wavelength parallelization of PHOENIX, where the necessary data from the processor working on a previous wavelength point is sent to the processor working on the succeeding wavelength point as soon as it is known. Our implementation uses a MIMD design based on a relatively small number of standard MPI library calls and is fully portable between serial and parallel computers.Comment: AAS-TeX, 15 pages, full text with figures available at ftp://calvin.physast.uga.edu/pub/preprints/Wavelength-Parallel.ps.gz ApJ, in pres

    The impact of switching costs on closing of service branches

    Get PDF
    The paper deals with the optimal location of service branches. Consumers can receive service from different firms and branches offering substitute services. The consumer chooses the firm and the branch. Examples are banking services (which firm and branch?), healthcare providers, insurance companies and their agents, brokerage firms and their branches. With the change in the accessibility of the internet, the service industry witnesses the impact of the change in technology. More customers prefer acquiring services over the internet, and less require face to face contacts. The industry has to respond in limiting the number of branches. The paper deals with the question which branches to close. In the first period, each consumer selects to receive service from the branch that minimizes the overall costs incurred, composed of price per service plus transportation costs. In the second period, the branches' configuration changes. A branch is closed by one of the firms in alternative locations. When a consumer wants to switch firms, he incurs additional costs, "exogenous switching costs", the cost of information, learning costs transaction costs. These costs are incurred by the consumers, and are different from "endogenous switching costs" that are incurred by the firm, e.g., the frequent flier case. The paper investigates how the market area and market share of the branches and firms change with the closure of a branch. The loss varies between no loss to losing the whole market. Given the prices, the switching costs and the location, we identify the branches to be closed.

    Post-Euphoria in Electronic Music Making

    Get PDF
    Engaging in a range of electronic music making techniques, this doctoral, practice-based project examines the creative interpretation of “post-euphoria”: a concept conceived here to describe a retrospective subjectivity that encapsulates reminiscing and an afterglow feeling in relation to lost euphoric experiences. The study aims to establish a musical aesthetic the responds to this concept using DAW tools, to form a body of musical work comprising a collection of recordings. Identifying a lack of available practice-based studies on music production practices that deal with memory and recollection, particularly in connection with electronic dance music and its concomitant cultural events, the research contributes insights to this underexplored area. To examine how post-euphoria can be interpreted through electronic music making, the research firstly conducts interviews with comparable electronic music practitioners whose work is informed by themes of recollection and reminiscing in the context of electronic dance music. Insights gained in the interviews then inform the main creative investigation, which is pursued by utilising the DAW Ableton Live and its built-in tools. Developed as part of the creative investigation are production and compositional strategies and a body of ten original recordings, accompanied by this critical exegesis that examines and contextualises the creative practices established. Undertaking the study enabled an understanding of ways in which elements of electronic music production can be used figuratively to embody conceptual meanings relating to reminiscing and recollection. It further demonstrates how electronic dance music tropes can be employed, alongside sampling-inspired and self-remixing techniques, to connote a sense of missing euphoria and to suggest transformations that occur in the memory of fleeting experiences. Finally, the study shows how these musical elements can be incorporated within a songwriting framework and as part of an integrated collection of corresponding recordings, to further the musical interpretation of post-euphoria

    Forecasting the Number of Visitors in a Unique Recreational Site- A Retrospective View

    Get PDF
    We examine in the research forecasts prepared by us fifteen years ago. We examine the assumptions made as well as the results, comparing the forecasts to reality. We concentrate on the forecasts of number of visitors, which enables to examine economic impact, and is crucial in analyzing ecological carrying capacity. Our case study was a wetland that was drained in the '50s, resulting in severe environmental damages. In the '90s part of the area was re-flooded and a small lake was created. We forecasted the number of visitors, the expected revenues and benefits. The area is currently called Agmon Hula, located in the previous Hula marsh (North of Israel). The commodity planned was a site which offers safari, birds' sanctuary, horse riding, swimming in a pool, picnicking. We asked recreationists in adjacent national parks and nature reserves on their willingness to visit the planned park and their willingness to pay (WTP), using CVM methods. In reality, the site started operation in 2005 as a birds' sanctuary, due to its success in attracting birds. 500 million birds pass the area twice per year migrating to the south in the fall and returning north in the spring. Our forecast for 380 thousand visitors in the first year of operation did not materialize. We could have predicted a smaller number closer to the real number (220 thousand) if we would have considered the percentage that ranked birds' sanctuary as one of their two favorite activities. The prediction assumed an annual increase in the number of visitors of 2-6%, but actually, the increase in the first five years of operation is 8% annually. In the prediction, we disregarded tourists, but they were 7-17% of the visitors. Updating the prediction of number of visitors is easy, and is a crucial aspect in predicting carrying capacity.

    Parallel Implementation of the PHOENIX Generalized Stellar Atmosphere Program

    Get PDF
    We describe the parallel implementation of our generalized stellar atmosphere and NLTE radiative transfer computer program PHOENIX. We discuss the parallel algorithms we have developed for radiative transfer, spectral line opacity, and NLTE opacity and rate calculations. Our implementation uses a MIMD design based on a relatively small number of MPI library calls. We report the results of test calculations on a number of different parallel computers and discuss the results of scalability tests.Comment: To appear in ApJ, 1997, vol 483. LaTeX, 34 pages, 3 Figures, uses AASTeX macros and styles natbib.sty, and psfig.st

    Pilot/vehicle model analysis of visual and motion cue requirements in flight simulation

    Get PDF
    The optimal control model (OCM) of the human operator is used to predict the effect of simulator characteristics on pilot performance and workload. The piloting task studied is helicopter hover. Among the simulator characteristics considered were (computer generated) visual display resolution, field of view and time delay

    Experimental implementation of a real-time token-based network protocol on a microcontroller

    Get PDF
    The real-time token-based RTnet network protocol has been implemented on a standard Ethernet network to investigate the possibility to use cheap components with strict resource limitations while preserving Quality of Service guarantees. It will be shown that the proposed implementation is feasible on a small network. For larger networks a different approach is necessary, using delegation by means of proxies. A delegation proposal will be discussed. For small networks it is possible to use a PIC microcontroller in combination with a standard Ethernet controller to run the RTnet network protocol. As more systems are added to the network the performance of this combination becomes insufficient. When this happens it is necessary for the microcontroller to delegate some tasks to a more powerful master and to organize a low-level communication protocol between master and slave
    • …
    corecore