69,034 research outputs found
E-Scripture: The Impact of Technology on the Reading of Sacred Texts (2013)
The tradition of religious readers in transition is not new: Augustine expressed “amazement” that Ambrose read silently and not aloud, movable type in the fifteenth century made the Bible publishable without scribal work, and today, electronic pages have become interactive in ways scarcely imagined a short time ago. How readers of today imagine a page (now conceptualized as a ‘web-page’) and consequently, reading in general, has profound implications for the 21st century. Acknowledging the fact that “the significance of a religious book lies not only in the message of its content, but also in the form and self-presentation with which it makes itself available to worship and transmission,” this project assumes that a great deal of perspective is provided by looking at this current transition in light of the old. In virtually all previous reading transitions, a religious ‘pattern of reading technology’ can be seen, whose pieces are all well-known but have not been collectively applied to the current situation of e-reading. The pattern operates with a three part assumption: readers will initially use a new technology to perform the same functions as the old technology, only more quickly, with more efficiency, or in greater quantity. This early use of new reading technology, in other words, largely attempts to imitate the functions and appearance of the old format. The second part is that the old technology becomes sacralized or ritualized in the face of the new technology’s standardization. As this standardization occurs, the new technology develops its own unique and innovative functions, exclusive to that form and shedding some or most of the imitative appearance and functions of the old technology – the third part of the pattern. Reviewing these transitions of the past and present, it becomes clear that perhaps fear of the new technology – however relatable – proves somewhat unfounded. New reading technology does not prove ultimately inimical to the old formats, or to religion, and despite many initial practical concerns, actually provides a multitude of benefits in the reading of sacred texts
Understanding evolutionary processes during past Quaternary climatic cycles: Can it be applied to the future?
Climate change affected ecological community make-up during the Quaternary which was probably both the cause of, and was caused by, evolutionary processes such as species evolution, adaptation and extinction of species and populations
Analytical drafting curves provide exact equations for plotted data
Analytical drafting curves provide explicit mathematical expressions for any numerical data that appears in the form of graphical plots. The curves each have a reference coordinate axis system indicated on the curve as well as the mathematical equation from which the curve was generated
Performance limitations of subband adaptive filters
In this paper, we evaluate the performance limitations of subband adaptive filters in terms of achievable final error terms. The limiting factors are the aliasing level in the subbands, which poses a distortion and thus presents a lower bound for the minimum mean squared error in each subband, and the distortion function of the overall filter bank, which in a system identification setup restricts the accuracy of the equivalent fullband model. Using a generalized DFT modulated filter bank for the subband decomposition, both errors can be stated in terms of the underlying prototype filter. If a source model for coloured input signals is available, it is also possible to calculate the power spectral densities in both subbands and reconstructed fullband. The predicted limits of error quantities compare favourably with simulations presented
Contributions of GRM to the ocean topography experiment (TOPEX)
The permanent shape of the sea surface, the time averaged mean sea level, is comprised of: (1) an ellipsoidal component due to the mean mass of a rotating Earth; (2) spatial undulations due to the inhomogeneous distribution of mass in the Earth; and (3) spatial undulations due to permanent ocean currents. The amplitude of the three components are in the ratio 10,000 : 100 : 1 m, the first two components comprising the marine geoid, and the third being the permanent topography of the sea surface
Treatment of Primary Pulmonary Aspergillosis: An Assessment of the Evidence.
Aspergillus spp. are a group of filamentous molds that were first described due to a perceived similarity to an aspergillum, or liturgical device used to sprinkle holy water, when viewed under a microscope. Although commonly inhaled due to their ubiquitous nature within the environment, an invasive fungal infection (IFI) is a rare outcome that is often reserved for those patients who are immunocompromised. Given the potential for significant morbidity and mortality within this patient population from IFI due to Aspergillus spp., along with the rise in the use of therapies that confer immunosuppression, there is an increasing need for appropriate initial clinical suspicion leading to accurate diagnosis and effective treatment. Voriconazole remains the first line agent for therapy; however, the use of polyenes, novel triazole agents, or voriconazole in combination with an echinocandin may also be utilized. Consideration as to which particular agent and for what duration should be made in the individual context for each patient based upon underlying immunosuppression, comorbidities, and overall tolerance of therapy
Improved decision support for engine-in-the-loop experimental design optimization
Experimental optimization with hardware in the loop is a common procedure in engineering and has been the subject of intense development, particularly when it is applied to relatively complex combinatorial systems that are not completely understood, or where accurate modelling is not possible owing to the dimensions of the search space. A common source of difficulty arises because of the level of noise associated with experimental measurements, a combination of limited instrument precision, and extraneous factors. When a series of experiments is conducted to search for a combination of input parameters that results in a minimum or maximum response, under the imposition of noise, the underlying shape of the function being optimized can become very difficult to discern or even lost. A common methodology to support experimental search for optimal or suboptimal values is to use one of the many gradient descent methods. However, even sophisticated and proven methodologies, such as simulated annealing, can be significantly challenged in the presence of noise, since approximating the gradient at any point becomes highly unreliable. Often, experiments are accepted as a result of random noise which should be rejected, and vice versa. This is also true for other sampling techniques, including tabu and evolutionary algorithms.
After the general introduction, this paper is divided into two main sections (sections 2 and 3), which are followed by the conclusion. Section 2 introduces a decision support methodology based upon response surfaces, which supplements experimental management based on a variable neighbourhood search and is shown to be highly effective in directing experiments in the presence of a significant signal-to-noise ratio and complex combinatorial functions. The methodology is developed on a three-dimensional surface with multiple local minima, a large basin of attraction, and a high signal-to-noise ratio.
In section 2, the methodology is applied to an automotive combinatorial search in the laboratory, on a real-time engine-in-the-loop application. In this application, it is desired to find the maximum power output of an experimental single-cylinder spark ignition engine operating under a quasi-constant-volume operating regime. Under this regime, the piston is slowed at top dead centre to achieve combustion in close to constant volume conditions.
As part of the further development of the engine to incorporate a linear generator to investigate free-piston operation, it is necessary to perform a series of experiments with combinatorial parameters. The objective is to identify the maximum power point in the least number of experiments in order to minimize costs. This test programme provides peak power data in order to achieve optimal electrical machine design.
The decision support methodology is combined with standard optimization and search methods – namely gradient descent and simulated annealing – in order to study the reductions possible in experimental iterations. It is shown that the decision support methodology significantly reduces the number of experiments necessary to find the maximum power solution and thus offers a potentially significant cost saving to hardware-in-the-loop experi- mentation
- …