4,089 research outputs found
The comparison of various gas turbine inlet aircooling methods for various ambient condition trough energy and exergy analysis
The strong influence of climate conditions on gas turbine behavior is well known. During the summer season the output of gas turbines falls to a value that is less than the rated output under high temperature conditions. Cooling the turbine inlet air can increase output power considerably, because cooled air is dense, giving the turbine a higher mass flow rate and resulting in increased turbine output and efficiency. This study is to use the energy and exergy analysis method to evaluate the air cooling method used for enhancing the gas turbine power plant. In addition, the effect of inlet air cooling method on the output power, exergy efficiency and exergy destruction have been analyzed. Also at the end of the paper the comparison of two mentioned methods has been investigated
Towards the Future of Supernova Cosmology
For future surveys, spectroscopic follow-up for all supernovae will be
extremely difficult. However, one can use light curve fitters, to obtain the
probability that an object is a Type Ia. One may consider applying a
probability cut to the data, but we show that the resulting non-Ia
contamination can lead to biases in the estimation of cosmological parameters.
A different method, which allows the use of the full dataset and results in
unbiased cosmological parameter estimation, is Bayesian Estimation Applied to
Multiple Species (BEAMS). BEAMS is a Bayesian approach to the problem which
includes the uncertainty in the types in the evaluation of the posterior. Here
we outline the theory of BEAMS and demonstrate its effectiveness using both
simulated datasets and SDSS-II data. We also show that it is possible to use
BEAMS if the data are correlated, by introducing a numerical marginalisation
over the types of the objects. This is largely a pedagogical introduction to
BEAMS with references to the main BEAMS papers.Comment: Replaced under married name Lochner (formally Knights). 3 pages, 2
figures. To appear in the Proceedings of 13th Marcel Grossmann Meeting
(MG13), Stockholm, Sweden, 1-7 July 201
Extending BEAMS to incorporate correlated systematic uncertainties
New supernova surveys such as the Dark Energy Survey, Pan-STARRS and the LSST
will produce an unprecedented number of photometric supernova candidates, most
with no spectroscopic data. Avoiding biases in cosmological parameters due to
the resulting inevitable contamination from non-Ia supernovae can be achieved
with the BEAMS formalism, allowing for fully photometric supernova cosmology
studies. Here we extend BEAMS to deal with the case in which the supernovae are
correlated by systematic uncertainties. The analytical form of the full BEAMS
posterior requires evaluating 2^N terms, where N is the number of supernova
candidates. This `exponential catastrophe' is computationally unfeasible even
for N of order 100. We circumvent the exponential catastrophe by marginalising
numerically instead of analytically over the possible supernova types: we
augment the cosmological parameters with nuisance parameters describing the
covariance matrix and the types of all the supernovae, \tau_i, that we include
in our MCMC analysis. We show that this method deals well even with large,
unknown systematic uncertainties without a major increase in computational
time, whereas ignoring the correlations can lead to significant biases and
incorrect credible contours. We then compare the numerical marginalisation
technique with a perturbative expansion of the posterior based on the insight
that future surveys will have exquisite light curves and hence the probability
that a given candidate is a Type Ia will be close to unity or zero, for most
objects. Although this perturbative approach changes computation of the
posterior from a 2^N problem into an N^2 or N^3 one, we show that it leads to
biases in general through a small number of misclassifications, implying that
numerical marginalisation is superior.Comment: Resubmitted under married name Lochner (formally Knights). Version 3:
major changes, including a large scale analysis with thousands of MCMC
chains. Matches version published in JCAP. 23 pages, 8 figure
Doubly hybrid density functional for accurate descriptions of nonbond interactions, thermochemistry, and thermochemical kinetics
We develop and validate a density functional, XYG3, based on the adiabatic connection formalism and the Görling–Levy coupling-constant perturbation expansion to the second order (PT2). XYG3 is a doubly hybrid functional, containing 3 mixing parameters. It has a nonlocal orbital-dependent component in the exchange term (exact exchange) plus information about the unoccupied Kohn–Sham orbitals in the correlation part (PT2 double excitation). XYG3 is remarkably accurate for thermochemistry, reaction barrier heights, and nonbond interactions of main group molecules. In addition, the accuracy remains nearly constant with system size
Effect of operating temperature on direct recycling aluminium chips (AA6061) in hot press forging process
A method of solid-state recycling aluminum alloy using hot press forging process was studied as well as the possibility of the recycled chip to be used as secondary resources. This paper presents the results of recycled AA6061 aluminium alloy chip using different operating temperature for hot press forging process. Mechanical properties and microstructure of the recycled specimens and as-received (reference) specimen were investigated. The recycled specimens exhibit a good potential in the strength properties. The result for yield strength (YS) and ultimate tensile strength (UTS) at the minimum temperature 430˚C is 25.8 MPa and 27.13 MPa. For the maximum operating temperature 520˚C YS and UTS are 107.0MPa and 117.53 MPa. Analysis for different operating temperatures shows that the higher temperatures giving better result on mechanical properties and finer microstructure. The strength of recycled specimen increases due to the grain refinement strengthening whereas particle dispersion strengthening has minor effects. In this study, the recycled AA6061 chip shows the good potential in strengthening as the comparison of using only 17.5% of suggested pressure (70.0/400.0) MPa, the UTS exhibit 35.8% (117.58/327.69) MPa. This shows a remarkable potential of direct recycling by using hot press forging process
Characterization of flow rate and Heat Loss in Heating, Ventilation and Air Conditioning (HVAC) Duct System for Office Building
A building is an assemblage that is firmly attached to the ground and provides the performance of human
activities and need to be considered in the daily operation in that building. The improvements in building performance are
focused on improving the energy efficiency of buildings. This is approach by designing heating, ventilation and air
conditioning (HVAC) duct system due to one of the most utilized energy in maintaining building performance and
environment. The objectives of this research is to calculate the air (CFM) supply in office building, to characterize the
velocity and head loss in a round and rectangular HVAC ducting system at various duct thickness and to optimize the
thickness of the duct in HVAC system according to ASHRAE Standard. The increasing of velocity in duct system shows
the increasing of head loss. The round duct design gives the lowest velocity and head loss in HVAC system approximately
around 9.35% as compared to rectangular duct at 0.06 inches thickness. Hence, the trends of the head loss and duct thickness
has influenced in reducing noise in HVAC duct system in order to select the best design concepts which is round shape
design
Machine Learning Application to Atmospheric Chemistry Modeling
Atmospheric chemistry models are a central tool to study the impact of chemical constituents on the environment, vegetation and human health. These models split the atmosphere in a large number of grid-boxes and consider the emission of compounds into these boxes and their subsequent transport, deposition, and chemical processing. The chemistry is represented through a series of simultaneous ordinary differential equations, one for each compound. Given the difference in life-times between the chemical compounds (milli-seconds for O (sup 1) D (Deuterium) to years for CH4) these equations are numerically stiff and solving them consists of a significant fraction of the computational burden of a chemistry model. We have investigated a machine learning approach to emulate the chemistry instead of solving the differential equations numerically. From a one-month simulation of the GEOS-Chem model we have produced a training dataset consisting of the concentration of compounds before and after the differential equations are solved, together with some key physical parameters for every grid-box and time-step. From this dataset we have trained a machine learning algorithm (regression forest) to be able to predict the concentration of the compounds after the integration step based on the concentrations and physical state at the beginning of the time step. We have then included this algorithm back into the GEOS-Chem model, bypassing the need to integrate the chemistry. This machine learning approach shows many of the characteristics of the full simulation and has the potential to be substantially faster. There are a wide range of application for such an approach - generating boundary conditions, for use in air quality forecasts, chemical data assimilation systems, etc. We discuss speed and accuracy of our approach, and highlight some potential future directions for improving it
Ventricular arrhythmias classification and onset determination system
Accurately differentiating between ventricular fibrillation (VF) and ventricular tachycardia (VT) episodes is crucial in preventing potentially fatal missed interpretations that could lead to needless shock to the patients, resulting in damaging the heart. Apart from accurately classifying between VT and VF, the predetermination of the onset of the ventricular arrhythmias is also important in order to allow for more efficient monitoring of patients and can potentially save one’s life. Thus, this research intends to focus on developing a system called Classification and Onset Determination System (CODS) that is able to classify, track and monitor ventricular arrhythmias by using a method called Second Order Dynamic Binary Decomposition (SOD-BD) technique. Two significant characteristics (the natural frequency and the input parameter) were extracted from Electrocardiogram (ECG) signals that are provided by Physiobank database and analyzed to find the significant differences for each ventricular arrhythmia types and classify the ECGs accordingly (N, VT and VF). The outcome from these ECG extractions was also used to locate the onset of ventricular arrhythmia that is useful to predict the occurrence of the heart abnormalities. All the ECGs analysis, parameters extraction, classification techniques, and the CODS are developed using LabVIEW software
The development of mobile augmented reality for laptop maintenance (MAR4LM)
Currently, laptops have become a necessity to most of the people as laptop provides mobility and helps in doing the task that a normal desktop PC can do.However, the malfunction of the laptop hardware will distract the user work and decrease the productivity.By utilizing Augmented Reality (AR) that combine real world and virtual world, a laptop maintenance application can be made in reality.The objective of this paper is to discuss on how to develop Mobile Augmented Reality for Laptop Maintenance (MAR4LM) applications.This AR application specifically develop for smart phones based on the android platform.Therefore, this study presents a way to understand and discuss the process of using this new technology on the android platforms.This application has been proven in increasing the understanding of the users about their laptop and doing maintenance on their own self.In addition, laptop maintenance tasks will become straightforward, easy to use, interactive, and easily available anywhere and anytime
- …
