5,360 research outputs found
Six Peaks Visible in the Redshift Distribution of 46,400 SDSS Quasars Agree with the Preferred Redshifts Predicted by the Decreasing Intrinsic Redshift Model
The redshift distribution of all 46,400 quasars in the Sloan Digital Sky
Survey (SDSS) Quasar Catalog III, Third Data Release, is examined. Six Peaks
that fall within the redshift window below z = 4, are visible. Their positions
agree with the preferred redshift values predicted by the decreasing intrinsic
redshift (DIR) model, even though this model was derived using completely
independent evidence. A power spectrum analysis of the full dataset confirms
the presence of a single, significant power peak at the expected redshift
period. Power peaks with the predicted period are also obtained when the upper
and lower halves of the redshift distribution are examined separately. The
periodicity detected is in linear z, as opposed to log(1+z). Because the peaks
in the SDSS quasar redshift distribution agree well with the preferred
redshifts predicted by the intrinsic redshift relation, we conclude that this
relation, and the peaks in the redshift distribution, likely both have the same
origin, and this may be intrinsic redshifts, or a common selection effect.
However, because of the way the intrinsic redshift relation was determined it
seems unlikely that one selection effect could have been responsible for both.Comment: 12 pages, 12 figure, accepted for publication in the Astrophysical
Journa
Towards virtual machine energy-aware cost prediction in clouds
Pricing mechanisms employed by different service providers significantly influence the role of cloud computing within the IT industry. With the increasing cost of electricity, Cloud providers consider power consumption as one of the major cost factors to be maintained within their infrastructures. Consequently, modelling a new pricing mechanism that allow Cloud providers to determine the potential cost of resource usage and power consumption has attracted the attention of many researchers. Furthermore, predicting the future cost of Cloud services can help the service providers to offer the suitable services to the customers that meet their requirements. This paper introduces an Energy-Aware Cost Prediction Framework to estimate the total cost of Virtual Machines (VMs) by considering the resource usage and power consumption. The VMs’ workload is firstly predicted based on an Autoregressive Integrated Moving Average (ARIMA) model. The power consumption is then predicted using regression models. The comparison between the predicted and actual results obtained in a real Cloud testbed shows that this framework is capable of predicting the workload, power consumption and total cost for different VMs with good prediction accuracy, e.g. with 0.06 absolute percentage error for the predicted total cost of the VM
Design and analysis of fractional factorial experiments from the viewpoint of computational algebraic statistics
We give an expository review of applications of computational algebraic
statistics to design and analysis of fractional factorial experiments based on
our recent works. For the purpose of design, the techniques of Gr\"obner bases
and indicator functions allow us to treat fractional factorial designs without
distinction between regular designs and non-regular designs. For the purpose of
analysis of data from fractional factorial designs, the techniques of Markov
bases allow us to handle discrete observations. Thus the approach of
computational algebraic statistics greatly enlarges the scope of fractional
factorial designs.Comment: 16 page
Damage function for historic paper. Part I: Fitness for use
Background In heritage science literature and in preventive conservation practice, damage functions are used to model material behaviour and specifically damage (unacceptable change), as a result of the presence of a stressor over time. For such functions to be of use in the context of collection management, it is important to define a range of parameters, such as who the stakeholders are (e.g. the public, curators, researchers), the mode of use (e.g. display, storage, manual handling), the long-term planning horizon (i.e. when in the future it is deemed acceptable for an item to become damaged or unfit for use), and what the threshold of damage is, i.e. extent of physical change assessed as damage. Results In this paper, we explore the threshold of fitness for use for archival and library paper documents used for display or reading in the context of access in reading rooms by the general public. Change is considered in the context of discolouration and mechanical deterioration such as tears and missing pieces: forms of physical deterioration that accumulate with time in libraries and archives. We also explore whether the threshold fitness for use is defined differently for objects perceived to be of different value, and for different modes of use. The data were collected in a series of fitness-for-use workshops carried out with readers/visitors in heritage institutions using principles of Design of Experiments. Conclusions The results show that when no particular value is pre-assigned to an archival or library document, missing pieces influenced readers/visitors’ subjective judgements of fitness-for-use to a greater extent than did discolouration and tears (which had little or no influence). This finding was most apparent in the display context in comparison to the reading room context. The finding also best applied when readers/visitors were not given a value scenario (in comparison to when they were asked to think about the document having personal or historic value). It can be estimated that, in general, items become unfit when text is evidently missing. However, if the visitor/reader is prompted to think of a document in terms of its historic value, then change in a document has little impact on fitness for use
Quantum trajectories for the realistic measurement of a solid-state charge qubit
We present a new model for the continuous measurement of a coupled quantum
dot charge qubit. We model the effects of a realistic measurement, namely
adding noise to, and filtering, the current through the detector. This is
achieved by embedding the detector in an equivalent circuit for measurement.
Our aim is to describe the evolution of the qubit state conditioned on the
macroscopic output of the external circuit. We achieve this by generalizing a
recently developed quantum trajectory theory for realistic photodetectors [P.
Warszawski, H. M. Wiseman and H. Mabuchi, Phys. Rev. A_65_ 023802 (2002)] to
treat solid-state detectors. This yields stochastic equations whose (numerical)
solutions are the ``realistic quantum trajectories'' of the conditioned qubit
state. We derive our general theory in the context of a low transparency
quantum point contact. Areas of application for our theory and its relation to
previous work are discussed.Comment: 7 pages, 2 figures. Shorter, significantly modified, updated versio
Ensemble Sales Forecasting Study in Semiconductor Industry
Sales forecasting plays a prominent role in business planning and business
strategy. The value and importance of advance information is a cornerstone of
planning activity, and a well-set forecast goal can guide sale-force more
efficiently. In this paper CPU sales forecasting of Intel Corporation, a
multinational semiconductor industry, was considered. Past sale, future
booking, exchange rates, Gross domestic product (GDP) forecasting, seasonality
and other indicators were innovatively incorporated into the quantitative
modeling. Benefit from the recent advances in computation power and software
development, millions of models built upon multiple regressions, time series
analysis, random forest and boosting tree were executed in parallel. The models
with smaller validation errors were selected to form the ensemble model. To
better capture the distinct characteristics, forecasting models were
implemented at lead time and lines of business level. The moving windows
validation process automatically selected the models which closely represent
current market condition. The weekly cadence forecasting schema allowed the
model to response effectively to market fluctuation. Generic variable
importance analysis was also developed to increase the model interpretability.
Rather than assuming fixed distribution, this non-parametric permutation
variable importance analysis provided a general framework across methods to
evaluate the variable importance. This variable importance framework can
further extend to classification problem by modifying the mean absolute
percentage error(MAPE) into misclassify error. Please find the demo code at :
https://github.com/qx0731/ensemble_forecast_methodsComment: 14 pages, Industrial Conference on Data Mining 2017 (ICDM 2017
Information theoretic approach to interactive learning
The principles of statistical mechanics and information theory play an
important role in learning and have inspired both theory and the design of
numerous machine learning algorithms. The new aspect in this paper is a focus
on integrating feedback from the learner. A quantitative approach to
interactive learning and adaptive behavior is proposed, integrating model- and
decision-making into one theoretical framework. This paper follows simple
principles by requiring that the observer's world model and action policy
should result in maximal predictive power at minimal complexity. Classes of
optimal action policies and of optimal models are derived from an objective
function that reflects this trade-off between prediction and complexity. The
resulting optimal models then summarize, at different levels of abstraction,
the process's causal organization in the presence of the learner's actions. A
fundamental consequence of the proposed principle is that the learner's optimal
action policies balance exploration and control as an emerging property.
Interestingly, the explorative component is present in the absence of policy
randomness, i.e. in the optimal deterministic behavior. This is a direct result
of requiring maximal predictive power in the presence of feedback.Comment: 6 page
Temporal variability and statistics of the Strehl ratio in adaptive-optics images
We have investigated the temporal variability and statistics of the
"instantaneous" Strehl ratio. The observations were carried out with the 3.63-m
AEOS telescope equipped with a high-order adaptive optics system. In this paper
Strehl ratio is defined as the peak intensity of a single short exposure. We
have also studied the behaviour of the phase variance computed on the
reconstructed wavefronts. We tested the Marechal approximation and used it to
explain the observed negative skewness of the Strehl ratio distribution. The
estimate of the phase variance is shown to fit a three-parameter Gamma
distribution model. We show that simple scaling of the reconstructed wavefronts
has a large impact on the shape of the Strehl ratio distribution.Comment: submitted to PAS
R.A.Fisher, design theory, and the Indian connection
Design Theory, a branch of mathematics, was born out of the experimental
statistics research of the population geneticist R. A. Fisher and of Indian
mathematical statisticians in the 1930s. The field combines elements of
combinatorics, finite projective geometries, Latin squares, and a variety of
further mathematical structures, brought together in surprising ways. This
essay will present these structures and ideas as well as how the field came
together, in itself an interesting story.Comment: 11 pages, 3 figure
Gas and dust in the Beta Pictoris Moving Group as seen by the Herschel Space Observatory
Context. Debris discs are thought to be formed through the collisional
grinding of planetesimals, and can be considered as the outcome of planet
formation. Understanding the properties of gas and dust in debris discs can
help us to comprehend the architecture of extrasolar planetary systems.
Herschel Space Observatory far-infrared (IR) photometry and spectroscopy have
provided a valuable dataset for the study of debris discs gas and dust
composition. This paper is part of a series of papers devoted to the study of
Herschel PACS observations of young stellar associations.
Aims. This work aims at studying the properties of discs in the Beta Pictoris
Moving Group (BPMG) through far-IR PACS observations of dust and gas.
Methods. We obtained Herschel-PACS far-IR photometric observations at 70, 100
and 160 microns of 19 BPMG members, together with spectroscopic observations of
four of them. Spectroscopic observations were centred at 63.18 microns and 157
microns, aiming to detect [OI] and [CII] emission. We incorporated the new
far-IR observations in the SED of BPMG members and fitted modified blackbody
models to better characterise the dust content.
Results. We have detected far-IR excess emission toward nine BPMG members,
including the first detection of an IR excess toward HD 29391.The star HD
172555, shows [OI] emission, while HD 181296, shows [CII] emission, expanding
the short list of debris discs with a gas detection. No debris disc in BPMG is
detected in both [OI] and [CII]. The discs show dust temperatures in the range
55 to 264 K, with low dust masses (6.6*10^{-5} MEarth to 0.2 MEarth) and radii
from blackbody models in the range 3 to 82 AU. All the objects with a gas
detection are early spectral type stars with a hot dust component.Comment: 12 pages, 7 figures, 6 table
- …
