4,147 research outputs found

    An open-source, stochastic, six-degrees-of-freedom rocket flight simulator, with a probabilistic trajectory analysis approach

    No full text
    Predicting the flight-path of an unguided rocket can help overcome unnecessary risks. Avoiding residential areas or a car-park can improve the safety of launching a rocket significantly. Furthermore, an accurate landing site prediction facilitates recovery. This paper introduces a six-degrees-of-freedom flight simulator for large unguided model rockets that can fly to altitudes of up to 13 km and then return to earth by parachute. The open-source software package assists the user with the design of rockets, and its simulation core models both the rocket flight and the parachute descent in stochastic wind conditions. Furthermore, the uncertainty in the input variables propagates through the model via a Monte Carlo wrapper, simulating a range of possible flight conditions. The resulting trajectories are captured as a Gaussian process, which assists in the statistical assessment of the flight conditions in the face of uncertainties, such as changes in wind conditions, failure to deploy the parachute, and variations in thrust. This approach also facilitates concise presentation of such uncertainties via visualisation of trajectory ensembles

    A Bayesian spatio-temporal model of panel design data: airborne particle number concentration in Brisbane, Australia

    Get PDF
    This paper outlines a methodology for semi-parametric spatio-temporal modelling of data which is dense in time but sparse in space, obtained from a split panel design, the most feasible approach to covering space and time with limited equipment. The data are hourly averaged particle number concentration (PNC) and were collected, as part of the Ultrafine Particles from Transport Emissions and Child Health (UPTECH) project. Two weeks of continuous measurements were taken at each of a number of government primary schools in the Brisbane Metropolitan Area. The monitoring equipment was taken to each school sequentially. The school data are augmented by data from long term monitoring stations at three locations in Brisbane, Australia. Fitting the model helps describe the spatial and temporal variability at a subset of the UPTECH schools and the long-term monitoring sites. The temporal variation is modelled hierarchically with penalised random walk terms, one common to all sites and a term accounting for the remaining temporal trend at each site. Parameter estimates and their uncertainty are computed in a computationally efficient approximate Bayesian inference environment, R-INLA. The temporal part of the model explains daily and weekly cycles in PNC at the schools, which can be used to estimate the exposure of school children to ultrafine particles (UFPs) emitted by vehicles. At each school and long-term monitoring site, peaks in PNC can be attributed to the morning and afternoon rush hour traffic and new particle formation events. The spatial component of the model describes the school to school variation in mean PNC at each school and within each school ground. It is shown how the spatial model can be expanded to identify spatial patterns at the city scale with the inclusion of more spatial locations.Comment: Draft of this paper presented at ISBA 2012 as poster, part of UPTECH projec

    Synthetic LISA: Simulating Time Delay Interferometry in a Model LISA

    Full text link
    We report on three numerical experiments on the implementation of Time-Delay Interferometry (TDI) for LISA, performed with Synthetic LISA, a C++/Python package that we developed to simulate the LISA science process at the level of scientific and technical requirements. Specifically, we study the laser-noise residuals left by first-generation TDI when the LISA armlengths have a realistic time dependence; we characterize the armlength-measurements accuracies that are needed to have effective laser-noise cancellation in both first- and second-generation TDI; and we estimate the quantization and telemetry bitdepth needed for the phase measurements. Synthetic LISA generates synthetic time series of the LISA fundamental noises, as filtered through all the TDI observables; it also provides a streamlined module to compute the TDI responses to gravitational waves according to a full model of TDI, including the motion of the LISA array and the temporal and directional dependence of the armlengths. We discuss the theoretical model that underlies the simulation, its implementation, and its use in future investigations on system characterization and data-analysis prototyping for LISA.Comment: 18 pages, 14 EPS figures, REVTeX 4. Accepted PRD version. See http://www.vallis.org/syntheticlisa for information on the Synthetic LISA software packag

    The equivalence of fluctuation scale dependence and autocorrelations

    Full text link
    We define optimal per-particle fluctuation and correlation measures, relate fluctuations and correlations through an integral equation and show how to invert that equation to obtain precise autocorrelations from fluctuation scale dependence. We test the precision of the inversion with Monte Carlo data and compare autocorrelations to conditional distributions conventionally used to study high-ptp_t jet structure.Comment: 10 pages, 9 figures, proceedings, MIT workshop on correlations and fluctuations in relativistic nuclear collision

    Retrodiction as a tool for micromaser field measurements

    Get PDF
    We use retrodictive quantum theory to describe cavity field measurements by successive atomic detections in the micromaser. We calculate the state of the micromaser cavity field prior to detection of sequences of atoms in either the excited or ground state, for atoms that are initially prepared in the excited state. This provides the POM elements, which describe such sequences of measurements.Comment: 20 pages, 4(8) figure

    Temporal variability and statistics of the Strehl ratio in adaptive-optics images

    Full text link
    We have investigated the temporal variability and statistics of the "instantaneous" Strehl ratio. The observations were carried out with the 3.63-m AEOS telescope equipped with a high-order adaptive optics system. In this paper Strehl ratio is defined as the peak intensity of a single short exposure. We have also studied the behaviour of the phase variance computed on the reconstructed wavefronts. We tested the Marechal approximation and used it to explain the observed negative skewness of the Strehl ratio distribution. The estimate of the phase variance is shown to fit a three-parameter Gamma distribution model. We show that simple scaling of the reconstructed wavefronts has a large impact on the shape of the Strehl ratio distribution.Comment: submitted to PAS

    Towards virtual machine energy-aware cost prediction in clouds

    Get PDF
    Pricing mechanisms employed by different service providers significantly influence the role of cloud computing within the IT industry. With the increasing cost of electricity, Cloud providers consider power consumption as one of the major cost factors to be maintained within their infrastructures. Consequently, modelling a new pricing mechanism that allow Cloud providers to determine the potential cost of resource usage and power consumption has attracted the attention of many researchers. Furthermore, predicting the future cost of Cloud services can help the service providers to offer the suitable services to the customers that meet their requirements. This paper introduces an Energy-Aware Cost Prediction Framework to estimate the total cost of Virtual Machines (VMs) by considering the resource usage and power consumption. The VMs’ workload is firstly predicted based on an Autoregressive Integrated Moving Average (ARIMA) model. The power consumption is then predicted using regression models. The comparison between the predicted and actual results obtained in a real Cloud testbed shows that this framework is capable of predicting the workload, power consumption and total cost for different VMs with good prediction accuracy, e.g. with 0.06 absolute percentage error for the predicted total cost of the VM

    Gas and dust in the Beta Pictoris Moving Group as seen by the Herschel Space Observatory

    Get PDF
    Context. Debris discs are thought to be formed through the collisional grinding of planetesimals, and can be considered as the outcome of planet formation. Understanding the properties of gas and dust in debris discs can help us to comprehend the architecture of extrasolar planetary systems. Herschel Space Observatory far-infrared (IR) photometry and spectroscopy have provided a valuable dataset for the study of debris discs gas and dust composition. This paper is part of a series of papers devoted to the study of Herschel PACS observations of young stellar associations. Aims. This work aims at studying the properties of discs in the Beta Pictoris Moving Group (BPMG) through far-IR PACS observations of dust and gas. Methods. We obtained Herschel-PACS far-IR photometric observations at 70, 100 and 160 microns of 19 BPMG members, together with spectroscopic observations of four of them. Spectroscopic observations were centred at 63.18 microns and 157 microns, aiming to detect [OI] and [CII] emission. We incorporated the new far-IR observations in the SED of BPMG members and fitted modified blackbody models to better characterise the dust content. Results. We have detected far-IR excess emission toward nine BPMG members, including the first detection of an IR excess toward HD 29391.The star HD 172555, shows [OI] emission, while HD 181296, shows [CII] emission, expanding the short list of debris discs with a gas detection. No debris disc in BPMG is detected in both [OI] and [CII]. The discs show dust temperatures in the range 55 to 264 K, with low dust masses (6.6*10^{-5} MEarth to 0.2 MEarth) and radii from blackbody models in the range 3 to 82 AU. All the objects with a gas detection are early spectral type stars with a hot dust component.Comment: 12 pages, 7 figures, 6 table

    Ensemble Sales Forecasting Study in Semiconductor Industry

    Full text link
    Sales forecasting plays a prominent role in business planning and business strategy. The value and importance of advance information is a cornerstone of planning activity, and a well-set forecast goal can guide sale-force more efficiently. In this paper CPU sales forecasting of Intel Corporation, a multinational semiconductor industry, was considered. Past sale, future booking, exchange rates, Gross domestic product (GDP) forecasting, seasonality and other indicators were innovatively incorporated into the quantitative modeling. Benefit from the recent advances in computation power and software development, millions of models built upon multiple regressions, time series analysis, random forest and boosting tree were executed in parallel. The models with smaller validation errors were selected to form the ensemble model. To better capture the distinct characteristics, forecasting models were implemented at lead time and lines of business level. The moving windows validation process automatically selected the models which closely represent current market condition. The weekly cadence forecasting schema allowed the model to response effectively to market fluctuation. Generic variable importance analysis was also developed to increase the model interpretability. Rather than assuming fixed distribution, this non-parametric permutation variable importance analysis provided a general framework across methods to evaluate the variable importance. This variable importance framework can further extend to classification problem by modifying the mean absolute percentage error(MAPE) into misclassify error. Please find the demo code at : https://github.com/qx0731/ensemble_forecast_methodsComment: 14 pages, Industrial Conference on Data Mining 2017 (ICDM 2017

    Assessment of Climate Variability of the Greenland Ice Sheet: Integration of In Situ and Satellite Data

    Get PDF
    The proposed research involves the application of multispectral satellite data in combination with ground truth measurements to monitor surface properties of the Greenland Ice Sheet which are essential for describing the energy and mass of the ice sheet. Several key components of the energy balance are parameterized using satellite data and in situ measurements. The analysis has been done for a 6 to 17 year time period in order to analyze the seasonal and interannual variations of the surface processes and the climatology. Our goal was to investigate to what accuracy and over what geographic areas large scale snow properties and radiative fluxes can be derived based upon a combination of available remote sensing and meteorological data sets. For the understanding of the surface processes a field program was designed to collect information on spectral albedo, specular reflectance, soot content, grain size and the physical properties of different snow types. Further, the radiative and turbulent fluxes at the ice/snow surface were monitored for the parameterization and interpretation of the satellite data. Highlights include AVHRR time series and surface based radiation measurements, passive microwave time series, and geodetic results from the ETH/CU camp
    • …
    corecore