1,302 research outputs found

    Modern CACSD using the Robust-Control Toolbox

    Get PDF
    The Robust-Control Toolbox is a collection of 40 M-files which extend the capability of PC/PRO-MATLAB to do modern multivariable robust control system design. Included are robust analysis tools like singular values and structured singular values, robust synthesis tools like continuous/discrete H(exp 2)/H infinity synthesis and Linear Quadratic Gaussian Loop Transfer Recovery methods and a variety of robust model reduction tools such as Hankel approximation, balanced truncation and balanced stochastic truncation, etc. The capabilities of the toolbox are described and illustated with examples to show how easily they can be used in practice. Examples include structured singular value analysis, H infinity loop-shaping and large space structure model reduction

    Algorithms for computing the multivariable stability margin

    Get PDF
    Stability margin for multiloop flight control systems has become a critical issue, especially in highly maneuverable aircraft designs where there are inherent strong cross-couplings between the various feedback control loops. To cope with this issue, we have developed computer algorithms based on non-differentiable optimization theory. These algorithms have been developed for computing the Multivariable Stability Margin (MSM). The MSM of a dynamical system is the size of the smallest structured perturbation in component dynamics that will destabilize the system. These algorithms have been coded and appear to be reliable. As illustrated by examples, they provide the basis for evaluating the robustness and performance of flight control systems

    Implement Innovative Proactive-Service-Center to Enhance Service Performance in Customer Site

    Get PDF
    How to more efficiently and effectively enhance service performance is an on-going challenge to any service team. A new service model, Proactive-Service-Center (P-S-C), is developed since Feb’01. As a result, the service performance results are improved in terms of same day issue closure rate, issue response time and system uptime. The P-S-C is not only an innovative way to enhance service performance but also a systematic way to build-up engineers\u27 trouble-shooting capability. In the old service situation, the information flows randomly among customer engineers, customer section managers, AMT engineers, and AMT site managers. The average time spent is around two hours from Issue Happened to Dedicated Engineer on site. Customer and AMT engineers are running around like a chicken without head. The information flows inefficiently and ineffectively among customer engineers, section managers, AMT engineers and site managers. The detailed solutions are not documented and remained in the engineers\u27 heads. In the PSC model, the information flows automatically from the system which is down to the customer Server then to the PSC Server and the dedicated AMT engineer is automatically informed. The information flows efficiently and effectively. The actions done are then systematically entered and documented in PSC and a monthly analysis report is generated and provided to customers. The cost savings of PSC can be estimated indirectly. The issue response time is reduced from 2 hours to 58 minutes (reduced by 62 minutes). This translates to the increase of system uptime by 62 minutes. This will bring cost savings. In addition, the issue closure rate is improved from 50% to 85% (improved by 35%). This also translates to the increase of system uptime. The system uptime enhancement is cost savings. Three areas which need improvement are identified to be Technology, People and Process. Technology: Internet is used and many auto functions (for example, reminding, reporting) are built-in to enhance the capability. People: Dedicated Engineer Matrix is built-in PSC and the right engineer can be reached at the right time in the most efficient manner. Process: Process is real time and systematic with clear R&R defined to ensure efficiency and experience build-up in the future

    Tidal Evolution of a Secularly Interacting Planetary System

    Full text link
    In a multi-planet system, a gradual change in one planet's semi-major axis will affect the eccentricities of all the planets, as angular momentum is distributed via secular interactions. If tidal dissipation in the planet is the cause of the change in semi-major axis, it also damps that planet's eccentricity, which in turn also contributes to the evolution of all the eccentricities. Formulae quantifying the combined effects on the whole system due to semi-major axis changes, as well as eccentricity damping, are derived here for a two-planet system. The CoRoT 7 system is considered as an example.Comment: Accepted for publication in the Astrophysical Journal, 17 pages, including 1 figur

    Deep Learning based Automated Forest Health Diagnosis from Aerial Images

    Get PDF
    Global climate change has had a drastic impact on our environment. Previous study showed that pest disaster occured from global climate change may cause a tremendous number of trees died and they inevitably became a factor of forest fire. An important portent of the forest fire is the condition of forests. Aerial image-based forest analysis can give an early detection of dead trees and living trees. In this paper, we applied a synthetic method to enlarge imagery dataset and present a new framework for automated dead tree detection from aerial images using a re-trained Mask RCNN (Mask Region-based Convolutional Neural Network) approach, with a transfer learning scheme. We apply our framework to our aerial imagery datasets,and compare eight fine-tuned models. The mean average precision score (mAP) for the best of these models reaches 54\%. Following the automated detection, we are able to automatically produce and calculate number of dead tree masks to label the dead trees in an image, as an indicator of forest health that could be linked to the causal analysis of environmental changes and the predictive likelihood of forest fire

    Persistent termini of 2004- and 2005-like ruptures of the Sunda megathrust

    Get PDF
    To gain insight into the longevity of subduction zone segmentation, we use coral microatolls to examine an 1100-year record of large earthquakes across the boundary of the great 2004 and 2005 Sunda megathrust ruptures. Simeulue, a 100-km-long island off the west coast of northern Sumatra, Indonesia, straddles this boundary: northern Simeulue was uplifted in the 2004 earthquake, whereas southern Simeulue rose in 2005. Northern Simeulue corals reveal that predecessors of the 2004 earthquake occurred in the 10th century AD, in AD 1394 ± 2, and in AD 1450 ± 3. Corals from southern Simeulue indicate that none of the major uplifts inferred on northern Simeulue in the past 1100 years extended to southern Simeulue. The two largest uplifts recognized at a south-central Simeulue site—around AD 1422 and in 2005—involved little or no uplift of northern Simeulue. The distribution of uplift and strong shaking during a historical earthquake in 1861 suggests the 1861 rupture area was also restricted to south of central Simeulue, as in 2005. The strikingly different histories of the two adjacent patches demonstrate that this boundary has persisted as an impediment to rupture through at least seven earthquakes in the past 1100 years. This implies that the rupture lengths, and hence sizes, of at least some future great earthquakes and tsunamis can be forecast. These microatolls also provide insight into megathrust behavior between earthquakes, revealing sudden and substantial changes in interseismic strain accumulation rates

    Generating Gravitational Waves After Inflation

    Full text link
    I review two mechanisms by which gravitational waves can be generated at the end of inflation: preheating, and gravitons Hawking radiated during the decay of very small primordial black holes. These mechanisms are contrasted with the gravitational waves during inflation, and may provide a window into the physical processes that govern the end of the inflationary phase.Comment: Conference proceeding

    Interface Contracts for Workflow+ Models: an Analysis of Uncertainty across Models

    Get PDF
    Workflow models are used to rigorously specify and reason about diverse types of processes. The Workflow+ (WF+) framework has been developed to support unified modelling of the control and data in processes that can be used to derive assurance cases that support certification. However, WF+ is limited in its support for precise contracts on workflow models, which can enable powerful forms of static analysis and reasoning. In this paper we propose a mechanism for adding interface contracts to WF+ models, which can thereafter be applied to tracing and reasoning about the uncertainty that arises when combining heterogeneous models. We specifically explore this in terms of design models and assurance case models. We argue that some of the key issues in managing some types of uncertainty can be partly addressed by use of interface contract

    Tolerance Specification of Robot Kinematic Parameters Using an Experimental Design Technique

    Get PDF
    This paper presents the tolerance specification of robot kinematic parameters using the Taguchi method. The concept of employing inner and outer orthogonal arrays to identify the significant parameters and select the optimal tolerance range for each parameter is proposed. The performance measure based on signal-to-noise ratios (S/N) using the Taguchi method is validated by Monte Carlo simulations. Finally, a step-by-step tolerance specification methodology is developed and illustrated with a planar two-link manipulator and a five-degree-of-freedom Rhino robot

    Bose-Einstein condensation in an optical lattice: A perturbation approach

    Full text link
    We derive closed analytical expressions for the order parameter Φ(x)\Phi (x) and for the chemical potential μ\mu of a Bose-Einstein Condensate loaded into a harmonically confined, one dimensional optical lattice, for sufficiently weak, repulsive or attractive interaction, and not too strong laser intensities. Our results are compared with exact numerical calculations in order to map out the range of validity of the perturbative analytical approach. We identify parameter values where the optical lattice compensates the interaction-induced nonlinearity, such that the condensate ground state coincides with a simple, single particle harmonic oscillator wave function
    • …
    corecore