157 research outputs found

    Time scales of mesoscale variability and their relationship with space scales in the North Atlantic

    Get PDF
    A systematic study of characteristic time scales of mesoscale variability over the North Atlantic was done using two years of Geosat data. Time scales are first characterized by 10° latitude by 10° longitude bins. A more detailed description was obtained by globally mapping the Sea Level Anomaly temporal correlation after one cycle (17.05 days). The scales are shortest in areas of high mesoscale activity (Gulf Stream, North Atlantic Current) while relatively long time scales are observed over the Mid-Atlantic Ridge and in the eastern part of the basin. In general, time scales are not proportional to space scales. Propagation velocities obtained by dividing space scales by time scales appear to be minimal east of the Mid-Atlantic Ridge. Frequency-wavenumber spectral analysis complemented this statistical description of mesoscale variability. It shows that the dominant wavelengths of around 200 to 500 km (depending on latitude) are associated with long periods (\u3e150 days) in the eastern part of the basin, while near the Gulf Stream significant energy is found at shorter periods. Propagation velocities are generally westward. Pseudo-dispersion relations deduced from Geosat data suggest two distinct dynamic regimes, as in quasigeostrophic turbulence models: a turbulent regime for smaller scales, with proportionality between space and time scales, and an apparently more linear regime where an inverse dispersion relation is found in the eastern part of the basin. This latter characteristic is in agreement with quasigeostrophic models forced by fluctuating winds

    Trivial compiler equivalence: A large scale empirical study of a simple, fast and effective equivalent mutant detection technique

    Get PDF
    Identifying equivalent mutants remains the largest impediment to the widespread uptake of mutation testing. Despite being researched for more than three decades, the problem remains. We propose Trivial Compiler Equivalence (TCE) a technique that exploits the use of readily available compiler technology to address this long-standing challenge. TCE is directly applicable to real-world programs and can imbue existing tools with the ability to detect equivalent mutants and a special form of useless mutants called duplicated mutants. We present a thorough empirical study using 6 large open source programs, several orders of magnitude larger than those used in previous work, and 18 benchmark programs with hand-analysis equivalent mutants. Our results reveal that, on large real-world programs, TCE can discard more than 7% and 21% of all the mutants as being equivalent and duplicated mutants respectively. A human- based equivalence verification reveals that TCE has the ability to detect approximately 30% of all the existing equivalent mutants

    Detecting Trivial Mutant Equivalences via Compiler Optimisations

    Get PDF
    Mutation testing realises the idea of fault-based testing, i.e., using artificial defects to guide the testing process. It is used to evaluate the adequacy of test suites and to guide test case generation. It is a potentially powerful form of testing, but it is well-known that its effectiveness is inhibited by the presence of equivalent mutants. We recently studied Trivial Compiler Equivalence (TCE) as a simple, fast and readily applicable technique for identifying equivalent mutants for C programs. In the present work, we augment our findings with further results for the Java programming language. TCE can remove a large portion of all mutants because they are determined to be either equivalent or duplicates of other mutants. In particular, TCE equivalent mutants account for 7.4% and 5.7% of all C and Java mutants, while duplicated mutants account for a further 21% of all C mutants and 5.4% Java mutants, on average. With respect to a benchmark ground truth suite (of known equivalent mutants), approximately 30% (for C) and 54% (for Java) are TCE equivalent. It is unsurprising that results differ between languages, since mutation characteristics are language-dependent. In the case of Java, our new results suggest that TCE may be particularly effective, finding almost half of all equivalent mutants

    High resolution 3-D temperature and salinity fields derived from in situ and satellite observations

    Get PDF
    This paper describes an observation-based approach that efficiently combines the main components of the global ocean observing system using statistical methods. Accurate but sparse in situ temperature and salinity profiles (mainly from Argo for the last 10 yr) are merged with the lower accuracy but high-resolution synthetic data derived from satellite altimeter and sea surface temperature observations to provide global 3-D temperature and salinity fields at high temporal and spatial resolution. The first step of the method consists in deriving synthetic temperature fields from altimeter and sea surface temperature observations, and salinity fields from altimeter observations, through multiple/simple linear regression methods. The second step of the method consists in combining the synthetic fields with in situ temperature and salinity profiles using an optimal interpolation method. Results show the revolutionary nature of the Argo observing system. Argo observations now allow a global description of the statistical relationships that exist between surface and subsurface fields needed for step 1 of the method, and can constrain the large-scale temperature and mainly salinity fields during step 2 of the method. Compared to the use of climatological estimates, results indicate that up to 50% of the variance of the temperature fields can be reconstructed from altimeter and sea surface temperature observations and a statistical method. For salinity, only about 20 to 30% of the signal can be reconstructed from altimeter observations, making the in situ observing system essential for salinity estimates. The in situ observations (step 2 of the method) further reduce the differences between the gridded products and the observations by up to 20% for the temperature field in the mixed layer, and the main contribution is for salinity and the near surface layer with an improvement up to 30%. Compared to estimates derived using in situ observations only, the merged fields provide a better reconstruction of the high resolution temperature and salinity fields. This also holds for the large-scale and low-frequency fields thanks to a better reduction of the aliasing due to the mesoscale variability. Contribution of the merged fields is then illustrated to describe qualitatively the temperature variability patterns for the period from 1993 to 2009

    Contribution of future wide-swath altimetry missions to ocean analysis and forecasting

    Get PDF
    The impact of forthcoming wide-swath altimetry missions on the ocean analysis and forecasting system was investigated by means of OSSEs (observing system simulation experiments). These experiments were performed with a regional data assimilation system, implemented in the Iberian–Biscay–Ireland (IBI) region, at 1∕12° resolution using simulated observations derived from a fully eddy-resolving free simulation at 1∕36° resolution over the same region. The objective of the experiments was to assess the ability of different satellite constellations to constrain the ocean analyses and forecasts, considering both along-track altimeters and future wide-swath missions; consequently, the capability of the data assimilation techniques used in the Mercator Ocean operational system to effectively combine the different kinds of measurements was also investigated. These assessments were carried out as part of a European Space Agency (ESA) study on the potential role of wide-swath altimetry in future versions of the European Union Copernicus programme. The impact of future wide-swath altimetry data is evident for investigating the reliability of sea level values in OSSEs. The most significant results were obtained when looking at the sensitivity of the system to wide-swath instrumental error: considering a constellation of three nadir and two accurate (small instrumental error) wide-swath altimeters, the error in ocean analysis was reduced by up to 50&thinsp;% compared to conventional altimeters. Investigating the impact of the repetitivity of the future measurements, the results showed that two wide-swath missions had a major impact on sea-level forecasting – increasing the accuracy over the entire time window of the 5-day forecasts – compared with a single wide-swath instrument. A spectral analysis underlined that the contributions of wide-swath altimetry data observed in ocean analyses and forecast statistics were mainly due to the more accurate resolution, compared with along-track data, of ocean variability at spatial scales smaller than 100&thinsp;km. Considering the ocean currents, the results confirmed that the information provided by wide-swath measurements at the surface is propagated down the water column and has a considerable impact (30&thinsp;%) on ocean currents (up to a depth of 300&thinsp;m), compared with the present constellation of altimeters. The ocean analysis and forecasting systems used here are those currently used by the Copernicus Marine Environment and Monitoring Service (CMEMS) to provide operational services and ocean reanalysis. The results obtained in the OSSEs considering along-track altimeters were consistent with those derived from real data (observing system experiments, OSEs). OSSEs can also be used to assess the potential of new observing systems, and in this study the results showed that future constellations of altimeters will have a major impact on constraining the CMEMS ocean analysis and forecasting systems and their applications.</p

    The Importance of Accounting for Real-World Labelling When Predicting Software Vulnerabilities

    Get PDF
    Previous work on vulnerability prediction assume that predictive models are trained with respect to perfect labelling information (includes labels from future, as yet undiscovered vulnerabilities). In this paper we present results from a comprehensive empirical study of 1,898 real-world vulnerabilities reported in 74 releases of three security-critical open source systems (Linux Kernel, OpenSSL and Wiresark). Our study investigates the effectiveness of three previously proposed vulnerability prediction approaches, in two settings: with and without the unrealistic labelling assumption. The results reveal that the unrealistic labelling assumption can profoundly mis- lead the scientific conclusions drawn; suggesting highly effective and deployable prediction results vanish when we fully account for realistically available labelling in the experimental methodology. More precisely, MCC mean values of predictive effectiveness drop from 0.77, 0.65 and 0.43 to 0.08, 0.22, 0.10 for Linux Kernel, OpenSSL and Wiresark, respectively. Similar results are also obtained for precision, recall and other assessments of predictive efficacy. The community therefore needs to upgrade experimental and empirical methodology for vulnerability prediction evaluation and development to ensure robust and actionable scientific findings

    Design by Contract to Improve Software Vigilance

    Full text link

    The Oceanic Variability Spectrum and Transport Trends

    Get PDF
    Oceanic meridional transports evaluated over the width of the Pacific Ocean from altimetric observations become incoherent surprisingly rapidly with meridional separation. Even with 15 years of data, surface slopes show no significant coherence beyond 5◩ of latitude separation at any frequency. An analysis of the frequency/zonal-wavenumber spectral density shows a broad continuum of motions at all time and space scales, with a significant excess of energy along a “non-dispersive” line extending between the simple barotropic and first baroclinic mode Rossby waves. It is speculated that much of that excess energy lies with coupled barotropic and first mode Rossby waves. The statistical significance of apparent oceanic transport trends depends upon the existence of a reliable frequency/wavenumber spectrum and for which only a few observational elements now exist.Jet Propulsion Laboratory (U.S.).United States. National Aeronautics and Space Administration (Jason-1 program)National Oceanographic Partnership Program (U.S.
    • 

    corecore