8 research outputs found

    SPATIAL OUTLIERS DETECTION ALGORITHM (SODA) APPLIED TO MULTIBEAM BATHYMETRIC DATA PROCESSING

    Get PDF
    The amount of data produced in an echo sounder has grown exponentially with scanning systems such as multibeam echo sounders and interferometric sonars, providing a considerable improvement in the submerged relief representation, especially to detect hazard objects to the navigator. However, the available processing algorithms did not follow this evolution; manual processing is usually necessary or at least constant intervention by an analyst, making this task arduous with a high subjectivity degree. In addition, statistical inconsistencies are common in most of the algorithms and filters available. Thus, SODA (Spatial Outliers Detection Algorithm) was recently presented, which is a methodology directed at first for echo sounder data treatment. The authors evaluated the algorithm efficiency using simulated data. Therefore, this article aimed to evaluate the SODA efficiency for real data treatment, with a multibeam echo sounder. A number of interesting results was obtained, reaffirming the methodology strength, regarding the search for spikes in echo sounder data

    Методы фильтрации информации, полученной при гидрографической съемке

    Get PDF
    Проведено порівняння методів, що застосовуються для постобробки даних глибини, виміряних ехолотом. Визначено, що в порівнянні з вейвлет-фільтрацією, фільтр Калмана є дещо менш ефективним, але фільтрація глибини за допомогою фільтра Калмана дає змогу як очистити дані від шумів, так і відкинути аномальні дані. Предметом подальших досліджень може стати вдосконалення використаних та впровадження нових методів фільтрації та постобробки виміряних даних.Current trends in navigation are characterized by the further increase of demands on the precision of hydrographic information, especially of the nautical maps. Thus, precision of both spatial position and depth bathymetric data is important for ensuring safe navigation, and so problem of data filtering and elimination of outliers arises. In the present work, comparison of methods, used for postprocessing of depth data, measured by echosounder, is done. First of all, review of commonly used data filtering and outlier elimination methods is done, and their advantages and disadvantages are analyzed. As improved outlier elimination algorithm and median filtering has their flaws, Kalman filtering is considered as a measure of outlier elimination and real data estimation. It’s shown that Kalman filter can both effectively filter noise and eliminate outliers; however, quality of the filtered data strongly depends on measurement noise covariation and process noise covariation estimates, R and Q respectively. At that, the lower Q is, the better noise is filtered and the smoother depth profile is; the higher R is, the better outliers are eliminated. However, care must be taken, as depth profile is distorted at high R values, and noise is almost not filtered at low ones. It’s shown that noise covariation estimate has more influence on data filtering; therefore, one should pay attention to correct R estimation. For practical reasons, values of Q = 0,01; R =10 are recommended. In the recent works, wavelet filtering is considered as a promising method of data filtering in postprocessing. Therefore, as a next step, comparison of Kalman filtering and wavelet filtering is done using the real-world data. To that end, white noise is added to filtered and smoothed data, and then those data are filtered by methods, mentioned above. Corellation of source and denoised data is chosen as a criterion of filter effectiveness. It’s shown that Kalman filter is somewhat less effective in data postprocessing than wavelet filter. However, as Kalman filter allows one both to filter noises form the measured data and to eliminate outliers, and can be used for “on-the-fly” data filtering, it’s advisable to use Kalman filtering for real-time measurements during surveys, and wavelets for data postprocessing. Future studies may be devoted to improvement of existing and introduction of new data filtering and postrprocessing methods.Проведено сравнение методов, которые применяются для постобработки данных глубины, измеренных эхолотом. Определено, что в сравнении с вейвлет-фильтрацией, фильтр Калмана несколько менее эффективен, но фильтрация глубины с помощью фильтра Калмана дает возможность как очистить данные от шумов, так и исключить аномальные данные. Предметом последующих исследований может стать усовершенствование примененных и внедрение новых методов фильтрации и постобработки измеренных данных

    Distribution-free, Variable Resolution Depth Estimation with Composite Uncertainty

    Get PDF
    Recent algorithms for processing hydrographic data have treated the problem of achievable resolution by constructing grids of fixed resolution, a composite grid of variable resolution, recursive sub-division in a quad-tree, or by relying on a comprehensive TIN of the original points. These algorithms all impose some artificial structure on the data to allow for efficient computation, however, which this paper attempts to address. A scheme is outlined which provides a robust estimate of depth and associated uncertainty that makes as few assumptions as possible. Using a non-uniform spectral analysis, it estimates the spatial scales at which the data are consistent so it can estimate within the Nyquist limit for the underlying surface. Kernel density techniques estimate the most likely depth, and density partitioning estimates the observational and modeling uncertainty. After correcting for potential biases the uncertainty is augmented using a modified Hare-Godin-Mayer system integration uncertainty and a sound speed profile variability due to Beaudoin et al. The result is a robust, distributionfree, continuously variable-resolution estimate of depth with an associated uncertainty. This algorithm is illustrated by estimating the depth (and uncertainty) of Challenger Deep, and the paper then provides some perspectives on efficiency, extensibility and adaptability of this algorithm in the hydrographic context

    MEASUREMENT AND ANALYSIS OF ACOUSTIC BACKSCATTER USING MULTIBEAM ECHOSOUNDER TECHNOLOGY FOR SEDIMENT CLASSIFICATION OF THE GULF OF PALU

    Get PDF
    Nilai Hambur balik dapat menggambarkan kondisi sedimen di dasar perairan, termasuk ukuran butir dari sedimen dasar perairan. Tujuan penelitian ini untuk mendeteksi, mengklasifikasi dan memprediksi tipe dasar perairan berdasarkan nilai hambur balik menggunakan Angular Response Analysis (ARA) dan Support Vector Machine (SVM) sehingga didapatkan peta spasial sebaran sedimen di Teluk Palu. Data batimetri dan intensitas hambur balik diambil pada 5-9 Oktober 2018 menggunakan multibeam echosounder Kongsberg EM 302 dengan frekuensi 30 kHz dan 10 sampel sedimen tahun 2012 milik PUSHIDROSAL. Hasil penelitian menunjukkan sebaran sedimen dasar Teluk Palu dengan metode ARA didominasi oleh pasir (sand) dan lanau (silt) sedangkan dengan metode SVM didominasi oleh pasir berlanau (silty sand), lanau (silt) dan pasir (sand). Hasil uji akurasi untuk metode ARA sebesar 50% sedangkan hasil uji akurasi untuk metode SVM menghasilkan overall accuracy dengan nilai 60%. Prediksi tipe dasar perairan di Teluk Palu yang paling mendekati keadaan sebenarnya adalah hasil prediksi dengan metode SVM yaitu pasir berlanau, lanau dan pasir.Backscattering can describe sediments' condition in the bottom waters, including the grain size of the bottom waters sediments. This study aims to detect, classify, and estimate the bottom watershed based on backscattering values using Angular Response Analysis (ARA)  and Support Vector Machine (SVM) so that a spatial map of sediment distribution is obtained in Gulf of Palu. Bathymetry data and backscattering intensity were taken on 5-9 October 2018 using the multibeam echosounder Kongsberg EM 302 with a frequency of 30 kHz, and ten sediment samples in 2012 belong to PUSHIDROSAL. The sediment distribution from the Gulf of Palu with the ARA method is dominated by sand and silt. Simultaneously, the distribution of sediments using the SVM method is dominated by silty sand, silt, and sand. Accuracy test results for the ARA methods produce an overall accuracy with a value of 50%. In comparison, Accuracy test results for the SVM method produce an overall accuracy with a value of 60%. The prediction of the basic types of waters in Palu Bay that are most close to the actual state is the prediction results using the SVM method, namely silt, silt, and sand

    The Dimensions of Customer Satisfaction in the Jamaican Financial Service Industry

    Get PDF
    Bank leaders spend an average of 727toacquireanewcustomerand727 to acquire a new customer and 287 to retain current customers. Grounded in customer relationship management and adaptation level theories, the purpose of this correlational study was to examine the relationship between service quality and customers\u27 intention to switch banking service. An online survey was administered to 203 Jamaican banking customers. The target population was selected to identify if the Jamaican banks\u27 customer service adhered to the customer satisfaction principles developed by Parasuraman. The independent variables were the 10 dimensions of service quality. Competence, courtesy, credibility, and access were removed because of multicollinearity issues. The dependent variable was the customers\u27 intention. The results indicated a statistically significant relationship, F(6, 196) = 15.074, p \u3c .001, between service quality and customer intent to switch banking services. The six predictors: tangibles (r = -.303, p \u3c .005), reliability (r = -.253, p \u3c .008), responsiveness (r = .35, p \u3c .001), safety (r = -.433, p \u3c .001), communication (r = -.184, p \u3c .028), and empathy (r = -.357, p \u3c .001), accounted the largest variance for (β = -.316) of the customers\u27 intention of the Jamaican banking service. The implications for positive social change include the potential for bank leaders to develop customer-focused banking policy, increase customer satisfaction, and decrease costs related to losing customers, thus increasing profitability

    11th International Coral Reef Symposium Proceedings

    Get PDF
    A defining theme of the 11th International Coral Reef Symposium was that the news for coral reef ecosystems are far from encouraging. Climate change happens now much faster than in an ice-age transition, and coral reefs continue to suffer fever-high temperatures as well as sour ocean conditions. Corals may be falling behind, and there appears to be no special silver bullet remedy. Nevertheless, there are hopeful signs that we should not despair. Reef ecosystems respond vigorously to protective measures and alleviation of stress. For concerned scientists, managers, conservationists, stakeholders, students, and citizens, there is a great role to play in continuing to report on the extreme threat that climate change represents to earth’s natural systems. Urgent action is needed to reduce CO2 emissions. In the interim, we can and must buy time for coral reefs through increased protection from sewage, sediment, pollutants, overfishing, development, and other stressors, all of which we know can damage coral health. The time to act is now. The canary in the coral-coal mine is dead, but we still have time to save the miners. We need effective management rooted in solid interdisciplinary science and coupled with stakeholder buy in, working at local, regional, and international scales alongside global efforts to give reefs a chance.https://nsuworks.nova.edu/occ_icrs/1000/thumbnail.jp
    corecore