46 research outputs found
On High Performance Computing in Geodesy : Applications in Global Gravity Field Determination
Autonomously working sensor platforms deliver an increasing amount of precise data sets, which are often usable in geodetic applications. Due to the volume and quality, models determined from the data can be parameterized more complex and in more detail. To derive model parameters from these observations, the solution of a high dimensional inverse data fitting problem is often required. To solve such high dimensional adjustment problems, this thesis proposes a systematical, end-to-end use of a massive parallel implementation of the geodetic data analysis, using standard concepts of massive parallel high performance computing. It is shown how these concepts can be integrated into a typical geodetic problem, which requires the solution of a high dimensional adjustment problem. Due to the proposed parallel use of the computing and memory resources of a compute cluster it is shown, how general Gauss-Markoff models become solvable, which were only solvable by means of computationally motivated simplifications and approximations before. A basic, easy-to-use framework is developed, which is able to perform all relevant operations needed to solve a typical geodetic least squares adjustment problem. It provides the interface to the standard concepts and libraries used. Examples, including different characteristics of the adjustment problem, show how the framework is used and can be adapted for specific applications. In a computational sense rigorous solutions become possible for hundreds of thousands to millions of unknown parameters, which have to be estimated from a huge number of observations. Three special problems with different characteristics, as they arise in global gravity field recovery, are chosen and massive parallel implementations of the solution processes are derived. The first application covers global gravity field determination from real data as collected by the GOCE satellite mission (comprising 440 million highly correlated observations, 80,000 parameters). Within the second application high dimensional global gravity field models are estimated from the combination of complementary data sets via the assembly and solution of full normal equations (scenarios with 520,000 parameters, 2 TB normal equations). The third application solves a comparable problem, but uses an iterative least squares solver, allowing for a parameter space of even higher dimension (now considering scenarios with two million parameters). This thesis forms the basis for a flexible massive parallel software package, which is extendable according to further current and future research topics studied in the department. Within this thesis, the main focus lies on the computational aspects.Autonom arbeitende Sensorplattformen liefern prĂ€zise geodĂ€tisch nutzbare DatensĂ€tze in gröĂer werdendem Umfang. Deren Menge und QualitĂ€t fĂŒhrt dazu, dass Modelle die aus den Beobachtungen abgeleitet werden, immer komplexer und detailreicher angesetzt werden können. Zur Bestimmung von Modellparametern aus den Beobachtungen gilt es oftmals, ein hochdimensionales inverses Problem im Sinne der Ausgleichungsrechnung zu lösen. Innerhalb dieser Arbeit soll ein Beitrag dazu geleistet werden, Methoden und Konzepte aus dem Hochleistungsrechnen in der geodĂ€tischen Datenanalyse strukturiert, durchgĂ€ngig und konsequent zu verwenden. Diese Arbeit zeigt, wie sich diese nutzen lassen, um geodĂ€tische Fragestellungen, die ein hochdimensionales Ausgleichungsproblem beinhalten, zu lösen. Durch die gemeinsame Nutzung der Rechen- und Speicherressourcen eines massiv parallelen Rechenclusters werden Gauss-Markoff Modelle lösbar, die ohne den Einsatz solcher Techniken vorher höchstens mit massiven Approximationen und Vereinfachungen lösbar waren. Ein entwickeltes GrundgerĂŒst stellt die Schnittstelle zu den massiv parallelen Standards dar, die im Rahmen einer numerischen Lösung von typischen Ausgleichungsaufgaben benötigt werden. Konkrete Anwendungen mit unterschiedlichen Charakteristiken zeigen das detaillierte Vorgehen um das GrundgerĂŒst zu verwenden und zu spezifizieren. Rechentechnisch strenge Lösungen sind so fĂŒr Hunderttausende bis Millionen von unbekannten Parametern möglich, die aus einer Vielzahl von Beobachtungen geschĂ€tzt werden. Drei spezielle Anwendungen aus dem Bereich der globalen Bestimmung des Erdschwerefeldes werden vorgestellt und die Implementierungen fĂŒr einen massiv parallelen Hochleistungsrechner abgeleitet. Die erste Anwendung beinhaltet die Bestimmung von Schwerefeldmodellen aus realen Beobachtungen der Satellitenmission GOCE (welche 440 Millionen korrelierte Beobachtungen umfasst, 80,000 Parameter). In der zweite Anwendung werden globale hochdimensionale Schwerefelder aus komplementĂ€ren Daten ĂŒber das Aufstellen und Lösen von vollen Normalgleichungen geschĂ€tzt (basierend auf Szenarien mit 520,000 Parametern, 2 TB Normalgleichungen). Die dritte Anwendung löst dasselbe Problem, jedoch ĂŒber einen iterativen Löser, wodurch der Parameterraum noch einmal deutlich höher dimensional sein kann (betrachtet werden nun Szenarien mit 2 Millionen Parametern). Die Arbeit bildet die Grundlage fĂŒr ein massiv paralleles Softwarepaket, welches schrittweise um Spezialisierungen, abhĂ€ngig von aktuellen Forschungsprojekten in der Arbeitsgruppe, erweitert werden wird. Innerhalb dieser Arbeit liegt der Fokus rein auf den rechentechnischen Aspekten
On high performance computing in geodesy : applications in global gravity field determination
Autonomously working sensor platforms deliver an increasing amount of precise data sets, which are often usable in geodetic applications. Due to the volume and quality, models determined from the data can be parameterized more complex and in more detail. To derive model parameters from these observations, the solution of a high dimensional inverse data fitting problem is often required. To solve such high dimensional adjustment problems, this thesis proposes a systematical, end-to-end use of a massive parallel implementation of the geodetic data analysis, using standard concepts of massive parallel high performance computing. It is shown how these concepts can be integrated into a typical geodetic problem, which requires the solution of a high dimensional adjustment problem. Due to the proposed parallel use of the computing and memory resources of a compute cluster it is shown, how general Gauss-Markoff models become solvable, which were only solvable by means of computationally motivated simplifications and approximations before. A basic, easy-to-use framework is developed, which is able to perform all relevant operations needed to solve a typical geodetic least squares adjustment problem. It provides the interface to the standard concepts and libraries used. Examples, including different characteristics of the adjustment problem, show how the framework is used and can be adapted for specific applications. In a computational sense rigorous solutions become possible for hundreds of thousands to millions of unknown parameters, which have to be estimated from a huge number of observations.
Three special problems with different characteristics, as they arise in global gravity field recovery, are chosen and massive parallel implementations of the solution processes are derived. The first application covers global gravity field determination from real data as collected by the GOCE satellite mission (comprising 440 million highly correlated observations, 80,000 parameters). Within the second application high dimensional global gravity field models are estimated from the combination of complementary data sets via the assembly and solution of full normal equations (scenarios with 520,000 parameters, 2 TB normal equations). The third application solves a comparable problem, but uses an iterative least squares solver, allowing for a parameter space of even higher dimension (now considering scenarios with two million parameters). This thesis forms the basis for a flexible massive parallel software package, which is extendable according to further current and future research topics studied in the department. Within this thesis, the main focus lies on the computational aspects.Autonom arbeitende Sensorplattformen liefern prĂ€zise geodĂ€tisch nutzbare DatensĂ€tze in gröĂer werdendem Umfang. Deren Menge und QualitĂ€t fĂŒhrt dazu, dass Modelle die aus den Beobachtungen abgeleitet werden, immer komplexer und detailreicher angesetzt werden können. Zur Bestimmung von Modellparametern aus den Beobachtungen gilt es oftmals, ein hochdimensionales inverses Problem im Sinne der Ausgleichungsrechnung zu lösen. Innerhalb dieser Arbeit soll ein Beitrag dazu geleistet werden, Methoden und Konzepte aus dem Hochleistungsrechnen in der geodĂ€tischen Datenanalyse strukturiert, durchgĂ€ngig und konsequent zu verwenden. Diese Arbeit zeigt, wie sich diese nutzen lassen, um geodĂ€tische Fragestellungen, die ein hochdimensionales Ausgleichungsproblem beinhalten, zu lösen. Durch die gemeinsame Nutzung der Rechen- und Speicherressourcen eines massiv parallelen Rechenclusters werden Gauss-Markoff Modelle lösbar, die ohne den Einsatz solcher Techniken vorher höchstens mit massiven Approximationen und Vereinfachungen lösbar waren. Ein entwickeltes GrundgerĂŒst stellt die Schnittstelle zu den massiv parallelen Standards dar, die im Rahmen einer numerischen Lösung von typischen Ausgleichungsaufgaben benötigt werden.
Konkrete Anwendungen mit unterschiedlichen Charakteristiken zeigen das detaillierte Vorgehen um das GrundgerĂŒst zu verwenden und zu spezifizieren. Rechentechnisch strenge Lösungen sind so fĂŒr Hunderttausende bis Millionen von unbekannten Parametern möglich, die aus einer Vielzahl von Beobachtungen geschĂ€tzt werden. Drei spezielle Anwendungen aus dem Bereich der globalen Bestimmung des Erdschwerefeldes werden vorgestellt und die Implementierungen fĂŒr einen massiv parallelen Hochleistungsrechner abgeleitet. Die erste Anwendung beinhaltet die Bestimmung von Schwerefeldmodellen aus realen Beobachtungen der Satellitenmission GOCE (welche 440 Millionen korrelierte Beobachtungen umfasst, 80,000 Parameter). In der zweite Anwendung werden globale hochdimensionale Schwerefelder aus komplementĂ€ren Daten ĂŒber das Aufstellen und Lösen von vollen Normalgleichungen geschĂ€tzt (basierend auf Szenarien mit 520,000 Parametern, 2 TB Normalgleichungen). Die dritte Anwendung löst dasselbe Problem, jedoch ĂŒber einen iterativen Löser, wodurch der Parameterraum noch einmal deutlich höher dimensional sein kann (betrachtet werden nun Szenarien mit 2 Millionen Parametern). Die Arbeit bildet die Grundlage fĂŒr ein massiv paralleles Softwarepaket, welches schrittweise um Spezialisierungen, abhĂ€ngig von aktuellen Forschungsprojekten in der Arbeitsgruppe, erweitert werden wird. Innerhalb dieser Arbeit liegt der Fokus rein auf den rechentechnischen Aspekten
Coestimating long-term temporal signals to reduce the aliasing effect in parametric geodetic mean dynamic topography estimation
The geodetic estimation of the mean dynamic ocean topography (MDT) as the difference between the mean of the sea surface and the geoid remains, despite the simple relation, still a difficult task. Mainly, the spectral inconsistency between the available altimetric sea surface height (SSH) observations and the geoid information causes problems in the separation process of the spatially and temporally averaged
SSH into geoid and MDT. This is aggravated by the accuracy characteristics of the satellite derived geoid information, as it is only sufficiently accurate for a resolution of about 100 km.
To enable the direct use the along-track altimetric SSH observations together with a proper stochastic model, we apply a parametric approach, where a C1-smooth finite element space is used to model the MDT and spherical harmonics to model the geoid. Combining observation equations for altimetric SSH observations with gravity field normal equations assembled from dedicated gravity field missions in a least squares adjustment, allows for a joint estimation of both- i.e. the MDT and an improved geoid.
To allow for temporal averaging and to obtain a proper spatial resolution, satellite altimetry missions with an exact repeat cycle are combined with geodetic missions. Whereas the temporal averaging for the exact repeat altimetry missions is implicitly performed by the adjustment due to the regular temporal sampling, aliasing is introduced for the geodetic missions, because of the missing (or at least very long)
repeat characteristics. In this contribution, we will summarize the parametric approach used, with a focus on co-estimation of long-term temporal sea level variations. Regularization strategies are applied to derive stable and smooth estimates. It is studied how the additional spatio-temporal model component, e.g.
linear trends and seasonal signals, reduces the aliasing problem and influences the estimate of the geodetic MDT
An improved global gravity field model from the GOCE mission: the time-wise release 6 model
The Gravity field and steady-state Ocean Circulation Explorer (GOCE) completed its science mission to observethe Earthâs gravity field in 2013. From the collected observations, gravity gradients and tracking by the GPSconstellation, global gravity field models were estimated in terms of spherical harmonic series. From the completemission data set, the release 5 gravity field models were published in 2014.One of those models is EGM_TIM_RL05 which is computed with the time-wise approach. Within the computationof the time-wise models only GOCE observations are used. They remain independent of other available groundor satellite based gravity data sets. Thus, it represents the gravity field as seen by the GOCE mission. Within theestimation of the time-wise models, a lot of effort is spent on the stochastic modeling of the input observations toderive an uncertainty description in form of a covariance matrix.Recent studies have identified an imperfect calibration of the level 1B gravity gradients. Within a reprocessingcampaign, the entire mission data set was reprocessed, such that the quality of the gravity gradients couldbe significantly improved. In addition, the orbits were reprocessed, reducing systematic artifacts. Within thiscontribution, the sixth release of the time-wise GOCE gravity field models is presented. Using the reprocessedinput data, orbits as well as gravity gradients, and a robustified processing, the entire data set is used to estimatethe updated gravity field model EGM_TIM_RL06. The processing used to generate the new solution is shown.It is validated and compared to the older releases. Three kinds of improvements are shown: i) a reduction of themean error in the range of 15 to 25 %, ii) a reduction of systematic errors at centimeter level and iii) an even morerealistic covariance description. The presented model will be made available as the ESA official GOCE time-wisegravity field model
Design of frequency selective filters for non-equispaced data
Modern sensors and satellite missions deliver huge data sets and long time series of observations.
In the most cases equispaced time series are collected. Filtering of time series is a standard task in data analysis. Therefore a large variety of methods is already established to deal with equispaced time series. Discrete digital filters can be individually designed in the time as well as in the frequency domain. Often a priori information in the frequency domain is used to extract the signal of interest from the remaining part. These well established strategies presume equispaced samples to get access to the frequency domain behaviour.
In this article we focus our attention on non-equispaced time series, which often appear in satellite geodesy and remote sensing. Again we assume that a special behaviour in the frequency domain characterizes the desired part of the signal. To extract exactly this part of the signal we construct frequency limited base functions, which are able to provide a strict cut-off in the frequency domain. A linear combination of these base functions can be applied in an approximation procedure. Thus, it is possible to extract the desired band-limited signal also for non-equispaced data sets.
Our article is focused on the construction of these frequency limited base functions. Starting with piecewise given polynomial base functions with finite support (Splines), we construct by the inversion of an infinite band Toeplitz system tailored base functions, which are mutually independent (sampling splines). These sampling splines can be transformed into the spectral domain and because of the finite support of the original base functions a closed form representation in the spectral domain is possible (finite sums). Detailed studies on the filter characteristic in the frequency domain allows then a simple access to design low pass filters for non-equispaced data with a predefined passband region. It can be seen that the order of the splines controls the transition width as well as the stopband attenuation. Some practical examples demonstrate the capability of this approach in practice
Satellite water quality monitoring: Status report 2023
Prosjektleder: Therese HarveyThis report covers the 1-year status report of the project VannovervĂ„king med satellitt 2023 â 2024 (2025)/ Satellite water quality monitoring 2023 â2024 (2025) funded by MiljĂždirektoratet/the Norwegian EnvironmentalAgency (NEA). It describes the work done in Work package 1, 3, 5 andpartly 6 as well as plans for 2024. In situ data have been gathered for themost monitored Norwegian lakes (for ĂKOSTOR and ĂKOFERSK) and thecoastal stations included in ĂKOKYST. Both Sentinel-2 and Sentinel-3data have been processed between 2016-2023. A demo viewer forexploring the data for some dedicated lakes and coastal areas has beenset up with granted access to project members. The infrastructure havebeen set up for the initial validation of 10+10 water bodies (WB) as well astests with MET and EUMETSAT for the future host of the service havebeen tested. The validation and evaluation of different algorithms of chl-aand Secchi depth data have started for the 10+10 WBs. The selection ofWB for the initial validation was based on water quality and opticalparameters to cover a wide range of water types and conditions.MiljĂždirektoratetpublishedVersio
The comorbidity and co-medication profile of patients with progressive supranuclear palsy
BackgroundProgressive supranuclear palsy (PSP) is usually diagnosed in elderly. Currently, little is known about comorbidities and the co-medication in these patients.ObjectivesTo explore the pattern of comorbidities and co-medication in PSP patients according to the known different phenotypes and in comparison with patients without neurodegenerative disease.MethodsCross-sectional data of PSP and patients without neurodegenerative diseases (non-ND) were collected from three German multicenter observational studies (DescribePSP, ProPSP and DANCER). The prevalence of comorbidities according to WHO ICD-10 classification and the prevalence of drugs administered according to WHO ATC system were analyzed. Potential drug-drug interactions were evaluated using AiDKlinik (R).ResultsIn total, 335 PSP and 275 non-ND patients were included in this analysis. The prevalence of diseases of the circulatory and the nervous system was higher in PSP at first level of ICD-10. Dorsopathies, diabetes mellitus, other nutritional deficiencies and polyneuropathies were more frequent in PSP at second level of ICD-10. In particular, the summed prevalence of cardiovascular and cerebrovascular diseases was higher in PSP patients. More drugs were administered in the PSP group leading to a greater percentage of patients with polypharmacy. Accordingly, the prevalence of potential drug-drug interactions was higher in PSP patients, especially severe and moderate interactions.ConclusionsPSP patients possess a characteristic profile of comorbidities, particularly diabetes and cardiovascular diseases. The eminent burden of comorbidities and resulting polypharmacy should be carefully considered when treating PSP patients