1,110 research outputs found
Innovative approaches to urban data management using emerging technologies
Many characteristics of Smart cities rely on a sufficient quantity and quality of urban data. Local industry and developers can use this data for application development that improves life of all citizens. Therefore, the handling and usability of this data is a big challenge for smart cities. In this paper we investigate new approaches to urban data management using emerging technologies and give an insight on further research conducted within the EC-funded smarticipate project.
Geospatial data cannot be handled well in classical relational database environments. Either they are just put in as binary large objects or have to be broken down into elementary types which can be handled by the database, in many cases resulting in a slow system, since the database technology is not really tuned for delivery on mass data as classical relational databases are optimized for online transaction processing and not analytic processing.
Document-based databases provide a better performance, but still struggle with the challenge of large binary objects. Also the heterogeneity of data requires a lot of mapping and data cleansing, in some cases replication canât be avoided.
Another approach is to use Semantic Web technologies to enhance the data and build up relations and connections between entities. However, data formats such as RDF use a different approach and are not suitable for geospatial data leading to a lack on usability.
Search engines are a good example of web applications with a high usability. The users must be able to find the right data and get information of related or close matches. This allows information retrieval in an easy to use fashion. The same principles should be applied to geospatial data, which would improve the usability of open data. Combined with data mining and big data technologies those principles would improve the usability of open geospatial data and even lead to new ways to use it. By helping with the interpretation of data in a certain context data is transformed into useful information.
In this paper we analyse key features of open geodata portals such as linked data and machine learning in order to show ways of improving the user experience. Based on the Smarticipate projects we show afterwards as open data and geo data online and see the practical application. We also give an outlook on piloting cases where we want to evaluate, how the technologies presented in this paper can be combined to a usefull open data portal. In contrast to the previous EC-funded project urbanapi, where participative processes in smart cities where created with urban data, we go one step further with semantic web and open data. Thereby we achieve a more general approach on open data portals for spatial data and how to improve their usability.
The envisioned architecture of the smarticipate project relies on file based storage and a no-copy strategy, which means that data is mostly kept in its original format, a conversion to another format is only done if necessary (e.g. the current format has limitations on domain specific attributes or the user requests a specific format). A strictly functional approach and architecture is envisioned which allows a massively parallel execution and therefore is predestined to be deployed in a cloud environment.
The actual search interface uses a domain specific vocabulary which can be customised for special purposes or for users that consider their context and expertise, which should abstract from technology specific peculiarities.
Also application programmers will benefit form this architecture as linked data principles will be followed extensively. For example, the JSON and JSON-LD standards will be used, so that web developers can use results of the data store directly without the need for conversion. Also links to further information will be provided within the data, so that a drill down is possible for more details.
The remainder of this paper is structured as follows. After the introduction about open data and data in general we look at related work and existing open data portals. This leads to the main chapter about the key technology aspects for an easy-to-use open data portal. This is followed by Chapter five, an introduction of the EC-funded project smarticipate, in which the key technology aspects of chapter four will be included
Reply to âChallenging the hypothesis of an arctic ocean lake during recent glacial episodesâ by HillaireâMarcel, et al
HillaireâMarcelet al. bring forward several physical and geochemical arguments against our finding of an Arctic glaciolacustrine system in the past. In brief, we find that a physical approach to further test our hypothesis should additionally consider the actual bathymetry of the GreenlandâScotland Ridge (GSR), the density maximum of freshwater at 3â4°C, the sensible heat flux from rivers, and the actual volumes that are being mixed and advected. Their geochemical considerations acknowledge our original argument, but they also add a number of assumptions that are neither required to explain the observations, nor do they correspond to the lithology of the sediments. Rather than being additive in nature, their arguments of high particle flux, low particle flux, export of 230Th and accumulation of 230Th, are mutually exclusive. We first address the arguments above, before commenting on some misunderstandings of our original claim in their contribution, especially regarding our dating approach
A new perspective on the competent programmer hypothesis through the reproduction of bugs with repeated mutations
The competent programmer hypothesis states that most programmers are
competent enough to create correct or almost correct source code. Because this
implies that bugs should usually manifest through small variations of the
correct code, the competent programmer hypothesis is one of the fundamental
assumptions of mutation testing. Unfortunately, it is still unclear if the
competent programmer hypothesis holds and past research presents contradictory
claims. Within this article, we provide a new perspective on the competent
programmer hypothesis and its relation to mutation testing. We try to re-create
real-world bugs through chains of mutations to understand if there is a direct
link between mutation testing and bugs. The lengths of these paths help us to
understand if the source code is really almost correct, or if large variations
are required. Our results indicate that while the competent programmer
hypothesis seems to be true, mutation testing is missing important operators to
generate representative real-world bugs.Comment: Submitted and under revie
A multi-scale flood monitoring system based on fully automatic MODIS and TerraSAR-X processing chains
A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently
under development at DLRâs (German Aerospace Center) Center for Satellite based Crisis Information (ZKI) to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a
continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS). A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR) based satellite mission (TerraSAR-X). The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand
programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS) and the TerraSAR-X Flood Service (TFS) include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013
Geochemical evidence of a floating Arctic ice sheet and underlying freshwater in the Arctic Mediterranean in glacial periods
Numerous studies have addressed the possible existence of large floating ice sheets in the glacial Arctic Ocean from theoretical, modelling, or seafloor morphology perspectives. Here, we add evidence from the sediment record that support the existence of such freshwater ice caps in certain intervals, and we discuss their implications for possible non-linear and rapid behaviour of such a system in the high latitudes.
We present sedimentary activities of 230Th together with 234U/238U ratios, the concentrations of manganese, sulphur and calcium in the context of lithological information and records of microfossils and their isotope composition. New analyses (PS51/038, PS72/396) and a re-analysis of existing marine sediment records (PS1533, PS1235, PS2185, PS2200, amongst others) in view of the naturally occurring radionuclide 230Thex and, where available, 10Be from the Arctic Ocean and the Nordic Seas reveal the widespread occurrence of intervals with a specific geochemical signature. The pattern of these parameters in a pan-Arctic view can best be explained when assuming the repeated presence of freshwater in frozen and liquid form across large parts of the Arctic Ocean and the Nordic Seas.
Based on the sedimentary evidence and known environmental constraints at the time, we develop a glacial scenario that explains how these ice sheets, together with eustatic sea-level changes, may have affected the past oceanography of the Arctic Ocean in a fundamental way that must have led to a drastic and non-linear response to external forcing.
This concept offers a possibility to explain and to some extent reconcile contrasting age models for the Late Pleistocene in the Arctic Ocean. Our view, if adopted, offers a coherent dating approach across the Arctic Ocean and the Nordic Seas, linked to events outside the Arctic
Regulation of global CD8+ T-cell positioning by the actomyosin cytoskeleton
CD8+ T cells have evolved as one of the most motile mammalian cell types, designed to continuously scan peptideâmajor histocompatibility complexes class I on the surfaces of other cells. Chemoattractants and adhesion molecules direct CD8+ Tâcell homing to and migration within secondary lymphoid organs, where these cells colocalize with antigenâpresenting dendritic cells in confined tissue volumes. CD8+ Tâ cell activation induces a switch to infiltration of nonâlymphoid tissue (NLT), which differ in their topology and biophysical properties from lymphoid tissue. Here, we provide a short overview on regulation of organismâwide trafficking patterns during naive Tâcell recirculation and their switch to nonâlymphoid tissue homing during activation. The migratory lifestyle of CD8+ T cells is regulated by their actomyosin cytoskeleton, which translates chemical signals from surface receptors into mechanical work. We explore how properties of the actomyosin cytoskeleton and its regulators affect CD8+ T cell function in lymphoid and nonâlymphoid tissue, combining recent findings in the field of cell migration and actin network regulation with tissue anatomy. Finally, we hypothesize that under certain conditions, intrinsic regulation of actomyosin dynamics may render NLT CD8+ Tâcell populations less dependent on input from extrinsic signals during tissue scanning
- âŠ