1,073 research outputs found
A BIM - GIS Integrated Information Model Using Semantic Web and RDF Graph Databases
In recent years, 3D virtual indoor and outdoor urban modelling has become an essential geospatial information framework for civil and engineering applications such as emergency response, evacuation planning, and facility management. Building multi-sourced and multi-scale 3D urban models are in high demand among architects, engineers, and construction professionals to achieve these tasks and provide relevant information to decision support systems. Spatial modelling technologies such as Building Information Modelling (BIM) and Geographical Information Systems (GIS) are frequently used to meet such high demands. However, sharing data and information between these two domains is still challenging. At the same time, the semantic or syntactic strategies for inter-communication between BIM and GIS do not fully provide rich semantic and geometric information exchange of BIM into GIS or vice-versa. This research study proposes a novel approach for integrating BIM and GIS using semantic web technologies and Resources Description Framework (RDF) graph databases. The suggested solution's originality and novelty come from combining the advantages of integrating BIM and GIS models into a semantically unified data model using a semantic framework and ontology engineering approaches. The new model will be named Integrated Geospatial Information Model (IGIM). It is constructed through three stages. The first stage requires BIMRDF and GISRDF graphs generation from BIM and GIS datasets. Then graph integration from BIM and GIS semantic models creates IGIMRDF. Lastly, the information from IGIMRDF unified graph is filtered using a graph query language and graph data analytics tools. The linkage between BIMRDF and GISRDF is completed through SPARQL endpoints defined by queries using elements and entity classes with similar or complementary information from properties, relationships, and geometries from an ontology-matching process during model construction. The resulting model (or sub-model) can be managed in a graph database system and used in the backend as a data-tier serving web services feeding a front-tier domain-oriented application. A case study was designed, developed, and tested using the semantic integrated information model for validating the newly proposed solution, architecture, and performance
PLBD: protein–ligand binding database of thermodynamic and kinetic intrinsic parameters
We introduce a protein–ligand binding database (PLBD) that presents thermodynamic and kinetic data of reversible protein interactions with small molecule compounds. The manually curated binding data are linked to protein–ligand crystal structures, enabling structure–thermodynamics correlations to be determined. The database contains over 5500 binding datasets of 556 sulfonamide compound interactions with the 12 catalytically active human carbonic anhydrase isozymes defined by fluorescent thermal shift assay, isothermal titration calorimetry, inhibition of enzymatic activity and surface plasmon resonance. In the PLBD, the intrinsic thermodynamic parameters of interactions are provided, which account for the binding-linked protonation reactions. In addition to the protein–ligand binding affinities, the database provides calorimetrically measured binding enthalpies, providing additional mechanistic understanding. The PLBD can be applied to investigations of protein–ligand recognition and could be integrated into small molecule drug design
Towards a Traceable Data Model Accommodating Bounded Uncertainty for DST Based Computation of BRCA1/2 Mutation Probability With Age
In this paper, we describe the requirements for traceable open-source data retrieval in the context of computation of BRCA1/2 mutation probabilities (mutations in two tumor-suppressor genes responsible for hereditary BReast or/and ovarian CAncer). We show how such data can be used to develop a Dempster-Shafer model for computing the probability of BRCA1/2 mutations enhanced by taking into account the actual age of a patient or a family member in an appropriate way even if it is not known exactly. The model is compared with PENN II and BOADICEA (based on undisclosed data), two established platforms for this purpose accessible online, as well as with our own previous models. A proof-of-concept implementation shows that set-based techniques are able to provide better information about mutation probabilities, simultaneously highlighting the necessity for ground truth data of high quality
A robotic platform for precision agriculture and applications
Agricultural techniques have been improved over the centuries to match with the growing demand of an increase in global population. Farming applications are facing new challenges to satisfy global needs and the recent technology advancements in terms of robotic platforms can be exploited.
As the orchard management is one of the most challenging applications because of its tree structure and the required interaction with the environment, it was targeted also by the University of Bologna research group to provide a customized solution addressing new concept for agricultural vehicles.
The result of this research has blossomed into a new lightweight tracked vehicle capable of performing autonomous navigation both in the open-filed scenario and while travelling inside orchards for what has been called in-row navigation. The mechanical design concept, together with customized software implementation has been detailed to highlight the strengths of the platform and some further improvements envisioned to improve the overall performances.
Static stability testing has proved that the vehicle can withstand steep slopes scenarios. Some improvements have also been investigated to refine the estimation of the slippage that occurs during turning maneuvers and that is typical of skid-steering tracked vehicles.
The software architecture has been implemented using the Robot Operating System (ROS) framework, so to exploit community available packages related to common and basic functions, such as sensor interfaces, while allowing dedicated custom implementation of the navigation algorithm developed.
Real-world testing inside the university’s experimental orchards have proven the robustness and stability of the solution with more than 800 hours of fieldwork.
The vehicle has also enabled a wide range of autonomous tasks such as spraying, mowing, and on-the-field data collection capabilities. The latter can be exploited to automatically estimate relevant orchard properties such as fruit counting and sizing, canopy properties estimation, and autonomous fruit harvesting with post-harvesting estimations.Le tecniche agricole sono state migliorate nel corso dei secoli per soddisfare la crescente domanda di aumento della popolazione mondiale. I recenti progressi tecnologici in termini di piattaforme robotiche possono essere sfruttati in questo contesto.
Poiché la gestione del frutteto è una delle applicazioni più impegnative, a causa della sua struttura arborea e della necessaria interazione con l'ambiente, è stata oggetto di ricerca per fornire una soluzione personalizzata che sviluppi un nuovo concetto di veicolo agricolo.
Il risultato si è concretizzato in un veicolo cingolato leggero, capace di effettuare una navigazione autonoma sia nello scenario di pieno campo che all'interno dei frutteti (navigazione interfilare). La progettazione meccanica, insieme all'implementazione del software, sono stati dettagliati per evidenziarne i punti di forza, accanto ad alcuni ulteriori miglioramenti previsti per incrementarne le prestazioni complessive.
I test di stabilità statica hanno dimostrato che il veicolo può resistere a ripidi pendii. Sono stati inoltre studiati miglioramenti per affinare la stima dello slittamento che si verifica durante le manovre di svolta, tipico dei veicoli cingolati.
L'architettura software è stata implementata utilizzando il framework Robot Operating System (ROS), in modo da sfruttare i pacchetti disponibili relativi a componenti base, come le interfacce dei sensori, e consentendo al contempo un'implementazione personalizzata degli algoritmi di navigazione sviluppati.
I test in condizioni reali all'interno dei frutteti sperimentali dell'universitĂ hanno dimostrato la robustezza e la stabilitĂ della soluzione con oltre 800 ore di lavoro sul campo.
Il veicolo ha permesso di attivare e svolgere un'ampia gamma di attivitĂ agricole in maniera autonoma, come l'irrorazione, la falciatura e la raccolta di dati sul campo. Questi ultimi possono essere sfruttati per stimare automaticamente le proprietĂ piĂą rilevanti del frutteto, come il conteggio e la calibratura dei frutti, la stima delle proprietĂ della chioma e la raccolta autonoma dei frutti con stime post-raccolta
Research and development for the data, trigger and control card in preparation for Hi-Lumi lhc
When the Large Hadron Collider (LHC) increases its luminosity by an order of magnitude in the coming decade, the experiments that sit upon it must also be upgraded to continue to their physics performance in the increasingly demanding environment. To achieve this, the Compact Muon Solenoid (CMS) experiment will make use of tracking information in the Level-1 trigger for the first time, meaning that track reconstruction must be achieved in less than 4 ÎĽs in an all-FPGA architecture.
MUonE is an experiment aiming to make an accurate measurement of the the hadronic contribution to the anomalous magnetic moment of the muon. It will achieve this by making use of similar apparatus to that designed for CMS and benefit from the research and development efforts there.
This thesis presents both development and testing work for the readout chain from tracker module to back-end processing card, as well as the results and analysis of a beam test used to validate this chain for both CMS and the MUonE experiment.Open Acces
IoT Data Processing for Smart City and Semantic Web Applications
The world has been experiencing rapid urbanization over the last few decades,
putting a strain on existing city infrastructure such as waste management,
water supply management, public transport and electricity consumption. We are
also seeing increasing pollution levels in cities threatening the environment,
natural resources and health conditions. However, we must realize that the real
growth lies in urbanization as it provides many opportunities to individuals
for better employment, healthcare and better education. However, it is
imperative to limit the ill effects of rapid urbanization through integrated
action plans to enable the development of growing cities. This gave rise to the
concept of a smart city in which all available information associated with a
city will be utilized systematically for better city management.
The proposed system architecture is divided in subsystems and is discussed in
individual chapters. The first chapter introduces and gives overview to the
reader of the complete system architecture. The second chapter discusses the
data monitoring system and data lake system based on the oneM2M standards. DMS
employs oneM2M as a middleware layer to achieve interoperability, and DLS uses
a multi-tenant architecture with multiple logical databases, enabling efficient
and reliable data management. The third chapter discusses energy monitoring and
electric vehicle charging systems developed to illustrate the applicability of
the oneM2M standards. The fourth chapter discusses the Data Exchange System
based on the Indian Urban Data Exchange framework. DES uses IUDX standard data
schema and open APIs to avoid data silos and enable secure data sharing. The
fifth chapter discusses the 5D-IoT framework that provides uniform data quality
assessment of sensor data with meaningful data descriptions
Green Cities Artificial Intelligence
119 pagesIn an era defined by rapid urbanization, the effective planning and
management of cities have become paramount to ensure sustainable
development, efficient resource allocation, and enhanced quality of life
for residents. Traditional methods of urban planning and management
are grappling with the complexities and challenges presented by modern
cities. Enter Artificial Intelligence (AI), a disruptive technology that holds
immense potential to revolutionize the way cities are planned, designed,
and operated.
The primary aim of this report is to provide an in-depth exploration of the
multifaceted role that Artificial Intelligence plays in modern city planning
and management. Through a comprehensive analysis of key AI
applications, case studies, challenges, and ethical considerations, the
report aims to provide resources for urban planners, City staff, and
elected officials responsible for community planning and development.
These include a model City policy, draft informational public meeting
format, AI software and applications, implementation actions, AI
timeline, glossary, and research references. This report represents the
cumulative efforts of many participants and is sponsored by the City of
Salem and Sustainable City Year Program. The Green Cities AI project
website is at: https://blogs.uoregon.edu/artificialintelligence/.
As cities continue to evolve into complex ecosystems, the integration of
Artificial Intelligence stands as a pivotal force in shaping their
trajectories. Through this report, we aim to provide a comprehensive
understanding of how AI is transforming the way cities are planned,
operated, and experienced. By analyzing the tools, applications, and
ethical considerations, we hope to equip policymakers, urban planners,
and stakeholders with the insights needed to navigate the AI-driven
urban landscape effectively and create cities that are not only smart but
also sustainable, resilient, and regenerative.This year's SCYP partnership is possible in part due to support from U.S. Senators Ron Wyden and Jeff Merkley, as well as former Congressman Peter DeFazio, who secured federal funding for SCYP through Congressionally Directed Spending. With additional funding from the city of Salem, the partnerships will allow UO students and faculty to study and make recommendations on city-identified projects and issues
Recommended from our members
Using domain specific language and sequence to sequence models as a hybrid framework for a natural language interface to a database solution
This thesis was submitted for the award of Doctor of Philosophy and was awarded by Brunel University LondonThe aim of this project is to provide a new approach to solving the problem of
converting natural language into a language capable of querying a database or data
repository. This problem has been around for a while, in the 1970's the US Navy
developed a solution called LADDER and since then there have been an array of
solutions, approaches and tweaks that have kept the research community busy. The
introduction of electronic assistants into the smart phone in 2010 has given new
impetus to this problem.
With the increasingly pervasive nature of data and its ever expanding use to answer
questions within business science, medicine extracting data is becoming more important.
The idea behind this project is to make data more democratised by allowing access to it
without the need for specialist languages. The performance and reliability of converting
natural language into structured query language can be problematic in handling nuances
that are prevalent in natural language. Relational databases are not designed to understand
language nuance.
This project introduces the following components as part of a holistic approach to improving
the conversion of a natural language statement into a language capable of querying a data
repository.
â—Ź The idea proposed in this project combines the use of sequence to sequence models
in conjunction with the natural language part of speech technologies and domain
specific languages to convert natural language queries into SQL. The approach
being proposed by this chapter is to use natural language processing to perform an
initial shallow pass of the incoming query and then use Google's Tensor Flow to
refine the query with the use of a sequence to sequence model.
â—Ź This thesis is also proposing to use a Domain Specific Language (DSL) as part of the
conversion process. The use of the DSL has the potential to allow the natural
language query to be translated into more than just an SQL statement, but any query
language such as NoSQL or XQuery
- …