11 research outputs found
Towards interactive betweenness centrality estimation for transportation network using capsule network
Includes bibliographical references.2022 Fall.The node importance of a graph needs to be estimated for many graph-based applications. One of the most popular metrics for measuring node importance is betweenness centrality, which measures the amount of influence a node has over the flow of information in a graph. However, the computation complexity of calculating betweenness centrality is extremely high with large- scale graphs. This is especially true when analyzing the road networks of states with millions of nodes and edges, making it infeasible to calculate their betweenness centrality (BC) in real- time using traditional iterative methods. The application of a machine learning model to predict the importance of nodes provides opportunities to address this issue. Graph Neural Networks (GNNs), which have been gaining popularity in recent years, are particularly well-suited for graph analysis. In this study, we propose a deep learning architecture RoadCaps to estimate the BC by merging Capsule Neural Networks with Graph Convolutional Networks (GCN), a convolution operation based GNN. We target the effective aggregation of features from neighbor nodes to approximate the correct BC of a node. We leverage patterns capturing the strength of the capsule network to effectively estimate the node level BC from the high-level information generated by the GCN block. We further compare the model accuracy and effectiveness of RoadCaps with the other two GCN-based models. We also analyze the efficiency and effectiveness of RoadCaps for different aspects like scalability and robustness. We perform one empirical benchmark with the road network for the entire state of California. The overall analysis shows that our proposed network can provide more accurate road importance estimation, which is helpful for rapid response planning such as evacuation during wildfires and flooding
Hierarchical Syntactic Models for Human Activity Recognition through Mobility Traces
Recognizing usersā daily life activities without disrupting their lifestyle is a key functionality to enable a broad variety of advanced services for a Smart City, from energy-efficient management of urban spaces to mobility optimization. In this paper, we propose a novel method for human activity recognition from a collection of outdoor mobility traces acquired through wearable devices. Our method exploits the regularities naturally present in human mobility patterns to construct syntactic models in the form of finite state automata, thanks to an approach known as grammatical inference. We also introduce a measure of similarity that accounts for the intrinsic hierarchical nature of such models, and allows to identify the common traits in the paths induced by different activities at various granularity levels. Our method has been validated on a dataset of real traces representing movements of users in a large metropolitan area. The experimental results show the effectiveness of our similarity measure to correctly identify a set of common coarse-grained activities, as well as their refinement at a finer level of granularity
Decentralised location-based reputation management system in IOT using blockchain
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial TechnologiesInternet of Things (IoT) allows an object to connect to the internet network and
observe or interact with a physical phenomenon. The communication technologies
allow an IoT device to discover and communicate with another one to exchange
services like humans do in their social network. Knowing the reputation of another
device is important to consider if it will trust before establishing a new connection
to avoid an unexpected behaviour. The reputation of a device can also be varied
depending on its geographical location. Thus, this thesis proposed an architecture
to manage reputation values of end devices in an IoT system, based on their located
area. To avoid a hard workload of the system in the cloud layer, the proposed architecture
follows the cloud-fog-edge concept by adding an intermediate layer called
a fog layer. In this layer, multiple smaller devices are distributed, so it used the
Blockchain technology to keep the reputation management to be consistent and
fault-tolerant across di erent nodes in the layer. Ethereum, which is a Blockchain
implementation, was used in this work to ease the management functionalities, because
it allows the Blockchain network to run a decentralised application through
the Smart Contracts. The location-based part of the system was done by storing
geographical areas in the Smart Contracts, and make the reputation values to be
subjected to di erent regions depending on device geographical location. To reduce
the spatial computation complexity in the Smart Contracts, the geographical
data are geocoded by either one of two di erent spatial indexing techniques called
Geohash and S2. This work introduced three experiments to test the proposed architecture,
to deploy the architecture in IoT devices, and to compare the two geocoding
techniques in the Smart Contracts. It also additionally proposed a compression algorithm
of the geocoded data. The results showed that the proposed architecture is
able to serve the objective of managing the reputation values based on location in
a decentralised way. The test case scenario also demonstrated that the IoT devices
were able to work as a Blockchain node. They also were able to discover the service
providers in an area and obtain their reputation values by querying through the fog
layer. Lastly, the comparison experiment results showed that Geohash performed
better inside the developed Smart Contracts, while S2 encoded the data much faster
outside the Smart Contracts. The proposed compression algorithm of geocoded data
resulted in a signi cant size reduction, but it was computationally heavier in the
developed Smart Contracts
Supercritical CO2 extraction pilot plant design - towards IoT integration
Interes za tehnologije visokog tlaka tijekom posljednjih desetljeÄa se intenzivno poveÄava. Ekstrakcija super kritiÄnim fluidima (SFE) je process koji predstavlja alternative konvencionalnim postupcima separacije. U procesu ekstrakcije superkritiÄnim fluidima koristi se ekoloÅ”ki CO2 kao ekstrakcijsko otapalo zbog relativno niskog kritiÄnog tlaka (7,38 MPa), niske kritiÄne temperature (304 K), poželjnih svojstava i niske cijene. Tijekom ovog postupka, potrebno je rabiti visoke tlakove. Ekstrakcijska posuda (posuda pod tlakom) je najvažnija oprema sustava, gdje se trebaju ostvariti kritiÄni uvjeti i gdje se odvija ekstrakcija. TakoÄer, u procesu se treba obratiti pažnja i na druge dijelove ureÄaja (separator, izmjenjivaÄe topline, ventile i sl.) zbog uporabe visokih tlakova. Sigurnost je najvažniji factor kada se radi o SFE sustavima i projektiranje takve opreme s potpunom sigurnosti procesa predstavlja vrlo težak zadatak. Stoga, kako bi se postigla visoka razina sigurnosti, pouzdan sustav kontrole mora biti osmiÅ”ljen kao komunikacijski segment sustava kontrole i podataka. RazliÄiti procesni parametri, kao Å”to su protok CO2, tlak i temperature ekstrakcije, utjeÄu na process ekstrakcije i kvalitetu dobivenog ekstrakta. Stoga ovi parametri trebaju biti precizno kontrolirani i nadzirani tijekom ekstrakcije. Projektiranje jednog laboratorijskog-pilot postrojenja za ekstrakciju superkritiÄnim CO2, te razvoj daljinskog upravljanja i nadzora sustava prikazani su u ovom radu. Razvijeni sustavi SFE (mehaniÄke i elektriÄne komponente) usporeÄeni su s postojeÄim komercijalnim sustavima, gdje su prezentirane njegove glavne prednosti u odnosu na postojeÄe sustave. OmoguÄavanjem daljinskog upravljanja i nadzora klasiÄna kontrola procesa je spojena s konceptom Interneta objekata - Internet of Things (IoT), gdje informacija postaje sve prisutna u ogromnom podruÄju Interneta.The interest in high pressure technology during last decades increased intensively. Supercritical Fluid Extraction (SFE) is a process that is growing in importance as an alternative to conventional separation processes. SFE uses environmentally friendly CO2 as the extracting agent in the process because of its relatively low critical pressure (7,38 MPa), its low critical temperature (304 K), its non-dangerous character and low cost. During this process it is necessary to use high pressures in the procedure. The extractor vessel (pressure vessel) is the most important equipment of the system, where the supercritical conditions need to be established and the extraction occurs. Also other devices (separator vessel, heat exchangers, valves etc.) are necessary to be involved in the process due to used high pressures. Safety is the most important factor while dealing with SFE systems and the design of such equipment with full safety of process is very hard task. Therefore, to achieve the high desired safety level, a reliable control system must be designed as the control system and data communication segment. Various different process parameters such as CO2 mass flow rate, extraction pressures and temperatures affect the extraction process and the quality of the extract; hence these parameters need to be precisely controlled and monitored during the extraction. A design of one supercritical CO2 extraction laboratory-pilot plant and development of a remote control and its supervision system is presented in this paper. The developed SFE system (mechanical and electrical components) was compared with the existing commercial systems and its main advantages over the existing systems are presented. By enabling remote control and supervision the classical process control is joined with the concept of Internet of Things (IoT), where the information becomes omnipresent in the vast realm of Internet
Anonymization of Event Logs for Network Security Monitoring
A managed security service provider (MSSP) must collect security event logs from
their customersā network for monitoring and cybersecurity protection. These logs
need to be processed by the MSSP before displaying it to the security operation
center (SOC) analysts. The employees generate event logs during their working hours
at the customersā site. One challenge is that collected event logs consist of personally
identifiable information (PII) data; visible in clear text to the SOC analysts or any
user with access to the SIEM platform.
We explore how pseudonymization can be applied to security event logs to help
protect individualsā identities from the SOC analysts while preserving data utility
when possible. We compare the impact of using different pseudonymization functions
on sensitive information or PII. Non-deterministic methods provide higher level of
privacy but reduced utility of the data.
Our contribution in this thesis is threefold. First, we study available architectures
with different threat models, including their strengths and weaknesses. Second, we
study pseudonymization functions and their application to PII fields; we benchmark
them individually, as well as in our experimental platform. Last, we obtain valuable
feedbacks and lessons from SOC analysts based on their experience.
Existing works[43, 44, 48, 39] are generally restricting to the anonymization of
the IP traces, which is only one part of the SOC analystsā investigation of PCAP
files inspection. In one of the closest work[47], the authors provide useful, practical
anonymization methods for the IP addresses, ports, and raw logs
Understanding the Socio-infrastructure Systems During Disaster from Social Media Data
Our socio-infrastructure systems are becoming more and more vulnerable due to the increased severity and frequency of extreme events every year. Effective disaster management can minimize the damaging impacts of a disaster to a large extent. The ubiquitous use of social media platforms in GPS enabled smartphones offers a unique opportunity to observe, model, and predict human behavior during a disaster. This dissertation explores the opportunity of using social media data and different modeling techniques towards understanding and managing disaster more dynamically. In this dissertation, we focus on four objectives. First, we develop a method to infer individual evacuation behaviors (e.g., evacuation decision, timing, destination) from social media data. We develop an input output hidden Markov model to infer evacuation decisions from user tweets. Our findings show that using geo-tagged posts and text data, a hidden Markov model can be developed to capture the dynamics of hurricane evacuation decision. Second, we develop evacuation demand prediction model using social media and traffic data. We find that trained from social media and traffic data, a deep learning model can predict well evacuation traffic demand up to 24 hours ahead. Third, we present a multi-label classification approach to identify the co-occurrence of multiple types of infrastructure disruptions considering the sentiment towards a disruptionāwhether a post is reporting an actual disruption (negative), or a disruption in general (neutral), or not affected by a disruption (positive). We validate our approach for data collected during multiple hurricanes. Fourth, finally we develop an agent-based model to understand the influence of multiple information sources on risk perception dynamics and evacuation decisions. In this study, we explore the effects of socio-demographic factors and information sources such as social connectivity, neighborhood observation, and weather information and its credibility in forming risk perception dynamics and evacuation decisions
Recommended from our members
Preserving Privacy in Mobile Environments
Technology is improving day-by-day and so is the usage of mobile devices. Every activity that would involve manual and paper transactions can now be completed in seconds using your ngertips. On one hand, life has become fairly convenient with the help of mobile devices, whereas on the other hand privacy of the data and the transactions occurring in the process have been under continuous threat. Mobile devices connect to a number of service providers for various reasons. These could include downloading data, online purchasing or could be just used to browse information which may be irrelevant at a later point. Access to critical and sensitive information may be available at a number of places. In case of a mobile device, the information may be available with the service provider. Service Provider could be in the form of any web portal. In all such scenarios, passing the information or data from the service provider into the mobile device is a major challenge, as the data/information cannot be sent in plain text format. The con dentiality and integrity of the data needs to be protected and hence, the service provider must convert the data into an encrypted format before passing it onto the mobile device, to prevent risks from sniffing and unauthorized disclosure of data. Preserving the location of the individual user of any mobile device has also been the concern for a number of researchers.
Mobile devices have become an important tool in modern communication. Mobile and other handheld devices such as ipads and tablets have over taken laptops and desktops and hence there has been an increasing research interest in this area in recent years. This includes improving the quality of communication and the overall end-to-end data security in day-to-day transactions. Mobile devices continuously connect to di erent service providers for day-to-day needs such as online purchases, online banking and endless sur ng for information. In addition to this devices could be connecting to the service providers to receive or send sensitive information. At the Service Provider end, the data would be stored with the provider and Service Provider would only hand over the data if it con rms that the person requested it is authorized to receive the information. The exchange of data from one end of the network to the other is a major challenge due to malicious intruder mishandling of the data. Hence the con dentiality and integrity of the data needs to be protected either by transforming the sensitive information into a non-readable format or by converting into a cipher text.
Privacy has been an open problem for research as more and more information is getting leaked on a day-to-day basis. Through this thesis, I have tried to address a number of areas within the privacy realm where information and data access and sharing is a key concern along side the key aspect of location privacy. I have also tried to address the problems in the space of access control wherein I have proposed policy based languages and extensions for ensuring appropriate access control methodologies. The main goal and focus in this work has been to enforce the importance of location privacy in mobile environments and to propose solutions that resolve the problems of where and when to enforce location security. Another key goal of this work has been to create new access control and trust based solutions to ensure the right level of access to the right receiver of information. Through my research, I have explored the various privacy related attacks and suggested appropriate countermeasures for the same. In addition to proposing and showcasing solutions using policy languages for access control, I have also introduced geospatial access control solutions to ensure that the right user is accessing or requesting for the right information from the right location. This helps the appropriate and the right use of the information by the right resource. Through my thesis I have also given equal importance to the trust aspects of sharing information. I have created new trust assessment models to show how fused information can be handled and how can trust be imposed on the information provider and the information itself.
The main contribution of this thesis is to address the problems around protecting the data and individual's privacy and to propose solutions to mitigate these issues using new and novel techniques. They can be detailed as the following:
In privacy, there is always a privacy versus utility tradeo and in order to make use of utility, trust in the location is essential. Through this research I have developed i) novel attestation models and access control methodologies including Privacy Preferences Platform (P3P) extensions, ii) Extensible Access Control Markup Language (XACML) extensions and iii) Geospatial access control through GeoXACML. iv)I have created new methodologies to enforce location privacy and shown where best to enforce privacy. v)I have also shown that global attestation is very crucial for privacy and needs accurate methods in place to attest user's location information for access. vi) Fusing of location information is very crucial as there could be a number of similar or con icting information produced about a common source and it is very important to assess and evaluate the trust level in the information. I have proposed, developed and implemented a new trust assessment framework. This framework looks at the incoming information and passes it on to the rule engine in the framework to make some inferences and then the trust assessment module computes the trust score based on forward chaining or background chaining scheme. The framework is used to evaluate the trust on the fused information in a streaming setup. vii) I have created new solutions to look at the similarity pro les and create identity enforcement through pro ling. I have shown methods of anonymisation for location privacy and identity privacy