13 research outputs found
Leveraging data from a smart card automatic fare collection system for public transit planning
RĂSUMĂ Le systĂšme de transport en commun est une crĂ©ature artificielle et complexe. Lâinteraction spatio-temporelle entre lâoffre de service par les opĂ©rateurs et la demande des usagers est difficile Ă mesurer et Ă©volue constamment. Câest dans ce contexte que de nombreux efforts sont mis Ă la recherche de lâinformation et de la mĂ©thodologie qui peuvent contribuer Ă rĂ©vĂ©ler et Ă comprendre cette relation dynamique afin que les services rĂ©pondent aux besoins des voyageurs. RĂ©cemment, des changements aux paradigmes remodĂšlent ce processus. Dâune part, les opĂ©rateurs de transport en commun adoptent une orientation axĂ©e sur la performance et le client. Ceci demande des donnĂ©es qui ne sont pas recueillies par des enquĂȘtes traditionnelles. Dâautre part, lâavancement des systĂšmes automatiques de collecte de donnĂ©es et leur adoption par les opĂ©rateurs gĂ©nĂšrent une abondance de donnĂ©es dans un environnement oĂč la collecte de donnĂ©es Ă©tait auparavant limitĂ©e. Les systĂšmes dâanalyse et de planification sont souvent adaptĂ©s Ă ces rĂ©alitĂ©s et sont inadĂ©quats pour exploiter de nouvelles sources de donnĂ©es. Au confluent de ces Ă©volutions, se retrouvent un dĂ©fi et une opportunitĂ© : apprivoiser les nouvelles technologies informationnelles dans le but de les rĂ©concilier avec les besoins grandissants de donnĂ©es dans le domaine de transport en commun. Cette recherche se fonde sur un jeu de donnĂ©es de validation provenant dâun systĂšme de perception par carte Ă puce (CAP). Le but de la recherche est de dĂ©velopper de nouvelles mĂ©thodologies dâexploitation des donnĂ©es, notamment au niveau de leur traitement, de leur enrichissement et de leur analyse afin de mieux connaĂźtre la demande de transport en commun, dâamĂ©liorer la planification opĂ©rationnelle, de raffiner la gestion du systĂšme et de comprendre les comportements de dĂ©placement.
Le jeu de donnĂ©es principal provient du systĂšme de perception par CAP de la SociĂ©tĂ© de transport de lâOutaouais (STO). Le systĂšme est muni de GPS et le jeu contient toutes les validations dĂ©sagrĂ©gĂ©es pour le mois de fĂ©vrier 2005. Les technologies informationnelles, incluant la base de donnĂ©es relationnelle, le systĂšme dâinformation gĂ©ographique (SIG), les statistiques spatiales, le data mining et les visualisations, sont des principaux outils de traitement et dâanalyse.----------ABSTRACT Public transit system is an artificial and complex creature. The interaction between operatorsâ supply and usersâ demand is at the same time spatial and temporal. It is also difficult to measure and in constant evolution. There is a continuous quest for information and methodology that can help reveal and facilitate the understanding of this dynamic relationship, so that public transit services can be better organized to suit travelersâ needs. Recent paradigm shifts have contributed the reshaping of this process. On the one hand, public transit service has become more performance-driven and customer-oriented. These require data not covered by traditional survey methods. On the other hand, advances in passive data collection methods and their adoption by transit operators progressively transform the industry from data-poor to data-rich. Traditional analysis and planning tools are adapted to past conditions and are not suited to fully leverage new sources of data. At the confluence of these evolutions lies opportunity and challenge: to embrace the data-rich environment with the view of reconciling with the increasingly demanding data needs in public transit. The research is based on a set of validations data from a smart card automatic fare collection (AFC) system. The goal of the research is to develop new methods in data processing, data enrichment and data analysis in order to better quantify transit demand, enhance operations planning, improve system management and understand travel behaviour. The primary dataset comes from the smart card AFC of the SociĂ©tĂ© de transport de lâOutaouais (STO). The system is equipped with GPS and the dataset contains all fare validations in a disaggregate form for the month of February 2005. Information technologies, including relational database, geographic information system (GIS), spatial statistics, data mining and visualization are the main data processing and analysis tools. Three overall principles guide the research: the information-based (data-driven) approach, the totally disaggregated approach and the object-oriented approach. Combined with multi-day smart card data, these principles lead to the multi-day information approach, a new concept used in the proposed data processing and enrichment procedures. The assumption is that each day of data represent partial information of the universe and may contain errors. By synthesizing the correct information from each day, it is possible to reconstruct complete knowledge. The latter is in turn used as a reference to analyze and interpret multi-day data
Validation of design artefacts for blockchain-enabled precision healthcare as a service.
Healthcare systems around the globe are currently experiencing a rapid wave of digital disruption.
Current research in applying emerging technologies such as Big Data (BD), Artificial Intelligence
(AI), Machine Learning (ML), Deep Learning (DL), Augmented Reality (AR), Virtual Reality (VR),
Digital Twin (DT), Wearable Sensor (WS), Blockchain (BC) and Smart Contracts (SC) in contact
tracing, tracking, drug discovery, care support and delivery, vaccine distribution, management,
and delivery. These disruptive innovations have made it feasible for the healthcare industry to
provide personalised digital health solutions and services to the people and ensure sustainability
in healthcare. Precision Healthcare (PHC) is a new inclusion in digital healthcare that can support
personalised needs. It focuses on supporting and providing precise healthcare delivery. Despite
such potential, recent studies show that PHC is ineffectual due to the lower patient adoption in
the system. Anecdotal evidence shows that people are refraining from adopting PHC due to
distrust.
This thesis presents a BC-enabled PHC ecosystem that addresses ongoing issues and challenges
regarding low opt-in. The designed ecosystem also incorporates emerging information
technologies that are potential to address the need for user-centricity, data privacy and security,
accountability, transparency, interoperability, and scalability for a sustainable PHC ecosystem.
The research adopts Soft System Methodology (SSM) to construct and validate the design artefact
and sub-artefacts of the proposed PHC ecosystem that addresses the low opt-in problem.
Following a comprehensive view of the scholarly literature, which resulted in a draft set of design
principles and rules, eighteen design refinement interviews were conducted to develop the
artefact and sub-artefacts for design specifications. The artefact and sub-artefacts were validated
through a design validation workshop, where the designed ecosystem was presented to a Delphi
panel of twenty-two health industry actors. The key research finding was that there is a need for
data-driven, secure, transparent, scalable, individualised healthcare services to achieve
sustainability in healthcare. It includes explainable AI, data standards for biosensor devices,
affordable BC solutions for storage, privacy and security policy, interoperability, and usercentricity,
which prompts further research and industry application. The proposed ecosystem is
potentially effective in growing trust, influencing patients in active engagement with real-world
implementation, and contributing to sustainability in healthcare
Accountants\u27 index. Twenty-eighth supplement, January-December 1979, volume 2: M-Z
https://egrove.olemiss.edu/aicpa_accind/1034/thumbnail.jp
Accountants\u27 index. Twenty-eighth supplement, January-December 1979, volume 1: A-L
https://egrove.olemiss.edu/aicpa_accind/1033/thumbnail.jp
Task Allocation in Foraging Robot Swarms:The Role of Information Sharing
Autonomous task allocation is a desirable feature of robot swarms that collect and deliver items in scenarios where congestion, caused by accumulated items or robots, can temporarily interfere with swarm behaviour. In such settings, self-regulation of workforce can prevent unnecessary energy consumption. We explore two types of self-regulation: non-social, where robots become idle upon experiencing congestion, and social, where robots broadcast information about congestion to their team mates in order to socially inhibit foraging. We show that while both types of self-regulation can lead to improved energy efficiency and increase the amount of resource collected, the speed with which information about congestion flows through a swarm affects the scalability of these algorithms