748 research outputs found
Mobile heritage practices. Implications for scholarly research, user experience design, and evaluation methods using mobile apps.
Mobile heritage apps have become one of the most popular means for audience
engagement and curation of museum collections and heritage contexts. This
raises practical and ethical questions for both researchers and practitioners, such
as: what kind of audience engagement can be built using mobile apps? what are
the current approaches? how can audience engagement with these experience
be evaluated? how can those experiences be made more resilient, and in turn
sustainable? In this thesis I explore experience design scholarships together with
personal professional insights to analyse digital heritage practices with a view to
accelerating thinking about and critique of mobile apps in particular. As a result,
the chapters that follow here look at the evolution of digital heritage practices,
examining the cultural, societal, and technological contexts in which mobile
heritage apps are developed by the creative media industry, the academic
institutions, and how these forces are shaping the user experience design
methods. Drawing from studies in digital (critical) heritage, Human-Computer
Interaction (HCI), and design thinking, this thesis provides a critical analysis of
the development and use of mobile practices for the heritage. Furthermore,
through an empirical and embedded approach to research, the thesis also
presents auto-ethnographic case studies in order to show evidence that mobile
experiences conceptualised by more organic design approaches, can result in
more resilient and sustainable heritage practices. By doing so, this thesis
encourages a renewed understanding of the pivotal role of these practices in the
broader sociocultural, political and environmental changes.AHRC REAC
Regulating competition in the digital network industry: A proposal for progressive ecosystem regulation
The digital sector is a cornerstone of the modern economy, and regulating digital enterprises can be considered the new frontier for regulators and competition authorities. To capture and address the competitive dynamics of digital markets we need to rethink our (competition) laws and regulatory strategies. The thesis develops new approaches to regulating digital markets by viewing them as part of a network industry. By combining insights from our experiences with existing regulation in telecommunications with insights from economics literature and management theory, the thesis concludes by proposing a new regulatory framework called âprogressive ecosystem regulationâ. The thesis is divided in three parts and has three key findings or contributions. The first part explains why digital platforms such as Googleâs search engine, Metaâs social media platforms and Amazonâs Marketplace are prone to monopolization. Here, the thesis develops a theory of âdigital natural monopolyâ, which explains why competition in digital platform markets is likely to lead to concentration by its very nature.The second part of the thesis puts forward that competition in digital markets persists, even if there is monopoly in a market. Here, the thesis develops a conceptual framework for competition between digital ecosystems, which consists of group of actors and products. Digital enterprises compete to carve out a part of the digital network industry where they can exert control, and their strong position in a platform market can be used offensively or defensively to steer competition between ecosystems. The thesis then sets out four phases of ecosystem competition, which helps to explain when competition in the digital network industry is healthy and when it is likely to become problematic.The third and final part of the thesis brings together these findings and draws lessons from our experiences of regulating the network industry for telecommunications. Based on the insights developed in the thesis it puts forward a proposal for âprogressive ecosystem regulationâ. The purpose of this regulation is to protect and empower entrants from large digital ecosystems so that they can develop new products and innovate disruptively. This regulatory framework would create three regulatory pools: a heavily regulated, lightly regulated and entrant pool. The layered regulatory framework allows regulators to adjust who receives protection under the regulation and who faces the burdens relatively quickly, so that the regulatory framework reflects the fast pace of innovation and changing nature of digital markets. With this proposal, the thesis challenges and enriches our existing notions on regulation and specifically how we should regulate digital markets
Application of knowledge management principles to support maintenance strategies in healthcare organisations
Healthcare is a vital service that touches people's lives on a daily basis by providing treatment and
resolving patients' health problems through the staff. Human lives are ultimately dependent on the skilled
hands of the staff and those who manage the infrastructure that supports the daily operations of the
service, making it a compelling reason for a dedicated research study. However, the UK healthcare sector
is undergoing rapid changes, driven by rising costs, technological advancements, changing patient
expectations, and increasing pressure to deliver sustainable healthcare. With the global rise in healthcare
challenges, the need for sustainable healthcare delivery has become imperative. Sustainable healthcare
delivery requires the integration of various practices that enhance the efficiency and effectiveness of
healthcare infrastructural assets. One critical area that requires attention is the management of
healthcare facilities.
Healthcare facilitiesis considered one of the core elements in the delivery of effective healthcare services,
as shortcomings in the provision of facilities management (FM) services in hospitals may have much more
drastic negative effects than in any other general forms of buildings. An essential element in healthcare
FM is linked to the relationship between action and knowledge. With a full sense of understanding of
infrastructural assets, it is possible to improve, manage and make buildings suitable to the needs of users
and to ensure the functionality of the structure and processes.
The premise of FM is that an organisation's effectiveness and efficiency are linked to the physical
environment in which it operates and that improving the environment can result in direct benefits in
operational performance. The goal of healthcare FM is to support the achievement of organisational
mission and goals by designing and managing space and infrastructural assets in the best combination of
suitability, efficiency, and cost. In operational terms, performance refers to how well a building
contributes to fulfilling its intended functions.
Therefore, comprehensive deployment of efficient FM approaches is essential for ensuring quality
healthcare provision while positively impacting overall patient experiences. In this regard, incorporating
knowledge management (KM) principles into hospitals' FM processes contributes significantly to ensuring
sustainable healthcare provision and enhancement of patient experiences. Organisations implementing
KM principles are better positioned to navigate the constantly evolving business ecosystem easily.
Furthermore, KM is vital in processes and service improvement, strategic decision-making, and
organisational adaptation and renewal.
In this regard, KM principles can be applied to improve hospital FM, thereby ensuring sustainable
healthcare delivery. Knowledge management assumes that organisations that manage their
organisational and individual knowledge more effectively will be able to cope more successfully with the challenges of the new business ecosystem. There is also the argument that KM plays a crucial role in
improving processes and services, strategic decision-making, and adapting and renewing an organisation.
The goal of KM is to aid action â providing "a knowledge pull" rather than the information overload most
people experience in healthcare FM. Other motivations for seeking better KM in healthcare FM include
patient safety, evidence-based care, and cost efficiency as the dominant drivers. The most evidence exists
for the success of such approaches at knowledge bottlenecks, such as infection prevention and control,
working safely, compliances, automated systems and reminders, and recall based on best practices. The
ability to cultivate, nurture and maximise knowledge at multiple levels and in multiple contexts is one of
the most significant challenges for those responsible for KM. However, despite the potential benefits,
applying KM principles in hospital facilities is still limited. There is a lack of understanding of how KM can
be effectively applied in this context, and few studies have explored the potential challenges and
opportunities associated with implementing KM principles in hospitals facilities for sustainable healthcare
delivery.
This study explores applying KM principles to support maintenance strategies in healthcare organisations.
The study also explores the challenges and opportunities, for healthcare organisations and FM
practitioners, in operationalising a framework which draws the interconnectedness between healthcare.
The study begins by defining healthcare FM and its importance in the healthcare industry. It then discusses
the concept of KM and the different types of knowledge that are relevant in the healthcare FM sector.
The study also examines the challenges that healthcare FM face in managing knowledge and how the
application of KM principles can help to overcome these challenges. The study then explores the different
KM strategies that can be applied in healthcare FM. The KM benefits include improved patient outcomes,
reduced costs, increased efficiency, and enhanced collaboration among healthcare professionals.
Additionally, issues like creating a culture of innovation, technology, and benchmarking are considered.
In addition, a framework that integrates the essential concepts of KM in healthcare FM will be presented
and discussed.
The field of KM is introduced as a complex adaptive system with numerous possibilities and challenges.
In this context, and in consideration of healthcare FM, five objectives have been formulated to achieve
the research aim. As part of the research, a number of objectives will be evaluated, including appraising
the concept of KM and how knowledge is created, stored, transferred, and utilised in healthcare FM,
evaluating the impact of organisational structure on job satisfaction as well as exploring how cultural
differences impact knowledge sharing and performance in healthcare FM organisations.
This study uses a combination of qualitative methods, such as meetings, observations, document analysis
(internal and external), and semi-structured interviews, to discover the subjective experiences of
healthcare FM employees and to understand the phenomenon within a real-world context and attitudes of healthcare FM as the data collection method, using open questions to allow probing where appropriate
and facilitating KM development in the delivery and practice of healthcare FM.
The study describes the research methodology using the theoretical concept of the "research onion". The
qualitative research was conducted in the NHS acute and non-acute hospitals in Northwest England.
Findings from the research study revealed that while the concept of KM has grown significantly in recent
years, KM in healthcare FM has received little or no attention. The target population was fifty (five FM
directors, five academics, five industry experts, ten managers, ten supervisors, five team leaders and ten
operatives). These seven groups were purposively selected as the target population because they play a
crucial role in KM enhancement in healthcare FM. Face-to-face interviews were conducted with all
participants based on their pre-determined availability. Out of the 50-target population, only 25 were
successfully interviewed to the point of saturation. Data collected from the interview were coded and
analysed using NVivo to identify themes and patterns related to KM in healthcare FM.
The study is divided into eight major sections. First, it discusses literature findings regarding healthcare
FM and KM, including underlying trends in FM, KM in general, and KM in healthcare FM. Second, the
research establishes the study's methodology, introducing the five research objectives, questions and
hypothesis. The chapter introduces the literature on methodology elements, including philosophical views
and inquiry strategies. The interview and data analysis look at the feedback from the interviews. Lastly, a
conclusion and recommendation summarise the research objectives and suggest further research.
Overall, this study highlights the importance of KM in healthcare FM and provides insights for healthcare
FM directors, managers, supervisors, academia, researchers and operatives on effectively leveraging
knowledge to improve patient care and organisational effectiveness
Regulating competition in the digital network industry: A proposal for progressive ecosystem regulation
The digital sector is a cornerstone of the modern economy, and regulating digital enterprises can be considered the new frontier for regulators and competition authorities. To capture and address the competitive dynamics of digital markets we need to rethink our (competition) laws and regulatory strategies. The thesis develops new approaches to regulating digital markets by viewing them as part of a network industry. By combining insights from our experiences with existing regulation in telecommunications with insights from economics literature and management theory, the thesis concludes by proposing a new regulatory framework called âprogressive ecosystem regulationâ. The thesis is divided in three parts and has three key findings or contributions. The first part explains why digital platforms such as Googleâs search engine, Metaâs social media platforms and Amazonâs Marketplace are prone to monopolization. Here, the thesis develops a theory of âdigital natural monopolyâ, which explains why competition in digital platform markets is likely to lead to concentration by its very nature.The second part of the thesis puts forward that competition in digital markets persists, even if there is monopoly in a market. Here, the thesis develops a conceptual framework for competition between digital ecosystems, which consists of group of actors and products. Digital enterprises compete to carve out a part of the digital network industry where they can exert control, and their strong position in a platform market can be used offensively or defensively to steer competition between ecosystems. The thesis then sets out four phases of ecosystem competition, which helps to explain when competition in the digital network industry is healthy and when it is likely to become problematic.The third and final part of the thesis brings together these findings and draws lessons from our experiences of regulating the network industry for telecommunications. Based on the insights developed in the thesis it puts forward a proposal for âprogressive ecosystem regulationâ. The purpose of this regulation is to protect and empower entrants from large digital ecosystems so that they can develop new products and innovate disruptively. This regulatory framework would create three regulatory pools: a heavily regulated, lightly regulated and entrant pool. The layered regulatory framework allows regulators to adjust who receives protection under the regulation and who faces the burdens relatively quickly, so that the regulatory framework reflects the fast pace of innovation and changing nature of digital markets. With this proposal, the thesis challenges and enriches our existing notions on regulation and specifically how we should regulate digital markets
Responsible AI in Africa
This open access book contributes to the discourse of Responsible Artificial Intelligence (AI) from an African perspective. It is a unique collection that brings together prominent AI scholars to discuss AI ethics from theoretical and practical African perspectives and makes a case for African values, interests, expectations and principles to underpin the design, development and deployment (DDD) of AI in Africa. The book is a first in that it pays attention to the socio-cultural contexts of Responsible AI that is sensitive to African cultures and societies. It makes an important contribution to the global AI ethics discourse that often neglects AI narratives from Africa despite growing evidence of DDD in many domains. Nine original contributions provide useful insights to advance the understanding and implementation of Responsible AI in Africa, including discussions on epistemic injustice of global AI ethics, opportunities and challenges, an examination of AI co-bots and chatbots in an African work space, gender and AI, a consideration of African philosophies such as Ubuntu in the application of AI, African AI policy, and a look towards a future of Responsible AI in Africa. This is an open access book
An Experimental Evaluation of Machine Learning Training on a Real Processing-in-Memory System
Training machine learning (ML) algorithms is a computationally intensive
process, which is frequently memory-bound due to repeatedly accessing large
training datasets. As a result, processor-centric systems (e.g., CPU, GPU)
suffer from costly data movement between memory units and processing units,
which consumes large amounts of energy and execution cycles. Memory-centric
computing systems, i.e., with processing-in-memory (PIM) capabilities, can
alleviate this data movement bottleneck.
Our goal is to understand the potential of modern general-purpose PIM
architectures to accelerate ML training. To do so, we (1) implement several
representative classic ML algorithms (namely, linear regression, logistic
regression, decision tree, K-Means clustering) on a real-world general-purpose
PIM architecture, (2) rigorously evaluate and characterize them in terms of
accuracy, performance and scaling, and (3) compare to their counterpart
implementations on CPU and GPU. Our evaluation on a real memory-centric
computing system with more than 2500 PIM cores shows that general-purpose PIM
architectures can greatly accelerate memory-bound ML workloads, when the
necessary operations and datatypes are natively supported by PIM hardware. For
example, our PIM implementation of decision tree is faster than a
state-of-the-art CPU version on an 8-core Intel Xeon, and faster
than a state-of-the-art GPU version on an NVIDIA A100. Our K-Means clustering
on PIM is and than state-of-the-art CPU and GPU
versions, respectively.
To our knowledge, our work is the first one to evaluate ML training on a
real-world PIM architecture. We conclude with key observations, takeaways, and
recommendations that can inspire users of ML workloads, programmers of PIM
architectures, and hardware designers & architects of future memory-centric
computing systems
Development of traceability solution for furniture components
Mestrado de dupla diplomação com a UTFPR - Universidade TecnolĂłgica Federal do ParanĂĄIn the contemporary context, characterized by intensified global competition and the constant evolution of the globalization landscape, it becomes imperative for industries, including Small and Medium Enterprises (SMEs), to undertake efforts to enhance their operational processes, often through digital technological adaptation. The present study falls within the scope of the project named âWood Work 4.0,â which aims to infuse innovation into the wood furniture manufacturing industry through process optimization and the adoption of digital technologies. This project received funding from the European Union Development Fund, in collaboration with the North 2020 Regional Program, and was carried out at the Carpintaria Mofreita company, located in Macedo de Cavaleiros, Portugal. In this regard, this study introduces a software architecture that supports the traceability of projects in the wood furniture industry and simultaneously employs a system to identify and manage material leftovers, aiming for more efficient waste management.
For the development of this software architecture, an approach that integrates the Fiware platform, specialized in systems for the Internet of Things (IoT), with an Application Programming Interface (API) specifically created to manage information about users, projects, and associated media files, was adopted. The material leftovers identification system employs image processing techniques to extract geometric characteristics of the materials. Additionally, these data are integrated into the companyâs database. In
this way, it was possible to develop an architecture that allows not only the capturing of project information but also its effective management. In the case of material leftovers identification, the system was able to establish, with a satisfactory degree of accuracy, the dimensions of the materials, enabling the insertion of these data into the companyâs database for resource management and optimization.No contexto contemporĂąneo, marcado por uma competição global intensificada e pela constante evolução do cenĂĄrio de globalização, torna-se imperativo para as indĂșstrias, incluindo as Pequenas e MĂ©dias Empresas (PMEs), empreender esforços para aprimorar seus processos operacionais, frequentemente pela via da adaptação tecnolĂłgica digital. O presente estudo insere-se dentro do escopo do projeto denominado âWood Work 4.0â, cujo propĂłsito Ă© infundir inovação na indĂșstria de fabricação de mĂłveis de madeira por meio da otimização de processos e da adoção de tecnologias digitais. Este projeto obteve financiamento do Fundo de Desenvolvimento da UniĂŁo Europeia, em colaboração com o programa Regional do Norte 2020 e foi realizado na empresa Carpintaria Mofreita, localizada em Macedo de Cavaleiros, Portugal. Nesse sentido, este estudo introduz uma arquitetura de software que oferece suporte Ă rastreabilidade de projetos na indĂșstria de mĂłveis de madeira, e simultaneamente emprega um sistema para identificar e gerenciar sobras de material, objetivando uma gestĂŁo de resĂduos mais eficiente. Para o desenvolvimento dessa arquitetura de software, adotou-se uma abordagem que integra a plataforma Fiware, especializada em sistemas para a Internet das Coisas (IoT), com uma Interface de Programação de AplicaçÔes (API) criada especificamente para gerenciar informaçÔes de usuĂĄrios, projetos, e arquivos de mĂdia associados. O sistema de identificação de sobras de material emprega tĂ©cnicas de processamento de imagem para extrair caracterĂsticas geomĂ©tricas dos materiais. Adicionalmente, esses dados sĂŁo integrados ao banco de dados
da empresa. Desta forma, foi possĂvel desenvolver uma arquitetura que permite nĂŁo sĂł capturar informaçÔes de projetos, mas tambĂ©m gerenciĂĄ-las de forma eficaz. No caso da identificação de sobras de material, o sistema foi capaz de estabelecer, com um grau de precisĂŁo satisfatĂłrio, as dimensĂ”es dos materiais, possibilitando a inserção desses dados no banco de dados da empresa para gestĂŁo e otimização do uso de recursos
Structured parallelism discovery with hybrid static-dynamic analysis and evaluation technique
Parallel computer architectures have dominated the computing landscape for the
past two decades; a trend that is only expected to continue and intensify, with increasing specialization and heterogeneity. This creates huge pressure across the software
stack to produce programming languages, libraries, frameworks and tools which will
efficiently exploit the capabilities of parallel computers, not only for new software, but
also revitalizing existing sequential code. Automatic parallelization, despite decades of
research, has had limited success in transforming sequential software to take advantage
of efficient parallel execution. This thesis investigates three approaches that use commutativity analysis as the enabler for parallelization. This has the potential to overcome
limitations of traditional techniques.
We introduce the concept of liveness-based commutativity for sequential loops.
We examine the use of a practical analysis utilizing liveness-based commutativity in a
symbolic execution framework. Symbolic execution represents input values as groups
of constraints, consequently deriving the output as a function of the input and enabling
the identification of further program properties. We employ this feature to develop an
analysis and discern commutativity properties between loop iterations. We study the
application of this approach on loops taken from real-world programs in the OLDEN
and NAS Parallel Benchmark (NPB) suites, and identify its limitations and related
overheads.
Informed by these findings, we develop Dynamic Commutativity Analysis (DCA), a
new technique that leverages profiling information from program execution with specific
input sets. Using profiling information, we track liveness information and detect loop
commutativity by examining the codeâs live-out values. We evaluate DCA against almost
1400 loops of the NPB suite, discovering 86% of them as parallelizable. Comparing
our results against dependence-based methods, we match the detection efficacy of two
dynamic and outperform three static approaches, respectively. Additionally, DCA is
able to automatically detect parallelism in loops which iterate over Pointer-Linked
Data Structures (PLDSs), taken from wide range of benchmarks used in the literature,
where all other techniques we considered failed. Parallelizing the discovered loops, our
methodology achieves an average speedup of 3.6Ă across NPB (and up to 55Ă) and up
to 36.9Ă for the PLDS-based loops on a 72-core host. We also demonstrate that our
methodology, despite relying on specific input values for profiling each program, is able
to correctly identify parallelism that is valid for all potential input sets.
Lastly, we develop a methodology to utilize liveness-based commutativity, as implemented in DCA, to detect latent loop parallelism in the shape of patterns. Our approach
applies a series of transformations which subsequently enable multiple applications
of DCA over the generated multi-loop code section and match its loop commutativity
outcomes against the expected criteria for each pattern. Applying our methodology on
sets of sequential loops, we are able to identify well-known parallel patterns (i.e., maps,
reduction and scans). This extends the scope of parallelism detection to loops, such
as those performing scan operations, which cannot be determined as parallelizable by
simply evaluating liveness-based commutativity conditions on their original form
Data Management for Dynamic Multimedia Analytics and Retrieval
Multimedia data in its various manifestations poses a unique challenge from a data storage and data management perspective, especially if search, analysis and analytics in large data corpora is considered. The inherently unstructured nature of the data itself and the curse of dimensionality that afflicts the representations we typically work with in its stead are cause for a broad range of issues that require sophisticated solutions at different levels. This has given rise to a huge corpus of research that puts focus on techniques that allow for effective and efficient multimedia search and exploration. Many of these contributions have led to an array of purpose-built, multimedia search systems.
However, recent progress in multimedia analytics and interactive multimedia retrieval, has demonstrated that several of the assumptions usually made for such multimedia search workloads do not hold once a session has a human user in the loop. Firstly, many of the required query operations cannot be expressed by mere similarity search and since the concrete requirement cannot always be anticipated, one needs a flexible and adaptable data management and query framework. Secondly, the widespread notion of staticity of data collections does not hold if one considers analytics workloads, whose purpose is to produce and store new insights and information. And finally, it is impossible even for an expert user to specify exactly how a data management system should produce and arrive at the desired outcomes of the potentially many different queries.
Guided by these shortcomings and motivated by the fact that similar questions have once been answered for structured data in classical database research, this Thesis presents three contributions that seek to mitigate the aforementioned issues. We present a query model that generalises the notion of proximity-based query operations and formalises the connection between those queries and high-dimensional indexing. We complement this by a cost-model that makes the often implicit trade-off between query execution speed and results quality transparent to the system and the user. And we describe a model for the transactional and durable maintenance of high-dimensional index structures.
All contributions are implemented in the open-source multimedia database system Cottontail DB, on top of which we present an evaluation that demonstrates the effectiveness of the proposed models. We conclude by discussing avenues for future research in the quest for converging the fields of databases on the one hand and (interactive) multimedia retrieval and analytics on the other
- âŠ