4 research outputs found
Learning to use data analytics to manage an outsourced public service : a case study of organizational learning
Public service outsourcing has been a decades-long practice of governments intent
on downsizing and leveraging private enterprise to realize market efficiencies.
The practice has delivered challenges for public managers, a key one being
detecting and managing opportunism, service provider advantage-seeking
behaviour. Outsourcing challenges may be met with the emerging potential of data
analytics, but to realize the potential, an organization is faced with learning how
to use data, analysis and scientific methods so that an orientation for evidence based management becomes an organizational norm. In this thesis the literature
from outsourced service management, the phenomenon of opportunism, data
analytics and organizational learning is examined to synthesize significant
findings and validate exploring the research question: When data analytics is
introduced to manage opportunism and accountability for an outsourced public
service, in what ways does the organization learn to use data and analytical
methods for performance management? The research was designed using the
foundational tenets of qualitative research, involving participants in an action
research case study. Through prolonged, longitudinal engagement the case
delivered the experience of a learning journey using the interview as purposeful
conversation. The research was designed in three phases to implement technology
and processes for data analytics in a public registry service outsourced to a
network of providers, then support the organization to develop familiarity with
using analytics, and finally evaluate the learning journey. This study shows that
pressure from authoritative sources in the government and enthusiasm for the
public service value of accountability led to accelerated learning to work in new
ways with data and analytics to manage service provider performance. But the
outsourced environment, and more particularly the business environment played
critical roles to limit learning and shift the focus from new ways of working to
exploiting existing ways and to mire the perceived problems of the organization
as top-of-mind for many. This study underscores how complex and fraught with
barriers a learning journey is and supplies lessons on organizational learning for
academics and practitioners. The study adds depth and nuance to the value of the
learning lens, where learning was an outcome from a study of learning
Data quality issues in electronic health records for large-scale databases
Data Quality (DQ) in Electronic Health Records (EHRs) is one of the core functions that play a decisive role to improve the healthcare service quality. The DQ issues in EHRs are a noticeable trend to improve the introduction of an adaptive framework for interoperability and standards in Large-Scale Databases (LSDB) management systems. Therefore, large data communications are challenging in the traditional approaches to satisfy the needs of the consumers, as data is often not capture directly into the Database Management Systems (DBMS) in a seasonably enough fashion to enable their subsequent uses. In addition, large data plays a vital role in containing plenty of treasures for all the fields in the DBMS. EHRs technology provides portfolio management systems that allow HealthCare Organisations (HCOs) to deliver a higher quality of care to their patients than that which is possible with paper-based records. EHRs are in high demand for HCOs to run their daily services as increasing numbers of huge datasets occur every day. Efficient EHR systems reduce the data redundancy as well as the system application failure and increase the possibility to draw all necessary reports. However, one of the main challenges in developing efficient EHR systems is the inherent difficulty to coherently manage data from diverse heterogeneous sources. It is practically challenging to integrate diverse data into a global schema, which satisfies the need of users. The efficient management of EHR systems using an existing DBMS present challenges because of incompatibility and sometimes inconsistency of data structures. As a result, no common methodological approach is currently in existence to effectively solve every data integration problem. The challenges of the DQ issue raised the need to find an efficient way to integrate large EHRs from diverse heterogeneous sources. To handle and align a large dataset efficiently, the hybrid algorithm method with the logical combination of Fuzzy-Ontology along with a large-scale EHRs analysis platform has shown the results in term of improved accuracy. This study investigated and addressed the raised DQ issues to interventions to overcome these barriers and challenges, including the provision of EHRs as they pertain to DQ and has combined features to search, extract, filter, clean and integrate data to ensure that users can coherently create new consistent data sets. The study researched the design of a hybrid method based on Fuzzy-Ontology with performed mathematical simulations based on the Markov Chain Probability Model. The similarity measurement based on dynamic Hungarian algorithm was followed by the Design Science Research (DSR) methodology, which will increase the quality of service over HCOs in adaptive frameworks