1,965 research outputs found
Mobile heritage practices. Implications for scholarly research, user experience design, and evaluation methods using mobile apps.
Mobile heritage apps have become one of the most popular means for audience
engagement and curation of museum collections and heritage contexts. This
raises practical and ethical questions for both researchers and practitioners, such
as: what kind of audience engagement can be built using mobile apps? what are
the current approaches? how can audience engagement with these experience
be evaluated? how can those experiences be made more resilient, and in turn
sustainable? In this thesis I explore experience design scholarships together with
personal professional insights to analyse digital heritage practices with a view to
accelerating thinking about and critique of mobile apps in particular. As a result,
the chapters that follow here look at the evolution of digital heritage practices,
examining the cultural, societal, and technological contexts in which mobile
heritage apps are developed by the creative media industry, the academic
institutions, and how these forces are shaping the user experience design
methods. Drawing from studies in digital (critical) heritage, Human-Computer
Interaction (HCI), and design thinking, this thesis provides a critical analysis of
the development and use of mobile practices for the heritage. Furthermore,
through an empirical and embedded approach to research, the thesis also
presents auto-ethnographic case studies in order to show evidence that mobile
experiences conceptualised by more organic design approaches, can result in
more resilient and sustainable heritage practices. By doing so, this thesis
encourages a renewed understanding of the pivotal role of these practices in the
broader sociocultural, political and environmental changes.AHRC REAC
Essays on Panel Data Prediction Models
Forward-looking analysis is valuable for policymakers as they need effective strategies to mitigate imminent risks and potential challenges. Panel data sets contain time series information over a number of cross-sectional units and are known to have superior predictive abilities in comparison to time series only models. This PhD thesis develops novel panel data methods to contribute to the advancement of short-term forecasting and nowcasting of macroeconomic and environmental variables. The two most important highlights of this thesis are the use of cross-sectional dependence in panel data forecasting and to allow for timely predictions and ‘nowcasts’.Although panel data models have been found to provide better predictions in many empirical scenarios, forecasting applications so far have not included cross-sectional dependence. On the other hand, cross-sectional dependence is well-recognised in large panels and has been explicitly modelled in previous causal studies. A substantial portion of this thesis is devoted to developing cross-sectional dependence in panel models suited to diverse empirical scenarios. The second important aspect of this work is to integrate the asynchronous release schedules of data within and across panel units into the panel models. Most of the thesis emphasises the pseudo-real-time predictions with efforts to estimate the model on the data that has been released at the time of predictions, thus trying to replicate the realistic circumstances of delayed data releases.Linear, quantile and non-linear panel models are developed to predict a range of targets both in terms of their meaning and method of measurement. Linear models include panel mixed-frequency vector-autoregression and bridge equation set-ups which predict GDP growth, inflation and CO2 emissions. Panel quantile regressions and latent variable discrete choice models predict growth-at-risk and extreme episodes of cross-border capital flows, respectively. The datasets include both international cross-country panels as well as regional subnational panels. Depending on the nature of the model and the prediction targets, different precision criteria evaluate the accuracy of the models in out-of-sample settings. The generated predictions beat respective standard benchmarks in a more timely fashion
Essays on Risk Creation in the Banking Sector
This thesis consists of four essays exploring risk creation in the banking sector. The essays examine how conflicting interests can compromise the objectivity, judgment, and decision making of economic agents. Consequently, they may prioritize their personal or institutional interests over the best interests of others or the entire financial system. Chapter 2 delves into the conflict of interest that arises when a bank serves as an investor in the stock market. Chapter 3 revisits the discussion of the potential misalignment between sovereign incentives and the collective interests of the currency union, particularly in the bond market. Chapter 4 draws attention to a situation where regulations in the banking sector may be advantageous for a government in the sovereign bond market. Finally, Chapter 5 looks at the flip side of the coin, examining how banks may be susceptible to moral hazard concerns in their FX lending decisions, given that they do not fully bear the consequences of their actions
Redefining Disproportionate Arrest Rates: An Exploratory Quasi-Experiment that Reassesses the Role of Skin Tone
The New York Times reported that Black Lives Matter was the third most-read subject of 2020. These articles brought to the forefront the question of disparity in arrest rates for darker-skinned people. Questioning arrest disparity is understandable because virtually everything known about disproportionate arrest rates has been a guess, and virtually all prior research on disproportionate arrest rates is questionable because of improper benchmarking (the denominator effect). Current research has highlighted the need to switch from demographic data to skin tone data and start over on disproportionate arrest rate research; therefore, this study explored the relationship between skin tone and disproportionate arrest rates. This study also sought to determine which of the three theories surrounding disproportionate arrests is most predictive of disproportionate rates. The current theories are that disproportionate arrests increase as skin tone gets darker (stereotype threat theory), disproportionate rates are different for Black and Brown people (self-categorization theory), or disproportionate rates apply equally across all darker skin colors (social dominance theory). This study used a quantitative exploratory quasi-experimental design using linear spline regression to analyze arrest rates in Alachua County, Florida, before and after the county’s mandate to reduce arrests as much as possible during the COVID-19 pandemic to protect the prison population. The study was exploratory as no previous study has used skin tone analysis to examine arrest disparity. The findings of this study redefines the understanding of the existence and nature of disparities in arrest rates and offer a solid foundation for additional studies about the relationship between disproportionate arrest rates and skin color
Measuring the Impact of China’s Digital Heritage: Developing Multidimensional Impact Indicators for Digital Museum Resources
This research investigates how to best assess the impact of China’s digital heritage and focuses on digital museum resources. It is motivated by the need for tools to help governing bodies and heritage organisations assess the impact of digital heritage resources. The research sits at the intersection of Chinese cultural heritage, digital heritage, and impact assessment (IA) studies, which forms the theoretical framework of the thesis. Informed by the Balanced Value Impact (BVI) Model, this thesis addresses the following questions: 1. How do Western heritage discourses and Chinese culture shape ‘cultural heritage’ and the museum digital ecosystem in modern China? 2. Which indicators demonstrate the multidimensional impacts of digital museum resources in China? How should the BVI Model be adapted to fit the Chinese cultural landscape? 3. How do different stakeholders perceive these impact indicators? What are the implications for impact indicator development and application? This research applies a mixed-method approach, combining desk research, survey, and interview with both public audiences and museum professionals. The research findings identify 18 impact indicators, covering economic, social, innovation and operational dimensions. Notably, the perceived usefulness and importance of different impact indicators vary among and between public participants and museum professionals. The study finds the BVI Model helpful in guiding the indicator development process, particularly in laying a solid foundation to inform decision-making. The Strategic Perspectives and Value Lenses provide a structure to organise various indicators and keep them focused on the impact objectives. However, the findings also suggest that the Value Lenses are merely signifiers; their signified meanings change with cultural contexts and should be examined when the Model is applied in a different cultural setting. This research addresses the absence of digital resource IA in China’s heritage sector. It contributes to the field of IA for digital heritage within and beyond the Chinese context by challenging the current target-setting culture in performance evaluation. Moreover, the research ratifies the utility of the BVI Model while modifying it to fit China’s unique cultural setting. This thesis as a whole demonstrates the value of using multidimensional impact indicators for evidence-based decision-making and better museum practices in the digital domain
Spectrum auctions: designing markets to benefit the public, industry and the economy
Access to the radio spectrum is vital for modern digital communication. It is an essential component for smartphone capabilities, the Cloud, the Internet of Things, autonomous vehicles, and multiple other new technologies. Governments use spectrum auctions to decide which companies should use what parts of the radio spectrum. Successful auctions can fuel rapid innovation in products and services, unlock substantial economic benefits, build comparative advantage across all regions, and create billions of dollars of government revenues. Poor auction strategies can leave bandwidth unsold and delay innovation, sell national assets to firms too cheaply, or create uncompetitive markets with high mobile prices and patchy coverage that stifles economic growth. Corporate bidders regularly complain that auctions raise their costs, while government critics argue that insufficient revenues are raised. The cross-national record shows many examples of both highly successful auctions and miserable failures. Drawing on experience from the UK and other countries, senior regulator Geoffrey Myers explains how to optimise the regulatory design of auctions, from initial planning to final implementation. Spectrum Auctions offers unrivalled expertise for regulators and economists engaged in practical auction design or company executives planning bidding strategies. For applied economists, teachers, and advanced students this book provides unrivalled insights in market design and public management. Providing clear analytical frameworks, case studies of auctions, and stage-by-stage advice, it is essential reading for anyone interested in designing public-interested and successful spectrum auctions
UNPUBLISHING THE NEWS: AN ASSESSMENT OF U.S. PUBLIC OPINION, NEWSROOM ACCOUNTABILITY, AND JOURNALISTS’ AUTHORITY AS “THE FIRST DRAFT OF HISTORY”
Unpublishing, or the manipulation, deindexing, or removal of published content on a news organization’s website, is a hotly debated issue in the news industry that disrupts fundamental beliefs about the nature of news and the roles of journalists. This dissertation’s premise is that unpublishing as a phenomenon challenges the authority of journalism as “the first draft of history,” questions the assumed relevance of traditional norms, and creates an opportunity to reconsider how news organizations demonstrate their accountability to the public. The study identifies public opinions related to unpublishing practices and approval of related journalistic norms through a public opinion survey of 1,350 U.S. adults. In tandem, a qualitative analysis of 62 editorial policies related to unpublishing offers the first inventory and assessment of emerging journalistic practices and the normative values journalists demonstrate through them. These contributions are valuable to both the academy and the news industry, as they identify a path forward for future research and provide desired guidance to U.S. news organizations. Findings suggest that in response to the unpublishing phenomenon, American journalists defend their professionalism primarily through the traditional professional paradigm of accuracy, invoking it to legitimize new guidelines whether those policies permitted or denounced unpublishing as a newsroom practice. Findings also show newsrooms are pledging increased levels of accountability to their communities and society at large, but how they might demonstrate that accountability more tactically was absent from policy discourse. In addition, both American adults and news organizations place a high value on the accuracy of previously published news content, yet the groups’ temporal conceptions of accuracy must be reconciled. Ultimately, the unpublishing phenomenon presents an opportunity for journalists to redefine their notions of accountability to their communities. Based on these findings, the study concludes with a call for American news organizations to abandon claims as the “first draft of history” in the digital era and assume the role of information custodians, proactively establishing and managing the lifecycle of content.Doctor of Philosoph
A Comprehensive Survey of Artificial Intelligence Techniques for Talent Analytics
In today's competitive and fast-evolving business environment, it is a
critical time for organizations to rethink how to make talent-related decisions
in a quantitative manner. Indeed, the recent development of Big Data and
Artificial Intelligence (AI) techniques have revolutionized human resource
management. The availability of large-scale talent and management-related data
provides unparalleled opportunities for business leaders to comprehend
organizational behaviors and gain tangible knowledge from a data science
perspective, which in turn delivers intelligence for real-time decision-making
and effective talent management at work for their organizations. In the last
decade, talent analytics has emerged as a promising field in applied data
science for human resource management, garnering significant attention from AI
communities and inspiring numerous research efforts. To this end, we present an
up-to-date and comprehensive survey on AI technologies used for talent
analytics in the field of human resource management. Specifically, we first
provide the background knowledge of talent analytics and categorize various
pertinent data. Subsequently, we offer a comprehensive taxonomy of relevant
research efforts, categorized based on three distinct application-driven
scenarios: talent management, organization management, and labor market
analysis. In conclusion, we summarize the open challenges and potential
prospects for future research directions in the domain of AI-driven talent
analytics.Comment: 30 pages, 15 figure
Analytical validation of innovative magneto-inertial outcomes: a controlled environment study.
peer reviewe
Exploiting Emotions via Composite Pretrained Embedding and Ensemble Language Model
Decisions in the modern era are based on more than just the available data; they also incorporate feedback from online sources. Processing reviews known as Sentiment analysis (SA) or Emotion analysis. Understanding the user's perspective and routines is crucial now-a-days for multiple reasons. It is used by both businesses and governments to make strategic decisions. Various architectural and vector embedding strategies have been developed for SA processing. Accurate representation of text is crucial for automatic SA. Due to the large number of languages spoken and written, polysemy and syntactic or semantic issues were common. To get around these problems, we developed effective composite embedding (ECE), a method that combines the advantages of vector embedding techniques that are either context-independent (like glove & fasttext) or context-aware (like XLNet) to effectively represent the features needed for processing. To improve the performace towards emotion or sentiment we proposed stacked ensemble model of deep lanugae models.ECE with Ensembled model is evaluated on balanced dataset to prove that it is a reliable embedding technique and a generalised model for SA.In order to evaluate ECE, cutting-edge ML and Deep net language models are deployed and comapared. The model is evaluated using benchmark datset such as MR, Kindle along with realtime tweet dataset of user complaints . LIME is used to verify the model's predictions and to provide statistical results for sentence.The model with ECE embedding provides state-of-art results with real time dataset as well
- …