59 research outputs found
COVID-19 Outbreak and Beyond
The COVID-19 pandemic drastically changed our lifestyle when, on 30 January 2020, the World Health Organization declared the coronavirus disease outbreak a public health emergency of international concern. Since then, many governments have introduced unprecedented containment measures, hoping to slow the spread of the virus. International research suggests that both the pandemic and the related protective measures, such as lockdown, curfews, and social distancing, are having a profound impact on the mental health of the population. Among the most commonly observed psychological effects, there are high levels of stress, anxiety, depression, and post-traumatic symptoms, along with boredom and frustration. At the same time, the behavioral response of the population is of paramount importance to successfully contain the outbreak, creating a vicious circle in which the psychological distress impacts the willingness to comply with the protective measures, which, in turn, if prolonged, could exacerbate the population’s distress. This book includes: i) original studies on the worldwide psychological and behavioral impact of COVID-19 on targeted individuals (e.g., parents, social workers, patients affected by physical and mental disorders); ii) studies exploring the effect of COVID-19 using advanced statistical and methodological techniques (e.g., machine learning technologies); iii) research on practical applications that could help identify persons at risk, mitigate the negative effects of this situation, and offer insights to policymakers to manage the pandemic are also highly welcomed
The Convergence of Human and Artificial Intelligence on Clinical Care - Part I
This edited book contains twelve studies, large and pilots, in five main categories: (i) adaptive imputation to increase the density of clinical data for improving downstream modeling; (ii) machine-learning-empowered diagnosis models; (iii) machine learning models for outcome prediction; (iv) innovative use of AI to improve our understanding of the public view; and (v) understanding of the attitude of providers in trusting insights from AI for complex cases. This collection is an excellent example of how technology can add value in healthcare settings and hints at some of the pressing challenges in the field. Artificial intelligence is gradually becoming a go-to technology in clinical care; therefore, it is important to work collaboratively and to shift from performance-driven outcomes to risk-sensitive model optimization, improved transparency, and better patient representation, to ensure more equitable healthcare for all
Methods and tools for analysis and management of risks and regulatory compliance in the healthcare sector: the hospital at home – HaH
Changing or creating an organisation means creating a new process. Each process involves many risks that need to be identified and managed. The main risks considered here are procedural and legal risks. The former are related to the risks of errors that may occur during processes, while the latter are related to the compliance of processes with regulations. Managing the risks implies proposing changes to the processes that allow the desired result: an optimised process.
In order to manage a company and optimise it in the best possible way, not only should the organisational aspect, risk management and legal compliance be taken into account, but it is important that they are all analysed simultaneously with the aim of finding the right balance that satisfies them all. This is the aim of this thesis, to provide methods and tools to balance these three characteristics, and to enable this type of optimisation, ICT support is used.
This work isn’t a thesis in computer science or law, but rather an interdisciplinary thesis. Most of the work done so far is vertical and in a specific domain. The particularity and aim of this thesis is not to carry out an in-depth analysis of a particular aspect, but rather to combine several important aspects, normally analysed separately, which however have an impact and influence each other. In order to carry out this kind of interdisciplinary analysis, the knowledge base of both areas was involved and the combination and collaboration of different experts in the various fields was necessary.
Although the methodology described is generic and can be applied to all sectors, the case study considered is a new type of healthcare service that allows patients in acute disease to be hospitalised to their home. This provide the possibility to perform experiments using real hospital database
Análisis bibliométrico de la producción científica sobre Economía Experimental
La Economía Experimental (EE) es un método de trabajo de la Economía del
comportamiento que desarrolla modelos teóricos de comportamiento humano en
ámbitos económicos. Los experimentos económicos tienen ya una larga tradición, y
han proporcionado resultados espectaculares y conclusiones ampliamente
admitidas sobre la dinámica de mercados y el efecto de las instituciones
económicas. Las nuevas tecnologías facilitan la realización y el análisis de estos
experimentos. El objetivo principal de este estudio es la revisión sistemática de la
producción científica sobre Economía Experimental, desde el año 1990 hasta finales
de 2021, en las bases de datos de Web of Science Core Collection y Scopus. El
análisis descriptivo de datos se realizó con el software Rstudio, mientras que el
análisis de redes se hizo con el software Vosviewer. El estudio muestra, entre otras
cosas, que la producción bibliográfica en este campo se ha intensificado
exponencialmente; así como, que el país con más investigaciones es Estados
Unidos y el autor más citado es Urs Fischbacher.Experimental Economics (EE) is a working method of behavioral economics that
develops theoretical models of human behavior in economic settings. Economic
experiments have a long tradition, and have provided spectacular results and widely
accepted conclusions about market dynamics and the effect of economic institutions.
New technologies facilitate the conduct and analysis of these experiments. The main
objective of this study is the systematic review of the scientific production on
Experimental Economics, from 1990 to the end of 2021, in the Web of Science Core
Collection and Scopus databases. Descriptive data analysis was performed with
Rstudio software, while network analysis was performed with Vosviewer software.
The study shows, among other things, that the bibliographic production in this field
has intensified exponentially; as well as, that the country with the most research is
2
the United States and the most cited author is Urs Fischbacher.Universidad de Sevilla. Doble Grado en Matemáticas y Estadístic
Methods and tools for analysis and management of risks and regulatory compliance in the healthcare sector: the Hospital at Home – HaH
Changing or creating a new organization means creating a new process. Each process involves many risks that need to be identified and managed. The main risks considered here are procedural risks and legal risks. The former are related to the risks of errors that may occur during processes, while the latter are related to the compliance of processes with regulations. Therefore, managing the risks implies proposing changes to the processes that allow the desired result: an optimized process. In order to manage a company and optimize it in the best possible way, not only should the organizational aspect, risk management and legal compliance be taken into account, but it is important that they are all analyzed simultaneously with the aim of finding the right balance that satisfies them all. This is exactly the aim of this thesis, to provide methods and tools to balance these three characteristics, and to enable this type of optimization, ICT support is used. This work is not intended to be a computer science or law thesis but an interdisciplinary thesis. Most of the work done so far is vertical and in a specific domain. The particularity and aim of this thesis is not so much to carry out an in-depth analysis of a particular aspect, but rather to combine several important aspects, normally analyzed separately, which however have an impact on each other and influence each other. In order to carry out this kind of interdisciplinary analysis, the knowledge base of both areas was involved and the combination and collaboration of different experts in the various fields was necessary. Although the methodology described is generic and can be applied to all sectors, a particular use case was chosen to show its application. The case study considered is a new type of healthcare service that allows patients in acute disease to be hospitalized to their home. This provide the possibility to perform experiments using real hospital database
The Routledge Handbook of Philosophy of Economics
The most fundamental questions of economics are often philosophical in nature, and philosophers have, since the very beginning of Western philosophy, asked many questions that current observers would identify as economic. The Routledge Handbook of Philosophy of Economics is an outstanding reference source for the key topics, problems, and debates at the intersection of philosophical and economic inquiry. It captures this field of countless exciting interconnections, affinities, and opportunities for cross-fertilization. Comprising 35 chapters by a diverse team of contributors from all over the globe, the Handbook is divided into eight sections: I. Rationality II. Cooperation and Interaction III. Methodology IV. Values V. Causality and Explanation VI. Experimentation and Simulation VII. Evidence VIII. Policy The volume is essential reading for students and researchers in economics and philosophy who are interested in exploring the interconnections between the two disciplines. It is also a valuable resource for those in related fields like political science, sociology, and the humanities.</p
Quantitative Methods for Economics and Finance
This book is a collection of papers for the Special Issue “Quantitative Methods for Economics and Finance” of the journal Mathematics. This Special Issue reflects on the latest developments in different fields of economics and finance where mathematics plays a significant role. The book gathers 19 papers on topics such as volatility clusters and volatility dynamic, forecasting, stocks, indexes, cryptocurrencies and commodities, trade agreements, the relationship between volume and price, trading strategies, efficiency, regression, utility models, fraud prediction, or intertemporal choice
Magnetoencephalography
This is a practical book on MEG that covers a wide range of topics. The book begins with a series of reviews on the use of MEG for clinical applications, the study of cognitive functions in various diseases, and one chapter focusing specifically on studies of memory with MEG. There are sections with chapters that describe source localization issues, the use of beamformers and dipole source methods, as well as phase-based analyses, and a step-by-step guide to using dipoles for epilepsy spike analyses. The book ends with a section describing new innovations in MEG systems, namely an on-line real-time MEG data acquisition system, novel applications for MEG research, and a proposal for a helium re-circulation system. With such breadth of topics, there will be a chapter that is of interest to every MEG researcher or clinician
Inferring implicit relevance from physiological signals
Ongoing growth in data availability and consumption has meant users are increasingly faced with the challenge of distilling relevant information from an abundance of noise. Overcoming this information overload can be particularly difficult in situations such as intelligence analysis, which involves subjectivity, ambiguity, or risky social implications. Highly automated solutions are often inadequate, therefore new methods are needed for augmenting existing analysis techniques to support user decision making. This project investigated the potential for deep learning to infer the occurrence of implicit relevance assessments from users' biometrics. Internal cognitive processes manifest involuntarily within physiological signals, and are often accompanied by 'gut feelings' of intuition. Quantifying unconscious mental processes during relevance appraisal may be a useful tool during decision making by offering an element of objectivity to an inherently subjective situation. Advances in wearable or non-contact sensors have made recording these signals more accessible, whilst advances in artificial intelligence and deep learning have enhanced the discovery of latent patterns within complex data. Together, these techniques might make it possible to transform tacit knowledge into codified knowledge which can be shared. A series of user studies recorded eye gaze movements, pupillary responses, electrodermal activity, heart rate variability, and skin temperature data from participants as they completed a binary relevance assessment task. Participants were asked to explicitly identify which of 40 short-text documents were relevant to an assigned topic. Investigations found this physiological data to contain detectable cues corresponding with relevance judgements. Random forests and artificial neural networks trained on features derived from the signals were able to produce inferences with moderate correlations with the participants' explicit relevance decisions. Several deep learning algorithms trained on the entire physiological time series data were generally unable to surpass the performance of feature-based methods, and instead produced inferences with low correlations with participants' explicit personal truths. Overall, pupillary responses, eye gaze movements, and electrodermal activity offered the most discriminative power, with additional physiological data providing diminishing or adverse returns. Finally, a conceptual design for a decision support system is used to discuss social implications and practicalities of quantifying implicit relevance using deep learning techniques. Potential benefits included assisting with introspection and collaborative assessment, however quantifying intrinsically unknowable concepts using personal data and abstruse artificial intelligence techniques were argued to pose incommensurate risks and challenges. Deep learning techniques therefore have the potential for inferring implicit relevance in information-rich environments, but are not yet fit for purpose. Several avenues worthy of further research are outlined
On politics and social science – the subject-object problem in social science and Foucault’s engaged epistemology
The epistemological problem of the relationship between the subject of knowledge and
the object being known has it’s form in social science as a problem of the relationship between a
social scientist as a researcher and society and it’s phenomena as an object of this inquiry. As
Berger and Kellner note in their book “Sociology Reinterpreted” a social scientist is necessarily a
part of the object he studies, being embedded in a position in society from which he studies it.
Hence social sciences as scientific endeavors face a problem of the inseperability of their
researchers from object they study. Two main solutions two this problem have arisen: positivism
and interpretivism. Positivism postulates that rigorous methods for research will insure that
objective knowledge will be produced while interpretivism sees society only as an aggregate of
individuals whose interactions should be interpreted. A third epistemological framework has
arisen in the first half of the twentieth century usually called “critical theory”. Critical theory
states that researchers should aim their research towards changing the object they are
researching, therefore their scientific practice should have extra-scientific effects, namely
political effects. This perspective violates Webers postulate of value neutrality which claims that
social sciences can only study the state of affairs but can’t subscribe desirable ways of action. As
we will see the main topic of our paper is the epistemological framework of the work of Michel
Foucault and his contribution to the resolution of the problematic relation between a researcher
and his research object in social science. We will claim that Foucault broadly falls into the
critical theory paradigm but manages to solve it’s conflict with the value neutrality postulate.
Foucault envisions society as an amalgam of discursive and non-discursive practices that
interconnect in a way that gives them regularity and coherence through time. As Gayatri Spivak
notices for Foucault discursive practices create meaning and in doing so chart a way for nondiscursive
practices and therefore for action. This can be seen as an explanation for Foucault’s
well known postulate of the relationship between power and knowledge, discursive practices
create knowledge that makes visible certain paths for action. Both of these types of practices
intertwine to create what Foucault calls “dispositifs” that can be seen as mechanisms that bind discursive and non-discursive practices in a coherent manner and enable their regular repetition
through time. Foucault calls his methodology “genealogy” and sees it as a historical research of
the emergence of dipositifs. Genealogy is a historical research of the contingent ways in which
practices got interconnected in the past to create dispositifs we see today. As Foucault claims
genealogy begins with a “question posed in the present” about a certain dispositive and then
charts historical events and processes that led to its current form. The main aim of genealogy is
to show that there is no transcendental necessity for a certain dispositif to exist in it’s current
form by exposing the historical contingency that led to it’s current state. Foucault claimed that
his intent was to show that there is no metaphysical necessity that grounds the existences of
dispositifs and hence that their current form is arbitrary. As we can see Foucault follows his
postulate on the relationship between knowledge and power and formulates his scientific practice
as an opening of possibilities for different forms of action. This is way he calls his books
“experiments” and claims that they are to be used for readers to re-examine their own links to the
currently existing dispositifs and possibilities of their alternative arrangements. But as Foucault
claims the genealogical method doesn’t include normative prescriptions and can be seen only as
a form of an anti-metaphysical “unmasking” of current dispositifs. This unmasking doesn’t
prescribe a desirable form to any dispositive but only shows that it can be arranged in different
ways. Hence we can say that Foucault sees the relationship between a researcher and his object
of study as a form of an intervention of the subject that aims at showing that the object is an
arbitrary construction. In that regard Foucault falls into the critical theory paradigm. Where he
differs from critical theory is his anti-normative stance that refuses to prescribe any desirable
form of action unlike for example Horkheimer who in his essay on critical theory claims that
“the task of the theorist is to push society towards justice”. Foucault claims that his research
results should be used as “instruments” in political struggles but he himself doesn’t ever
proclaim a desirable political goal. So we can conclude that Foucault solves the problem of the
subject-object relation in social science by envisioning the research process as a practice of
production of tools for social change. Therefore he connects social science to extra-scientific
political goals but doesn’t violate the value neutrality postulate because his research doesn’t
prescribe any concrete political goals but only shows the possibility for social change
- …