1,302 research outputs found

    Macro- and microscopic analysis of the internet economy from network measurements

    Get PDF
    Tesi per compendi de publicacions.The growth of the Internet impacts multiple areas of the world economy, and it has become a permanent part of the economic landscape both at the macro- and at microeconomic level. On-line traffic and information are currently assets with large business value. Even though commercial Internet has been a part of our lives for more than two decades, its impact on global, and everyday, economy still holds many unknowns. In this work we analyse important macro- and microeconomic aspects of the Internet. First we investigate the characteristics of the interdomain traffic, which is an important part of the macroscopic economy of the Internet. Finally, we investigate the microeconomic phenomena of price discrimination in the Internet. At the macroscopic level, we describe quantitatively the interdomain traffic matrix (ITM), as seen from the perspective of a large research network. The ITM describes the traffic flowing between autonomous systems (AS) in the Internet. It depicts the traffic between the largest Internet business entities, therefore it has an important impact on the Internet economy. In particular, we analyse the sparsity and statistical distribution of the traffic, and observe that the shape of the statistical distribution of the traffic sourced from an AS might be related to congestion within the network. We also investigate the correlations between rows in the ITM. Finally, we propose a novel method to model the interdomain traffic, that stems from first-principles and recognizes the fact that the traffic is a mixture of different Internet applications, and can have regional artifacts. We present and evaluate a tool to generate such matrices from open and available data. Our results show that our first-principles approach is a promising alternative to the existing solutions in this area, which enables the investigation of what-if scenarios and their impact on the Internet economy. At the microscopic level, we investigate the rising phenomena of price discrimination (PD). We find empirical evidences that Internet users can be subject to price and search discrimination. In particular, we present examples of PD on several ecommerce websites and uncover the information vectors facilitating PD. Later we show that crowd-sourcing is a feasible method to help users to infer if they are subject to PD. We also build and evaluate a system that allows any Internet user to examine if she is subject to PD. The system has been deployed and used by multiple users worldwide, and uncovered more examples of PD. The methods presented in the following papers are backed with thorough data analysis and experiments.Internet es hoy en día un elemento crucial en la economía mundial, su constante crecimiento afecta directamente múltiples aspectos tanto a nivel macro- como a nivel microeconómico. Entre otros aspectos, el tráfico de red y la información que transporta se han convertido en un producto de gran valor comercial para cualquier empresa. Sin embargo, más de dos decadas después de su introducción en nuestras vidas y siendo un elemento de vital importancia, el impacto de Internet en la economía global y diaria es un tema que alberga todavía muchas incógnitas que resolver. En esta disertación analizamos importantes aspectos micro y macroeconómicos de Internet. Primero, investigamos las características del tráfico entre Sistemas Autónomos (AS), que es un parte decisiva de la macroeconomía de Internet. A continuacin, estudiamos el controvertido fenómeno microeconómico de la discriminación de precios en Internet. A nivel macroeconómico, mostramos cuantitatívamente la matriz del tráfico entre AS ("Interdomain Traffic Matrix - ITM"), visto desde la perspectiva de una gran red científica. La ITM obtenida empíricamente muestra la cantidad de tráfico compartido entre diferentes AS, las entidades más grandes en Internet, siendo esto uno de los principales aspectos a evaluar en la economiá de Internet. Esto nos permite por ejemplo, analizar diferentes propiedades estadísticas del tráfico para descubrir si la distribución del tráfico producido por un AS está directamente relacionado con la congestión dentro de la red. Además, este estudio también nos permite investigar las correlaciones entre filas de la ITM, es decir, entre diferentes AS. Por último, basándonos en el estudio empírico, proponemos una innovadora solución para modelar el tráfico en una ITM, teniendo en cuenta que el tráfico modelado es dependiente de las particularidades de cada escenario (e.g., distribución de apliaciones, artefactos). Para obtener resultados representativos, la herramienta propuesta para crear estas matrices es evaluada a partir de conjuntos de datos abiertos, disponibles para toda la comunidad científica. Los resultados obtenidos muestran que el método propuesto es una prometedora alternativa a las soluciones de la literatura. Permitiendo así, la nueva investigación de escenarios desconocidos y su impacto en la economía de Internet. A nivel microeconómico, en esta tesis investigamos el fenómeno de la discriminación de precios en Internet ("price discrimination" - PD). Nuestros estudios permiten mostrar pruebas empíricas de que los usuarios de Internet están expuestos a discriminación de precios y resultados de búsquedas. En particular, presentamos ejemplos de PD en varias páginas de comercio electrónico y descubrimos que informacin usan para llevarlo a cabo. Posteriormente, mostramos como una herramienta crowdsourcing puede ayudar a la comunidad de usuarios a inferir que páginas aplican prácticas de PD. Con el objetivo de mitigar esta cada vez más común práctica, publicamos y evaluamos una herramienta que permite al usuario deducir si está siendo víctima de PD. Esta herramienta, con gran repercusión mediática, ha sido usada por multitud de usuarios alrededor del mundo, descubriendo así más ejemplos de discriminación. Por último remarcar que todos los metodos presentados en esta disertación están respaldados por rigurosos análisis y experimentos.Postprint (published version

    Decision making study: methods and applications of evidential reasoning and judgment analysis

    Get PDF
    Decision making study has been the multi-disciplinary research involving operations researchers, management scientists, statisticians, mathematical psychologists and economists as well as others. This study aims to investigate the theory and methodology of decision making research and apply them to different contexts in real cases. The study has reviewed the literature of Multiple Criteria Decision Making (MCDM), Evidential Reasoning (ER) approach, Naturalistic Decision Making (NDM) movement, Social Judgment Theory (SJT), and Adaptive Toolbox (AT) program. On the basis of these literatures, two methods, Evidence-based Trade-Off (EBTO) and Judgment Analysis with Heuristic Modelling (JA-HM), have been proposed and developed to accomplish decision making problems under different conditions. In the EBTO method, we propose a novel framework to aid people s decision making under uncertainty and imprecise goal. Under the framework, the imprecise goal is objectively modelled through an analytical structure, and is independent of the task requirement; the task requirement is specified by the trade-off strategy among criteria of the analytical structure through an importance weighting process, and is subject to the requirement change of a particular decision making task; the evidence available, that could contribute to the evaluation of general performance of the decision alternatives, are formulated with belief structures which are capable of capturing various format of uncertainties that arise from the absence of data, incomplete information and subjective judgments. The EBTO method was further applied in a case study of Soldier system decision making. The application has demonstrated that EBTO, as a tool, is able to provide a holistic analysis regarding the requirements of Soldier missions, the physical conditions of Soldiers, and the capability of their equipment and weapon systems, which is critical in domain. By drawing the cross-disciplinary literature from NDM and AT, the JA-HM extended the traditional Judgment Analysis (JA) method, through a number of novel methodological procedures, to account for the unique features of decision making tasks under extreme time pressure and dynamic shifting situations. These novel methodological procedures include, the notion of decision point to deconstruct the dynamic shifting situations in a way that decision problem could be identified and formulated; the classification of routine and non-routine problems, and associated data alignment process to enable meaningful decision data analysis across different decision makers (DMs); the notion of composite cue to account for the DMs iterative process of information perception and comprehension in dynamic task environment; the application of computational models of heuristics to account for the time constraints and process dynamics of DMs decision making process; and the application of cross-validation process to enable the methodological principle of competitive testing of decision models. The JA-HM was further applied in a case study of fire emergency decision making. The application has been the first behavioural test of the validity of the computational models of heuristics, in predicting the DMs decision making during fire emergency response. It has also been the first behavioural test of the validity of the non-compensatory heuristics in predicting the DMs decisions on ranking task. The findings identified extend the literature of AT and NDM, and have implications for the fire emergency decision making

    Markov-modulated marked Poisson processes for modelling disease dynamics based on medical claims data

    Full text link
    We explore Markov-modulated marked Poisson processes (MMMPPs) as a natural framework for modelling patients' disease dynamics over time based on medical claims data. In claims data, observations do not only occur at random points in time but are also informative, i.e. driven by unobserved disease levels, as poor health conditions usually lead to more frequent interactions with the healthcare system. Therefore, we model the observation process as a Markov-modulated Poisson process, where the rate of healthcare interactions is governed by a continuous-time Markov chain. Its states serve as proxies for the patients' latent disease levels and further determine the distribution of additional data collected at each observation time, the so-called marks. Overall, MMMPPs jointly model observations and their informative time points by comprising two state-dependent processes: the observation process (corresponding to the event times) and the mark process (corresponding to event-specific information), which both depend on the underlying states. The approach is illustrated using claims data from patients diagnosed with chronic obstructive pulmonary disease (COPD) by modelling their drug use and the interval lengths between consecutive physician consultations. The results indicate that MMMPPs are able to detect distinct patterns of healthcare utilisation related to disease processes and reveal inter-individual differences in the state-switching dynamics

    Sparse Multivariate Modeling: Priors and Applications

    Get PDF

    Functional network analyses and dynamical modeling of proprioceptive updating of the body schema

    Full text link
    Proprioception is an ability to perceive the position and speed of body parts that is important for construction of the body schema in the brain. Proper updating of the body schema is necessary for appropriate voluntary movement. However, the mechanisms mediating such an updating are not well understood. To study these mechanisms when the body part was at rest, electroencephalography (EEG) and evoked potentials studies were employed, and when the body was in motion, kinematic studies were performed. An experimental approach to elicit proprioceptive P300 evoked potentials was developed providing evidence that processing of novel passive movements is similar to processing of novel visual and auditory stimuli. The latencies of the proprioceptive P300 potentials were found to be greater than those elicited by auditory, but not different from those elicited by the visual stimuli. The features of the functional networks that generated the P300s were analyzed for each modality. Cross-correlation networks showed both common features, e.g. connections between frontal and parietal areas, and the stimulus-specific features, e.g. increases of the connectivity for temporal electrodes in the visual and auditory networks, but not in the proprioceptive ones. The magnitude of coherency networks showed a reduction in alpha band connectivity for most of the electrodes groupings for all stimuli modalities, but did not demonstrate modality-specific features. Kinematic study compared performances of 19 models previously proposed in the literature for movements at the shoulder and elbow joints in terms of their ability to reconstruct the speed profiles of the wrist pointing movements. It was found that lognormal and beta function models are most suitable for wrist speed profile modeling. In addition, an investigation of the blinking rates during the P300 potentials recordings revealed significantly lower rates in left-handed participants, compared to the right-handed ones. Future work will include expanding the experimental and analytical methodologies to different kinds of proprioceptive stimuli (displacements and speeds) and experimental paradigms (error-related negativity potentials), and comparing the models of the speed profiles produced by the feet to those of the wrists, as well as replicating the observations made on the blinking rates in a larger scale study

    A New Trivariate Model and Generalized Linear Model for Stochastic Episodes' Duration, Magnitude and Maximum

    Get PDF
    In this dissertation we work with a trivariate model for stochastic events (N,X,Y)(N, X, Y), where NN is the duration, XX is the magnitude, and YY is the maximum of an event. We first consider the case, where N has a geometric distribution, X=i=1NXiX=\sum_{i=1}^{N}X_i, and Y=i=1NXiY = \bigvee_{i=1}^{N}X_i, where the XiX_i's are independent and identically distributed (IID) exponential random variables. Such events arise, for example, when a process is observed above or below a threshold. Examples include heat waves, flood, drought, or market growth or decline periods. In this setting, we extend the IID model to one that incorporates covariates. \\ We prove existence and uniqueness of the maximum likelihood estimators for the parameters, and introduce a new method for checking the goodness of fit of the model to the data. Our goodness of fit method is based on distributional fit of appropriately transformed data. We include a data example from finance to illustrate the modeling potential of this new generalized linear model.\\ In the second part, we extend the model for (N,X,Y)(N, X, Y) to the case where NN has a 1-inflated (or deflated) geometric distribution. Data requiring this extension appear in several areas of applications, including actuarial science, finance, and weather and climate. We provide basic properties and estimation of parameters of this new class of multivariate mixed distributions. Our results include marginal and conditional distributions, joint integral transforms, moments, stochastic representations, estimation and testing. An example from finance illustrates the modeling potential of this new model. \\ Finally, we show an important application of our model to weather data, where the process of interest is total daily precipitation. Here, the random vectors \\ (N,X,Y)(N, X, Y) describe the duration, the magnitude and the maximum of precipitation events, defined as consecutive days when precipitation is greater than a certain threshold. We fit our model to observational and predicted precipitation data from several global climate models for the time period 1950-2100, with particular attention to the storms generated by the atmospheric rivers. We show that the estimated parameters change to reflect the observed topography and type of storms in the region of interest. Our results provide an insight to the duration and magnitude of the future storms, which in turn provide the quantitative information for the water resources and emergency planning

    Untangling hotel industry’s inefficiency: An SFA approach applied to a renowned Portuguese hotel chain

    Get PDF
    The present paper explores the technical efficiency of four hotels from Teixeira Duarte Group - a renowned Portuguese hotel chain. An efficiency ranking is established from these four hotel units located in Portugal using Stochastic Frontier Analysis. This methodology allows to discriminate between measurement error and systematic inefficiencies in the estimation process enabling to investigate the main inefficiency causes. Several suggestions concerning efficiency improvement are undertaken for each hotel studied.info:eu-repo/semantics/publishedVersio
    corecore