712 research outputs found

    Mapeamento de qualidade de experiência (QOE) através de qualidade de serviço (QOS) focado em bases de dados distribuídas

    Get PDF
    Tese (doutorado) - Universidade Federal de Santa Catarina, Centro Tecnológico, Programa de Pós-Graduação em Ciência da Computação, Florianópolis, 2017.A falta de conceitualização congruente sobre qualidade de serviço (QoS) para bases de dados (BDs) foi o fator que impulsionou o estudo resultante nesta tese. A definição de QoS como uma simples verificação de se um nó corre risco de falha devido ao número de acessos, como faziam, na época do levantamento bibliométrico desta tese, alguns sistemas comerciais, era uma simplificação exagerada para englobar um conceito tão complexo. Outros trabalhos que dizem lidar com estes conceitos também não são exatos, em termos matemáticos, e não possuem definições concretas ou com qualidade passível de utilização ou replicação, o que torna inviável sua aplicação ou mesmo verificação. O foco deste estudo é direcionado à bases de dados distribuídas (BDDs), de maneira que a conceitualização aqui desenvolvida é também compatível, ao menos parcialmente, com modelos não distribuídos de BDs. As novas definições de QoS desenvolvidas são utilizadas para se lidar com o conceito correlacionado de qualidade de experiência (QoE), em uma abordagem em nível de sistema focada em completude de QoS. Mesmo sendo QoE um conceito multidimensional, difícil de ser mensurado, o foco é mantido em uma abordagem passível de mensuramento, de maneira a permitir que sistemas de BDDs possam lidar com autoavaliação. A proposta de autoavaliação surge da necessidade de identificação de problemas passíveis de autocorreção. Tendo-se QoS bem definida, de maneira estatística, pode-se fazer análise de comportamento e tendência comportamental de maneira a se inferir previsão de estados futuros, o que permite o início de processo de correção antes que se alcance estados inesperados, por predição estatística. Sendo o objetivo geral desta tese a definição de métricas de QoS e QoE, com foco em BDDs, lidando com a hipótese de que é possível se definir QoE estatisticamente com base em QoS, para propósitos de nível de sistema. Ambos os conceitos sendo novos para BDDs quando lidando com métricas mensuráveis exatas. E com estes conceitos então definidos, um modelo de recuperação arquitetural é apresentado e testado para demonstração de resultados quando da utilização das métricas definidas para predição comportamental.Abstract : The hitherto lack of quality of service (QoS) congruent conceptualization to databases (DBs) was the factor that drove the initial development of this thesis. To define QoS as a simple verification that if a node is at risk of failure due to memory over-commitment, as did some commercial systems at the time that was made the bibliometric survey of this thesis, it is an oversimplification to encompass such a complex concept. Other studies that quote to deal with this concept are not accurate and lack concrete definitions or quality allowing its use, making infeasible its application or even verification. Being the focus targeted to distributed databases (DDBs), the developed conceptualization is also compatible, at least partially, with models of non-distributed DBs. These newfound QoS settings are then used to handle the correlated concept of quality of experience (QoE) in a system-level approach, focused on QoS completeness. Being QoE a multidimensional concept, hard to be measured, the focus is kept in an approach liable of measurement, in a way to allow DDBs systems to deal with self-evaluation. The idea of self-evaluation arises from the need of identifying problems subject to self-correction. With QoS statistically well-defined, it is possible to analyse behavior and to indetify tendencies in order to predict future states, allowing early correction before the system reaches unexpected states. Being the general objective of this thesis the definition of metrics of QoS and QoE, focused on DDBs, dealing with the hypothesis that it is possible to define QoE statistically based on QoS, for system level purposes. Both these concepts being new to DDBs when dealing with exact measurable metrics. Once defined these concepts, an architectural recovering model is presented and tested to demonstrate the results when using the metrics defined for behavioral prediction

    Applications and case studies in oil refineries

    Get PDF
    The widespread adoption of wireless systems for industrial automation calls for the development of efficient tools for virtual planning of network deployments similarly as done for conventional Fieldbus and wired systems. In industrial sites the radio signal propagation is subject to blockage due to highly dense metallic structures. Network planning should therefore account for the number and the density of the 3D obstructions surrounding each link. In this paper we address the problem of wireless node deployment in wireless industrial networks, with special focus on WirelessHART IEC 62591 and ISA SP100 IEC 62734 standards. The goal is to optimize the network connectivity and develop an effective tool that can work in complex industrial sites characterized by severe obstructions. The proposed node deployment approach is validated through a case study in an oil refinery environment. It includes an ad-hoc simulation environment (RFSim tool) that implements the proposed network planning approach using 2D models of the plant, providing connectivity information based on user-defined deployment configurations. Simulation results obtained using the proposed simulation environment were validated by on-site measurements

    Canagliflozin and renal outcomes in type 2 diabetes and nephropathy

    Get PDF
    BACKGROUND Type 2 diabetes mellitus is the leading cause of kidney failure worldwide, but few effective long-term treatments are available. In cardiovascular trials of inhibitors of sodium–glucose cotransporter 2 (SGLT2), exploratory results have suggested that such drugs may improve renal outcomes in patients with type 2 diabetes. METHODS In this double-blind, randomized trial, we assigned patients with type 2 diabetes and albuminuric chronic kidney disease to receive canagliflozin, an oral SGLT2 inhibitor, at a dose of 100 mg daily or placebo. All the patients had an estimated glomerular filtration rate (GFR) of 30 to <90 ml per minute per 1.73 m2 of body-surface area and albuminuria (ratio of albumin [mg] to creatinine [g], >300 to 5000) and were treated with renin–angiotensin system blockade. The primary outcome was a composite of end-stage kidney disease (dialysis, transplantation, or a sustained estimated GFR of <15 ml per minute per 1.73 m2), a doubling of the serum creatinine level, or death from renal or cardiovascular causes. Prespecified secondary outcomes were tested hierarchically. RESULTS The trial was stopped early after a planned interim analysis on the recommendation of the data and safety monitoring committee. At that time, 4401 patients had undergone randomization, with a median follow-up of 2.62 years. The relative risk of the primary outcome was 30% lower in the canagliflozin group than in the placebo group, with event rates of 43.2 and 61.2 per 1000 patient-years, respectively (hazard ratio, 0.70; 95% confidence interval [CI], 0.59 to 0.82; P=0.00001). The relative risk of the renal-specific composite of end-stage kidney disease, a doubling of the creatinine level, or death from renal causes was lower by 34% (hazard ratio, 0.66; 95% CI, 0.53 to 0.81; P<0.001), and the relative risk of end-stage kidney disease was lower by 32% (hazard ratio, 0.68; 95% CI, 0.54 to 0.86; P=0.002). The canagliflozin group also had a lower risk of cardiovascular death, myocardial infarction, or stroke (hazard ratio, 0.80; 95% CI, 0.67 to 0.95; P=0.01) and hospitalization for heart failure (hazard ratio, 0.61; 95% CI, 0.47 to 0.80; P<0.001). There were no significant differences in rates of amputation or fracture. CONCLUSIONS In patients with type 2 diabetes and kidney disease, the risk of kidney failure and cardiovascular events was lower in the canagliflozin group than in the placebo group at a median follow-up of 2.62 years

    Multidifferential study of identified charged hadron distributions in ZZ-tagged jets in proton-proton collisions at s=\sqrt{s}=13 TeV

    Full text link
    Jet fragmentation functions are measured for the first time in proton-proton collisions for charged pions, kaons, and protons within jets recoiling against a ZZ boson. The charged-hadron distributions are studied longitudinally and transversely to the jet direction for jets with transverse momentum 20 <pT<100< p_{\textrm{T}} < 100 GeV and in the pseudorapidity range 2.5<η<42.5 < \eta < 4. The data sample was collected with the LHCb experiment at a center-of-mass energy of 13 TeV, corresponding to an integrated luminosity of 1.64 fb1^{-1}. Triple differential distributions as a function of the hadron longitudinal momentum fraction, hadron transverse momentum, and jet transverse momentum are also measured for the first time. This helps constrain transverse-momentum-dependent fragmentation functions. Differences in the shapes and magnitudes of the measured distributions for the different hadron species provide insights into the hadronization process for jets predominantly initiated by light quarks.Comment: All figures and tables, along with machine-readable versions and any supplementary material and additional information, are available at https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-013.html (LHCb public pages

    Study of the BΛc+ΛˉcKB^{-} \to \Lambda_{c}^{+} \bar{\Lambda}_{c}^{-} K^{-} decay

    Full text link
    The decay BΛc+ΛˉcKB^{-} \to \Lambda_{c}^{+} \bar{\Lambda}_{c}^{-} K^{-} is studied in proton-proton collisions at a center-of-mass energy of s=13\sqrt{s}=13 TeV using data corresponding to an integrated luminosity of 5 fb1\mathrm{fb}^{-1} collected by the LHCb experiment. In the Λc+K\Lambda_{c}^+ K^{-} system, the Ξc(2930)0\Xi_{c}(2930)^{0} state observed at the BaBar and Belle experiments is resolved into two narrower states, Ξc(2923)0\Xi_{c}(2923)^{0} and Ξc(2939)0\Xi_{c}(2939)^{0}, whose masses and widths are measured to be m(Ξc(2923)0)=2924.5±0.4±1.1MeV,m(Ξc(2939)0)=2938.5±0.9±2.3MeV,Γ(Ξc(2923)0)=0004.8±0.9±1.5MeV,Γ(Ξc(2939)0)=0011.0±1.9±7.5MeV, m(\Xi_{c}(2923)^{0}) = 2924.5 \pm 0.4 \pm 1.1 \,\mathrm{MeV}, \\ m(\Xi_{c}(2939)^{0}) = 2938.5 \pm 0.9 \pm 2.3 \,\mathrm{MeV}, \\ \Gamma(\Xi_{c}(2923)^{0}) = \phantom{000}4.8 \pm 0.9 \pm 1.5 \,\mathrm{MeV},\\ \Gamma(\Xi_{c}(2939)^{0}) = \phantom{00}11.0 \pm 1.9 \pm 7.5 \,\mathrm{MeV}, where the first uncertainties are statistical and the second systematic. The results are consistent with a previous LHCb measurement using a prompt Λc+K\Lambda_{c}^{+} K^{-} sample. Evidence of a new Ξc(2880)0\Xi_{c}(2880)^{0} state is found with a local significance of 3.8σ3.8\,\sigma, whose mass and width are measured to be 2881.8±3.1±8.5MeV2881.8 \pm 3.1 \pm 8.5\,\mathrm{MeV} and 12.4±5.3±5.8MeV12.4 \pm 5.3 \pm 5.8 \,\mathrm{MeV}, respectively. In addition, evidence of a new decay mode Ξc(2790)0Λc+K\Xi_{c}(2790)^{0} \to \Lambda_{c}^{+} K^{-} is found with a significance of 3.7σ3.7\,\sigma. The relative branching fraction of BΛc+ΛˉcKB^{-} \to \Lambda_{c}^{+} \bar{\Lambda}_{c}^{-} K^{-} with respect to the BD+DKB^{-} \to D^{+} D^{-} K^{-} decay is measured to be 2.36±0.11±0.22±0.252.36 \pm 0.11 \pm 0.22 \pm 0.25, where the first uncertainty is statistical, the second systematic and the third originates from the branching fractions of charm hadron decays.Comment: All figures and tables, along with any supplementary material and additional information, are available at https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-028.html (LHCb public pages

    Measurement of the ratios of branching fractions R(D)\mathcal{R}(D^{*}) and R(D0)\mathcal{R}(D^{0})

    Full text link
    The ratios of branching fractions R(D)B(BˉDτνˉτ)/B(BˉDμνˉμ)\mathcal{R}(D^{*})\equiv\mathcal{B}(\bar{B}\to D^{*}\tau^{-}\bar{\nu}_{\tau})/\mathcal{B}(\bar{B}\to D^{*}\mu^{-}\bar{\nu}_{\mu}) and R(D0)B(BD0τνˉτ)/B(BD0μνˉμ)\mathcal{R}(D^{0})\equiv\mathcal{B}(B^{-}\to D^{0}\tau^{-}\bar{\nu}_{\tau})/\mathcal{B}(B^{-}\to D^{0}\mu^{-}\bar{\nu}_{\mu}) are measured, assuming isospin symmetry, using a sample of proton-proton collision data corresponding to 3.0 fb1{ }^{-1} of integrated luminosity recorded by the LHCb experiment during 2011 and 2012. The tau lepton is identified in the decay mode τμντνˉμ\tau^{-}\to\mu^{-}\nu_{\tau}\bar{\nu}_{\mu}. The measured values are R(D)=0.281±0.018±0.024\mathcal{R}(D^{*})=0.281\pm0.018\pm0.024 and R(D0)=0.441±0.060±0.066\mathcal{R}(D^{0})=0.441\pm0.060\pm0.066, where the first uncertainty is statistical and the second is systematic. The correlation between these measurements is ρ=0.43\rho=-0.43. Results are consistent with the current average of these quantities and are at a combined 1.9 standard deviations from the predictions based on lepton flavor universality in the Standard Model.Comment: All figures and tables, along with any supplementary material and additional information, are available at https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-039.html (LHCb public pages

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    Search for supersymmetry in events with one lepton and multiple jets in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Why Are Outcomes Different for Registry Patients Enrolled Prospectively and Retrospectively? Insights from the Global Anticoagulant Registry in the FIELD-Atrial Fibrillation (GARFIELD-AF).

    Get PDF
    Background: Retrospective and prospective observational studies are designed to reflect real-world evidence on clinical practice, but can yield conflicting results. The GARFIELD-AF Registry includes both methods of enrolment and allows analysis of differences in patient characteristics and outcomes that may result. Methods and Results: Patients with atrial fibrillation (AF) and ≥1 risk factor for stroke at diagnosis of AF were recruited either retrospectively (n = 5069) or prospectively (n = 5501) from 19 countries and then followed prospectively. The retrospectively enrolled cohort comprised patients with established AF (for a least 6, and up to 24 months before enrolment), who were identified retrospectively (and baseline and partial follow-up data were collected from the emedical records) and then followed prospectively between 0-18 months (such that the total time of follow-up was 24 months; data collection Dec-2009 and Oct-2010). In the prospectively enrolled cohort, patients with newly diagnosed AF (≤6 weeks after diagnosis) were recruited between Mar-2010 and Oct-2011 and were followed for 24 months after enrolment. Differences between the cohorts were observed in clinical characteristics, including type of AF, stroke prevention strategies, and event rates. More patients in the retrospectively identified cohort received vitamin K antagonists (62.1% vs. 53.2%) and fewer received non-vitamin K oral anticoagulants (1.8% vs . 4.2%). All-cause mortality rates per 100 person-years during the prospective follow-up (starting the first study visit up to 1 year) were significantly lower in the retrospective than prospectively identified cohort (3.04 [95% CI 2.51 to 3.67] vs . 4.05 [95% CI 3.53 to 4.63]; p = 0.016). Conclusions: Interpretations of data from registries that aim to evaluate the characteristics and outcomes of patients with AF must take account of differences in registry design and the impact of recall bias and survivorship bias that is incurred with retrospective enrolment. Clinical Trial Registration: - URL: http://www.clinicaltrials.gov . Unique identifier for GARFIELD-AF (NCT01090362)
    corecore