1,287 research outputs found

    Sintesis dan Karakterisasi Komposit PANi/ZnO Sebagai Sensor Gas Metanol

    Get PDF
    Pada penelitian ini telah dilakukan sintesis dan karakterisasi komposit PANi/ZnO untuk mengetahui karakteristik komposit PANi/ZnO serta membandingkan nilai sensitivitas sensor terhadap gas metanol pada variasi konsentrasi gas metanol yang berbeda. Metode yang digunakan dalam sintesis komposit PANi/ZnO adalah polimerisasi in-situ dengan variasi komposisi ZnO yang digunakan yaitu 10%, 20% dan 30%. Hasil karakterisasi FTIR menunjukkan adanya beberapa jenis ikatan karakteristik PANi/ZnO dari semua sampel hasil sintesis. Pola XRD menunjukkan puncak tertinggi karakteristik ZnO terletak pada 36,19, sedangkan pola difraksi karakteristik PANi terletak pada 31,74 dengan struktur kristal ortorombik. Morfologi sampel diamati menggunakan SEM hasilnya menunjukkan bahwa sampel PANi/ZnO 30% memiliki struktur ovoidal-shaped dan berpori dengan diamter 378 nm. Hasil pengukuran konduktivitas listrik menunjukkan nilai konduktivitas listrik dari sampel PANi murni memiliki nilai tertinggi yaitu 3,5810-6 S/cm, akan tetapi nilai konduktivitas listrik mengalami penurunan seiring dengan penambahan komposisi ZnO. Penggunaan komposit PANi/ZnO 30% sebagai bahan sensor gas metanol mampu menghasilkan nilai sensitivitas yang relatif tinggi, semakin tinggi konsentrasi gas metanol maka sensitivitas sensor semakin meningkat. Dengan demikian disimpulkan bahwa sampel komposit PANi/ZnO 30% merupakan sampel terbaik sebagai bahan sensor dalam mendeteksi gas metanol

    Examining relationship between service quality, user satisfaction, and performance impact in the context of smart government in UAE

    Get PDF
    Governments attempt to use all forms of information technologies including Internet and mobile computing to be able to transform relationships with citizens. However, there is a clear gap between the indicator of the impact of technology innovation output and government’s vision in UAE (United Arab Emirates). In this regard, investigating the relationship between service quality, user satisfaction, and performance impact may help the government to mark its current progress and milestone achievement. This research proposed a model based on Delone & McLean IS success model by considering the research context. The modeling of structural equations via PLS (Partial least squares) regression was applied to evaluate the model within the context of public sector in the UAE. The data was collected from a sample of 147 employees in public organizations using a questionnaire. Results demonstrated that the quality of service has a significant effect on user satisfaction. In addition, quality of service and user satisfaction positively influences the staff performance. The outcome of this research helps to enhance the understanding of the impact of smart government applications

    Asociación entre sobrepeso y obesidad con gastritis crónica en adultos

    Get PDF
    Objetivo: Determinar la asociación entre sobrepeso y obesidad con la presencia de gastritis crónica en adultos del Hospital de Apoyo II-2 Sullana en el año 2019. Material y métodos: Se realizó un estudio observacional, con diseño transversal analítico, que incluyó pacientes atendidos en el servicio de gastroenterología del Hospital De Apoyo II-2 Sullana en el año 2019. Se recogieron datos edad, sexo y valores del índice de masa corporal de los pacientes que presentaron la existencia de gastritis crónica. Resultados: Se incluyeron 305 pacientes, de los cuales 256 presentaban gastritis crónica, hallándose una prevalencia de esta patología del 83.9%. La edad media de los pacientes con gastritis crónica fue 50.4 ± (1.1) años. La presencia de esta patología predominó en los pacientes entre los 18 y 39 años, sexo masculino e índice de masa corporal ≥ 40 kg/m2. La obesidad clase III aumento la prevalencia de gastritis crónica 28% más (RPa 1.28, IC 95% 1.14 a 1.43, p < 0.001), luego de ajusta por sexo y edad. Conclusiones: En la evaluación general de los pacientes incluidos, la obesidad clase III fue un factor asociado a la presencia de gastritis crónica en la presente investigación.Objective: To determine the association between overweight and obesity with the presence of chronic gastritis in adults at the Hospital De Apoyo II-2 Sullana in 2019. Material and methods: An observational study, with an analytical crosssectional design, was carried out, which included patients attended in the gastroenterology service of the Hospital De Apoyo II-2 Sullana in 2019. Data were collected age, sex and body mass index values of patients who presented the existence of chronic gastritis. Results: A total of 305 patients were included, of which 256 had chronic gastritis, with a prevalence of this pathology of 83.9%. The mean age of patients with chronic gastritis was 50.4 ± (1.1) years. The presence of this pathology predominated in patients between 18 and 39 years, male sex and body mass index ≥ 40 kg/m2. Class III obesity increased the prevalence of chronic gastritis by 28% more (aPR 1.28, 95% CI 1.14 to 1.43, p < 0.001), after adjusting for sex and age. Conclusions: In the general evaluation of the patients included, class III obesity was a factor associated with the presence of chronic gastritis in the present research.Tesi

    Statistical analysis of the owl:sameAs network for aligning concepts in the linking open data cloud

    No full text
    The massively distributed publication of linked data has brought to the attention of scientific community the limitations of classic methods for achieving data integration and the opportunities of pushing the boundaries of the field by experimenting this collective enterprise that is the linking open data cloud. While reusing existing ontologies is the choice of preference, the exploitation of ontology alignments still is a required step for easing the burden of integrating heterogeneous data sets. Alignments, even between the most used vocabularies, is still poorly supported in systems nowadays whereas links between instances are the most widely used means for bridging the gap between different data sets. We provide in this paper an account of our statistical and qualitative analysis of the network of instance level equivalences in the Linking Open Data Cloud (i.e. the sameAs network) in order to automatically compute alignments at the conceptual level. Moreover, we explore the effect of ontological information when adopting classical Jaccard methods to the ontology alignment task. Automating such task will allow in fact to achieve a clearer conceptual description of the data at the cloud level, while improving the level of integration between datasets. <br/

    Evaluating the Validation Process:Embracing Complexity and Transparency in Health Economic Modelling

    Get PDF
    Reimbursement decisions and price negotiation of healthcare interventions often rely on health economic model results. Such decisions affect resource allocation, patient outcomes and future healthcare choices. To ensure optimal decisions, assessing the validity of health economic models may be crucial. Validation involves much more than identifying (and hopefully correcting) errors in the model implementation. It also includes assessing the conceptual validity of the model and validation of the model input data, and checking whether the model’s predictions align sufficiently well with real-world data. In the context of health economics, validation can be defined as “the act of evaluating whether a model is a proper and sufficient representation of the system it is intended to represent in view of an application”, meaning that the model complies with what is known about the system and its outcomes provide a robust basis for decision making.[...]Validation of health economic models should be seen as a critical component of evidence-based decision making in healthcare. However, as of today, it still faces several important challenges, including the lack of consensus guidance and standardised procedures, the need for greater rigour or the question of who should oversee the validation process. To address these challenges, we encourage model developers, agencies requiring models for their decision making and editors of journals that publish models to recommend the use of state-of-the-art tools for reporting (and conducting) validations of health economic models, such as those mentioned in this editorial

    Contemporary Art Authentication With Large-Scale Classification

    Get PDF
    Art authentication is the process of identifying the artist who created a piece of artwork and is manifested through events of provenance, such as art gallery exhibitions and financial transactions. Art authentication has visual influence via the uniqueness of the artist’s style in contrast to the style of another artist. The significance of this contrast is proportional to the number of artists involved and the degree of uniqueness of an artist’s collection. This visual uniqueness of style can be captured in a mathematical model produced by a machine learning (ML) algorithm on painting images. Art authentication is not always possible as provenance can be obscured or lost through anonymity, forgery, gifting, or theft of artwork. This paper presents an image-only art authentication attribute marker of contemporary art paintings for a very large number of artists. The experiments in this paper demonstrate that it is possible to use ML-generated models to authenticate contemporary art from 2368 to 100 artists with an accuracy of 48.97% to 91.23%, respectively. This is the largest effort for image-only art authentication to date, with respect to the number of artists involved and the accuracy of authentication

    Evaluating the Validation Process:Embracing Complexity and Transparency in Health Economic Modelling

    Get PDF
    Reimbursement decisions and price negotiation of healthcare interventions often rely on health economic model results. Such decisions affect resource allocation, patient outcomes and future healthcare choices. To ensure optimal decisions, assessing the validity of health economic models may be crucial. Validation involves much more than identifying (and hopefully correcting) errors in the model implementation. It also includes assessing the conceptual validity of the model and validation of the model input data, and checking whether the model’s predictions align sufficiently well with real-world data. In the context of health economics, validation can be defined as “the act of evaluating whether a model is a proper and sufficient representation of the system it is intended to represent in view of an application”, meaning that the model complies with what is known about the system and its outcomes provide a robust basis for decision making.[...]Validation of health economic models should be seen as a critical component of evidence-based decision making in healthcare. However, as of today, it still faces several important challenges, including the lack of consensus guidance and standardised procedures, the need for greater rigour or the question of who should oversee the validation process. To address these challenges, we encourage model developers, agencies requiring models for their decision making and editors of journals that publish models to recommend the use of state-of-the-art tools for reporting (and conducting) validations of health economic models, such as those mentioned in this editorial

    Comparison between Analytical Equation and Numerical Methods for Determining Shear Stress in a Cantilever Beam

    Get PDF
    A three meter-length cantilever beam loaded with a concentrated load at its free end is studied to determine shear stresses. In the present study, three cross sections are considered: rectangle (R); I, and T. The study presents a comparison of maximum shear stresses obtained by means of two methods: classical analytical equation derived by Collingnon, and finite element method (FEM) software. Software programs ANSYS and SAP2000 were used. The results show difference between the maximum shear stresses obtained by the analytical equation and the software, being the last is always higher. The average differences for ANSYS and SAP2000, independently of the cross section, were 12.76% and 11.96%, respectively. Considering these differences, correction factors were proposed to the classical analytical formula for each cross section case to obtain more realistic results. After the correction, the average differences decrease to 1.48% and 4.86%, regardless of the cross section shape

    Decoupled and Descattered Monopole MIMO Antenna Array with Orthogonal Radiation Patterns

    Get PDF
    This chapter introduces a novel design concept to reduce mutual coupling among closely-spaced antenna elements of a MIMO array. This design concept significantly reduces the complexity of traditional/existing design approaches such as metamaterials, defected ground plane structures, soft electromagnetic surfaces, parasitic elements, matching and decoupling networks using a simple, yet a novel design alternative. The approach is based on a planar single decoupling element, consisting of a rectangular metallic ring resonator printed on one face of an ungrounded substrate. The decoupling structure surrounds a two-element vertical monopole antenna array fed by a coplanar waveguide structure. The design is shown both by simulations and measurements to reduce the mutual coupling by at least 20 dB, maintain the impedance bandwidth over which S11, is less than −10 dB, and reduce the envelope correlation coefficient to below 0.001. The boresight of the far-field radiation patterns of the two vertical monopole wire antennas operating at 2.4 GHz and separated by 8 mm (λo/16), where λo is the free-space wavelength at 2.45 GHz, is shown to be orthogonal and inclined by 45° with respect to the horizontal (azimuthal) plane while maintaining the shape of the isolated single antenna element

    Cost recommendation under uncertainty in IQWiG’s efficiency frontier framework

    Get PDF
    Background: The National Institute for Quality and Efficiency in Health Care (IQWiG) employs an efficiency frontier (EF) framework to facilitate setting maximum reimbursable prices for new interventions. Probabilistic sensitivity analysis (PSA) is used when yes/no reimbursement decisions are sought based on a fixed threshold. In the IQWiG framework, an additional layer of complexity arises as the EF itself may vary its shape in each PSA iteration, and thus the willingness-to-pay, indicated by the EF segments, may vary. Objectives: To explore the practical problems arising when, within the EF approach, maximum reimbursable prices for new interventions are sought through PSA. Methods: When the EF is varied in a PSA, cost recommendations for new interventions may be determined by the mean or the median of the distances between each intervention’s point estimate and each EF. Implications of using these metrics were explored in a simulation study based on the model used by IQWiG to assess the cost-effectiveness of 4 antidepressants. Results. Depending on the metric used, cost recommendations can be contradictory. Recommendations based on the mean can also be inconsistent. Results (median) suggested that costs of duloxetine, venlafaxine, mirtazapine, and bupropion should be decreased by €131, €29, €12, and €99, respectively. These recommendations were implemented and the analysis repeated. New results suggested keeping the costs as they were. The percentage of acceptable PSA outcomes increased 41% on average, and the uncertainty associated to the net health benefit was significantly reduced. Conclusions: The median of the distances between every intervention outcome and every EF is a good proxy for the cost recommendation that would be given should the EF be fixed. Adjusting costs according to the median increased the probability of acceptance and reduced the uncertainty around the net health benefit distribution, resulting in a reduced uncertainty for decision makers
    corecore