1,089 research outputs found

    Automated mechanism design for B2B e-commerce models

    Get PDF
    Business-to-business electronic marketplaces (B2B e-Marketplaces) have been in the limelight since 1999 with the commercialisation of the Internet and subsequent “dot.com” boom [1]. Literature is indicative of the growth of the B2B sectors in all industries, and B2B e-Marketplace is one of the sectors that have witnessed a rapid increase. Consequently, the importance of developing the B2B e-Commerce Model for improved value chain in B2B exchanges is extremely important for SMEs to expose to the world marketplace. There are three research objectives (ROs) in this study; first (RO1) to critical review the concepts of the B2B e-Marketplace including their technologies, operations, business relationships and functionalities; second (RO2) to design an automated mechanism of B2B e-Marketplace for Small to Medium Sized Enterprises (SMEs); and third (RO3) to propose a conceptual B2B e-Commerce model for SMEs. The proposed model is constructed by the analytical findings obtained from the contemporary B2B e-Marketplace literature

    Special section on advances in reachability analysis and decision procedures: contributions to abstraction-based system verification

    No full text
    Reachability analysis asks whether a system can evolve from legitimate initial states to unsafe states. It is thus a fundamental tool in the validation of computational systems - be they software, hardware, or a combination thereof. We recall a standard approach for reachability analysis, which captures the system in a transition system, forms another transition system as an over-approximation, and performs an incremental fixed-point computation on that over-approximation to determine whether unsafe states can be reached. We show this method to be sound for proving the absence of errors, and discuss its limitations for proving the presence of errors, as well as some means of addressing this limitation. We then sketch how program annotations for data integrity constraints and interface specifications - as in Bertrand Meyers paradigm of Design by Contract - can facilitate the validation of modular programs, e.g., by obtaining more precise verification conditions for software verification supported by automated theorem proving. Then we recap how the decision problem of satisfiability for formulae of logics with theories - e.g., bit-vector arithmetic - can be used to construct an over-approximating transition system for a program. Programs with data types comprised of bit-vectors of finite width require bespoke decision procedures for satisfiability. Finite-width data types challenge the reduction of that decision problem to one that off-the-shelf tools can solve effectively, e.g., SAT solvers for propositional logic. In that context, we recall the Tseitin encoding which converts formulae from that logic into conjunctive normal form - the standard format for most SAT solvers - with only linear blow-up in the size of the formula, but linear increase in the number of variables. Finally, we discuss the contributions that the three papers in this special section make in the areas that we sketched above. © Springer-Verlag 2009

    Electrical and optical characterisation of low temperature grown InGaAs for photodiode applications

    Get PDF
    Dilute bismide and nitride alloys are promising semiconductors for bandgap engineering, opening additional design freedom for devices such as infrared photodiodes. Low growth temperatures are required to incorporate bismuth or nitrogen into III V semiconductors. However, the effects of low growth temperature on dark current and responsivity are not well understood. In this work, a set of InGaAs p i n wafers were grown at a constant temperature of 250, 300, 400 and 500 °C for all p, i and n layers. A second set of wafers was grown where the p and n layers were grown at 500 °C while the i-layers were grown at 250, 300 and 400 °C. Photodiodes were fabricated from all seven wafers. When constant growth temperature was employed (for all p, i and n layers), we observed that photodiodes grown at 500 °C show dark current density at 1 V that is 6 orders of magnitude lower while the responsivity at an illumination wavelength of 1520 nm is 4.5 times higher than those from photodiodes grown at 250 °C. Results from the second set of wafers suggest that performance degradation can be recovered by growing the p and n layers at high temperature. For instance, comparing photodiodes with i-layers grown at 250 °C, photodiodes showed dark current density at -1 V that is 5 orders of magnitude lower when the p and n layer were grown at 500 °C. Postgrowth annealing, at 595 °C for 15 minutes, on the two wafers grown at 250 and 300 °C showed recovery of diode responsivity but no significant improvement in the dark current. Our work suggests that growth of the cap layer at high temperature is necessary to maintain the responsivity and minimise the dark current degradation, offering a pathway to developing novel photodiode materials that necessitate low growth temperatures

    A microcosting study of the surgical correction of upper extremity deformity in children with spastic cerebral palsy

    Get PDF
    _Objective:_ Determine healthcare costs of upper-extremity surgical correction in children with spastic cerebral palsy (CP). _Method:_ This cohort study included 39 children with spastic CP who had surgery for their upper extremity at a Dutch hospital. A retrospective cost analysis was performed including both hospital and rehabilitation costs. Hospital costs were determined using microcosting methodology. Rehabilitation costs were estimated using reference prices. _Results:_ Hospital costs averaged €6813 per child. Labor (50%), overheads (29%), and medical aids (15%) were important cost drivers. Rehabilitation costs were estimated at €3599 per child. _Conclusions:_ Surgery of the upper extremity is an important contributor to the healthcare costs of children with CP. Our study shows that labor is the most important cost driver for hospital costs, owing to the multidisciplinary approach and patient-specific treatment plan. A remarkable finding was the substantial amount of rehabilitation costs

    Remote severity assessment in atopic dermatitis:Validity and reliability of the remote Eczema Area and Severity Index and Self-Administered Eczema Area and Severity Index

    Get PDF
    Background: Reliable assessment of atopic dermatitis (AD) severity is necessary for clinical practice and research. Valid and reliable remote assessment is essential to facilitate remote care and research. Objectives: Assess the validity and reliability of the Eczema Area and Severity Index (EASI) based on images and patient-assessed severity based on the Self-Administered EASI (SA-EASI). Methods: Whole-body clinical images were taken during consultation from children with AD. After consultations, caregivers completed the SA-EASI and provided images from home. Four raters assessed all images twice using EASI. Results: A total of 1534 clinical images and 425 patient-provided images were collected from 87 and 32 children. Excellent (0.90) validity, good inter (0.77) and intrarater reliability (0.91), and standard error of measurement (4.31) was found for the EASI based on clinical images. Feasibility of patient-provided images showed limitations with missing images (43.8%) and quality issues (23.1%). However, good validity (0.86), inter (0.74) and intrarater reliability (0.94) were found when assessment was possible. Moderate correlation (0.60) between SA-EASI and EASI was found. Limitations: Low portion patient-provided images. Conclusion: AD severity assessment based on images strongly correlates with in-person AD assessment. Good measurement properties confirm the potential of remote assessment. Moderate correlation between SA-EASI and in-person EASI suggest limited value of self-assessment.</p

    An Algorithm for Probabilistic Alternating Simulation

    Get PDF
    In probabilistic game structures, probabilistic alternating simulation (PA-simulation) relations preserve formulas defined in probabilistic alternating-time temporal logic with respect to the behaviour of a subset of players. We propose a partition based algorithm for computing the largest PA-simulation, which is to our knowledge the first such algorithm that works in polynomial time, by extending the generalised coarsest partition problem (GCPP) in a game-based setting with mixed strategies. The algorithm has higher complexities than those in the literature for non-probabilistic simulation and probabilistic simulation without mixed actions, but slightly improves the existing result for computing probabilistic simulation with respect to mixed actions.Comment: We've fixed a problem in the SOFSEM'12 conference versio

    Polymer nanocomposites functionalised with nanocrystals of zeolitic imidazolate frameworks as ethylene control agents

    Get PDF
    Ethylene (C2H4) management involves the usage of materials such as KMnO4 or processes such as ozone oxidation or combined photocatalysis/photochemistry. The ubiquity of C2H4, especially in an industrial context, necessitates a simpler and much more effective approach, and herein we propose the usage of tuneable polymer nanocomposites for the adsorption of C2H4 through the modification of the polymer matrices via the incorporation of nanocrystals of zeolitic imidazolate frameworks (nano-ZIFs). We demonstrate that the inclusion of ZIF-8 and ZIF-7 nanocrystals into polymeric matrices (Matrimid and polyurethane [PU]) yields robust nanocomposites that preserve the C2H4 adsorption/desorption capacity of nanocrystals while shielding it from degrading factors. We report new insights into the adsorption/desorption kinetics of the polymer and its corresponding nanocomposites, which can be tailored by exploiting the underlying polymeric molecular interactions. Importantly, we also elucidated the retention of the intrinsic structural framework dynamics of the nano-ZIFs even when embedded within the polymeric matrix, as evidenced from the breathing and gate-opening phenomena. Our findings pave the way for bespoke designs of novel polymer nanocomposites, which will subsequently impact the deployment of tailored nanomaterials for effective industrial applications.E. M. Mahdi would like to thank Yayasan Khazanah (YK) for the DPhil scholarship that made this work possible. The research in the MMC Lab (J.C.T.) was supported by the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation programme (grant agreement No 771575 - PROMOFS), and the EPSRC grant no. EP/N014960/1. The authors acknowledge the provision of the TGA and TEM by the Research Complex at Harwell (RCaH), in Rutherford Appleton Laboratory, Oxfordshire. J.S.A. acknowledges financial support by MINECO (Project MAT2016-80285-p), H2020 (MSCA-RISE-2016/NanoMed Project), and GV (PROMETEOII/2014/004)

    First order Born-Infeld Hydrodynamics via Gauge/Gravity Duality

    Full text link
    By performing a derivative expansion on a class of boosted Born-Infeld-AdS_5 black branes, we study the hydrodynamics of the dual field theory - in the spirit of AdS/CFT correspondence. We determine the fluid dynamical stress-energy tensor to first order, and find that the ratio of the shear viscosity to entropy density conforms to the universal value of 1/4π1/4\pi to all orders of the inverse of the Born-Infeld parameter.Comment: 14 pages, JHEP3, minor revision

    Sharp Trace Hardy-Sobolev-Maz'ya Inequalities and the Fractional Laplacian

    Get PDF
    In this work we establish trace Hardy and trace Hardy-Sobolev-Maz'ya inequalities with best Hardy constants, for domains satisfying suitable geometric assumptions such as mean convexity or convexity. We then use them to produce fractional Hardy-Sobolev-Maz'ya inequalities with best Hardy constants for various fractional Laplacians. In the case where the domain is the half space our results cover the full range of the exponent s(0,1)s \in (0,1) of the fractional Laplacians. We answer in particular an open problem raised by Frank and Seiringer \cite{FS}.Comment: 42 page
    corecore