2,323 research outputs found

    Branching Ratio and CP-asymmetry for B-> 1^{1}P_{1}gamma decays

    Full text link
    We calculate the branching ratios for B_{d}^{0}->(b_{1},h_{1})gamma at next-to-leading order (NLO) of alpha_{s} where b_{1} and h_{1} are the corresponding radially excited axial vector mesons of rho and omega respectively. Using the SU(3)symmetry for the form factor, the branching ratio for B_{d}^{0}->(b_{1},h_{1})gamma is expressed in terms of the branching ratio of the B_{d}^{0}-> K_{1}gamma and it is found to be B(B_{d}^{0}->b_{1}gamma)=0.71* 10^{-6} and B(B_{d}^{0}-> h_{1}gamma) =0.74*10^{-6}. We also calculate direct CP asymmetry for these decays and find, in confirmity with the observations made in the literature, that the hard spectator contributions significantely reduces the asymmetry arising from the vertex corrections alone. The value of CP-asymmetry is 10% and is negative like rho and omega in the Standard Model.Comment: 10 pages, 2 figure

    Use of untreated wastewater in peri-urban agriculture in Pakistan: risks and opportunities

    Get PDF
    Water reuse / Waste waters / Water quality / Groundwater / Irrigation practices / Soil properties / Environmental effects / Conjunctive use / Pakistan / Haroonabad

    Deterministic and Probabilistic Binary Search in Graphs

    Full text link
    We consider the following natural generalization of Binary Search: in a given undirected, positively weighted graph, one vertex is a target. The algorithm's task is to identify the target by adaptively querying vertices. In response to querying a node qq, the algorithm learns either that qq is the target, or is given an edge out of qq that lies on a shortest path from qq to the target. We study this problem in a general noisy model in which each query independently receives a correct answer with probability p>12p > \frac{1}{2} (a known constant), and an (adversarial) incorrect one with probability 1p1-p. Our main positive result is that when p=1p = 1 (i.e., all answers are correct), log2n\log_2 n queries are always sufficient. For general pp, we give an (almost information-theoretically optimal) algorithm that uses, in expectation, no more than (1δ)log2n1H(p)+o(logn)+O(log2(1/δ))(1 - \delta)\frac{\log_2 n}{1 - H(p)} + o(\log n) + O(\log^2 (1/\delta)) queries, and identifies the target correctly with probability at leas 1δ1-\delta. Here, H(p)=(plogp+(1p)log(1p))H(p) = -(p \log p + (1-p) \log(1-p)) denotes the entropy. The first bound is achieved by the algorithm that iteratively queries a 1-median of the nodes not ruled out yet; the second bound by careful repeated invocations of a multiplicative weights algorithm. Even for p=1p = 1, we show several hardness results for the problem of determining whether a target can be found using KK queries. Our upper bound of log2n\log_2 n implies a quasipolynomial-time algorithm for undirected connected graphs; we show that this is best-possible under the Strong Exponential Time Hypothesis (SETH). Furthermore, for directed graphs, or for undirected graphs with non-uniform node querying costs, the problem is PSPACE-complete. For a semi-adaptive version, in which one may query rr nodes each in kk rounds, we show membership in Σ2k1\Sigma_{2k-1} in the polynomial hierarchy, and hardness for Σ2k5\Sigma_{2k-5}

    OSCA: a comprehensive open-access system of analysis of posterior capsular opacification

    Get PDF
    BACKGROUND: This paper presents and tests a comprehensive computerised system of analysis of digital images of posterior capsule opacification (PCO). It updates and expands significantly on a previous presentation to include facilities for selecting user defined central areas and for registering and subsequent merging of images for artefact removal. Also, the program is compiled and thus eliminates the need for specialised additional software. The system is referred to in this paper as the open-access systematic capsule assessment (OSCA). The system is designed to be evidence based, objective and openly available, improving on current systems of analysis. METHODS: Principal features of the OSCA system of analysis are discussed. Flash artefacts are automatically located in two PCO images and the images merged to produce a composite free from these artefacts. For this to be possible the second image has to be manipulated with a registration technique to bring it into alignment with the first. Further image processing and analysis steps use a location-sensitive entropy based texture analysis of PCO. Validity of measuring PCO progression of the whole new system is assessed along with visual significance of scores. Reliability of the system is assessed. RESULTS: Analysis of PCO by the system shows ability to detect early progression of PCO, as well as detection of more visually significant PCO. Images with no clinical PCO produce very low scores in the analysis. Reliability of the system of analysis is demonstrated. CONCLUSION: This system of PCO analysis is evidence-based, objective and clinically useful. It incorporates flash detection and removal as well as location sensitive texture analysis. It provides features and benefits not previously available to most researchers or clinicians. Substantial evidence is provided for this system's validity and reliability

    Energy-aware Theft Detection based on IoT Energy Consumption Data

    Get PDF
    With the advent of modern smart grid networks, advanced metering infrastructure provides real-time information from smart meters (SM) and sensors to energy companies and consumers. The smart grid is indeed a paradigm that is enabled by the Internet of Things (IoT) and in which the SM acts as an IoT device that collects and transmits data over the Internet to enable intelligent applications. However, IoT data communicated over the smart grid could however be maliciously altered, resulting in energy theft due to unbilled energy consumption. Machine learning (ML) techniques for energy theft detection (ETD) based on IoT data are promising but are nonetheless constrained by the poor quality of data and particularly its imbalanced nature (which emerges from the dominant representation of honest users and poor representation of the rare theft cases). Leading ML-based ETD methods employ synthetic data generation to balance the training the dataset. However, these are trained to maximise average correct detection instead of ETD. In this work, we formulate an energy-aware evaluation framework that guides the model training to maximise ETD and minimise the revenue loss due to mis-classification. We propose a convolution neural network with positive bias (CNN-B) and another with focal loss CNN (CNN-FL) to mitigate the data imbalance impact. These outperform the state of the art and the CNN-B achieves the highest ETD and the minimum revenue loss with a loss reduction of 30.4% compared to the highest loss incurred by these methods

    Urban wastewater: A valuable resource for agriculture - A case study from Haroonabad, Pakistan

    Get PDF
    Waste waters / Irrigation water / Water reuse / Economic analysis / Soil properties / Households / Water availability / Water use / Water quality / Groundwater / Public health / Risks / Case studies

    Intelligence as a source of cold case homicide investigation : a cross national perspective

    Get PDF
    Cold case homicides are probably the most complex and difficult of cases to (re-) investigate. Each is different in terms of means and motives, method of killing, geographic location and weapon used. There is no single method of investigation that fully fits with each type of cold case homicide. Therefore, several investigative techniques are in common practice, such as DNA profiling and fingerprint analysis etc. However, it is not known how an ‘Intelligence-Led Policing Model’ (ILPM) can be applied in cold case homicide investigations and literature is currently lacking that would enable us to shed light on this issue. The aim of the research is to develop a robust understanding on how an ILPM can help to solve old and cold case homicides
    corecore