240 research outputs found

    Algebra of currents and some applications to elementary particle physics

    Get PDF
    We study the implications of the algebra of currents to elementary particle processes. In Chapter 1 we introduce the concepts of current algebra and discuss how the information contained in the current commutators can be used to set up sum rules and in particular to evaluate strong interaction renormalization effects. In Chapter II we apply some of the techniques developed in Chapter 1 and obtain consistency conditions for the K-meson scattering amplitude. In the third chapter we illustrate the methods further by calculating the F/D ratio and find good agreement with experiment. Finally, in Chapter IV we are concerned with calculating the coupling constant of the so far hypothetical a-meson to nucleon states. We find that the techniques of current algebras enable us to do so and we find a value for its coupling constant which should ultimately be testable experimentally should the a-meson exist as a physical particle

    Analysis of Irrigation Water Requirements in Gezira Scheme Using Geographic Information Systems: Case Study Block Number 26 (Dolga)

    Get PDF
    Water scarcity has a direct impact on food security and a real threat to food production for millions of the population. To avoid the unwise and random consumption of available water resources, it is high time for using efficient system and appropriate policies for management of irrigated farms to control water overuse, starting by the analysis of the existing reality. Block number 26 (Dolga) in Gezira scheme had been chosen as a study area. The total area of the study area is about 24616 Feddans. The study aims to apply GIS in the data editing, manipulation, storage, processing, and presentation for the analysis of irrigation water requirements. GIS had been chosen for its capabilities of data capturing, data processing and efficient spatial analysis. Layers of geo-referenced spatial database including crops’ maps, irrigation canals network and its command area, each crop water requirements, irrigation water requirement, rainfall data, area cultivated for each crop and contour map of the study area had been created and used for the spatial analysis in ArcGIS software. Five course rotations are applied for the crops of cotton, wheat, sorghum, and groundnut. Each crop covers 20% of the total area and the remaining 20% are fallow and layer for each crop had been created. Crops’ water requirements had been calculated applying Penman-Monteith method and saved in the attributes Table of crops layer. It had been found that the seasonal crop water requirements per Feddan for each crop is 3,871.56 m3, 2,983.26 m3, 1,847.16 m3 and 2,007.6 m3 respectively, calculated using Penman-Monteith method. The total crops water requirements are 50,254,962.66m3(absorbed by plants), water allotment of the study area according to the ratio of the area is 67, 195,230.327 m3, irrigation water requirement is 64,537,716.5 m3 (including the water losses), where the actual water supply is 63,817,600 m3 which had been calculated applying ArcGIS tools and saved in the attribute Table of the crops and rotation layer. The study had concluded that cotton consumes the largest amount of water supply, rainfall water is in irrigation has the used only in the complementary irrigation of sorghum crop and the implementation of GIS capabilities enables efficient analysis and scheduling of irrigation water

    Bayesian image restoration and bacteria detection in optical endomicroscopy

    Get PDF
    Optical microscopy systems can be used to obtain high-resolution microscopic images of tissue cultures and ex vivo tissue samples. This imaging technique can be translated for in vivo, in situ applications by using optical fibres and miniature optics. Fibred optical endomicroscopy (OEM) can enable optical biopsy in organs inaccessible by any other imaging systems, and hence can provide rapid and accurate diagnosis in a short time. The raw data the system produce is difficult to interpret as it is modulated by a fibre bundle pattern, producing what is called the “honeycomb effect”. Moreover, the data is further degraded due to the fibre core cross coupling problem. On the other hand, there is an unmet clinical need for automatic tools that can help the clinicians to detect fluorescently labelled bacteria in distal lung images. The aim of this thesis is to develop advanced image processing algorithms that can address the above mentioned problems. First, we provide a statistical model for the fibre core cross coupling problem and the sparse sampling by imaging fibre bundles (honeycomb artefact), which are formulated here as a restoration problem for the first time in the literature. We then introduce a non-linear interpolation method, based on Gaussian processes regression, in order to recover an interpretable scene from the deconvolved data. Second, we develop two bacteria detection algorithms, each of which provides different characteristics. The first approach considers joint formulation to the sparse coding and anomaly detection problems. The anomalies here are considered as candidate bacteria, which are annotated with the help of a trained clinician. Although this approach provides good detection performance and outperforms existing methods in the literature, the user has to carefully tune some crucial model parameters. Hence, we propose a more adaptive approach, for which a Bayesian framework is adopted. This approach not only outperforms the proposed supervised approach and existing methods in the literature but also provides computation time that competes with optimization-based methods

    Spatiotemporal Tensor Completion for Improved Urban Traffic Imputation

    Full text link
    Effective management of urban traffic is important for any smart city initiative. Therefore, the quality of the sensory traffic data is of paramount importance. However, like any sensory data, urban traffic data are prone to imperfections leading to missing measurements. In this paper, we focus on inter-region traffic data completion. We model the inter-region traffic as a spatiotemporal tensor that suffers from missing measurements. To recover the missing data, we propose an enhanced CANDECOMP/PARAFAC (CP) completion approach that considers the urban and temporal aspects of the traffic. To derive the urban characteristics, we divide the area of study into regions. Then, for each region, we compute urban feature vectors inspired from biodiversity which are used to compute the urban similarity matrix. To mine the temporal aspect, we first conduct an entropy analysis to determine the most regular time-series. Then, we conduct a joint Fourier and correlation analysis to compute its periodicity and construct the temporal matrix. Both urban and temporal matrices are fed into a modified CP-completion objective function. To solve this objective, we propose an alternating least square approach that operates on the vectorized version of the inputs. We conduct comprehensive comparative study with two evaluation scenarios. In the first one, we simulate random missing values. In the second scenario, we simulate missing values at a given area and time duration. Our results demonstrate that our approach provides effective recovering performance reaching 26% improvement compared to state-of-art CP approaches and 35% compared to state-of-art generative model-based approaches

    Deep-Gap: A deep learning framework for forecasting crowdsourcing supply-demand gap based on imaging time series and residual learning

    Full text link
    Mobile crowdsourcing has become easier thanks to the widespread of smartphones capable of seamlessly collecting and pushing the desired data to cloud services. However, the success of mobile crowdsourcing relies on balancing the supply and demand by first accurately forecasting spatially and temporally the supply-demand gap, and then providing efficient incentives to encourage participant movements to maintain the desired balance. In this paper, we propose Deep-Gap, a deep learning approach based on residual learning to predict the gap between mobile crowdsourced service supply and demand at a given time and space. The prediction can drive the incentive model to achieve a geographically balanced service coverage in order to avoid the case where some areas are over-supplied while other areas are under-supplied. This allows anticipating the supply-demand gap and redirecting crowdsourced service providers towards target areas. Deep-Gap relies on historical supply-demand time series data as well as available external data such as weather conditions and day type (e.g., weekday, weekend, holiday). First, we roll and encode the time series of supply-demand as images using the Gramian Angular Summation Field (GASF), Gramian Angular Difference Field (GADF) and the Recurrence Plot (REC). These images are then used to train deep Convolutional Neural Networks (CNN) to extract the low and high-level features and forecast the crowdsourced services gap. We conduct comprehensive comparative study by establishing two supply-demand gap forecasting scenarios: with and without external data. Compared to state-of-art approaches, Deep-Gap achieves the lowest forecasting errors in both scenarios.Comment: Accepted at CloudCom 2019 Conferenc

    Cultural issues as an approach to forming and managing the future neighbourhoods, case study : the central region of Saudi Arabia

    Get PDF
    The purpose of this study was to formulate a guideline for developing and managing the future neighbourhoods in the Central Region of Saudi Arabia based on the cultural norms of its residents. The initial cause for this study is the conflict between the imported new planning system and the restricted culture and behaviour of the people involved. In order to achieve the main objective of this thesis, a series of studies was undertaken. The theoretical background relating to the main issues of this study are reviewed and the notion of these issues defined according to the Saudi Arabian's belief and way of life which is essential to understanding. The thesis discussed, analysed, and evaluated three types of neighbourhoods planning systems which occurred in the study area. These were the traditional, the contemporary, and the new trends planning system. These studies where primarily formulated in light of the literature review and the analysis made from the information obtained via questionnaires, interviews, observation, and public and community consultations (carried out by the researcher in the summer 1988 and summer 1989). From the analysis, the thesis concludes that the future planning of the neighbourhoods should be formulated according to a man-surrounding relationship and his needs. In order to clarify this, the thesis defined a set of recommendations for forming and managing the future neighbourhoods. These include considering the socio-cultural and the individual requirements of the residents, making the plan open-ended, and establishing a local community authority to control the implementation and the growth of the neighbourhoods. Finally, the thesis briefly explains how to implement some of the recommended guidelines which need to be clarified through using a specific case study. It recommends also some further studies in order to reinforce and generalise the findings of the thesis.The purpose of this study was to formulate a guideline for developing and managing the future neighbourhoods in the Central Region of Saudi Arabia based on the cultural norms of its residents. The initial cause for this study is the conflict between the imported new planning system and the restricted culture and behaviour of the people involved. In order to achieve the main objective of this thesis, a series of studies was undertaken. The theoretical background relating to the main issues of this study are reviewed and the notion of these issues defined according to the Saudi Arabian's belief and way of life which is essential to understanding. The thesis discussed, analysed, and evaluated three types of neighbourhoods planning systems which occurred in the study area. These were the traditional, the contemporary, and the new trends planning system. These studies where primarily formulated in light of the literature review and the analysis made from the information obtained via questionnaires, interviews, observation, and public and community consultations (carried out by the researcher in the summer 1988 and summer 1989). From the analysis, the thesis concludes that the future planning of the neighbourhoods should be formulated according to a man-surrounding relationship and his needs. In order to clarify this, the thesis defined a set of recommendations for forming and managing the future neighbourhoods. These include considering the socio-cultural and the individual requirements of the residents, making the plan open-ended, and establishing a local community authority to control the implementation and the growth of the neighbourhoods. Finally, the thesis briefly explains how to implement some of the recommended guidelines which need to be clarified through using a specific case study. It recommends also some further studies in order to reinforce and generalise the findings of the thesis

    Flood Risk Mapping and Management in Urban Areas: Integrating Geomatics and Hydrodynamic Modeling - A Case Study of Al Bidi City, Saudi Arabia

    Get PDF
    In this paper, we focus on developing a comprehensive approach to map and manage flood risks in Al Bidi City, located in the Al-Aflaj Governorate of Saudi Arabia. By integrating geomatics (Remote Sensing and GIS) and hydrodynamic modeling (PCSWMM and HEC-RAS), the study aims to simulate and model flood risks in populated areas under different scenarios, considering the impact of climate change. The study generates three integrated maps: flood intensity, environmental sensitivity, and flood risks. Strategic solutions and mitigation measures are proposed based on the findings. The results indicate that Al Bidi City is exposed to flood risks originating from the west and progressing towards the east, primarily due to significant valleys such as Wadi Harm. Approximately 60% of the urban area is affected by torrential water. The study proposes the construction of embankments, channels, and culverts to redirect floodwaters to Wadi Al Jadwal in the east, as well as the implementation of industrial channels to manage floods in the northern valleys

    Prediction of protein secondary structure using binary classificationtrees, naive Bayes classifiers and the Logistic Regression Classifier

    Get PDF
    The secondary structure of proteins is predicted using various binary classifiers. The data are adopted from the RS126 database. The original data consists of protein primary and secondary structure sequences. The original data is encoded using alphabetic letters. These data are encoded into unary vectors comprising ones and zeros only. Different binary classifiers, namely the naive Bayes, logistic regression and classification trees using hold-out and 5-fold cross validation are trained using the encoded data. For each of the classifiers three classification tasks are considered, namely helix against not helix (H/∼H), sheet against not sheet (S/∼S) and coil against not coil (C/∼C). The performance of these binary classifiers are compared using the overall accuracy in predicting the protein secondary structure for various window sizes. Our result indicate that hold-out cross validation achieved higher accuracy than 5-fold cross validation. The Naive Bayes classifier, using 5-fold cross validation achieved, the lowest accuracy for predicting helix against not helix. The classification tree classifiers, using 5-fold cross validation, achieved the lowest accuracies for both coil against not coil and sheet against not sheet classifications. The logistic regression classier accuracy is dependent on the window size; there is a positive relationship between the accuracy and window size. The logistic regression classier approach achieved the highest accuracy when compared to the classification tree and Naive Bayes classifiers for each classification task; predicting helix against not helix with accuracy 77.74 percent, for sheet against not sheet with accuracy 81.22 percent and for coil against not coil with accuracy 73.39 percent. It is noted that it is easier to compare classifiers if the classification process could be completely facilitated in R. Alternatively, it would be easier to assess these logistic regression classifiers if SPSS had a function to determine the accuracy of the logistic regression classifier

    Assessment of the Activity of Lactate Dehydrogenase, Gamma-glutamyl Transpeptidase and Alkaline Phosphatase in Breast Cancer Patients with and without Lymph Node Metastasis

    Get PDF
    Background: Breast cancer has the highest cancer incidence in women and is one of the leading causes of mortality globally. Lactate dehydrogenase (LDH) and other enzymes are vital metabolic enzymes that are associated with cancer development, invasion, and metastasis. Aim: The present study aimed to evaluate the LDH, Gamma-glutamyl transpeptidase (GGT) and Alkaline phosphatase (ALP) levels in the circulation of patients of breast cancer in different stages and tried to correlate levels with the disease stage and duration of breast cancer with and without lymph node involvement. Methods: LDH, GGT, and ALP were analyzed by means of the spectrophotometer, Biosystem device and reagents, serum samples were used for measurement. Results: GGT and LDH had a significant difference with P.value 0.022 and 0.001 respectively between stages II and III, while ALP didn't show a significant difference. Conclusion: This data concluded that enzyme markers like serum LDH and GGT could be sensitive, specific and cost effective biomarkers for diagnosing carcinoma of the breast and for monitoring its progressio

    The Budding Yeast “Saccharomyces cerevisiae” as a Drug Discovery Tool to Identify Plant-Derived Natural Products with Anti-Proliferative Properties

    Get PDF
    The budding yeast Saccharomyces cerevisiae is a valuable system to study cell-cycle regulation, which is defective in cancer cells. Due to the highly conserved nature of the cell-cycle machinery between yeast and humans, yeast studies are directly relevant to anticancer-drug discovery. The budding yeast is also an excellent model system for identifying and studying antifungal compounds because of the functional conservation of fungal genes. Moreover, yeast studies have also contributed greatly to our understanding of the biological targets and modes of action of bioactive compounds. Understanding the mechanism of action of clinically relevant compounds is essential for the design of improved second-generation molecules. Here we describe our methodology for screening a library of plant-derived natural products in yeast in order to identify and characterize new compounds with anti-proliferative properties
    corecore