966 research outputs found

    Evolutionary Centrality and Maximal Cliques in Mobile Social Networks

    Full text link
    This paper introduces an evolutionary approach to enhance the process of finding central nodes in mobile networks. This can provide essential information and important applications in mobile and social networks. This evolutionary approach considers the dynamics of the network and takes into consideration the central nodes from previous time slots. We also study the applicability of maximal cliques algorithms in mobile social networks and how it can be used to find the central nodes based on the discovered maximal cliques. The experimental results are promising and show a significant enhancement in finding the central nodes

    Procurement auctions with avoidable fixed costs: an experimental approach

    Get PDF
    Bidders in procurement auctions often face avoidable fixed costs. This can make bidding decisions complex and risky, and market outcomes volatile. If bidders deviate from risk neutral best responses, either due to faulty optimization or risk attitudes, then equilibrium predictions can perform poorly. In this paper, we confront laboratory bidders with three auction formats that make bidding difficult and risky in different ways. We find that measures of `difficulty' provide a consistent explanation of deviations from best response bidding across the three formats. In contrast, risk and loss preferences cannot explain behavior across all three formats.Auctions; Experimental; Procurement; Synergies; Asymmetric Bidders; Learning; Optimization errors

    Educational Journey: From Qena to Cairo

    Get PDF
    This documentary tells the story of my grandfather\u27s, Abdelhakim Elmaghraby, path to education as someone who grew up in Upper Egypt

    Creating a strong statistical machine translation system by combining different decoders

    Get PDF
    Machine translation is a very important field in Natural Language Processing. The need for machine translation arises due to the increasing amount of data available online. Most of our data now is digital and this is expected to increase over time. Since human manual translation takes a lot of time and effort, machine translation is needed to cover all of the languages available. A lot of research has been done to make machine translation faster and more reliable between different language pairs. Machine translation is now being coupled with deep learning and neural networks. New topics in machine translation are being studied and tested like applying neural machine translation as a replacement to the classical statistical machine translation. In this thesis, we also study the effect of data-preprocessing and decoder type on translation output. We then demonstrate two ways to enhance translation from English to Arabic. The first approach uses a two-decoder system; the first decoder translates from English to Arabic and the second is a post-processing decoder that retranslates the first Arabic output to Arabic again to fix some of the translation errors. We then study the results of different kinds of decoders and their contributions to the test set. The results of this study lead to the second approach which combines different decoders to create a stronger one. The second approach uses a classifier to categorize the English sentences based on their structure. The output of the classifier is the decoder that is suited best to translate the English sentence. Both approaches increased the BLEU score albeit with different ranges. The classifier showed an increase of ~0.1 BLEU points while the post-processing decoder showed an increase of between ~0.3~11 BLEU points on two different test sets. Eventually we compare our results to Google translate to know how well we are doing in comparison to a well-known translator. Our best translation machine system scored 5 absolute points compared to Google translate in ISI corpus test set and we were 9 absolute points lower in the case of the UN corpus test set

    The use of mechanical redundancy for fault detection in non-stationary machinery

    Get PDF
    The classical approach to machinery fault detection is one where a machinery’s condition is constantly compared to an established baseline with deviations indicating the occurrence of a fault. With the absence of a well-established baseline, fault detection for variable duty machinery requires the use of complex machine learning and signal processing tools. These tools require extensive data collection and expert knowledge which limits their use for industrial applications. The thesis at hand investigates the problem of fault detection for a specific class of variable duty machinery; parallel machines with simultaneously loaded subsystems. As an industrial case study, the parallel drive stations of a novel material haulage system have been instrumented to confirm the mechanical response similarity between simultaneously loaded machines. Using a table-top fault simulator, a preliminary statistical algorithm was then developed for fault detection in bearings under non-stationary operation. Unlike other state of the art fault detection techniques used in monitoring variable duty machinery, the proposed algorithm avoided the need for complex machine learning tools and required no previous training. The limitations of the initial experimental setup necessitated the development of a new machinery fault simulator to expand the investigation to include transmission systems. The design, manufacturing and setup of the various subsystems within the new simulator are covered in this manuscript including the mechanical, hydraulic and control subsystems. To ensure that the new simulator has successfully met its design objectives, extensive data collection and analysis has been completed and is presented in this thesis. The results confirmed that the developed machine truly represents the operation of a simultaneously loaded machine and as such would serve as a research tool for investigating the application of classical fault detection techniques to parallel machines in non-stationary operation.Master's These

    Security and the Transnational Information Polity

    Get PDF
    Global information and communications technologies create criminal opportunities in which criminal violation and physical proximity are decoupled. As in all our endeavors, the good become the prey of the bad. Murderous and venal exploitation of ICT has followed from the inception of the Internet, threatening all the good it brings and the trust we need so badly as a people. As the work continues to expand the implementation of Smart Cities and the Internet of Things, there will be more opportunities for exploitation of these technologies. We examine the social and liberty risks our data and technology-driven responses may entail

    Basic approximations to an adaptive resource allocation technique to stochastic multimodal projects

    Get PDF
    This paper presents three basic approximations developed to solve the Adaptive Stochastic Multimodal Resource Allocation Problem. Two of them are based on the DP model introduced in earlier papers ([23], [24]). The other one uses NLP to solve this problem. The approximations developed consist in considering the Work Content of some or all the activities of the project as represented by their mean values. These approximations were applied to a set of examples, and results were obtained and commented. As expected, running times were reduced, compared to the original model, but the total cost was underestimated, due to the use of means instead of the complete distribution

    Adaptive resource allocation in multimodal activity networks

    Get PDF
    In practice, project managers must cope with uncertainty, and must manipulate the allocation of their resources adaptively in order to achieve their ultimate objectives. Yet, treatments of the well-known ‘resource constrained project scheduling problem’ have been deterministic and static, and have addressed mostly unimodal activities. We present an approach to resource allocation under stochastic conditions for multimodal activity networks. Optimization is via dynamic programming, which proves to be demanding computationally. We suggest approximation schemes that do not detract signi…cantly from optimality, but are modest in their computational requirements

    ENHANCING DISSOLUTION RATE OF INDOMETHACIN BY IN SITU CRYSTALIZATION; DEVELOPMENT OF ORALLY DISINTEGRATING TABLETS

    Get PDF
    Objective: The main objective of this study was to investigate the potential of in situ crystallization of indomethacin, in presence or absence of hydrophilic materials, to improve drug dissolution with the goal of developing fast disintegrating tablets.Methods: Indomethacin crystals were prepared by bottom up approach. Water containing hydrophilic additive (polymer or/and surfactant) was added to ethanolic solution of indomethacin while stirring. The selected polymers were hydroxylpropylmethyl cellulose E5 (HPMC E5), polyethylene glycol 6000 (PEG6000) and polyvinylpyrrolidone K40 (PVP K40). The surfactants used were Tween80 and Glucire 44/14. The precipitated particles were collected and air dried. Solid state characterization were performed in addition to in vitro release studies in both acidic (0.1 N HCL) and alkaline medium (phosphate buffer pH 6.8). Optimized formulation was selected to develop fast disintegrating tablets.Results: Thermal behavior suggested modulation in crystalline nature with reduction in particle size that was confirmed by X-ray diffraction results. Infrared spectroscopy excluded any interaction between drug and hydrophilic excipients. Drug dissolution in acid media showed slight improve in drug release, while marked increase was observed in the alkaline media. Combination between Tween80 and HPMC (F7) showed the best dissolution parameters with 5-folds enhancement in release efficiency (RE) compared to pure drug. Formula F7 was successively used to formulate fast disintegrating tablets with prompted release of 58% of the loaded dose and RE of 83%.Conclusion: In situ crystallization of indomethacin is a good approach for enhanced dissolution rate with the presence of hydrophilic additives during precipitation process improving the efficiency
    corecore