312,951 research outputs found
Paolo Mengozzi; Mohammed Bedjaoui: actos de investidura
Discursos del Acto de investidura como Doctor Honoris Causa del profesor Dr. Paolo Mengozzi en la Universidad Carlos III de Madrid el 23 de enero de 1998 y del profesor Dr. Mohammed Bedjaoui el 27 de enero de 2000.Prefacio / Fernando M. Mariño Menéndez .-- Acto de investidura como Doctor Honoris Causa del profesor Dr. Paolo Mengozzi (23 de enero de 1998).-- Laudatio de Don Paolo Mengozzi por el Dr. Fernando M. Mariño Menéndez.-- Lección magistral de investidura del nuevo Doctor Honoris Causa Don Paolo Mengozzi.-- Discurso del Excmo. Sr. Rector MagnÃfico de la Universidad Carlos III de Madrid.-- Acto de investidura como Doctor Honoris Causa del profesor Dr. Mohammed Bedjaoui (27 de enero de 2000).-- Laudatio de Don Mohammed Bedjaoui por el Dr. Fernando M. Mariño Menéndez.-- Lección magistral de investidura del nuevo Doctor Honoris Causa Don Mohammed Bedjaoui
Review of M-libraries 2: A Virtual Library in Everyone’s Pocket
Review of M-libraries 2: A Virtual Library in Everyone’s Pocket. Eds. Mohammed Ally and Gill Needham. London: Facet Publishing, 2010. 273p. Paperback, $105 (ISBN 9781856046961)
Urban land utilization : case study : Riyadh, Saudi Arabia
Thesis. 1975. M.Arch.A.S.--Massachusetts Institute of Technology. Dept. of Architecture."The analysis and evaluations were carried out in the Urban Settlement Design Program, School of Architecture and Planning, M.I.T."Bibliography: p.91.by Mohammed A. Al-Hussayen & Ali M. Shuaibi.M.Arch.A.S
Sustainable economic growth in Iraq : role of industrialization, deforestation, trade, employment, technology, and agriculture
In the present study, the sustainable economic growth of Iraq is evaluated by analyzing six key explanatory variables, namely industrialization, deforestation, trade, employment, and agricultural expansion, which is measured by arable land. The study utilizes a time series research design spanning from 2000 to 2021. A quantitative approach was adopted, and data were collected from the World Bank database. The study utilized FMOLS (fully modified ordinary least squares) and DOLS (dynamic ordinary least squares) regression techniques to investigate the relationship. Additionally, canonical cointegrating regression (CCR) estimation was employed as a robust estimator. The results reveal significant negative effects of employment and deforestation on GDP per capita (GDPC). Industrialization also negatively impacts GDPC, while trade shows a positive influence. On the other hand, arable land does not exhibit a significant impact on GDPC. Based on the findings, the research suggests several policy implications.Majeed M. Abid (Department of Accounting / Al-Hadi University College), Nidhal Raheem Mardood (Department of Accounting / College of Management and Economic/Al-Esraa University), Amenah Muayad Abdullah (College of Education/Al-Farahidi University), Mohammed Salim Madi (Department of Accounting / Mazaya University College), Muqdad Hussein Ali (College of Media, Department of Journalism/ The Islamic University in Najaf), Mohammed Yousif Oudah Al-Muttar (Scientific Research Centre, Al-Ayen University), Rajaa Jasim Mohammed (Department of Management/ Al-Nisour University College), Ghassan Kasim Al_Lami (Business Management Department/ Ashur University College), Raad M. Sayed-Lafi (College of Education/ National University of Science and Technology), Corresponding author: Majeed M. AbidIncludes bibliographical reference
Evaluation of WGS-subtyping methods for epidemiological surveillance of foodborne salmonellosis
Background: Salmonellosis is one of the most common foodborne diseases worldwide. Although human infection by non-typhoidal Salmonella (NTS) enterica subspecies enterica is associated primarily with a self-limiting diarrhoeal illness, invasive bacterial infections (such as septicaemia, bacteraemia and meningitis) were also reported. Human outbreaks of NTS were reported in several countries all over the world including developing as well as high-income countries. Conventional laboratory methods such as pulsed field gel electrophoresis (PFGE) do not display adequate discrimination and have their limitations in epidemiological surveillance. It is therefore very crucial to use accurate, reliable and highly discriminative subtyping methods for epidemiological characterisation and outbreak investigation.
Methods: Here, we used different whole genome sequence (WGS)-based subtyping methods for retrospective investigation of two different outbreaks of Salmonella Typhimurium and Salmonella Dublin that occurred in 2013 in UK and Ireland respectively.
Results: Single nucleotide polymorphism (SNP)-based cluster analysis of Salmonella Typhimurium genomes revealed well supported clades, that were concordant with epidemiologically defined outbreak and confirmed the source of outbreak is due to consumption of contaminated mayonnaise. SNP-analyses of Salmonella Dublin genomes confirmed the outbreak however the source of infection could not be determined. The core genome multilocus sequence typing (cgMLST) was discriminatory and separated the outbreak strains of Salmonella Dublin from the non-outbreak strains that were concordant with the epidemiological data however cgMLST could neither discriminate between the outbreak and non-outbreak strains of Salmonella Typhimurium nor confirm that contaminated mayonnaise is the source of infection, On the other hand, other WGS-based subtyping methods including multilocus sequence typing (MLST), ribosomal MLST (rMLST), whole genome MLST (wgMLST), clustered regularly interspaced short palindromic repeats (CRISPRs), prophage sequence profiling, antibiotic resistance profile and plasmid typing methods were less discriminatory and could not confirm the source of the outbreak.
Conclusions: Foodborne salmonellosis is an important concern for public health therefore, it is crucial to use accurate, reliable and highly discriminative subtyping methods for epidemiological surveillance and outbreak investigation. In this study, we showed that SNP-based analyses do not only have the ability to confirm the occurrence of the outbreak but also to provide definitive evidence of the source of the outbreak in real-time
A Survey on the Project in title
In this paper we present a survey of work that has been done in the project ldquo;Unsupervised Adaptive P300 BCI in the framework of chaotic theory and stochastic theoryrdquo;we summarised the following papers, (Mohammed J Alhaddad amp; 2011), (Mohammed J. Alhaddad amp; Kamel M, 2012), (Mohammed J Alhaddad, Kamel, amp; Al-Otaibi, 2013), (Mohammed J Alhaddad, Kamel, amp; Bakheet, 2013), (Mohammed J Alhaddad, Kamel, amp; Al-Otaibi, 2014), (Mohammed J Alhaddad, Kamel, amp; Bakheet, 2014), (Mohammed J Alhaddad, Kamel, amp; Kadah, 2014), (Mohammed J Alhaddad, Kamel, Makary, Hargas, amp; Kadah, 2014), (Mohammed J Alhaddad, Mohammed, Kamel, amp; Hagras, 2015).We developed a new pre-processing method for denoising P300-based brain-computer interface data that allows better performance with lower number of channels and blocks. The new denoising technique is based on a modified version of the spectral subtraction denoising and works on each temporal signal channel independently thus offering seamless integration with existing pre-processing and allowing low channel counts to be used. We also developed a novel approach for brain-computer interface data that requires no prior training. The proposed approach is based on interval type-2 fuzzy logic based classifier which is able to handle the usersrsquo; uncertainties to produce better prediction accuracies than other competing classifiers such as BLDA or RFLDA. In addition, the generated type-2 fuzzy classifier is learnt from data via genetic algorithms to produce a small number of rules with a rule length of only one antecedent to maximize the transparency and interpretability for the normal clinician. We also employ a feature selection system based on an ensemble neural networks recursive feature selection which is able to find the effective time instances within the effective sensors in relation to given P300 event. The basic principle of this new class of techniques is that the trial with true activation signal within each block has to be different from the rest of the trials within that block. Hence, a measure that is sensitive to this dissimilarity can be used to make a decision based on a single block without any prior training. The new methods were verified using various experiments which were performed on standard data sets and using real-data sets obtained from real subjects experiments performed in the BCI lab in King Abdulaziz University. The results were compared to the classification results of the same data using previous methods. Enhanced performance in different experiments as quantitatively assessed using classification block accuracy as well as bit rate estimates was confirmed. It will be shown that the produced type-2 fuzzy logic based classifier will learn simple rules which are easy to understand explaining the events in question. In addition, the produced type-2 fuzzy logic classifier will be able to give better accuracies when compared to BLDA or RFLDA on various human subjects on the standard and real-world data sets
Image compression based on 2D Discrete Fourier Transform and matrix minimization algorithm
In the present era of the internet and multimedia, image compression techniques are essential to improve image and video performance in terms of storage space, network bandwidth usage, and secure transmission. A number of image compression methods are available with largely differing compression ratios and coding complexity. In this paper we propose a new method for compressing high-resolution images based on the Discrete Fourier Transform (DFT) and Matrix Minimization (MM) algorithm. The method consists of transforming an image by DFT yielding the real and imaginary components. A quantization process is applied to both components independently aiming at increasing the number of high frequency coefficients. The real component matrix is separated into Low Frequency Coefficients (LFC) and High Frequency Coefficients (HFC). Finally, the MM algorithm followed by arithmetic coding is applied to the LFC and HFC matrices. The decompression algorithm decodes the data in reverse order. A sequential search algorithm is used to decode the data from the MM matrix. Thereafter, all decoded LFC and HFC values are combined into one matrix followed by the inverse DFT. Results demonstrate that the proposed method yields high compression ratios over 98% for structured light images with good image reconstruction. Moreover, it is shown that the proposed method compares favorably with the JPEG technique based on compression ratios and image quality
- …