731 research outputs found

    Solutions Globales Régulières pour Quelques Équations Lineaires D'évolution du Type Pseudo-Différentiel Singulier

    Get PDF
    2000 Mathematics Subject Classification: 35C15, 35D05, 35D10, 35S10, 35S99.We give here examples of equations of type (1) ∂tt2 y -p(t, Dx) y = 0, where p is a singular pseudo-differential operator with regular global solutions when the Cauchy data are regular, t ∈ R, x ∈ R5

    Novel Approach for IP-PBX Denial of Service Intrusion Detection Using Support Vector Machine Algorithm.

    Get PDF
    Recent trends have revealed that SIP based IP-PBX DoS attacks contribute to most overall IP-PBX attacks which is resulting in loss of revenues and quality of service in telecommunication providers. IP-PBX face challenges in detecting and mitigating malicious traffic. In this research, Support Vector Machine (SVM) machine learning detection & prevention algorithm were developed to detect this type of attacks Two other techniques were benchmarked decision tree and Naïve Bayes. The training phase of the machine learning algorithm used proposed real-time training datasets benchmarked with two training datasets from CICIDS and NSL-KDD. Proposed real-time training dataset for SVM algorithm achieved highest detection rate of 99.13% while decision tree and Naïve Bayes has 93.28% & 86.41% of attack detection rate, respectively. For CICIDS dataset, SVM algorithm achieved highest detection rate of 76.47% while decision tree and Naïve Bayes has 63.71% & 41.58% of detection rate, respectively. Using NSL-KDD training dataset, SVM achieved 65.17%, while decision tree and Naïve Bayes has 51.96% & 38.26% of detection rate, respectively.The time taken by the algorithms to classify the attack is very important. SVM gives less time (2.9 minutes) for detecting attacks while decision tree and naïve Bayes gives 13.6 minutes 26.2 minutes, respectively. Proposed SVM algorithm achieved the lowest false negative value of (87 messages) while decision table and Naïve Bayes achieved false negative messages of 672 and 1359, respectively

    ARTEMETHER LOADED ETHYLCELLULOSE NANOSUSPENSIONS: EFFECTS OF FORMULATION VARIABLES, PHYSICAL STABILITY AND DRUG RELEASE PROFILE

    Get PDF
    Objective: The aim of this study was to explore the individual and joint effects of drug: ethylcellulose ratio, content of tween 80 and chloroform: water volume ratio on particles' size and size distribution of artemether loaded ethyl cellulose nanosuspension formulations, aiming to achieve nanosuspension with desired particles properties, stability and drug release profile.Methods: Mixed levels design was used to generate a series of artemether loaded ethylcellulose nanosuspensions that produced by emulsification-solvent evaporation technique. Formulations were qualified for particle size and size distribution using dynamic light scattering technique. Best ranked formulation was then evaluated for stability and drug release rate and kinetics.Results: Drug: polymer ratio, content of surfactant and organic: water volume ratio were found to exert considerable influences (p<0.05) on particle size of produced nanosuspensions, either individually or as joint variables. Peak intensity property of nanosuspensions was found to be influenced by drug: polymer ratio (p<0.05) whereas the influences of different variables on the polydisperse index property appear inconsequential (p>0.05). Best ranked (optimal) artemether nanosuspension proved stable and capable to improve and maintain the release of loaded drug over 24 h, at least under the setting conditions of this study.Conclusion: Focusing on both the individual and joint influences of formulation variables assist in achieving nanosuspension with desired particles characteristics, stability and drug release profile

    Principal component analysis for human gait recognition system

    Get PDF
    This paper represents a method for Human Recognition system using Principal Component Analysis. Human Gait recognition works on the gait of walking subjects to identify people without them knowing or without their permission. The initial step in this kind of system is to generate silhouette frames of walking human. A number of features couldb be exytacted from these frames such as centriod ratio, heifht, width and orientation. The Principal Component Analysis (PCA) is used for the extracted features to condense the information and produces the main components that can represent the gait sequences for each waiking human. In the testing phase, the generated gait sequences are recognized by using a minimum distance classifier based on eluclidean distance matched with the one that already exist in the database used to identify walking subject

    Analysis of multicomponent transient signals using MUSIC superresolution technique

    Get PDF
    The problem of estimating the parameters of transient signals consisting of real decay constants has for long been a subject of study by many researchers. Such signals arise in many problems in Science and Engineering like nuclear magnetic resonance for medical diagnosis, deep-level transient spectroscopy, fluorescence decay analysis, etc. Many techniques have been suggested by researchers to analyse these signals but they often produce mixed results. A new method of analysis using modified MUSIC (multiple signal classification) subspace algorithm is successfully applied to the analysis of this signal. A noisy multiexponential signal is subjected to a preprocessing procedure consisting of Gardenerspsila transformation and inverse filtering. Modified MUSIC algorithm is then applied to the deconvolved data. The parameters of focus in this paper are the number of components and decay constants. It is shown that with this technique parameter estimates do not significantly change with signal to noise ratio. The superiority of this algorithm over conventional MUSIC algorithm is also shown

    Elaboration de poudre de fer par réduction de la calamine avec du monoxyde de carbone

    Get PDF
    Le présent travail se rapporte à l‟étude des conditions de réduction de la calamine, co-produit sidérurgique formé au cours du laminage à chaud des aciers, par un gaz réducteur afin d‟élaborer une poudre de fer ayant les caractéristiques exigées par la métallurgie de poudres. L‟opération de réduction a été menée à différentes températures (750-1050°C) pendant des temps variant entre 40 et 180 mn dans une atmosphère de CO pur. La poudre de fer produite a été caractérisée par analyse chimique, diffraction des rayons X, microscopie optique et microscopie électronique à balayage. Ces méthodes d‟investigation confirment la présence du fer, graphite et carbure de fer (Fe3C) comme produits de réactions. La teneur maximale en fer total atteinte dans les poudres de fer (98.40 %) est obtenue par réduction de la calamine à 1050°C pendant 180 mn. Un recuit réducteur sous hydrogène permet de diminuer les teneurs en carbone et oxygène de poudres de fer réduites jusqu‟à des valeurs admissibles.Mots-clés: recyclage ; calamine ; réduction ; monoxyde de carbone ; poudre de fer

    Effect of sampling on the parameter estimates of multicomponent transients

    Get PDF
    The need to estimate the parameters of transient multiexponential signals frequently arises in different areas of applied science. A classical technique that has been frequently used with different modifications is the Gardner transform. Gardner transform is used to convert the original data signal into a convolution model. Converting this model into a discrete type for further analysis depends on the selection of correct sampling conditions. Previously, a relationship between the sampling frequency and the weighting factor in the modified Gardner transform was derived. In this paper, the effect of this relationship on the accuracy of parameter estimates is investigated

    Novel Framework for Hidden Data in the Image Page within Executable File Using Computation between Advanced Encryption Standard and Distortion Techniques

    Full text link
    The hurried development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information. In additional, digital document is also easy to copy and distribute, therefore it may face many threats. It became necessary to find an appropriate protection due to the significance, accuracy and sensitivity of the information. Furthermore, there is no formal method to be followed to discover a hidden data. In this paper, a new information hiding framework is presented.The proposed framework aim is implementation of framework computation between advance encryption standard (AES) and distortion technique (DT) which embeds information in image page within executable file (EXE file) to find a secure solution to cover file without change the size of cover file. The framework includes two main functions; first is the hiding of the information in the image page of EXE file, through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information) and the second function is the extraction of the hiding information through three process (specify the stego file, extract the information, and decryption of the information).Comment: 6 Pages IEEE Format, International Journal of Computer Science and Information Security, IJCSIS 2009, ISSN 1947 5500, Impact Factor 0.42

    State-of-the-art application of artificial neural network in digital watermarking and the way forward

    Get PDF
    Several high-ranking watermarking schemes using neural networks have been proposed in order to make the watermark stronger to resist attacks.The ability of Artificial Neural Network, ANN to learn, do mapping, classify, and adapt has increased the interest of researcher in application of different types ANN in watermarking.In this paper, ANN based approached have been categorized based on their application to different components of watermarking such as; capacity estimate, watermark embedding, recovery of watermark and error rate detection. We propose a new component of water marking, Secure Region, SR in which, ANN can be used to identify such region within the estimated capacity. Hence an attack-proof watermarking system can be achieved

    Avoiding Bending in Case of Uniaxial Tension with Electromagnetic Forming

    Get PDF
    During electromagnetic forming, excessive bending of the specimen takes place due to high velocities and inertia. We show that the excessive bending can be prevented by optimizing the coil geometry in case of uniaxial tension. The process is simulated with various coil geometries, and the resulting amount of bending is compared to the case of standard Nakajima Test. The comparison shows that the bending can be minimised to acceptable levels to be able to call the method a decent way of determining forming limits. The results should be verified experimentally
    corecore