289 research outputs found

    Challenges of hidden data in the unused area two within executable files

    Get PDF
    Problem statement: The executable files are one of the most important files in operating systems and in most systems designed by developers (programmers/software engineers), and then hiding information in these file is the basic goal for this study, because most users of any system cannot alter or modify the content of these files. There are many challenges of hidden data in the unused area two within executable files, which is dependencies of the size of the cover file with the size of hidden information, differences of the size of file before and after the hiding process, availability of the cover file after the hiding process to perform normally and detection by antivirus software as a result of changes made to the file. Approach: The system designed to accommodate the release mechanism that consists of two functions; first is the hiding of the information in the unused area 2 of PE-file (exe.file), through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information) and the second function is the extraction of the hiding information through three process (specify the steno file, extract the information, and decryption of the information). Results: The programs were coded in Java computer language and implemented on Pentium PC. The designed algorithms were intended to help in proposed system aim to hide and retract information (data file) with in unused area 2 of any execution file(exe.file). Conclusion: Features of the short-term responses were simulated that the size of the hidden data does depend on the size of the unused area2 within cover file which is equal 20% from the size of exe.file before hiding process, most antivirus systems do not allow direct write in executable file, so the approach of the proposed system is to prevent the hidden information to observation of these systems and the exe.file still function as usual after the hiding process

    Optimizing security and flexibility by designing a high security system for e-government servers

    Get PDF
    E-government is one of the most popular applications in the Web base applications.It helps people to do those work online, access the government sites, apply for online jobs, access to important data from the government database, and on top of that it also helps the government employees to access cameras and sensors over the country. However there are many challenges to keep the government data safe and secure in an open environment (network).Therefore, this paper is proposed to discuss two issues.In the first stage how to keep the data in safe, where this paper introduces many applications that guarantee a very high security for accessing and editing of data.The paper also carries a new design for E-government servers in which the authors try to distribute the security service on each line to avoid any attack from out or inside. The second issue is to ensure the flexibility of the data flow from the servers to the user which is the second challenge in the design.The experiment shows a good expected result, with the new approach have a high security and at the same time flexible E-government access.This paper provides a different view and uses a mixture of technologies to achieve a high security rate that will not affect different user's access.E-Government environment is subject to multiple security challenges, thus this paper proposed a model on how to secure the servers and how to ensure the flexibility of the system, in a simple way balance between a lot of security tools and the appreciate protecting vs. granting the flexible data flow up and download to the user

    New approach of hidden data in the portable executable file without change the size of carrier file using statistical technique

    Get PDF
    The rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information. In additional, digital document is also easy to copy and distribute, therefore it may face many threats. It became necessary to find an appropriate protection due to the significance, accuracy and sensitivity of the information. The strength of the hiding science is due to the non-existence of standard algorithms to be used in hiding secret messages. Also there is randomness in hiding methods such as combining several media (covers) with different methods to pass a secret message. Furthermore, there is no formal method to be followed to discover a hidden data. In this paper, a new information hiding system is presented. The aim of the proposed system is to hide information (data file) in an execution file (EXE) without change the size of execution file. The new proposed system is able to embed information in an execution file without change the size of execution file. Meanwhile, since the cover file might be used to identify hiding information, the proposed system considers overcoming this dilemma by using the execution file as a cover file

    Public Sentiment Analysis and Topic Modeling Regarding COVID-19’s Three Waves of Total Lockdown: A Case Study on Movement Control Order in Malaysia

    Get PDF
    [Abstract] The COVID-19 pandemic has affected many aspects of human life. The pandemic not only caused millions of fatalities and problems but also changed public sentiment and behavior. Owing to the magnitude of this pandemic, governments worldwide adopted full lockdown measures that attracted much discussion on social media platforms. To investigate the effects of these lockdown measures, this study performed sentiment analysis and latent Dirichlet allocation topic modeling on textual data from Twitter published during the three lockdown waves in Malaysia between 2020 and 2021. Three lockdown measures were identified, the related data for the first two weeks of each lockdown were collected and analysed to understand the public sentiment. The changes between these lockdowns were identified, and the latent topics were highlighted. Most of the public sentiment focused on the first lockdown as reflected in the large number of latent topics generated during this period. The overall sentiment for each lockdown was mostly positive, followed by neutral and then negative. Topic modelling results identified staying at home, quarantine and lockdown as the main aspects of discussion for the first lockdown, whilst importance of health measures and government efforts were the main aspects for the second and third lockdowns. Governments may utilise these findings to understand public sentiment and to formulate precautionary measures that can assure the safety of their citizens and tend to their most pressing problems. These results also highlight the importance of positive messaging during difficult times, establishing digital interventions and formulating new policies to improve the reaction of the public to emergency situations.Taiwan. Ministry of Science and Technology; 108-2511-H-224-007-MY

    Shrinking a large dataset to identify variables associated with increased risk of Plasmodium falciparum infection in Western Kenya

    Get PDF
    Large datasets are often not amenable to analysis using traditional single-step approaches. Here, our general objective was to apply imputation techniques, principal component analysis (PCA), elastic net and generalized linear models to a large dataset in a systematic approach to extract the most meaningful predictors for a health outcome. We extracted predictors for Plasmodium falciparum infection, from a large covariate dataset while facing limited numbers of observations, using data from the People, Animals, and their Zoonoses (PAZ) project to demonstrate these techniques: data collected from 415 homesteads in western Kenya, contained over 1500 variables that describe the health, environment, and social factors of the humans, livestock, and the homesteads in which they reside. The wide, sparse dataset was simplified to 42 predictors of P. falciparum malaria infection and wealth rankings were produced for all homesteads. The 42 predictors make biological sense and are supported by previous studies. This systematic data-mining approach we used would make many large datasets more manageable and informative for decision-making processes and health policy prioritization

    Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results

    Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC

    Get PDF
    Measurements of inclusive jet suppression in heavy ion collisions at the LHC provide direct sensitivity to the physics of jet quenching. In a sample of lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the anti-kt algorithm with values for the distance parameter that determines the nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp. Jet production is found to be suppressed by approximately a factor of two in the 10% most central collisions relative to peripheral collisions. Rcp varies smoothly with centrality as characterized by the number of participating nucleons. The observed suppression is only weakly dependent on jet radius and transverse momentum. These results provide the first direct measurement of inclusive jet suppression in heavy ion collisions and complement previous measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables, submitted to Physics Letters B. All figures including auxiliary figures are available at http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02

    Search for displaced vertices arising from decays of new heavy particles in 7 TeV pp collisions at ATLAS

    Get PDF
    We present the results of a search for new, heavy particles that decay at a significant distance from their production point into a final state containing charged hadrons in association with a high-momentum muon. The search is conducted in a pp-collision data sample with a center-of-mass energy of 7 TeV and an integrated luminosity of 33 pb^-1 collected in 2010 by the ATLAS detector operating at the Large Hadron Collider. Production of such particles is expected in various scenarios of physics beyond the standard model. We observe no signal and place limits on the production cross-section of supersymmetric particles in an R-parity-violating scenario as a function of the neutralino lifetime. Limits are presented for different squark and neutralino masses, enabling extension of the limits to a variety of other models.Comment: 8 pages plus author list (20 pages total), 8 figures, 1 table, final version to appear in Physics Letters

    Measurement of the polarisation of W bosons produced with large transverse momentum in pp collisions at sqrt(s) = 7 TeV with the ATLAS experiment

    Get PDF
    This paper describes an analysis of the angular distribution of W->enu and W->munu decays, using data from pp collisions at sqrt(s) = 7 TeV recorded with the ATLAS detector at the LHC in 2010, corresponding to an integrated luminosity of about 35 pb^-1. Using the decay lepton transverse momentum and the missing transverse energy, the W decay angular distribution projected onto the transverse plane is obtained and analysed in terms of helicity fractions f0, fL and fR over two ranges of W transverse momentum (ptw): 35 < ptw < 50 GeV and ptw > 50 GeV. Good agreement is found with theoretical predictions. For ptw > 50 GeV, the values of f0 and fL-fR, averaged over charge and lepton flavour, are measured to be : f0 = 0.127 +/- 0.030 +/- 0.108 and fL-fR = 0.252 +/- 0.017 +/- 0.030, where the first uncertainties are statistical, and the second include all systematic effects.Comment: 19 pages plus author list (34 pages total), 9 figures, 11 tables, revised author list, matches European Journal of Physics C versio

    Observation of a new chi_b state in radiative transitions to Upsilon(1S) and Upsilon(2S) at ATLAS

    Get PDF
    The chi_b(nP) quarkonium states are produced in proton-proton collisions at the Large Hadron Collider (LHC) at sqrt(s) = 7 TeV and recorded by the ATLAS detector. Using a data sample corresponding to an integrated luminosity of 4.4 fb^-1, these states are reconstructed through their radiative decays to Upsilon(1S,2S) with Upsilon->mu+mu-. In addition to the mass peaks corresponding to the decay modes chi_b(1P,2P)->Upsilon(1S)gamma, a new structure centered at a mass of 10.530+/-0.005 (stat.)+/-0.009 (syst.) GeV is also observed, in both the Upsilon(1S)gamma and Upsilon(2S)gamma decay modes. This is interpreted as the chi_b(3P) system.Comment: 5 pages plus author list (18 pages total), 2 figures, 1 table, corrected author list, matches final version in Physical Review Letter
    corecore