15,776 research outputs found

    Obfuscation-based malware update: A comparison of manual and automated methods

    Get PDF
    IndexaciĂłn: Scopus; Web of Science.This research presents a proposal of malware classification and its update based on capacity and obfuscation. This article is an extension of [4]a, and describes the procedure for malware updating, that is, to take obsolete malware that is already detectable by antiviruses, update it through obfuscation techniques and thus making it undetectable again. As the updating of malware is generally performed manually, an automatic solution is presented together with a comparison from the standpoint of cost and processing time. The automated method proved to be more reliable, fast and less intensive in the use of resources, specially in terms of antivirus analysis and malware functionality checking times.http://univagora.ro/jour/index.php/ijccc/article/view/2961/112

    Always in control? Sovereign states in cyberspace

    Get PDF
    For well over twenty years, we have witnessed an intriguing debate about the nature of cyberspace. Used for everything from communication to commerce, it has transformed the way individuals and societies live. But how has it impacted the sovereignty of states? An initial wave of scholars argued that it had dramatically diminished centralised control by states, helped by a tidal wave of globalisation and freedom. These libertarian claims were considerable. More recently, a new wave of writing has argued that states have begun to recover control in cyberspace, focusing on either the police work of authoritarian regimes or the revelations of Edward Snowden. Both claims were wide of the mark. By contrast, this article argues that we have often misunderstood the materiality of cyberspace and its consequences for control. It not only challenges the libertarian narrative of freedom, it suggests that the anarchic imaginary of the Internet as a ‘Wild West’ was deliberately promoted by states in order to distract from the reality. The Internet, like previous forms of electronic connectivity, consists mostly of a physical infrastructure located in specific geographies and jurisdictions. Rather than circumscribing sovereignty, it has offered centralised authority new ways of conducting statecraft. Indeed, the Internet, high-speed computing, and voice recognition were all the result of security research by a single information hegemon and therefore it has always been in control

    Friends Newsletter, Fall, 2010

    Get PDF

    Flow of emotional messages in artificial social networks

    Full text link
    Models of message flows in an artificial group of users communicating via the Internet are introduced and investigated using numerical simulations. We assumed that messages possess an emotional character with a positive valence and that the willingness to send the next affective message to a given person increases with the number of messages received from this person. As a result, the weights of links between group members evolve over time. Memory effects are introduced, taking into account that the preferential selection of message receivers depends on the communication intensity during the recent period only. We also model the phenomenon of secondary social sharing when the reception of an emotional e-mail triggers the distribution of several emotional e-mails to other people.Comment: 10 pages, 7 figures, submitted to International Journal of Modern Physics

    Building communities for the exchange of learning objects: theoretical foundations and requirements

    Get PDF
    In order to reduce overall costs of developing high-quality digital courses (including both the content, and the learning and teaching activities), the exchange of learning objects has been recognized as a promising solution. This article makes an inventory of the issues involved in the exchange of learning objects within a community. It explores some basic theories, models and specifications and provides a theoretical framework containing the functional and non-functional requirements to establish an exchange system in the educational field. Three levels of requirements are discussed. First, the non-functional requirements that deal with the technical conditions to make learning objects interoperable. Second, some basic use cases (activities) are identified that must be facilitated to enable the technical exchange of learning objects, e.g. searching and adapting the objects. Third, some basic use cases are identified that are required to establish the exchange of learning objects in a community, e.g. policy management, information and training. The implications of this framework are then discussed, including recommendations concerning the identification of reward systems, role changes and evaluation instruments

    The Enigma of Digitized Property A Tribute to John Perry Barlow

    Get PDF
    Compressive Sensing has attracted a lot of attention over the last decade within the areas of applied mathematics, computer science and electrical engineering because of it suggesting that we can sample a signal under the limit that traditional sampling theory provides. By then using dierent recovery algorithms we are able to, theoretically, recover the complete original signal even though we have taken very few samples to begin with. It has been proven that these recovery algorithms work best on signals that are highly compressible, meaning that the signals can have a sparse representation where the majority of the signal elements are close to zero. In this thesis we implement some of these recovery algorithms and investigate how these perform practically on a real video signal consisting of 300 sequential image frames. The video signal will be under sampled, using compressive sensing, and then recovered using two types of strategies, - One where no time correlation between successive frames is assumed, using the classical greedy algorithm Orthogonal Matching Pursuit (OMP) and a more robust, modied OMP called Predictive Orthogonal Matching Pursuit (PrOMP). - One newly developed algorithm, Dynamic Iterative Pursuit (DIP), which assumes and utilizes time correlation between successive frames. We then performance evaluate and compare these two strategies using the Peak Signal to Noise Ratio (PSNR) as a metric. We also provide visual results. Based on investigation of the data in the video signal, using a simple model for the time correlation and transition probabilities between dierent signal coecients in time, the DIP algorithm showed good recovery performance. The main results showed that DIP performed better and better over time and outperformed the PrOMP up to a maximum of 6 dB gain at half of the original sampling rate but performed slightly below the PrOMP in a smaller part of the video sequence where the correlation in time between successive frames in the original video sequence suddenly became weaker.Compressive sensing har blivit mer och mer uppmarksammat under det senaste decenniet inom forskningsomraden sasom tillampad matematik, datavetenskap och elektroteknik. En stor anledning till detta ar att dess teori innebar att det blir mojligt att sampla en signal under gransen som traditionell samplingsteori innebar. Genom att sen anvanda olika aterskapningsalgoritmer ar det anda teoretiskt mojligt att aterskapa den ursprungliga signalen. Det har visats sig att dessaaterskapningsalgoritmer funkar bast pa signaler som ar mycket kompressiva, vilket innebar att dessa signaler kan representeras glest i nagon doman dar merparten av signalens koecienter ar nara 0 i varde. I denna uppsats implementeras vissa av dessaaterskapningsalgoritmer och vi undersoker sedan hur dessa presterar i praktiken pa en riktig videosignal bestaende av 300 sekventiella bilder. Videosignalen kommer att undersamplas med compressive sensing och sen aterskapas genom att anvanda 2 typer av strategier, - En dar ingen tidskorrelation mellan successiva bilder i videosignalen antas genom att anvanda klassiska algoritmer sasom Orthogonal Matching Pursuit (OMP) och en mer robust, modierad OMP : Predictive Orthogonal Matching Pursuit (PrOMP). - En nyligen utvecklad algoritm, Dynamic Iterative Pursuit (DIP), som antar och nyttjar en tidskorrelation mellan successiva bilder i videosignalen. Vi utvarderar och jamfor prestandan i dessa tva olika typer av strategier genom att anvanda Peak Signal to Noise Ratio (PSNR) som jamforelseparameter. Vi ger ocksa visuella resultat fran videosekvensen. Baserat pa undersokning av data i videosignalen visade det sig, genom att anvanda enkla modeller, bade for tidskorrelationen och sannolikhetsfunktioner for vilka koecienter som ar aktiva vid varje tidpunkt, att DIP algoritmen visade battre prestanda an de tva andra tidsoberoende algoritmerna under visa tidsekvenser. Framforallt de sekvenser dar videosignalen inneholl starkare korrelation i tid. Som mest presterade DIP upp till 6 dB battre an OMP och PrOMP
    • 

    corecore