15 research outputs found

    Key Parameters in Identifying Cost of Spam 2.0

    Get PDF
    This paper aims to provide an analytical view in estimating the cost of Spam 2.0. For this purpose, the authorsdefine the web spam lifecycle and its associated impact. We also enlisted 5 stakeholders and focused on defining 5 cost calculations using a large collection of references. The cost of web spam then can be calculated with the definition of 13 parameters. Detail explanations of the web spam cost impacts are given with regardsto the main four stakeholders: spammer, application provider, content provider and content consumer. Ongoing research in developing honey spam is also presented in this paper

    Task mapping and routing optimization for hard real-time Networks-on-Chip

    Get PDF
    Interference from high priority tasks and messages in a hard real-time Networks-on-Chip (NoC) create computation and communication delays. As the delays increase in number, maintaining the system’s schedulability become difficult. In order to overcome the problem, one way is to reduce interference in the NoC by changing task mapping and network routing. Some population-based heuristics evaluate the worst-case response times of tasks and messages based on the schedulability analysis, but requires a significant amount of optimization time to complete due to the complexity of the evaluation function. In this paper, we propose an optimization technique that explore both parameters simultaneously with the aim to meet the schedulability of the system, hence reducing the optimization time. One of the advantages from our approach is the unrepeated call to the evaluation function, which is unaddressed in the heuristics that configure design parameters in stages. The results show that a schedulable configuration can be found from the large design space

    Optimized Cover Selection for Audio Steganography Using Multi-Objective Evolutionary Algorithm

    Get PDF
    Existing embedding techniques depend on cover audio selected by users. Unknowingly, users may make a poor cover audio selection that is not optimised in its capacity or imperceptibility features, which could reduce the effectiveness of any embedding technique. As a trade-off exists between capacity and imperceptibility, producing a method focused on optimising both features is crucial. One of the search methods commonly used to find solutions for the trade-off problem in various fields is the Multi-Objective Evolutionary Algorithm (MOEA). Therefore, this research proposed a new method for optimising cover audio selection for audio steganography using the Non-dominated Sorting Genetic Algorithm-II (NSGA-II), which falls under the MOEA Pareto dominance paradigm. The proposed method provided suggestions for cover audio to users based on imperceptibility and capacity features. The sample difference calculation was initially formulated to determine the maximum capacity for each cover audio defined in the cover audio database. Next, NSGA-II was implemented to determine the optimised solutions based on the parameters provided by each chromosome. The experimental results demonstrated the effectiveness of the proposed method as it managed to dominate the solutions from the previous method selected based on one criterion only. In addition, the proposed method considered that the trade-off managed to select the solution as the highest priority compared to the previous method, which put the same solution as low as 71 in the priority ranking. In conclusion, the method optimised the cover audio selected, thus, improving the effectiveness of the audio steganography used. It can be a response to help people whose computers and mobile devices continue to be unfamiliar with audio steganography in an age where information security is crucial

    Preventive effects of Polygonum minus essential oil on cisplatin-induced hepatotoxicity in sprague dawley rats

    Get PDF
    Cisplatin is a chemotherapeutic agent widely used in treating various types of cancer. However, its usage is restricted due to the adverse hepatoxicity, as seen in approximately 36% of cancer patients receiving cisplatin treatment. Polygonum minus essential oil has high antioxidant capacity, and is enriched with terpenoids and phenolic compounds. The objective of this study was to investigate effects of P. minus essential oil (PmEO) supplementation on cisplatin-induced hepatotoxicity in rats. Male rats were divided into seven different groups, namely: control (C), cisplatin-induced (CP), positive control with β-caryophyllene 150 mg/kg (BCP), PmEO 100 mg/kg (PmEO100CP), PmEO 200 mg/kg (PmEO200CP), PmEO 400 mg/kg (PmEO400CP) and PmEO 400 mg/kg alone (PmEO400). PmEO and BCP was given orally for 14 days prior to a single dose cisplatin (10 mg/kg) injection on day 15 and rats were sacrificed on day 18. Liver enzymes, histology, ultrastructural morphology and oxidative stress markers such as glutathione, glutathione peroxidase, catalase, superoxide dismutase and malondialdehyde were assayed. Compared to controls, levels of transaminase enzymes, serum bilirubin and oxidative stress were all increased in CP, PmEO200CP and PmEO400CP groups. However, only PmEO100CP and BCP groups reduced these increases in level of transaminase enzymes and oxidative stress compared to CP group. On both light microscopic and ultrastructural examination, CP and PmEO400CP groups showed hepatotoxicity, exhibited by cytoplasmic vacuolation, congested blood sinusoids and increased number of Kupffer cells. However, these changes were minimized in the PmEO100CP group. Therefore, we concluded that PmEO given at 100 mg/kg has preventive effect against cisplatin-induced hepatotoxicity in rats

    A study of cost, awareness, knowledge and perception of Spam 2.0

    Get PDF
    This research undertakes two important studies for Spam 2.0. Firstly it studies the cost of Spam 2.0, to identify cost of storage and loss of productivity. Secondly it studies the level of public awareness, knowledge and perception of Spam 2.0 by undertaking a web-based survey involving 368 Internet users. Results indicate that Spam 2.0 incurs significant costs and the level of awareness and knowledge among public is quite high but the perception is undecided

    Storage cost of spam 2.0 in a web discussion forum

    No full text
    This paper presents an empirical research that identifies cost of Spam 2.0. This experiment is a part of ongoing research for identifying the cost of Spam 2.0 and focuses only on storage cost. The data is collected via a honeypot setup using a discussion forum for a period of 13 months. Forum provides a good place for the spammers to continue their spamming activities. Spamming give both direct and indirect cost towards forum owner and forum users. In this paper, we present a method to measure direct cost focusing only on storage cost. The main observation of the experiment is done towards 450,772 posts, 141 personal messages and 62,798 profiles. It uses 2.69 GB storage space. We first define our cost formula. We then set up a web based discussion forum and collect the information posted on the forum. This data is pre-processed to discover information that can be used in our formula. In order to identify the storage used for spam, we define related attributes based on maximum storage and impact factor features named as spam unit, and measure the storage taken by all these spam units. We evaluate the cost of storage based on three sources which are our real self-hosted server, commercial web hosting package and cloud hosting package. The experiment resulted that the storage cost for our research forum are AUD 23.66 based on self-hosted server, AUD133.90 for commercial web hosting, and AUD11.53 for cloud hosting. The highest storage cost for 10,000 spam posts, profiles and personal messages is AUD2.963, AUD0.068 and AUD0.056

    Factors Involved in Estimating Cost of Email Spam

    No full text
    This paper analyses existing research work to identify all possible factors involved in estimating cost of spam. Main motivation of this paper is to provide unbiased spam costs estimation. For that, we first study the email spam lifecycle and identify all possible stakeholders. We then categorise cost and study the impact on each stakeholder. This initial study will form the backbone of the real time spam cost calculating engine that we are developing for Australia

    Spam 2.0

    No full text
    In this paper, we provide a high level overview of Spam 2.0, how it works, its impacts and its categorizations (which are annoying, tricky, deceiving and evil). We also describe the existing approaches taken to combat Spam 2.0, including the detection approach, the prevention approach, and the early detection approach. Three techniques based on the detection approach presented in this paper include: content based, metadata based and user flagging based. We also explore several open issues/problems in this area. These include problems regarding tools and technologies, awareness and responsibility, and spam and spammers. Issues discussed regarding awareness and responsibility are users’ lack of awareness, governments’ inaction in tackling Spam 2.0, companies’ apathy in combating it, lack of collaboration between countries, and unclear accountabilities in this regard. The paper also identifies future trends for both anti-spammers and spammers. Anti-spammers will likely focus their efforts more on behaviour based techniques and produce more language independent tools. Implementation of dynamic forms and forcing every user to actually go through the registration form will be good ways to control spam. From a monetary perspective, estimating intangible costs associated with Spam 2.0 will help raise the awareness of public users regarding spamming. On the other hand, the spammers will predictably continue to find methods to decrease the filters’ efficiency by imitating real users’ behaviours and finding other spamming opportunities

    Awareness, Knowledge and Perception of Online Spam

    No full text
    Online spam is a new way of spamming, using Web 2.0 applications as platforms. It can easily proliferate in spite of the first layer of security being in place, such as detection and prevention software, because of lack of awareness and knowledge on the part of the Internet users. It not only creates nuisance for the Internet users, it may also lead to bigger problems, like cybercrime involving hacking, phishing, etc. This paper presents the descriptive analysis of a web-based survey, conducted on 368 Internet users on their awareness, knowledge and perception of online spam. The purpose of the survey was to gauge the Internet users' awareness and knowledge of online spam, and investigate their perception of different aspects of the problem. To the best of our knowledge, it was the first survey conducted to highlight and investigate the issues involving online spam and, as such, the paper is a unique and pioneering contribution in the field

    Systematic literature review: Trend analysis on the design of lightweight block cipher

    No full text
    Lightweight block ciphers have become a standard for security protections on IoT devices. Advanced technology is required to secure the data, thus encryption is the method that can provide information security. From previous studies, comparisons of lightweight algorithms in various literature focus on their performance and implementation. However, a lack of analysis has been done on the relationship between the algorithm components and their security strength. This information is crucial for developers in designing secure algorithms. In this paper, a comprehensive systematic literature review on 101 existing lightweight algorithms is presented. This review focuses on the security aspect of lightweight algorithms that cover the identification of secure design components based on substitution and permutation. Security analysis and the evolution of lightweight algorithms are also presented. This research includes the results and discussions to observe the selections of substitution and permutation functions to analyse their impact on the security strength. Recommendations from the developer’s insight on methods and considerations for designing an algorithm are also presented. Findings from the research indicate that various techniques can be used to develop a secure algorithm. Most importantly, an algorithm must be provided with confusion and diffusion properties in the design to ensure sufficient security
    corecore