362 research outputs found

    Cross-Layer Peer-to-Peer Track Identification and Optimization Based on Active Networking

    Get PDF
    P2P applications appear to emerge as ultimate killer applications due to their ability to construct highly dynamic overlay topologies with rapidly-varying and unpredictable traffic dynamics, which can constitute a serious challenge even for significantly over-provisioned IP networks. As a result, ISPs are facing new, severe network management problems that are not guaranteed to be addressed by statically deployed network engineering mechanisms. As a first step to a more complete solution to these problems, this paper proposes a P2P measurement, identification and optimisation architecture, designed to cope with the dynamicity and unpredictability of existing, well-known and future, unknown P2P systems. The purpose of this architecture is to provide to the ISPs an effective and scalable approach to control and optimise the traffic produced by P2P applications in their networks. This can be achieved through a combination of different application and network-level programmable techniques, leading to a crosslayer identification and optimisation process. These techniques can be applied using Active Networking platforms, which are able to quickly and easily deploy architectural components on demand. This flexibility of the optimisation architecture is essential to address the rapid development of new P2P protocols and the variation of known protocols

    Accelerated focused crawling through online relevance feedback

    Get PDF
    The organization of HTML into a tag tree structure, which is rendered by browsers as roughly rectangular regions with embedded text and HREF links, greatly helps surfers locate and click on links that best satisfy their information need. Can an automatic program emulate this human behavior and thereby learn to predict the relevance of an unseen HREF target page w.r.t. an information need, based on information limited to the HREF source page? Such a capability would be of great interest in focused crawling and resource discovery, because it can fine-tune the priority of unvisited URLs in the crawl frontier, and reduce the number of irrelevant pages which are fetched and discarded

    Implementing a Search Engine for Bangladeshi E-Commerce Product

    Get PDF
    This project is concern with the practical implementation of Information Retrieval, where the main focus is on the algorithmic challenges in efficiently representing large data sets while supporting fast searches in the Web. The application was developed by using a system approach where analysis, design and development were carried out by the incremental model. The main aim of this work to introduce an efficient Information Retrieval System for Bangladeshi Ecommerce Product Search Engine. This search engine base on the technology of public search engine and is built specific for the structure of Bangladesh E-commerce Search Engine. The system provides the relevant searching results from Bangladeshi web domains. The proposed system has been designed and developed using Python programing language tools and methods

    StaticFixer: From Static Analysis to Static Repair

    Full text link
    Static analysis tools are traditionally used to detect and flag programs that violate properties. We show that static analysis tools can also be used to perturb programs that satisfy a property to construct variants that violate the property. Using this insight we can construct paired data sets of unsafe-safe program pairs, and learn strategies to automatically repair property violations. We present a system called \sysname, which automatically repairs information flow vulnerabilities using this approach. Since information flow properties are non-local (both to check and repair), \sysname also introduces a novel domain specific language (DSL) and strategy learning algorithms for synthesizing non-local repairs. We use \sysname to synthesize strategies for repairing two types of information flow vulnerabilities, unvalidated dynamic calls and cross-site scripting, and show that \sysname successfully repairs several hundred vulnerabilities from open source {\sc JavaScript} repositories, outperforming neural baselines built using {\sc CodeT5} and {\sc Codex}. Our datasets can be downloaded from \url{http://aka.ms/StaticFixer}

    Programming Not Only by Example

    Full text link
    In recent years, there has been tremendous progress in automated synthesis techniques that are able to automatically generate code based on some intent expressed by the programmer. A major challenge for the adoption of synthesis remains in having the programmer communicate their intent. When the expressed intent is coarse-grained (for example, restriction on the expected type of an expression), the synthesizer often produces a long list of results for the programmer to choose from, shifting the heavy-lifting to the user. An alternative approach, successfully used in end-user synthesis is programming by example (PBE), where the user leverages examples to interactively and iteratively refine the intent. However, using only examples is not expressive enough for programmers, who can observe the generated program and refine the intent by directly relating to parts of the generated program. We present a novel approach to interacting with a synthesizer using a granular interaction model. Our approach employs a rich interaction model where (i) the synthesizer decorates a candidate program with debug information that assists in understanding the program and identifying good or bad parts, and (ii) the user is allowed to provide feedback not only on the expected output of a program, but also on the underlying program itself. That is, when the user identifies a program as (partially) correct or incorrect, they can also explicitly indicate the good or bad parts, to allow the synthesizer to accept or discard parts of the program instead of discarding the program as a whole. We show the value of our approach in a controlled user study. Our study shows that participants have strong preference to using granular feedback instead of examples, and are able to provide granular feedback much faster

    Maritime Robotics and Autonomous Systems Operations: Exploring Pathways for Overcoming International Techno-Regulatory Data Barriers

    Get PDF
    The current regulatory landscape that applies to maritime service robotics, aptly termed as robotics and autonomous systems (RAS), is quite complex. When it comes to patents, there are multifarious considerations in relation to vessel survey, inspection, and maintenance processes under national and international law. Adherence is challenging, given that the traditional delivery methods are viewed as unsafe, strenuous, and laborious. Service robotics, namely micro aerial vehicles (MAVs) or drones, magnetic-wheeled crawlers (crawlers), and remotely operated vehicles (ROVs), function by relying on the architecture of the Internet of Robotic Things. The aforementioned are being introduced as time-saving apparatuses, accompanied by the promise to acquire concrete and sufficient data for the identification of vessel structural weaknesses with the highest level of accuracy to facilitate decision-making processes upon which temporary and permanent measures are contingent. Nonetheless, a noticeable critical issue associated with RAS effective deployment revolves around non-personal data governance, which comprises the main analytical focus of this research effort. The impetus behind this study stems from the need to enquire whether “data” provisions within the realm of international technological regulatory (techno-regulatory) framework is sufficient, well organized, and harmonized so that there are no current or future conflicts with promulgated theoretical dimensions of data that drive all subject matter-oriented actions. As is noted from the relevant expository research, the challenges are many. Engineering RAS to perfection is not the end-all and be-all. Collateral impediments must be avoided. A safety net needs to be devised to protect non-personal data. The results here indicate that established data decision dimensions call for data security and protection, as well as a consideration of ownership and liability details. An analysis of the state-of-the-art and the comparative results assert that the abovementioned remain neglected in the current international setting. The findings reveal specific data barriers within the existing international framework. The ways forward include strategic actions to remove data barriers towards overall efficacy of maritime RAS operations. The overall findings indicate that an effective transition to RAS operations requires optimizing the international regulatory framework for opening the pathways for effective RAS operations. Conclusions were drawn based on the premise that policy reform is inevitable in order to push the RAS agenda forward before the emanation of 6G and the era of the Internet of Everything, with harmonization and further standardization being very high priority issues

    Identifying New/Emerging Psychoactive Substances at the Time of COVID-19; A Web-Based Approach

    Get PDF
    © 2021 Catalani, Arillotta, Corkery, Guirguis, Vento and Schifano. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY).https://creativecommons.org/licenses/by/4.0/COVID-19-related disruptions of people and goods' circulation can affect drug markets, especially for new psychoactive substances (NPSs). Drug shortages could cause a change in available NPS, with the introduction of new, unknown, substances. The aims of the current research were to use a web crawler, NPSfinder®, to identify and categorize emerging NPS discussed on a range of drug enthusiasts/psychonauts' websites/fora at the time of the pandemic; social media for these identified NPS were screened as well. The NPSfinder® was used here to automatically scan 24/7 a list of psychonaut websites and NPS online resources. The NPSs identified in the time frame between January and August 2020 were searched in both the European Monitoring Center for Drugs and Drug Addictions (EMCDDA)/United Nations Office on Drugs and Crime (UNODC) databases and on social media (Facebook, Twitter, Instagram, Pinterest, and YouTube) as well, with a content qualitative analysis having been carried out on reddit.com. Of a total of 229 NPSs being discussed at the time of the pandemic, some 18 NPSs were identified for the first time by the NPSfinder®. These included six cathinones, six opioids, two synthetic cannabinoid receptor agonists (SCRAs), two phenylcyclohexylpiperidine (PCP)-like molecules, and two psychedelics. Of these NPSs, 10 were found to be previously unreported to either the UNODC or the EMCDDA. Of these 18 NPSs, opioids and cathinones were the most discussed on social media/reddit, with the highest number of threads associated. Current findings may support the use of both automated web crawlers and social listening approaches to identify emerging NPSs; the pandemic-related imposed restrictions may somehow influence the demand for specific NPS classes.Peer reviewe

    Cross-network Embeddings Transfer for Traffic Analysis

    Get PDF
    Artificial Intelligence (AI) approaches have emerged as powerful tools to improve traffic analysis for network monitoring and management. However, the lack of large labeled datasets and the ever-changing networking scenarios make a fundamental difference compared to other domains where AI is thriving. We believe the ability to transfer the specific knowledge acquired in one network (or dataset) to a different network (or dataset) would be fundamental to speed up the adoption of AI-based solutions for traffic analysis and other networking applications (e.g., cybersecurity). We here propose and evaluate different options to transfer the knowledge built from a provider network, owning data and labels, to a customer network that desires to label its traffic but lacks labels. We formulate this problem as a domain adaptation problem that we solve with embedding alignment techniques and canonical transfer learning approaches. We present a thorough experimental analysis to assess the performance considering both supervised (e.g., classification) and unsupervised (e.g., novelty detection) downstream tasks related to darknet and honeypot traffic. Our experiments show the proper transfer techniques to use the models obtained from a network in a different network. We believe our contribution opens new opportunities and business models where network providers can successfully share their knowledge and AI models with customers
    • …
    corecore