39 research outputs found

    Determining the effectiveness of deceptive honeynets

    Get PDF
    Over the last few years, incidents of network based intrusions have rapidly increased, due to the increase and popularity of various attack tools easily available for download from the Internet. Due to this increase in intrusions, the concept of a network defence known as Honeypots developed. These honeypots are designed to ensnare attackers and monitor their activities. Honeypots use the principles of deception such as masking, mimicry, decoying, inventing, repackaging and dazzling to deceive attackers. Deception exists in various forms. It is a tactic to survive and defeat the motives of attackers. Due to its presence in the nature, deception has been widely used during wars and now in Information Systems. This thesis considers the current state of honeypot technology as well as describes the framework of how to improve the effectiveness of honeypots through the effective use of deception. In this research, a legitimate corporate deceptive network is created using Honeyd (a type of honeypot) which is attacked and improved using empirical learning approach. The data collected during the attacking exercise were analysed, using various measures, to determine the effectiveness of the deception in the honeypot network created using honeyd. The results indicate that the attackers were deceived into believing the honeynet was a real network which instead was a deceptive network

    Power management as a system-level inhibitor of modularity in the mobile computer industry

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, System Design & Management Program, 2004.Includes bibliographical references (p. 88-90).Since the mid-90s, the computer industry has been very modular with respect to both product architecture and industry structure. The growing market size of mobile computers means that the challenges facing this segment are starting to affect the direction of the industry. It is argued in this paper that power management in mobile computers is forcing the industry in the direction of more integral product solutions and, hence, a more integral industry structure. That is to say, the industry is assuming a structure similar to the early days of mainframe computers when one firm delivered the entire proprietary integral system. Furthermore, this trend towards more integrality in mobile computer systems is due to fundamental physical attributes of the system; specifically, that information transfer systems lend themselves more readily to modular architectures than systems that transfer significant power. Thus, as processors and mobile computers become more powerful, they start to behave more like power transfer systems and side effects of this power, such as heat, require a more integral approach to managing it. A "free body" diagram framework is presented which provides a way of thinking about how integrality forces are acting on an industry's trajectory. Evidence is presented showing how the dominant player in the computer supply chain, Intel, is exhibiting this vertical/integral behavior in a number of ways.by Samuel K. Weinstein.S.M

    Laboratory information management system study & development of LIMS web platform application for CTCV - Coimbra

    Get PDF
    The World Wide Web not only changes the process but also improves the user experience. Also, it dramatically changes how the computer software is built. This profound evolution of software development has caused developers in the software industry to change their way of developing software. In this project, the Laboratory Information Management System (LIMS) for Staff and Users of small business have been designed and developed using Throwaway Prototyping methodology with the web architecture. Different types of development platforms are available in the market to develop this application, but as per the company requirements, this application was developed with the .net framework. This web application allows us to access application data on different devices like a tablet, a desktop, a smartphone from the remote location all over the world. The main feature of this application offers to monitor the application activity like which activity was performed by the user with the corresponding date, time and short description. Therefore, this software uses an industry standard relational database management system (RDBMS) combined with a platform-independent web browser interface for data entry and the retrieval. (The 3-tier technology) The laboratory workflow steps facilitate the management and tracking of all test and test results, which ensures that the right information is available at the right time to a right person. This system will produce an efficient process in the laboratory which leads to faster work, fewer errors, and smoother workflow for an organization. Keywords IndustryN/

    THE USE OF RECOMMENDER SYSTEMS IN WEB APPLICATIONS – THE TROI CASE

    Get PDF
    Avoiding digital marketing, surveys, reviews and online users behavior approaches on digital age are the key elements for a powerful businesses to fail, there are some systems that should preceded some artificial intelligence techniques. In this direction, the use of data mining for recommending relevant items as a new state of the art technique is increasing user satisfaction as well as the business revenues. And other related information gathering approaches in order to our systems thing and acts like humans. To do so there is a Recommender System that will be elaborated in this thesis. How people interact, how to calculate accurately and identify what people like or dislike based on their online previous behaviors. The thesis includes also the methodologies recommender system uses, how math equations helps Recommender Systems to calculate user’s behavior and similarities. The filters are important on Recommender System, explaining if similar users like the same product or item, which is the probability of neighbor user to like also. Here comes collaborative filters, neighborhood filters, hybrid recommender system with the use of various algorithms the Recommender Systems has the ability to predict whether a particular user would prefer an item or not, based on the user’s profile and their activities. The use of Recommender Systems are beneficial to both service providers and users. Thesis cover also the strength and weaknesses of Recommender Systems and how involving Ontology can improve it. Ontology-based methods can be used to reduce problems that content-based recommender systems are known to suffer from. Based on Kosovar’s GDP and youngsters job perspectives are desirable for improvements, the demand is greater than the offer. I thought of building an intelligence system that will be making easier for Kosovars to find the appropriate job that suits their profile, skills, knowledge, character and locations. And that system is called TROI Search engine that indexes and merge all local operating job seeking websites in one platform with intelligence features. Thesis will present the design, implementation, testing and evaluation of a TROI search engine. Testing is done by getting user experiments while using running environment of TROI search engine. Results show that the functionality of the recommender system is satisfactory and helpful

    CSR-Impact on the IT World

    Get PDF
    Aquest estudi sobre Responsabilitat Social Corporativa tracta de comparar la visió d’aquesta en dos països, Anglaterra i Espanya. S’intenta recollir la opinió de les dues societats amb la realització d’unes enquestes, les quals serviran també per analitzar la importància que té per als usuaris i com veuen que actuen les empreses relacionades en el món TIC

    Animating the evolution of software

    Get PDF
    The use and development of open source software has increased significantly in the last decade. The high frequency of changes and releases across a distributed environment requires good project management tools in order to control the process adequately. However, even with these tools in place, the nature of the development and the fact that developers will often work on many other projects simultaneously, means that the developers are unlikely to have a clear picture of the current state of the project at any time. Furthermore, the poor documentation associated with many projects has a detrimental effect when encouraging new developers to contribute to the software. A typical version control repository contains a mine of information that is not always obvious and not easy to comprehend in its raw form. However, presenting this historical data in a suitable format by using software visualisation techniques allows the evolution of the software over a number of releases to be shown. This allows the changes that have been made to the software to be identified clearly, thus ensuring that the effect of those changes will also be emphasised. This then enables both managers and developers to gain a more detailed view of the current state of the project. The visualisation of evolving software introduces a number of new issues. This thesis investigates some of these issues in detail, and recommends a number of solutions in order to alleviate the problems that may otherwise arise. The solutions are then demonstrated in the definition of two new visualisations. These use historical data contained within version control repositories to show the evolution of the software at a number of levels of granularity. Additionally, animation is used as an integral part of both visualisations - not only to show the evolution by representing the progression of time, but also to highlight the changes that have occurred. Previously, the use of animation within software visualisation has been primarily restricted to small-scale, hand generated visualisations. However, this thesis shows the viability of using animation within software visualisation with automated visualisations on a large scale. In addition, evaluation of the visualisations has shown that they are suitable for showing the changes that have occurred in the software over a period of time, and subsequently how the software has evolved. These visualisations are therefore suitable for use by developers and managers involved with open source software. In addition, they also provide a basis for future research in evolutionary visualisations, software evolution and open source development

    Doppio User Manual

    Get PDF
    The 67/91 Schmidt telescope is the largest instrument of this type in Italy. It was officially commissioned in 1966, when it was located in the Pennar Observation Station, near the Galileo telescope; in 1991 the telescope was moved to mount Ekar near the Copernico 1.82m telescope, in order to take advantage of the higher altitude and lower light pollution. In 2017 the telescope has been considerably refurbished (new CCD camera, new filters, autoguider) and remotely controlled. Starting from May, 2020 updates of both hardware and software allowed the implementation of the fully robotic operational mode. The observing blocks (OB) are submitted at any time by the PIs of the proposals or their collaborators. The Robotic System has a rapid-response capability that allows it to interrupt regular observations in order to observe transient phenomena with high priority
    corecore