183 research outputs found

    Design and implementation of a computational cluster for high performance design and modeling of integrated circuits

    Get PDF
    Modern microelectronic engineering fabrication involves hundreds of processing steps beginning with design, simulation and modeling. Tremendous data is acquired, managed and processed. Bringing together Information Technology (IT) into a functional system for microelectronic engineering is not a trivial task. Seamless integration of hardware and software is necessary. For this purpose, knowledge of design and fabrication of microelectronic devices and circuits is extremely important along with knowledge of current IT systems. This thesis will explain a design methodology for building and using a computer cluster running software used in the production of microelectronic circuits. The cluster will run a Linux operating system to support software from Silvaco and Cadence. It will discuss the selection, installation, and verification of hardware and software based on defined goals. The system will be tested via numerous methods to show proper operation, focusing on TCAD software from Silvaco and custom IC design software from Cadence. To date, the system has been successfully tested and performs well. Since the target applications are doing simulations that are independent of each other, parallelization is very easy and user friendly. By simply adding more computers with more CPUs, the maximum number of people and processes that can be supported scales linearly. With a staged approach and the selection of the right software for the job, the integration of IT components to build a computer cluster for microelectronic applications can be completed successfully

    Computer forensics methodology and praxis.

    Get PDF
    This thesis lays the groundwork for creation of a graduate-level computer forensics course. It begins with an introduction explaining how computing has invaded modern life and explains what computer forensics is and its necessity. The thesis then argues why universities need to be at the forefront of educating students in the science of computer forensics as opposed to proprietary education courses and the benefits to law enforcement agencies of having a computer scientist perform forensic analyses. It continues to detail what computer forensics is and is not. The thesis then addresses legal issues and the motivation for the topic. Following this section is a review of current literature pertaining to the topic. The last half of the thesis lays a groundwork for design of a computer forensics course at the graduate level by detailing a methodology to implement which contains associated laboratory praxis for the students to follow

    Free-libre open source software as a public policy choice

    Get PDF
    Free Libre Open Source Software (FLOSS) is characterised by a specific programming and development paradigm. The availability and freedom of use of source code are at the core of this paradigm, and are the prerequisites for FLOSS features. Unfortunately, the fundamental role of code is often ignored among those who decide the software purchases for Canadian public agencies. Source code availability and the connected freedoms are often seen as unrelated and accidental aspects, and the only real advantage acknowledged, which is the absence of royalty fees, becomes paramount. In this paper we discuss some relevant legal issues and explain why public administrations should choose FLOSS for their technological infrastructure. We also present the results of a survey regarding the penetration and awareness of FLOSS usage into the Government of Canada. The data demonstrates that the Government of Canada shows no enforced policy regarding the implementation of a specific technological framework (which has legal, economic, business, and ethical repercussions) in their departments and agencies

    Technical and legal perspectives on forensics scenario

    Get PDF
    The dissertation concerns digital forensic. The expression digital forensic (sometimes called digital forensic science) is the science that studies the identification, storage, protection, retrieval, documentation, use, and every other form of computer data processing in order to be evaluated in a legal trial. Digital forensic is a branch of forensic science. First of all, digital forensic represents the extension of theories, principles and procedures that are typical and important elements of the forensic science, computer science and new technologies. From this conceptual viewpoint, the logical consideration concerns the fact that the forensic science studies the legal value of specific events in order to contrive possible sources of evidence. The branches of forensic science are: physiological sciences, social sciences, forensic criminalistics and digital forensics. Moreover, digital forensic includes few categories relating to the investigation of various types of devices, media or artefacts. These categories are: - computer forensic: the aim is to explain the current state of a digital artefact; such as a computer system, storage medium or electronic document; - mobile device forensic: the aim is to recover digital evidence or data from mobile device, such as image, log call, log sms and so on; - network forensic: the aim is related to the monitoring and analysis of network traffic (local, WAN/Internet, UMTS, etc.) to detect intrusion more in general to find network evidence; - forensic data analysis: the aim is examine structured data to discover evidence usually related to financial crime; - database forensic: the aim is related to databases and their metadata. The origin and historical development of the discipline of study and research of digital forensic are closely related to progress in information and communication technology in the modern era. In parallel with the changes in society due to new technologies and, in particular, the advent of the computer and electronic networks, there has been a change in the mode of collection, management and analysis of evidence. Indeed, in addition to the more traditional, natural and physical elements, the procedures have included further evidence that although equally capable of identifying an occurrence, they are inextricably related to a computer or a computer network or electronic means. The birth of computer forensics can be traced back to 1984, when the FBI and other American investigative agencies have began to use software for the extraction and analysis of data on a personal computer. At the beginning of the 80s, the CART(Computer Analysis and Response Team) was created within the FBI, with the express purpose of seeking the so-called digital evidence. This term is used to denote all the information stored or transmitted in digital form that may have some probative value. While the term evidence, more precisely, constitutes the judicial nature of digital data, the term forensic emphasizes the procedural nature of matter, literally, "to be presented to the Court". Digital forensic have a huge variety of applications. The most common applications are related to crime or cybercrime. Cybercrime is a growing problem for government, business and private. - Government: security of the country (terrorism, espionage, etc.) or social problems (child pornography, child trafficking and so on). - Business: purely economic problems, for example industrial espionage. - Private: personal safety and possessions, for example phishing, identity theft. Often many techniques, used in digital forensics, are not formally defined and the relation between the technical procedure and the law is not frequently taken into consideration. From this conceptual perspective, the research work intends to define and optimize the procedures and methodologies of digital forensic in relation to Italian regulation, testing, analysing and defining the best practice, if they are not defined, concerning common software. The research questions are: 1. The problem of cybercrime is becoming increasingly significant for governments, businesses and citizens. - In relation to governments, cybercrime involves problems concerning national security, such as terrorism and espionage, and social questions, such as trafficking in children and child pornography. - In relation to businesses, cybercrime entails problems concerning mainly economic issues, such as industrial espionage. - In relation to citizens, cybercrime involves problems concerning personal security, such as identity thefts and fraud. 2. Many techniques, used within the digital forensic, are not formally defined. 3. The relation between procedures and legislation are not always applied and taken into consideratio

    Developoing A Computer and Network Engineering Major Curriculum For Vocational High School (VHS) in Indonesia

    Get PDF
    This study aims at developing curriculum for Computer and Network Engineering major which is relevant to industrial needs. The study employed the qualitative method. The data were collected through an in-depth interview, documentation, and focus group disscussion. The research population comprised of (1) industry practitioners from computer and network engineering industries, and (2) teachers of vocational high schools in Special Region of Yogyakarta. In this qualitative research, the one who became the instrument or tool of the research was the researcher himself. Understanding the qualitative research method and the knowledge related to the field of the research, the researcher was sure that he had sufficient knowledge both academically and technically. The findings of this study consisted of four parts, namely (1) standard competence of Computer and Network Engineering major for vocational high school; (2) the curriculum of Computer and Network Engineering major that is currently implemented; (3) competences in the field of Computer and Network Engineering demanded by industries; and (4) the curricuulum of Computer and Network Engineering major that is appropriate for industrial needs

    Implementation of Microsoft\u27s Virtual Pc in Networking Curriculum

    Get PDF
    Using Microsoft\u27s Virtual PC software product as a virtual technology in the implementation of Network Specialist curriculum allows increased versatility and considerable hardware cost savings. Rather than purchasing individual computers or removable hard drives, using boot manager programs, or simulation software (including Computer Based Training programs), for student use in learning the administration of an operating system, one computer with hard drive, sufficient processor power, and RAM can be used to implement the effective hands-on learning approach of plan, implement and test, and then review. In addition, this software allows a non-dedicated (production) computer lab to be used. This is in contrast to a lab dedicated to the support of networking curriculum, as is typically done because of the fundamental testing involved including operating system rebuilds and service manipulation. The successful result of this project is a curriculum including deliverables (student assessment guidelines, worksheets, and competency task list, and instructor procedures) that use Microsoft\u27s Virtual PC software, Windows XP and Windows Server 2003. These software products will be used, and are currently being used, to demonstrate a subset of the tasks and competencies required for Mid-State Technical College\u27s Network Specialist program course Network Administration-Intermediate

    Performance Evaluation of Virtualization with Cloud Computing

    Get PDF
    Cloud computing has been the subject of many researches. Researches shows that cloud computing permit to reduce hardware cost, reduce the energy consumption and allow a more efficient use of servers. Nowadays lot of servers are used inefficiently because they are underutilized. The uses of cloud computing associate to virtualization have been a solution to the underutilisation of those servers. However the virtualization performances with cloud computing cannot offers performances equal to the native performances. The aim of this project was to study the performances of the virtualization with cloud computing. To be able to meet this aim it has been review at first the previous researches on this area. It has been outline the different types of cloud toolkit as well as the different ways available to virtualize machines. In addition to that it has been examined open source solutions available to implement a private cloud. The findings of the literature review have been used to realize the design of the different experiments and also in the choice the tools used to implement a private cloud. In the design and the implementation it has been setup experiment to evaluate the performances of public and private cloud.The results obtains through those experiments have outline the performances of public cloud and shows that the virtualization of Linux gives better performances than the virtualization of Windows. This is explained by the fact that Linux is using paravitualization while Windows is using HVM. The evaluation of performances on the private cloud has permitted the comparison of native performance with paravirtualization and HVM. It has been seen that paravirtualization hasperformances really close to the native performances contrary to HVM. Finally it hasbeen presented the cost of the different solutions and their advantages

    Performance and enhancement for HD videoconference environment

    Get PDF
    In this work proposed here is framed in the project of research V3 (Video, Videoconference, and Visualization) of the Foundation i2CAT, that has for final goal to design and development of a platform of video, videoconference and independent visualization of resolution in high and super though inside new generation IP networks. i2CAT Foundation uses free software for achieving its goals. UltraGrid for the transmission of HD video is used and SAGE is used for distributed visualization among multiple monitors. The equipment used for management (capturing, sending, visualization, etc) of the high definition stream of work environment it has to be optimized so that all the disposable resources can be used, in order to improve the quality and stability of the platform. We are speaking about the treatment of datum flows of more of 1 Gbps with raw formats, so that the optimization of the use of the disposable resources of a system is given back a need. In this project it is evaluated the requirements for the high definition streams without compressing and a study of the current platform is carried out, in order to extract the functional requirements that an optimum system has to have to work in the best conditions. From this extracted information, a series of systems tests are carried out in order to improve the performance, from level of network until level of application. Different distributions of the Linux operating system have been proved in order to evaluate their performance. These are Debian 4 and openSUSE 10.3. The creation of a system from sources of software has also been proved in order to optimize its code in the compilation. It has been carried out with the help of Linux From Scratch project. It has also been tried to use systems Real Time (RT) with the distributions used. It offers more stability in the stream frame rate. Once operating systems has been test, it has proved different compilers in order to evaluate their efficiency. The GCC and the Intel C++ Compilers have proved, this second with more satisfactory results. Finally a Live CD has been carried out in order to include all the possible improvements in a system of easy distribution
    corecore