3,542 research outputs found

    Vietnam’s trade policy: a developing nation assessment

    Get PDF
    Aim/PurposeThis paper is a review of the progress of the Vietnam socio-economic and development plan and an assessment of the extent to which Vietnam is putting in place the critical social and economic development structures that will enable it to reach the status of “developed nation” in the time set (2020) by its national strategic plan. The research will identify and review trade patterns, trade policy and the effect of foreign aid on Vietnam’s plan to transform its economy and society from a developing nation status to status of developednation. The overriding question stands as “is” Vietnam effec-tively moving towards developed nation statussoon”?BackgroundThis paper examines the history of Vietnam from the command economy in its transition to a market driven economy, the criteria, hurdles and challenges as the country moves towards a developed country status. MethodologyApplied research based on the body of research in socio-economic develop-ment theory, international trade and market theory. The review is conducted by collecting and analyzing data on foreign trade, foreign aid, business and general economic growth, development and social wellbeing. Itidentifies and appraises the trade patterns,trade effects, socio-economic policies and the effect of foreign aid] on the economic growth and the progress of the coun-try towards becoming a developed nation state

    A user-oriented network forensic analyser: the design of a high-level protocol analyser

    Get PDF
    Network forensics is becoming an increasingly important tool in the investigation of cyber and computer-assisted crimes. Unfortunately, whilst much effort has been undertaken in developing computer forensic file system analysers (e.g. Encase and FTK), such focus has not been given to Network Forensic Analysis Tools (NFATs). The single biggest barrier to effective NFATs is the handling of large volumes of low-level traffic and being able to exact and interpret forensic artefacts and their context – for example, being able extract and render application-level objects (such as emails, web pages and documents) from the low-level TCP/IP traffic but also understand how these applications/artefacts are being used. Whilst some studies and tools are beginning to achieve object extraction, results to date are limited to basic objects. No research has focused upon analysing network traffic to understand the nature of its use – not simply looking at the fact a person requested a webpage, but how long they spend on the application and what interactions did they have with whilst using the service (e.g. posting an image, or engaging in an instant message chat). This additional layer of information can provide an investigator with a far more rich and complete understanding of a suspect’s activities. To this end, this paper presents an investigation into the ability to derive high-level application usage characteristics from low-level network traffic meta-data. The paper presents a three application scenarios – web surfing, communications and social networking and demonstrates it is possible to derive the user interactions (e.g. page loading, chatting and file sharing ) within these systems. The paper continues to present a framework that builds upon this capability to provide a robust, flexible and user-friendly NFAT that provides access to a greater range of forensic information in a far easier format

    What is an End User Software Engineer?

    Get PDF
    The group of people described as end user software engineers are a very large and diverse group. For example, research scientists building simulations of complex processes are described as end user software engineers as are school teachers who create spreadsheets to track the progress of their students. Given the difference in background and domains in which different end user software engineers work, I argue that it is important to distinguish between different categories of end user software engineers. Such distinctions will enable us to determine which tools and techniques are appropriate for which types of end user software engineers. Indeed, such distinctions will also make clear the differences and similarities between end user software engineers and so called professional software engineers

    A forensically-enabled IASS cloud computing architecture

    Get PDF
    Current cloud architectures do not support digital forensic investigators, nor comply with today’s digital forensics procedures largely due to the dynamic nature of the cloud. Whilst much research has focused upon identifying the problems that are introduced with a cloud-based system, to date there is a significant lack of research on adapting current digital forensic tools and techniques to a cloud environment. Data acquisition is the first and most important process within digital forensics – to ensure data integrity and admissibility. However, access to data and the control of resources in the cloud is still very much provider-dependent and complicated by the very nature of the multi-tenanted operating environment. Thus, investigators have no option but to rely on cloud providers to acquire evidence, assuming they would be willing or are required to by law. Furthermore, the evidence collected by the Cloud Service Providers (CSPs) is still questionable as there is no way to verify the validity of this evidence and whether evidence has already been lost. This paper proposes a forensic acquisition and analysis model that fundamentally shifts responsibility of the data back to the data owner rather than relying upon a third party. In this manner, organisations are free to undertaken investigations at will requiring no intervention or cooperation from the cloud provider. The model aims to provide a richer and complete set of admissible evidence than what current CSPs are able to provide

    Localization of Metal-Induced Gap States at the Metal-Insulator Interface:Origin of Flux Noise in SQUIDs and Superconducting Qubits

    Full text link
    The origin of magnetic flux noise in Superconducting Quantum Interference Devices with a power spectrum scaling as 1/f1/f (ff is frequency) has been a puzzle for over 20 years. This noise limits the decoherence time of superconducting qubits. A consensus has emerged that the noise arises from fluctuating spins of localized electrons with an areal density of 5×10175\times10^{17}m2^{-2}. We show that, in the presence of potential disorder at the metal-insulator interface, some of the metal-induced gap states become localized and produce local moments. A modest level of disorder yields the observed areal density
    corecore