90 research outputs found

    Cloud-Based Game Server Infrastructure AT PT. Games Karya Nusantara (Majamojo)

    Get PDF
    In the era of the game publishing industry, MAJAMOJO collaborates with game developers in developing game server infrastructure in Indonesia, by realizing the adoption of cloud computing using the Roadmap for Cloud Computing Adaption (ROCCA) method. Cloud computing implementation by following 5 stages of modification of the ROCCA adoption model, namely the analysis, design, adoption, migration, and management stages. By using Amazon Web Service (AWS) Cloud Service (CSP) services. Cloud computing adoption is carried out by developing game server infrastructure as connectivity between the client and the game server. The analysis is carried out by collecting data and interviews from speakers, in the design of the cloud computing technology to be used, the selection of the technology is based on the results of the analysis stage, then the adoption process prepares the cloud infrastructure to be built, based on software selection and setting up servers with recommended specifications, then migration is the core of the cloud computing adoption process, Where the process transfers the system from the physical server to the virtual server, new management will be executed if the adoption and migration process has been completed with the Game Server and Game Client connectivity indicators running normally. With the application of cloud computing based on the characteristics of the cloud itself, it is flexible, scalable, and safe to access over the internet

    Toxicity of Kuwait crude oil and dispersed oil on selected marine fish species of Kuwait

    Get PDF
    Oil spill is a major source of pollution in Kuwait marine environment and oil dispersants are used as a method to combat oil spill but the adverse effects of either oil or dispersed oil is unknown to fish species local to Kuwait. Therefore, the toxicity of water-accommodated fraction (WAF) of Kuwait crude oil (KCO) and chemically enhanced water-accommodated fraction (CE-WAF) of KCO with three dispersants (Corexit® 9500, Corexit® 9527 and Slickgone® NS) were investigated against selected marine fish species local to Kuwait marine waters such as: sobaity-sea bream (Sparidentex hasta), hamoor-orange-spotted grouper (Ephinephelus coicoides), meidmullet (Liza Klunzingeri), and shea’am-yellow-fin sea bream (Acanthopagrus latus). Prior to exposure chemical characterization of KCO WAF and CE-WAFs was conducted for benzene, toluene, ethylbenzene and xylene (BTEX), polycyclic aromatic hydrocarbons (PAH), aliphatic and total petroleum hydrocarbons (TPH) compounds. Standardization experiments regarding oil loading and mixing duration revealed that 1 g KCO loading and 24 h mixing duration were the most appropriate experimental conditions to obtain a reproducible and stable WAF and CE-WAF solutions. In general, CE-WAF contained higher concentrations of TPH, PAHs and aliphatics compared to KCO WAF. Exposure to KCO WAF and CE-WAF had no adverse effects on hatching success of embryonated eggs of sea bream and orange-spotted grouper exposed but larvae hatched during exposure exhibited a toxic response. Considering larval sensitivity, pre-hatched larvae of four marine fish species were separately exposed to KCO WAF and their sensitivities from the most sensitive to the least sensitive were: sea bream>orangespotted grouper > yellow-fin sea bream > mullet pre-hatched larval stages. The sensitivities of pre-hatched larvae of sea bream and orange-spotted grouper to WAF and CE-WAF were of different degrees. For sea bream the LC50 values were around 0.120 g oil/L for both WAF and CE-WAF indicating that dispersant didn’t increase oil toxicity, whereas for orange-spotted grouper CE-WAF (LC50 0.010 g oil/L) was more toxic than WAF alone (LC50 0.93 g/L). The data obtained in this study showed that most resistant developmental stage of fish to the toxicity of WAF and CE-WAFs was the egg stage > ABSTRACT ©KARAM v larvae hatched during exposure > pre-hatched larvae. Exposure of pre-hatched larvae to KCO WAF induced developmental abnormalities in spinal curvature of larvae and the most prominent deformity types were lordosis, scoliosis and kyphosis compared to that of control larvae were no abnormalities were observed. Relating toxicity data obtained in the present experimental study to actual petroleum hydrocarbon concentrations in Kuwait marine area, it was observed that current contamination level with petroleum hydrocarbons is far less than the LC50 determined in this study suggesting that there isn’t any acute hazard to either fish egg hatching or larva survival.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    Marine Oil Spills

    Get PDF
    Major oil spills attract the attention of the public and the media. This was especially the case after the Deepwater Horizon spill. In recent years, this attention has created a global awareness of the risks of oil spills and the damage they do to the environment. Oil is a necessity in our industrial society, however, and a major component of our lifestyle. This means that the risk of major spills continues as does the interest in spills. The Deepwater Horizon spill began a new series of scientific studies that have greatly increased our understanding of oil spills. This book contains 10 such studies. These studies vary from toxicity studies to social studies of human reaction to spills and risk. Importantly, the book is a sampling of important new topics that have become important after the Deepwater Horizon spill. These new topics include new chemical and tracing techniques, new risk perception techniques, perspectives on human health and spills, and discussion on new fuels. This book makes a significant contribution to the understanding of facets of spills and explores 10 very different facets of oil spills

    CIRA annual report FY 2014/2015

    Get PDF
    Reporting period July 1, 2014-March 31, 2015

    Forward Myth: Military Public Relations and the Domestic Base Newspaper 1941-1981

    Get PDF
    This dissertation explores the evolution of domestic military base newspapers from 1941-1981, a timeframe that encapsulates the Second World War, Korean War, and Vietnam War, as well as interwar and postwar years. While called “newspapers,” the United States military designed these publications to be a hybrid of traditional news and public relations. This dissertation focuses on three primary aspects of these newspapers: the evolution of the format, style, and function of these papers; the messages editors and writers crafted for and about the “common” soldier and American; and the messages for and about members of the non-majority group. Sometimes printed on private presses with ad revenue generated by civilian newspaper ad sellers, these papers sometimes marked a unique marriage between the military and traditional media outlets. However, the local presses had no control over content. That privilege went to each base’s commanding officer. Despite the wide swath of people with control over content, these papers looked the same, and in some cases carried shared stories from various government-run “news” agencies. On their pages could be found both news of the day and identity building stories and editorials, all of which conspired to inform and to a greater or lesser degree nudge readers to conclusions of what the creators of the papers thought it meant to be an American, a soldier, or the member of a racial or gender minority. Though tasked with functioning as both a journalistic and public relations vessel, these publications were inconsistent as purveyors of both news and propaganda. However, each provides a wealth of knowledge about the development of the American identity, and dominant concepts of majority and minority, as well as the integration of public relations and journalism into the military in the 20th Century

    Evaluating the cyber security skills gap relating to penetration testing

    Get PDF
    Information Technology (IT) is growing rapidly and has become an integral part of daily life. It provides a boundless list of services and opportunities, generating boundless sources of information, which could be abused or exploited. Due to this growth, there are thousands of new users added to the grid using computer systems in a static and mobile environment; this fact alone creates endless volumes of data to be exploited and hardware devices to be abused by the wrong people. The growth in the IT environment adds challenges that may affect users in their personal, professional, and business lives. There are constant threats on corporate and private computer networks and computer systems. In the corporate environment companies try to eliminate the threat by testing networks making use of penetration tests and by implementing cyber awareness programs to make employees more aware of the cyber threat. Penetration tests and vulnerability assessments are undervalued; are seen as a formality and are not used to increase system security. If used regularly the computer system will be more secure and attacks minimized. With the growth in technology, industries all over the globe become fully dependent on information systems in doing their day-to-day business. As technology evolves and new technology becomes available, the bigger the risk becomes to protect against the dangers which come with this new technology. For industry to protect itself against this growth in technology, personnel with a certain skill set is needed. This is where cyber security plays a very important role in the protection of information systems to ensure the confidentiality, integrity and availability of the information system itself and the data on the system. Due to this drive to secure information systems, the need for cyber security by professionals is on the rise as well. It is estimated that there is a shortage of one million cyber security professionals globally. What is the reason for this skills shortage? Will it be possible to close this skills shortage gap? This study is about identifying the skills gap and identifying possible ways to close this skills gap. In this study, research was conducted on the cyber security international standards, cyber security training at universities and international certification focusing specifically on penetration testing, the evaluation of the need of industry while recruiting new penetration testers, finishing with suggestions on how to fill possible gaps in the skills market with a conclusion

    Kandiyohi County: Local Food System Assessment

    Get PDF
    https://digitalcommons.morris.umn.edu/cst/1082/thumbnail.jp

    Educating the effective digital forensics practitioner: academic, professional, graduate and student perspectives

    Get PDF
    Over the years, digital forensics has become an important and sought-after profession where the gateway of training and education has developed vastly over the past decade. Many UK higher education (HE) institutions now deliver courses that prepare students for careers in digital forensics and, in most recent advances, cyber security. Skills shortages and external influences attributed within the field of cyber security, and its relationship as a discipline with digital forensics, has shifted the dynamic of UK higher education provisions. The implications of this now sees the route to becoming a digital forensic practitioner, be it in law enforcement or business, transform from on-the-job training to university educated, trained analysts. This thesis examined courses within HE and discovered that the delivery of these courses often overlooked areas such as mobile forensics, live data forensics, Linux and Mac knowledge. This research also considered current standards available across HE to understand whether educational programmes are delivering what is documented as relevant curriculum. Cyber security was found to be the central focus of these standards within inclusion of digital forensics, adding further to the debate and lack of distinctive nature of digital forensics as its own discipline. Few standards demonstrated how the topics, knowledge, skills and competences drawn were identified as relevant and effective for producing digital forensic practitioners. Additionally, this thesis analyses and discusses results from 201 participants across five stakeholder groups: graduates, professionals, academics, students and the public. These areas were selected due to being underdeveloped in existing literature and the crucial role they play in the cycle of producing effective practitioners. Analysis on stakeholder views, experiences and thoughts surrounding education and training offer unique insight, theoretical underpinnings and original contributions not seen in existing literature. For example, challenges, costs and initial issues with introducing graduates to employment for the employers and/or supervising practitioners, the lack of awareness and contextualisation on behalf of students and graduates towards what knowledge and skills they have learned and acquired on a course and its practical application on-the-job which often lead to suggestions of a lack of fundamental knowledge and skills. This is evidenced throughout the thesis, but examples include graduates: for their reflections on education based on their new on-the-job experiences and practices; professionals: for their job experiences and requirements, academics: for their educational practices and challenges; students: their initial expectations and views; and, the public: for their general understanding. This research uniquely captures these perspectives, bolstering the development of digital forensics as an academic discipline, along with the importance these diverse views play in the overall approach to delivering skilled practitioners. While the main contribution to knowledge within this thesis is its narrative focusing on the education of effective digital forensic practitioners and its major stakeholders, this thesis also makes additional contributions both academically and professionally; including the discussion, analysis and reflection of: - improvements for education and digital forensics topics for research and curriculum development; - where course offerings can be improved for institutions offering digital forensic degree programmes; - the need for further collaboration between industry and academia to provide students and graduates with greater understanding of the real-life role of a digital forensic practitioner and the expectations in employment; - continuous and unique challenges within both academia and the industry which digital forensics possess and the need for improved facilities and tool development to curate and share problem and scenario-based learning studies

    Network-based detection of malicious activities - a corporate network perspective

    Get PDF

    Web-based Secure Application Control

    Get PDF
    The world wide web today serves as a distributed application platform. Its origins, however, go back to a simple delivery network for static hypertexts. The legacy from these days can still be observed in the communication protocol used by increasingly sophisticated clients and applications. This thesis identifies the actual security requirements of modern web applications and shows that HTTP does not fit them: user and application authentication, message integrity and confidentiality, control-flow integrity, and application-to-application authorization. We explore the other protocols in the web stack and work out why they can not fill the gap. Our analysis shows that the underlying problem is the connectionless property of HTTP. However, history shows that a fresh start with web communication is far from realistic. As a consequence, we come up with approaches that contribute to meet the identified requirements. We first present impersonation attack vectors that begin before the actual user authentication, i.e. when secure web interaction and authentication seem to be unnecessary. Session fixation attacks exploit a responsibility mismatch between the web developer and the used web application framework. We describe and compare three countermeasures on different implementation levels: on the source code level, on the framework level, and on the network level as a reverse proxy. Then, we explain how the authentication credentials that are transmitted for the user login, i.e. the password, and for session tracking, i.e. the session cookie, can be complemented by browser-stored and user-based secrets respectively. This way, an attacker can not hijack user accounts only by phishing the user's password because an additional browser-based secret is required for login. Also, the class of well-known session hijacking attacks is mitigated because a secret only known by the user must be provided in order to perform critical actions. In the next step, we explore alternative approaches to static authentication credentials. Our approach implements a trusted UI and a mutually authenticated session using signatures as a means to authenticate requests. This way, it establishes a trusted path between the user and the web application without exchanging reusable authentication credentials. As a downside, this approach requires support on the client side and on the server side in order to provide maximum protection. Another approach avoids client-side support but can not implement a trusted UI and is thus susceptible to phishing and clickjacking attacks. Our approaches described so far increase the security level of all web communication at all time. This is why we investigate adaptive security policies that fit the actual risk instead of permanently restricting all kinds of communication including non-critical requests. We develop a smart browser extension that detects when the user is authenticated on a website meaning that she can be impersonated because all requests carry her identity proof. Uncritical communication, however, is released from restrictions to enable all intended web features. Finally, we focus on attacks targeting a web application's control-flow integrity. We explain them thoroughly, check whether current web application frameworks provide means for protection, and implement two approaches to protect web applications: The first approach is an extension for a web application framework and provides protection based on its configuration by checking all requests for policy conformity. The second approach generates its own policies ad hoc based on the observed web traffic and assuming that regular users only click on links and buttons and fill forms but do not craft requests to protected resources.Das heutige World Wide Web ist eine verteilte Plattform für Anwendungen aller Art: von einfachen Webseiten über Online Banking, E-Mail, multimediale Unterhaltung bis hin zu intelligenten vernetzten Häusern und Städten. Seine Ursprünge liegen allerdings in einem einfachen Netzwerk zur Übermittlung statischer Inhalte auf der Basis von Hypertexten. Diese Ursprünge lassen sich noch immer im verwendeten Kommunikationsprotokoll HTTP identifizieren. In dieser Arbeit untersuchen wir die Sicherheitsanforderungen moderner Web-Anwendungen und zeigen, dass HTTP diese Anforderungen nicht erfüllen kann. Zu diesen Anforderungen gehören die Authentifikation von Benutzern und Anwendungen, die Integrität und Vertraulichkeit von Nachrichten, Kontrollflussintegrität und die gegenseitige Autorisierung von Anwendungen. Wir untersuchen die Web-Protokolle auf den unteren Netzwerk-Schichten und zeigen, dass auch sie nicht die Sicherheitsanforderungen erfüllen können. Unsere Analyse zeigt, dass das grundlegende Problem in der Verbindungslosigkeit von HTTP zu finden ist. Allerdings hat die Geschichte gezeigt, dass ein Neustart mit einem verbesserten Protokoll keine Option für ein gewachsenes System wie das World Wide Web ist. Aus diesem Grund beschäftigt sich diese Arbeit mit unseren Beiträgen zu sicherer Web-Kommunikation auf der Basis des existierenden verbindungslosen HTTP. Wir beginnen mit der Beschreibung von Session Fixation-Angriffen, die bereits vor der eigentlichen Anmeldung des Benutzers an der Web-Anwendung beginnen und im Erfolgsfall die temporäre Übernahme des Benutzerkontos erlauben. Wir präsentieren drei Gegenmaßnahmen, die je nach Eingriffsmöglichkeiten in die Web-Anwendung umgesetzt werden können. Als nächstes gehen wir auf das Problem ein, dass Zugangsdaten im WWW sowohl zwischen den Teilnehmern zu Authentifikationszwecken kommuniziert werden als auch für jeden, der Kenntnis dieser Daten erlangt, wiederverwendbar sind. Unsere Ansätze binden das Benutzerpasswort an ein im Browser gespeichertes Authentifikationsmerkmal und das sog. Session-Cookie an ein Geheimnis, das nur dem Benutzer und der Web-Anwendung bekannt ist. Auf diese Weise kann ein Angreifer weder ein gestohlenes Passwort noch ein Session-Cookie allein zum Zugriff auf das Benutzerkonto verwenden. Darauffolgend beschreiben wir ein Authentifikationsprotokoll, das vollständig auf die Übermittlung geheimer Zugangsdaten verzichtet. Unser Ansatz implementiert eine vertrauenswürdige Benutzeroberfläche und wirkt so gegen die Manipulation derselben in herkömmlichen Browsern. Während die bisherigen Ansätze die Sicherheit jeglicher Web-Kommunikation erhöhen, widmen wir uns der Frage, inwiefern ein intelligenter Browser den Benutzer - wenn nötig - vor Angriffen bewahren kann und - wenn möglich - eine ungehinderte Kommunikation ermöglichen kann. Damit trägt unser Ansatz zur Akzeptanz von Sicherheitslösungen bei, die ansonsten regelmäßig als lästige Einschränkungen empfunden werden. Schließlich legen wir den Fokus auf die Kontrollflussintegrität von Web-Anwendungen. Bösartige Benutzer können den Zustand von Anwendungen durch speziell präparierte Folgen von Anfragen in ihrem Sinne manipulieren. Unsere Ansätze filtern Benutzeranfragen, die von der Anwendung nicht erwartet wurden, und lassen nur solche Anfragen passieren, die von der Anwendung ordnungsgemäß verarbeitet werden können
    • …
    corecore