497,192 research outputs found
CAMEO Stakeholders Report
Computer-Aided Management of Emergency Operations (CAMEO) is a suite of software applications used to plan for and respond to chemical emergencies. CAMEO was first released in 1986, and was jointly developed by the U.S. Environmental Protection Agency (US EPA) and the National Oceanic and Atmospheric Administration (NOAA) to assist front-line chemical emergency planners and responders. It has since undergone numerous modification and upgrades, and is a critical tool used today for chemical spills, other hazards, and emergency management. The CAMEO system integrates a chemical database and a method to manage the data, an air dispersion model, and a mapping capability. All modules work interactively to share and display critical information in a timely fashion. As a result of fatal chemical accidents in recent years, Executive Order (EO) 13650 (Improving Chemical Facility Safety and Security) was signed on August 1, 2013 for: Improving Operational Coordination with State, Local and Tribal partners Enhancing Federal Coordination Enhancing Information Collection and Sharing Modernizing Regulations, Guidance, Policy and Standards Identifying Best Practices.
The CAMEO team has been working to address these EO requirements and the areas of action in a manner that will best meet the needs of CAMEO users and stakeholders
Extension and hardware implementation of the comprehensive integrated security system concept
Merged with duplicate record (10026.1/700) on 03.01.2017 by CS (TIS)This is a digitised version of a thesis that was deposited in the University Library. If you are the author please contact PEARL Admin ([email protected]) to discuss options.The current strategy to computer networking is to increase the accessibility that legitimate
users have to their respective systems and to distribute functionality. This creates a more
efficient working environment, users may work from home, organisations can make better
use of their computing power. Unfortunately, a side effect of opening up computer systems
and placing them on potentially global networks is that they face increased threats from
uncontrolled access points, and from eavesdroppers listening to the data communicated
between systems. Along with these increased threats the traditional ones such as
disgruntled employees, malicious software, and accidental damage must still be countered.
A comprehensive integrated security system ( CISS ) has been developed to provide
security within the Open Systems Interconnection (OSI) and Open Distributed Processing
(ODP) environments. The research described in this thesis investigates alternative methods
for its implementation and its optimisation through partial implementation within hardware
and software and the investigation of mechanismsto improve its security.
A new deployment strategy for CISS is described where functionality is divided amongst
computing platforms of increasing capability within a security domain. Definitions are given
of a: local security unit, that provides terminal security; local security servers that serve the
local security units and domain management centres that provide security service coordination
within a domain.
New hardware that provides RSA and DES functionality capable of being connected to Sun
microsystems is detailed. The board can be used as a basic building block of CISS,
providing fast cryptographic facilities, or in isolation for discrete cryptographic services.
Software written for UNIX in C/C++ is described, which provides optimised security
mechanisms on computer systems that do not have SBus connectivity.
A new identification/authentication mechanism is investigated that can be added to existing
systems with the potential for extension into a real time supervision scenario. The
mechanism uses keystroke analysis through the application of neural networks and genetic
algorithms and has produced very encouraging results.
Finally, a new conceptual model for intrusion detection capable of dealing with real time
and historical evaluation is discussed, which further enhances the CISS concept
Standard requirements for GCP-compliant data management in multinational clinical trials
<p>Abstract</p> <p>Background</p> <p>A recent survey has shown that data management in clinical trials performed by academic trial units still faces many difficulties (e.g. heterogeneity of software products, deficits in quality management, limited human and financial resources and the complexity of running a local computer centre). Unfortunately, no specific, practical and open standard for both GCP-compliant data management and the underlying IT-infrastructure is available to improve the situation. For that reason the "Working Group on Data Centres" of the European Clinical Research Infrastructures Network (ECRIN) has developed a standard specifying the requirements for high quality GCP-compliant data management in multinational clinical trials.</p> <p>Methods</p> <p>International, European and national regulations and guidelines relevant to GCP, data security and IT infrastructures, as well as ECRIN documents produced previously, were evaluated to provide a starting point for the development of standard requirements. The requirements were produced by expert consensus of the ECRIN Working group on Data Centres, using a structured and standardised process. The requirements were divided into two main parts: an IT part covering standards for the underlying IT infrastructure and computer systems in general, and a Data Management (DM) part covering requirements for data management applications in clinical trials.</p> <p>Results</p> <p>The standard developed includes 115 IT requirements, split into 15 separate sections, 107 DM requirements (in 12 sections) and 13 other requirements (2 sections). Sections IT01 to IT05 deal with the basic IT infrastructure while IT06 and IT07 cover validation and local software development. IT08 to IT015 concern the aspects of IT systems that directly support clinical trial management. Sections DM01 to DM03 cover the implementation of a specific clinical data management application, i.e. for a specific trial, whilst DM04 to DM12 address the data management of trials across the unit. Section IN01 is dedicated to international aspects and ST01 to the competence of a trials unit's staff.</p> <p>Conclusions</p> <p>The standard is intended to provide an open and widely used set of requirements for GCP-compliant data management, particularly in academic trial units. It is the intention that ECRIN will use these requirements as the basis for the certification of ECRIN data centres.</p
Safety alarms at CERN
In order to operate the CERN accelerators complex safely, the acquisition, transport and management of safety alarms is of crucial importance. The French regulatory authority [Direction de Sûreté des Installations Nucléaires de Base (INB)] defines them as Level 3 alarms; they represent as such a danger for the life and require an immediate intervention of the Fire Brigade. Safety alarms are generated by fire and flammable gas detection systems, electrical emergency stops, and other safety related systems. Level 3 alarms are transmitted for reliability reasons to their operation centre: the CERN Safety Control Room (SCR) using two different media: the hard-wired network and a computer based system. The hard-wired networks are connected to local panels summarizing in 34 security areas the overall CERN geography. The computer based system offers data management facilities such as alarm acquisition, distribution, archiving and information correlation. The Level 3 alarms system is in constant evolution in order to achieve better reliability and to integrate new safety turn-key systems provided by industry
Recommended from our members
Telecommunication Network Security
YesOur global age is practically defined by the ubiquity of the Internet; the worldwide interconnection of
cyber networks that facilitates accessibility to virtually all ICT and other elements of critical
infrastructural facilities, with a click of a button. This is regardless of the user’s location and state of
equilibrium; whether static or mobile. However, such interconnectivity is not without security
consequences.
A telecommunication system is indeed a communication system with the distinguishing key
word, the Greek tele-, which means "at a distance," to imply that the source and sink of the system
are at some distance apart. Its purpose is to transfer information from some source to a distant user;
the key concepts being information, transmission and distance. These would require a means, each,
to send, convey and receive the information with safety and some degree of fidelity that is
acceptable to both the source and the sink.
Chapter K begins with an effort to conceptualise the telecommunication network security
environment, using relevant ITU-T2* recommendations and terminologies for secure telecommunications.
The chapter is primarily concerned with the security aspect of computer-mediated
telecommunications. Telecommunications should not be seen as an isolated phenomenon; it is a critical
resource for the functioning of cross-industrial businesses in connection with IT. Hence, just as
information, data or a computer/local computer-based network must have appropriate level of security,
so also a telecommunication network must have equivalent security measures; these may often be the
same as or similar to those for other ICT resources, e.g., password management.
In view of the forgoing, the chapter provides a brief coverage of the subject matter by first assessing
the context of security and the threat-scape. This is followed by an assessment of telecommunication
network security requirements; identification of threats to the systems, the conceivable counter or
mitigating measures and their implementation techniques. These bring into focus various
cryptographic/crypt analytical concepts, vis a vis social engineering/socio-crypt analytical techniques and
password management.
The chapter noted that the human factor is the most critical factor in the security system for at least
three possible reasons; it is the weakest link, the only factor that exercises initiatives, as well as the factor
that transcends all the other elements of the entire system. This underscores the significance of social
2*International Telecommunications Union - Telecommunication Standardisation Sector
12
engineering in every facet of security arrangement. It is also noted that password security could be
enhanced, if a balance is struck between having enough rules to maintain good security and not having
too many rules that would compel users to take evasive actions which would, in turn, compromise
security. The chapter is of the view that network security is inversely proportional to its complexity. In
addition to the traditional authentication techniques, the chapter gives a reasonable attention to locationbased
authentication. The chapter concludes that security solutions have a technological component, but
security is fundamentally a people problem. This is because a security system is only as strong as its
weakest link, while the weakest link of any security system is the human infrastructure.
A projection for the future of telecommunication network security postulates that, network security
would continue to get worse unless there is a change in the prevailing practice of externality or vicarious
liability in the computer/security industry; where consumers of security products, as opposed to
producers, bear the cost of security ineffectiveness. It is suggested that all transmission devices be made
GPS-compliant, with inherent capabilities for location-based mutual authentication. This could enhance
the future of telecommunication security.Petroleum Technology Development Fun
ANALISIS DAN PENERAPAN MANAJEMEN RISIKO APLIKASI PEMANTAUAN SERTA SISTEM MANAJEMEN KEAMANAN INFORMASI MENGGUNAKAN
In the success of the Electronic-Based Government System and Information Security Management System (SMKI), the West Java KPID must participate in it. West Java KPID has a monitoring application that functions to process recorded data and produce findings of violations of broadcast content in television broadcasts. In the monitoring application, there are national TV broadcasts that have networked main stations and local TV broadcasts. The monitoring application can only monitor television because there is a TV tuner that can make a computer capable of processing television signals and then recording them into and into the database and recording 24 hours of television viewing. Given the importance of information and the high risk of interference, the West Java KPID needs to carry out information security governance activities in the environment, especially in monitoring applications because there is data recording the contents of television broadcasts. There are frequent bugs and crashes in monitoring applications due to loss of voltage and not having a temporary power supply. The need for LAN network security to minimize the threat of attacks on monitoring applications. Risk assessment is needed to maintain the aspects of Confidentiality, Integrity, and availability and develop controls to minimize threats. This study carries out risk management monitoring applications using SNI ISO/IEC 27005: 2013 and carries out risk assessment controls based on SNI ISO/IEC 27001: 2013.
The steps taken are identification of information assets, threats, vulnerabilities, risks, impacts and clause mapping based on risk assessment. Then do a maturity level analysis, gap analysis, recommendation of control objectives and information security. So that this study resulted in a risk assessment, proposed mapping of control and control objectives based on SNI ISO/IEC 27001: 2013, the level of maturity of information security, findings and recommendations
Security risk analysis of the data communications network proposed in the NextGen air traffic control system
Scope and Method of Study: Aerospace Security Stakeholder Qualitative Interviews.Interests internal and external to the United States could wreak havoc by manipulating, impairing, or data mining the digital information being exchanged by aircraft and controllers in the proposed NextGen air traffic control system.The purpose of this study is the analyze potential security risks imposed by implementing the active network technologies utilized by ADS-B in NextGen, and more specifically the air-to-air, air-to-ground, and satellite-to-air links used. The purpose of this study was to provide a risk analysis of the NextGen active network compared to industry standards and best practices for information systems security. Data obtained through interviews was used to determine the effective security categorization and provide a risk analysis of the ADS-B portion of the proposed NextGen active network.Findings and Conclusions: Based on this research, there are both significant similarities and differences between the NextGen Active Network and industry standard computer networks. Both the ADS-B and computer networks operate in a wireless environment. Both are designed to move information between local devices, though the ADS-B devices may cover 200-300 miles at altitude where the computer networks are designed to cover a range of 200- 300 feet. Both have methods of insuring a high level of data integrity using FEC, CRC, 24-bit parity, or MAC codes. It is at this point that the similarities diverge, as the missions of the two networks are markedly different. Specific conclusions and recommendations cover the differences between the designed mission requirements of NextGen and existing computer security standards focusing on the security objectives within the Federal Information Security Management Act of 2002 and other pertinent government standards and industry best practices
An Application for Decentralized Access Control Mechanism on Cloud Data using Anonymous Authentication
In the last few years, Cloud computing has gained a lot of popularity and technology analysts believe it will be the future, but only if the security problems are solved from time-to-time. For those who are unfamiliar with cloud computing, it is a practice wherein users can access the data from the servers that are located in remote places. Users can do so through the Internet to manage, process and store the relevant data, instead of depending on the personal computer or a local server. Many firms and organizations are using cloud computing, which eventually is faster, cheaper and easy to maintain. Even the regular Internet users are also relying on cloud computing services to access their files whenever and wherever they wish. There are also numerous challenges associated with cloud computing like abuse of cloud services, data security and cyber-attacks. When clients outsource sensitive data through cloud servers, access control is one of the fundamental requirements among all security requirements which ensures that no unauthorized access to secured data will be avoided. Hence, cloud computing has to build a feature that provides privacy, access control challenges and security to the user data. A suitable and reliable encryption technique with enhanced key management should be developed and applied to the user data before loading into the cloud with the goal to achieve secured storage. It also has to support file access control and all other files related functions in a policy-based manner for any file stored in a cloud environment. This research paper proposes a decentralized access control mechanism for the data storage security in clouds which also provides anonymous authentication. This mechanism allows the decryption of the stored information only by the valid users, which is an additional feature of access control. Access control mechanism are decentralized which makes it robust when compared to centralized access control schemes meant for clouds
Cybersecurity for Manufacturers: Securing the Digitized and Connected Factory
As manufacturing becomes increasingly digitized and data-driven, manufacturers will find themselves at serious risk. Although there has yet to be a major successful cyberattack on a U.S. manufacturing operation, threats continue to rise. The complexities of multi-organizational dependencies and data-management in modern supply chains mean that vulnerabilities are multiplying.
There is widespread agreement among manufacturers, government agencies, cybersecurity firms, and leading academic computer science departments that U.S. industrial firms are doing too little to address these looming challenges. Unfortunately, manufacturers in general do not see themselves to be at particular risk. This lack of recognition of the threat may represent the greatest risk of cybersecurity failure for manufacturers. Public and private stakeholders must act before a significant attack on U.S. manufacturers provides a wake-up call.
Cybersecurity for the manufacturing supply chain is a particularly serious need. Manufacturing supply chains are connected, integrated, and interdependent; security of the entire supply chain depends on security at the local factory level. Increasing digitization in manufacturing— especially with the rise of Digital Manufacturing, Smart Manufacturing, the Smart Factory, and Industry 4.0, combined with broader market trends such as the Internet of Things (IoT)— exponentially increases connectedness. At the same time, the diversity of manufacturers—from large, sophisticated corporations to small job shops—creates weakest-link vulnerabilities that can be addressed most effectively by public-private partnerships.
Experts consulted in the development of this report called for more holistic thinking in industrial cybersecurity: improvements to technologies, management practices, workforce training, and learning processes that span units and supply chains. Solving the emerging security challenges will require commitment to continuous improvement, as well as investments in research and development (R&D) and threat-awareness initiatives. This holistic thinking should be applied across interoperating units and supply chains.National Science Foundation, Grant No. 1552534https://deepblue.lib.umich.edu/bitstream/2027.42/145442/1/MForesight_CybersecurityReport_Web.pd
Published incidents and their proportions of human error
The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.Purpose
- The information security field experiences a continuous stream of information security incidents and breaches, which are publicised by the media, public bodies and regulators. Despite the need for information security practices being recognised and in existence for some time the underlying general information security affecting tasks and causes of these incidents and breaches are not consistently understood, particularly with regard to human error.
Methodology
- This paper analyses recent published incidents and breaches to establish the proportions of human error, and where possible subsequently utilises the HEART human reliability analysis technique, which is established within the safety field.
Findings
- This analysis provides an understanding of the proportions of incidents and breaches that relate to human error as well as the common types of tasks that result in these incidents and breaches through adoption of methods applied within the safety field.
Originality
- This research provides original contribution to knowledge through the analysis of recent public sector information security incidents and breaches in order to understand the proportions that relate to human erro
- …