3,220 research outputs found

    Development of Secure Software : Rationale, Standards and Practices

    Get PDF
    The society is run by software. Electronic processing of personal and financial data forms the core of nearly all societal and economic activities, and concerns every aspect of life. Software systems are used to store, transfer and process this vital data. The systems are further interfaced by other systems, forming complex networks of data stores and processing entities.This data requires protection from misuse, whether accidental or intentional. Elaborate and extensive security mechanisms are built around the protected information assets. These mechanisms cover every aspect of security, from physical surroundings and people to data classification schemes, access control, identity management, and various forms of encryption. Despite the extensive information security effort, repeated security incidents keep compromising our financial assets, intellectual property, and privacy. In addition to the direct and indirect cost, they erode the trust in the very foundation of information security: availability, integrity, and confidentiality of our data. Lawmakers at various national and international levels have reacted by creating a growing body of regulation to establish a baseline for information security. Increased awareness of information security issues has led to extend this regulation to one of the core issues in secure data processing: security of the software itself. Information security contains many aspects. It is generally classified into organizational security, infrastructure security, and application security. Within application security, the various security engineering processes and techniques utilized at development time form the discipline of software security engineering. The aim of these security activities is to address the software-induced risk toward the organization, reduce the security incidents and thereby lower the lifetime cost of the software. Software security engineering manages the software risk by implementing various security controls right into the software, and by providing security assurance for the existence of these controls by verification and validation. A software development process has typically several objectives, of which security may form only a part. When security is not expressly prioritized, the development organizations have a tendency to direct their resources to the primary requirements. While producing short-term cost and time savings, the increased software risk, induced by a lack of security and assurance engineering, will have to be mitigated by other means. In addition to increasing the lifetime cost of software, unmitigated or even unidentified risk has an increased chance of being exploited and cause other software issues. This dissertation concerns security engineering in agile software development. The aim of the research is to find ways to produce secure software through the introduction of security engineering into the agile software development processes. Security engineering processes are derived from extant literature, industry practices, and several national and international standards. The standardized requirements for software security are traced to their origins in the late 1960s, and the alignment of the software engineering and security engineering objectives followed from their original challenges to the current agile software development methods. The research provides direct solutions to the formation of security objectives in software development, and to the methods used to achieve them. It also identifies and addresses several issues and challenges found in the integration of these activities into the development processes, providing directly applicable and clearly stated solutions for practical security engineering problems. The research found the practices and principles promoted by agile and lean software development methods to be compatible with many security engineering activities. Automated, tool-based processes and the drive for efficiency and improved software quality were found to directly support the security engineering techniques and objectives. Several new ways to integrate software engineering into agile software development processes were identified. Ways to integrate security assurance into the development process were also found, in the form of security documentation, analyses, and reviews. Assurance artifacts can be used to improve software design and enhance quality assurance. In contrast, detached security engineering processes may create security assurance that serves only purposes external to the software processes. The results provide direct benefits to all software stakeholders, from the developers and customers to the end users. Security awareness is the key to more secure software. Awareness creates a demand for security, and the demand gives software developers the concrete objectives and the rationale for the security work. This also creates a demand for new security tools, processes and controls to improve the efficiency and effectiveness of software security engineering. At first, this demand is created by increased security regulation. The main pressure for change will emanate from the people and organizations utilizing the software: security is a mandatory requirement, and software must provide it. This dissertation addresses these new challenges. Software security continues to gain importance, prompting for new solutions and research.Ohjelmistot ovat keskeinen osa yhteiskuntamme perusinfrastruktuuria. Merkittävä osa sosiaalisesta ja taloudellisesta toiminnastamme perustuu tiedon sähköiseen käsittelyyn, varastointiin ja siirtoon. Näitä tehtäviä suorittamaan on kehitetty merkittävä joukko ohjelmistoja, jotka muodostavat mutkikkaita tiedon yhteiskäytön mahdollistavia verkostoja. Tiedon suojaamiseksi sen ympärille on kehitetty lukuisia suojamekanismeja, joiden tarkoituksena on estää tiedon väärinkäyttö, oli se sitten tahatonta tai tahallista. Suojausmekanismit koskevat paitsi ohjelmistoja, myös niiden käyttöympäristöjä ja käyttäjiä sekä itse käsiteltävää tietoa: näitä mekanismeja ovat esimerkiksi tietoluokittelut, tietoon pääsyn rajaaminen, käyttäjäidentiteettien hallinta sekä salaustekniikat. Suojaustoimista huolimatta tietoturvaloukkaukset vaarantavat sekä liiketoiminnan ja yhteiskunnan strategisia tietovarantoj että henkilökohtaisia tietojamme. Taloudellisten menetysten lisäksi hyökkäykset murentavat luottamusta tietoturvan kulmakiviin: tiedon luottamuksellisuuteen, luotettavuuteen ja sen saatavuuteen. Näiden tietoturvan perustusten suojaamiseksi on laadittu kasvava määrä tietoturvaa koskevia säädöksiä, jotka määrittävät tietoturvan perustason. Lisääntyneen tietoturvatietoisuuden ansiosta uusi säännöstö on ulotettu koskemaan myös turvatun tietojenkäsittelyn ydintä,ohjelmistokehitystä. Tietoturva koostuu useista osa-alueista. Näitä ovat organisaatiotason tietoturvakäytännöt, tietojenkäsittelyinfrastruktuurin tietoturva, sekä tämän tutkimuksen kannalta keskeisenä osana ohjelmistojen tietoturva. Tähän osaalueeseen sisältyvät ohjelmistojen kehittämisen aikana käytettävät tietoturvatekniikat ja -prosessit. Tarkoituksena on vähentää ohjelmistojen organisaatioille aiheuttamia riskejä, tai poistaa ne kokonaan. Ohjelmistokehityksen tietoturva pyrkii pienentämään ohjelmistojen elinkaarikustannuksia määrittämällä ja toteuttamalla tietoturvakontrolleja suoraan ohjelmistoon itseensä. Lisäksi kontrollien toimivuus ja tehokkuus osoitetaan erillisten verifiointija validointimenetelmien avulla. Tämä väitöskirjatutkimus keskittyy tietoturvatyöhön osana iteratiivista ja inkrementaalista ns. ketterää (agile) ohjelmistokehitystä. Tutkimuksen tavoitteena on löytää uusia tapoja tuottaa tietoturvallisia ohjelmistoja liittämällä tietoturvatyö kiinteäksi osaksi ohjelmistokehityksen prosesseja. Tietoturvatyön prosessit on johdettu alan tieteellisestä ja teknillisestä kirjallisuudesta, ohjelmistokehitystyön vallitsevista käytännöistä sekä kansallisista ja kansainvälisistä tietoturvastandardeista. Standardoitujen tietoturvavaatimusten kehitystä on seurattu aina niiden alkuajoilta 1960-luvulta lähtien, liittäen ne ohjelmistokehityksen tavoitteiden ja haasteiden kehitykseen: nykyaikaan ja ketterien menetelmien valtakauteen saakka. Tutkimuksessa esitetään konkreettisia ratkaisuja ohjelmistokehityksen tietoturvatyön tavoitteiden asettamiseen ja niiden saavuttamiseen. Tutkimuksessa myös tunnistetaan ongelmia ja haasteita tietoturvatyön ja ohjelmistokehityksen menetelmien yhdistämisessä, joiden ratkaisemiseksi tarjotaan toimintaohjeita ja -vaihtoehtoja. Tutkimuksen perusteella iteratiivisen ja inkrementaalisen ohjelmistokehityksen käytäntöjen ja periaatteiden yhteensovittaminen tietoturvatyön toimintojen kanssa parantaa ohjelmistojen laatua ja tietoturvaa, alentaen täten kustannuksia koko ohjelmiston ylläpitoelinkaaren aikana. Ohjelmistokehitystyön automatisointi, työkaluihin pohjautuvat prosessit ja pyrkimys tehokkuuteen sekä korkeaan laatuun ovat suoraan yhtenevät tietoturvatyön menetelmien ja tavoitteiden kanssa. Tutkimuksessa tunnistettiin useita uusia tapoja yhdistää ohjelmistokehitys ja tietoturvatyö. Lisäksi on löydetty tapoja käyttää dokumentointiin, analyyseihin ja katselmointeihin perustuvaa tietoturvan todentamiseen tuotettavaa materiaalia osana ohjelmistojen suunnittelua ja laadunvarmistusta. Erillisinä nämä prosessit johtavat tilanteeseen, jossa tietoturvamateriaalia hyödynnetään pelkästään ohjelmistokehityksen ulkopuolisiin tarpeisiin. Tutkimustulokset hyödyttävät kaikkia sidosryhmiä ohjelmistojen kehittäjistä niiden tilaajiin ja loppukäyttäjiin. Ohjelmistojen tietoturvatyö perustuu tietoon ja koulutukseen. Tieto puolestaan lisää kysyntää, joka luo tietoturvatyölle konkreettiset tavoitteet ja perustelut jo ohjelmistokehitysvaiheessa. Tietoturvatyön painopiste siirtyy torjunnasta ja vahinkojen korjauksesta kohti vahinkojen rakenteellista ehkäisyä. Kysyntä luo tarpeen myös uusille työkaluille, prosesseille ja tekniikoille, joilla lisätään tietoturvatyön tehokkuutta ja vaikuttavuutta. Tällä hetkellä kysyntää luovat lähinnä lisääntyneet tietoturvaa koskevat säädökset. Pääosa muutostarpeesta syntyy kuitenkin ohjelmistojen tilaajien ja käyttäjien vaatimuksista: ohjelmistojen tietoturvakyvykkyyden taloudellinen merkitys kasvaa. Tietoturvan tärkeys tulee korostumaan entisestään, lisäten tarvetta tietoturvatyölle ja tutkimukselle myös tulevaisuudessa

    Regulatory Markets: The Future of AI Governance

    Full text link
    Appropriately regulating artificial intelligence is an increasingly urgent policy challenge. Legislatures and regulators lack the specialized knowledge required to best translate public demands into legal requirements. Overreliance on industry self-regulation fails to hold producers and users of AI systems accountable to democratic demands. Regulatory markets, in which governments require the targets of regulation to purchase regulatory services from a private regulator, are proposed. This approach to AI regulation could overcome the limitations of both command-and-control regulation and self-regulation. Regulatory market could enable governments to establish policy priorities for the regulation of AI, whilst relying on market forces and industry R&D efforts to pioneer the methods of regulation that best achieve policymakers' stated objectives

    Rational Cybersecurity for Business

    Get PDF
    Use the guidance in this comprehensive field guide to gain the support of your top executives for aligning a rational cybersecurity plan with your business. You will learn how to improve working relationships with stakeholders in complex digital businesses, IT, and development environments. You will know how to prioritize your security program, and motivate and retain your team. Misalignment between security and your business can start at the top at the C-suite or happen at the line of business, IT, development, or user level. It has a corrosive effect on any security project it touches. But it does not have to be like this. Author Dan Blum presents valuable lessons learned from interviews with over 70 security and business leaders. You will discover how to successfully solve issues related to: risk management, operational security, privacy protection, hybrid cloud management, security culture and user awareness, and communication challenges. This open access book presents six priority areas to focus on to maximize the effectiveness of your cybersecurity program: risk management, control baseline, security culture, IT rationalization, access control, and cyber-resilience. Common challenges and good practices are provided for businesses of different types and sizes. And more than 50 specific keys to alignment are included. What You Will Learn Improve your security culture: clarify security-related roles, communicate effectively to businesspeople, and hire, motivate, or retain outstanding security staff by creating a sense of efficacy Develop a consistent accountability model, information risk taxonomy, and risk management framework Adopt a security and risk governance model consistent with your business structure or culture, manage policy, and optimize security budgeting within the larger business unit and CIO organization IT spend Tailor a control baseline to your organization’s maturity level, regulatory requirements, scale, circumstances, and critical assets Help CIOs, Chief Digital Officers, and other executives to develop an IT strategy for curating cloud solutions and reducing shadow IT, building up DevSecOps and Disciplined Agile, and more Balance access control and accountability approaches, leverage modern digital identity standards to improve digital relationships, and provide data governance and privacy-enhancing capabilities Plan for cyber-resilience: work with the SOC, IT, business groups, and external sources to coordinate incident response and to recover from outages and come back stronger Integrate your learnings from this book into a quick-hitting rational cybersecurity success plan Who This Book Is For Chief Information Security Officers (CISOs) and other heads of security, security directors and managers, security architects and project leads, and other team members providing security leadership to your busines

    Kostnadsanalys av en molnbaserad konvergerad IT arkitektur för ett litet företag

    Get PDF
    The purpose of this thesis is to study the dispersed IT architecture of a small sized enterprise versus a converged cloud based IT architecture. Cloud computing enables moving to a pay-as-you-go model with low up-front investment making it attractive to small sized enterprises. Other traits that appeal to small sized enterprises are flexibility, modularity and ease of use. However, an important factor to be aware of when investing in a cloud solution is hidden costs, such as extra fees and premium support costs. The two scenarios (dispersed versus converged) are studied in terms of Total Cost of Ownership (TCO) and Customer-Provider Strategic Alignment Maturity (CPSAM) as the IT services are outsourced in both scenarios. The TCO provides cost information on both scenarios indicating where savings could be made and exposes excess expenditures. Whilst the CPSAM studies the outsourcing strategies and unveils vendor management issues. Based on the analysis the main differences in TCO related to operational costs, which includes maintenance and support costs. These can vary, however taking in consideration a margin of error there was still a clear difference between the two scenarios and the converged architecture showed a decrease in operational costs. The CPSAM analysis showed issues in communication, articulation of processes and lacking knowledge of the whole value network. Some of the risks could be minimized by choosing scenario 2 as vendor management would be centralized and less complex. However, many of the recommended actions concern both scenarios, such as formalizing a collaboration blueprint, re-assessing contracts for suitability, defining and communicating roles and responsibilities and defining and articulating communication practices.Syftet med detta diplomarbete är att undersöka ett litet företags icke-centrerade IT arkitektur i jämförelse med en centrerad molnbaserad IT arkitektur. Molntjänster är attraktiva för små företag eftersom startavgiften för investeringen är låg och kostnadsmodellen ändras till så kallad ”pay-as-you-go” modell där man endast betalar för de tjänster som används. Andra egenskaper som mindre företag uppskattar är flexibilitet, modularitet och användarvänlighet. Dock är det viktigt att ta i beaktande så kallade gömda kostnader som till exempel kan bestå av extra utgifter eller premium support kostnader. Studien undersöker de två scenarierna (dispergerad och konvergerad) både ur ett Total ägandekostnads perspektiv (TCO) och ur ett maturitets perspektiv (CPSAM) där kundens och leverantörens strategiska positionering analyseras. Analysen på Total ägandekostnader ger kostnadsinformation för båda scenarierna och utgående från den information kan man identifiera besparingsmöjligheter och eventuella överskott i utgifter. CPSAM analysen studerar outsourcing strategier och avslöjar problem i leverantörhanteringen. Utgående från Total ägandekostnads analysen härstammar de största kostnadsskillnaderna från operativa kostnader så som underhåll och support. Dessa kostnader kan variera men även då en felmarginal tas i beaktande är skillnaden i kostnader tydlig. Den konvergerade IT arkitekturen leder till lägre operativa kostnader. Baserat på maturitets analysen kunde det konstateras att problemen relaterar till kommunikation, processartikulation, och bristande helhetskunskap. En del risker kunde minimeras med en konvergerad IT arkitektur, scenario 2, eftersom leverantörhanteringen skulle centraliseras och därmed bli mindre komplex. Däremot är de flesta rekommendationerna aktuella för båda scenarier, såsom formalisering och standardisering av samarbetspraxis, omvärdering av kontrakt för att möta dagens krav, definiering och kommunikation av roller och ansvarsområden, och definiering och artikulation av kommunikations praxis

    After the success of DevOps introduce DataOps in enterprise culture

    Get PDF
    Dissertation presented as the partial requirement for obtaining a Master's degree in Information Management, specialization in Information Systems and Technologies ManagementA lot of organizations implemented DevOps processes with success. This allowed different areas like development, operations, security and quality work together. This cooperation, and processes associated to the work with these areas are producing excellent results. The organizations are developing many applications that support operation and are producing a lot of data. This data has a significant value for organizations because must be used in analysis, reporting and more recently data science projects to support decisions. It is time to take decisions supported in data and for this is necessary to transform organizations in a data-driven organizations and for this we need processes to deal with this data across all teams. This dissertation follows a design science research approach to apply multiple analytical methods and perspectives to create an artifact. The type of evidence within this methodology is a systematic literature review, with the goal to attain insights into the current state-of-the art research of DataOps implementation. Additionally, proven best practices from the industry are examined in depth to further strengthen the credibility. Thereby, the systematic literature review shall be used to pinpoint, analyze, and comprehend the obtainable empirical studies and research questions. This methodology supports the main goal of this dissertation, to develop and propose evidence-based practice guidelines for the DataOps implementation that can be followed by organizations

    Cloud based collaborative software development: A review, gap analysis and future directions

    Get PDF
    Organizations who have transitioned their development environments to the Cloud have started realizing benefits such as: cost reduction in hardware; relatively accelerated development process via reduction of time and effort to set up development and testing environments; unified management; service and functionality expansion; on-demand provisioning and access to resources and development environments. These benefits represent only a fraction of the full potential that could be achieved via leveraging Cloud Computing for the collaborative software development process. Related efforts in this area have been mainly in the areas of: asynchronous collaboration; collaboration in isolated aspects of the Software Development process, such as coding activities; use of open-source tools for contributing, improving, and managing code, etcetera. Although these efforts represent valid contributions and important enablers, they are still missing important aspects which enable a more holistic process, with solid theoretical foundation. This paper reviews this research area, in order to better assess factors and gaps creating the need to enhance the collaborative software development process in the Cloud, to better meet the pressure to collaboratively create better cloud-agnostic applications. © 2017 IEEE

    A modern approach for Threat Modelling in agile environments: redesigning the process in a SaaS company

    Get PDF
    Dealing with security aspects has become one of the priorities for companies operating in every sector. In the software industry building security requires being proactive and preventive by incorporating requirements right from the ideation and design of the product. Threat modelling has been consistently proven as one of the most effective and rewarding security activities in doing that, being able to uncover threats and vulnerabilities before they are even introduced into the codebase. Numerous approaches to conduct such exercise have been proposed over time, however, most of them can not be adopted in intricate corporate environments with multiple development teams. This is clear by analysing the case of Company Z, which introduced a well-documented process in 2019 but scalability, governance and knowledge issues blocked a widespread adoption. The main goal of the Thesis was to overcome these problems by designing a novel threat modelling approach, able to fit the company’s Agile environment and capable of closing the current gaps. As a result, a complete description of the redefined workflow and a structured set of suggestions was proposed. The solution is flexible enough to be adopted in multiple different contexts while meeting the requirements of Company Z. Achieving this result was possible only by analysing the industry’s best practices and solutions, understanding the current process, identifying the pain points, and gathering feedback from stakeholders. The solution proposed includes, alongside the new threat modelling process, a comprehensive method for evaluating and verifying the effectiveness of the proposed solution

    Managing Distributed Teams Using Agile Methods: An Implementation Strategy for Regulated Healthcare Software

    Get PDF
    Omega is a small medical software company focusing mainly on highly customized software solutions around patient communities, telemedicine and workflow optimization. The company has been in operation for nearly 10 years, with many successful project implementations, but has had little growth in this period. A set of recommendations are established for setting up an offshore team for software development, as well as moving infrastructure to the cloud to decrease costs. An analysis of strategy and process revision is required to ensure that this risky transition is effective. This paper will analyze risks and objectives in regards to moving to offshore development for a portion of software development. It will dentify necessary corporate structure, roles and processes to ensure efficient development with virtual teams. It will outline the use of ‘agile’ methodologies for software development in regards to offshore teams, as opposed to traditional project management methodologies. In addition, it will establish an analysis of costs and return on investment for moving to cloud for server hosting and corporate IT infrastructure
    corecore