6,223 research outputs found

    The Question of Spectrum: Technology, Management, and Regime Change

    Get PDF
    There is general agreement that the traditional command-and-control regulation of radio spectrum by the FCC (and NTIA) has failed. There is no general agreement on which regime should succeed it. Property rights advocates take Ronald Coase's advice that spectrum licenses should be sold off and traded in secondary markets, like any other assets. Commons advocates argue that new technologies cannot be accommodated by a licensing regime (either traditional or property rights) and that a commons regime leads to the most efficient means to deliver useful spectrum to the American public. This article reviews the scholarly history of this controversy, outlines the revolution of FCC thinking, and parses the question of property rights vs. commons into four distinct parts: new technology, spectrum uses, spectrum management, and the overarching legal regime. Advocates on both sides find much to agree about on the first three factors; the disagreement is focused on the choice of overarching regime to most efficiently and effectively make spectrum and its applications available to the American public. There are two feasible regime choices: a property rights regime and a mixed licensed/commons regime subject to regulation. The regime choice depends upon four factors: dispute resolution, transactions costs, tragedies of the commons and anticommons, and flexibility to changing technologies and demands. Each regime is described and analyzed against these four factors. With regard to pure transactions costs, commons may hold an advantage but it appears quite small. For all other factors, the property rights regime holds very substantial advantages relative to the mixed regime. I conclude that the choice comes down to markets vs. regulation as mechanism for allocating resources.

    The eroticism of artificial flesh in Villiers de L'Isle Adam's L'Eve Future

    Get PDF
    Villiers de L'Isle Adam's 'L'Eve Future' published in 1886 features a fictional version of the inventor Thomas Edison who constructs a complex, custom-made android for Englishman Lord Ewald as a substitute for his unsatisfactory lover. Hadaly, the android, has a number of literary and cultural precursors and successors. Her most commonly accepted ancestor is Olympia in E. T. A. Hoffmann's 'The Sandman' (1816) and among her fascinating descendants are Oskar Kokoschka's 'Silent Woman'; Model Borghild, a sex doll designed by German technicians during World War II;‘Caracas' in Tommaso Landolfi's short story ‘Gogol's Wife' (1954); a variety of gynoids and golems from the realms of science fiction, including Ira Levin's 'Stepford Wives' (1972); and, most recently, that silicon masterpiece - the Real Doll. All, arguably, have their genesis in the classical myth of Pygmalion. This essay considers the tension between animation and stasis in relation to this myth, and explores the necrophiliac aesthetic implicit in Villiers's novel

    Implementation of Advanced Tools to Increase Operation Efficiency

    Get PDF
    The Client XYZ is a leader in developing new advanced products for the manufacturing of pharma ingredients and intermediates to provide complete value to its valuable customers and is built across 1,500 acres which consists of 29 production blocks which actively produce pharma ingredients. The client’s research team is specialized in process design for new drug candidates, development up to gram/kilo scale, structural elucidation, impurity profile studies, process validation, process justification, process optimization, analytical methods development and validation, environment impact analysis, safety studies and time cycle studies etc. There are over 68 large sites worldwide with diverse applications needs including work order, calibration, inventory, inspections, and asset management. The deployed Work Force and Asset Management is now 900 users who use smart mobile solutions on handheld devices to perform operations, calibrations, and inventory work. Results will cut overall work order activities and shortened cycle time and eliminate all paper processing activities saving over 175,000 man hours per year. This drastically reduces pre mobile paper backlog from two weeks to two hours and gives technicians the ability to see all history, inventory levels, and safety guidelines. Within a year, reduced workflow administrative costs will be reduced from 8.00to8.00 to 4.00

    Rational Cybersecurity for Business

    Get PDF
    Use the guidance in this comprehensive field guide to gain the support of your top executives for aligning a rational cybersecurity plan with your business. You will learn how to improve working relationships with stakeholders in complex digital businesses, IT, and development environments. You will know how to prioritize your security program, and motivate and retain your team. Misalignment between security and your business can start at the top at the C-suite or happen at the line of business, IT, development, or user level. It has a corrosive effect on any security project it touches. But it does not have to be like this. Author Dan Blum presents valuable lessons learned from interviews with over 70 security and business leaders. You will discover how to successfully solve issues related to: risk management, operational security, privacy protection, hybrid cloud management, security culture and user awareness, and communication challenges. This open access book presents six priority areas to focus on to maximize the effectiveness of your cybersecurity program: risk management, control baseline, security culture, IT rationalization, access control, and cyber-resilience. Common challenges and good practices are provided for businesses of different types and sizes. And more than 50 specific keys to alignment are included. What You Will Learn Improve your security culture: clarify security-related roles, communicate effectively to businesspeople, and hire, motivate, or retain outstanding security staff by creating a sense of efficacy Develop a consistent accountability model, information risk taxonomy, and risk management framework Adopt a security and risk governance model consistent with your business structure or culture, manage policy, and optimize security budgeting within the larger business unit and CIO organization IT spend Tailor a control baseline to your organization’s maturity level, regulatory requirements, scale, circumstances, and critical assets Help CIOs, Chief Digital Officers, and other executives to develop an IT strategy for curating cloud solutions and reducing shadow IT, building up DevSecOps and Disciplined Agile, and more Balance access control and accountability approaches, leverage modern digital identity standards to improve digital relationships, and provide data governance and privacy-enhancing capabilities Plan for cyber-resilience: work with the SOC, IT, business groups, and external sources to coordinate incident response and to recover from outages and come back stronger Integrate your learnings from this book into a quick-hitting rational cybersecurity success plan Who This Book Is For Chief Information Security Officers (CISOs) and other heads of security, security directors and managers, security architects and project leads, and other team members providing security leadership to your busines

    Development of Secure Software : Rationale, Standards and Practices

    Get PDF
    The society is run by software. Electronic processing of personal and financial data forms the core of nearly all societal and economic activities, and concerns every aspect of life. Software systems are used to store, transfer and process this vital data. The systems are further interfaced by other systems, forming complex networks of data stores and processing entities.This data requires protection from misuse, whether accidental or intentional. Elaborate and extensive security mechanisms are built around the protected information assets. These mechanisms cover every aspect of security, from physical surroundings and people to data classification schemes, access control, identity management, and various forms of encryption. Despite the extensive information security effort, repeated security incidents keep compromising our financial assets, intellectual property, and privacy. In addition to the direct and indirect cost, they erode the trust in the very foundation of information security: availability, integrity, and confidentiality of our data. Lawmakers at various national and international levels have reacted by creating a growing body of regulation to establish a baseline for information security. Increased awareness of information security issues has led to extend this regulation to one of the core issues in secure data processing: security of the software itself. Information security contains many aspects. It is generally classified into organizational security, infrastructure security, and application security. Within application security, the various security engineering processes and techniques utilized at development time form the discipline of software security engineering. The aim of these security activities is to address the software-induced risk toward the organization, reduce the security incidents and thereby lower the lifetime cost of the software. Software security engineering manages the software risk by implementing various security controls right into the software, and by providing security assurance for the existence of these controls by verification and validation. A software development process has typically several objectives, of which security may form only a part. When security is not expressly prioritized, the development organizations have a tendency to direct their resources to the primary requirements. While producing short-term cost and time savings, the increased software risk, induced by a lack of security and assurance engineering, will have to be mitigated by other means. In addition to increasing the lifetime cost of software, unmitigated or even unidentified risk has an increased chance of being exploited and cause other software issues. This dissertation concerns security engineering in agile software development. The aim of the research is to find ways to produce secure software through the introduction of security engineering into the agile software development processes. Security engineering processes are derived from extant literature, industry practices, and several national and international standards. The standardized requirements for software security are traced to their origins in the late 1960s, and the alignment of the software engineering and security engineering objectives followed from their original challenges to the current agile software development methods. The research provides direct solutions to the formation of security objectives in software development, and to the methods used to achieve them. It also identifies and addresses several issues and challenges found in the integration of these activities into the development processes, providing directly applicable and clearly stated solutions for practical security engineering problems. The research found the practices and principles promoted by agile and lean software development methods to be compatible with many security engineering activities. Automated, tool-based processes and the drive for efficiency and improved software quality were found to directly support the security engineering techniques and objectives. Several new ways to integrate software engineering into agile software development processes were identified. Ways to integrate security assurance into the development process were also found, in the form of security documentation, analyses, and reviews. Assurance artifacts can be used to improve software design and enhance quality assurance. In contrast, detached security engineering processes may create security assurance that serves only purposes external to the software processes. The results provide direct benefits to all software stakeholders, from the developers and customers to the end users. Security awareness is the key to more secure software. Awareness creates a demand for security, and the demand gives software developers the concrete objectives and the rationale for the security work. This also creates a demand for new security tools, processes and controls to improve the efficiency and effectiveness of software security engineering. At first, this demand is created by increased security regulation. The main pressure for change will emanate from the people and organizations utilizing the software: security is a mandatory requirement, and software must provide it. This dissertation addresses these new challenges. Software security continues to gain importance, prompting for new solutions and research.Ohjelmistot ovat keskeinen osa yhteiskuntamme perusinfrastruktuuria. Merkittävä osa sosiaalisesta ja taloudellisesta toiminnastamme perustuu tiedon sähköiseen käsittelyyn, varastointiin ja siirtoon. Näitä tehtäviä suorittamaan on kehitetty merkittävä joukko ohjelmistoja, jotka muodostavat mutkikkaita tiedon yhteiskäytön mahdollistavia verkostoja. Tiedon suojaamiseksi sen ympärille on kehitetty lukuisia suojamekanismeja, joiden tarkoituksena on estää tiedon väärinkäyttö, oli se sitten tahatonta tai tahallista. Suojausmekanismit koskevat paitsi ohjelmistoja, myös niiden käyttöympäristöjä ja käyttäjiä sekä itse käsiteltävää tietoa: näitä mekanismeja ovat esimerkiksi tietoluokittelut, tietoon pääsyn rajaaminen, käyttäjäidentiteettien hallinta sekä salaustekniikat. Suojaustoimista huolimatta tietoturvaloukkaukset vaarantavat sekä liiketoiminnan ja yhteiskunnan strategisia tietovarantoj että henkilökohtaisia tietojamme. Taloudellisten menetysten lisäksi hyökkäykset murentavat luottamusta tietoturvan kulmakiviin: tiedon luottamuksellisuuteen, luotettavuuteen ja sen saatavuuteen. Näiden tietoturvan perustusten suojaamiseksi on laadittu kasvava määrä tietoturvaa koskevia säädöksiä, jotka määrittävät tietoturvan perustason. Lisääntyneen tietoturvatietoisuuden ansiosta uusi säännöstö on ulotettu koskemaan myös turvatun tietojenkäsittelyn ydintä,ohjelmistokehitystä. Tietoturva koostuu useista osa-alueista. Näitä ovat organisaatiotason tietoturvakäytännöt, tietojenkäsittelyinfrastruktuurin tietoturva, sekä tämän tutkimuksen kannalta keskeisenä osana ohjelmistojen tietoturva. Tähän osaalueeseen sisältyvät ohjelmistojen kehittämisen aikana käytettävät tietoturvatekniikat ja -prosessit. Tarkoituksena on vähentää ohjelmistojen organisaatioille aiheuttamia riskejä, tai poistaa ne kokonaan. Ohjelmistokehityksen tietoturva pyrkii pienentämään ohjelmistojen elinkaarikustannuksia määrittämällä ja toteuttamalla tietoturvakontrolleja suoraan ohjelmistoon itseensä. Lisäksi kontrollien toimivuus ja tehokkuus osoitetaan erillisten verifiointija validointimenetelmien avulla. Tämä väitöskirjatutkimus keskittyy tietoturvatyöhön osana iteratiivista ja inkrementaalista ns. ketterää (agile) ohjelmistokehitystä. Tutkimuksen tavoitteena on löytää uusia tapoja tuottaa tietoturvallisia ohjelmistoja liittämällä tietoturvatyö kiinteäksi osaksi ohjelmistokehityksen prosesseja. Tietoturvatyön prosessit on johdettu alan tieteellisestä ja teknillisestä kirjallisuudesta, ohjelmistokehitystyön vallitsevista käytännöistä sekä kansallisista ja kansainvälisistä tietoturvastandardeista. Standardoitujen tietoturvavaatimusten kehitystä on seurattu aina niiden alkuajoilta 1960-luvulta lähtien, liittäen ne ohjelmistokehityksen tavoitteiden ja haasteiden kehitykseen: nykyaikaan ja ketterien menetelmien valtakauteen saakka. Tutkimuksessa esitetään konkreettisia ratkaisuja ohjelmistokehityksen tietoturvatyön tavoitteiden asettamiseen ja niiden saavuttamiseen. Tutkimuksessa myös tunnistetaan ongelmia ja haasteita tietoturvatyön ja ohjelmistokehityksen menetelmien yhdistämisessä, joiden ratkaisemiseksi tarjotaan toimintaohjeita ja -vaihtoehtoja. Tutkimuksen perusteella iteratiivisen ja inkrementaalisen ohjelmistokehityksen käytäntöjen ja periaatteiden yhteensovittaminen tietoturvatyön toimintojen kanssa parantaa ohjelmistojen laatua ja tietoturvaa, alentaen täten kustannuksia koko ohjelmiston ylläpitoelinkaaren aikana. Ohjelmistokehitystyön automatisointi, työkaluihin pohjautuvat prosessit ja pyrkimys tehokkuuteen sekä korkeaan laatuun ovat suoraan yhtenevät tietoturvatyön menetelmien ja tavoitteiden kanssa. Tutkimuksessa tunnistettiin useita uusia tapoja yhdistää ohjelmistokehitys ja tietoturvatyö. Lisäksi on löydetty tapoja käyttää dokumentointiin, analyyseihin ja katselmointeihin perustuvaa tietoturvan todentamiseen tuotettavaa materiaalia osana ohjelmistojen suunnittelua ja laadunvarmistusta. Erillisinä nämä prosessit johtavat tilanteeseen, jossa tietoturvamateriaalia hyödynnetään pelkästään ohjelmistokehityksen ulkopuolisiin tarpeisiin. Tutkimustulokset hyödyttävät kaikkia sidosryhmiä ohjelmistojen kehittäjistä niiden tilaajiin ja loppukäyttäjiin. Ohjelmistojen tietoturvatyö perustuu tietoon ja koulutukseen. Tieto puolestaan lisää kysyntää, joka luo tietoturvatyölle konkreettiset tavoitteet ja perustelut jo ohjelmistokehitysvaiheessa. Tietoturvatyön painopiste siirtyy torjunnasta ja vahinkojen korjauksesta kohti vahinkojen rakenteellista ehkäisyä. Kysyntä luo tarpeen myös uusille työkaluille, prosesseille ja tekniikoille, joilla lisätään tietoturvatyön tehokkuutta ja vaikuttavuutta. Tällä hetkellä kysyntää luovat lähinnä lisääntyneet tietoturvaa koskevat säädökset. Pääosa muutostarpeesta syntyy kuitenkin ohjelmistojen tilaajien ja käyttäjien vaatimuksista: ohjelmistojen tietoturvakyvykkyyden taloudellinen merkitys kasvaa. Tietoturvan tärkeys tulee korostumaan entisestään, lisäten tarvetta tietoturvatyölle ja tutkimukselle myös tulevaisuudessa

    Learning From Movie-Sets Coordination

    Get PDF
    A comparative study of large, professional movie sets and construction sites revealed that conditions for movie production were far more complex and unpredictable, with sudden surprises, changing facilities and often hundreds of people from independent professional groups to be coordinated from hour to hour at different locations (Egebjerg, 2012). These are the uncertain production conditions that we usually hear of as an excuse for not being able to control construction sites. Yet the study also shows that the movie industry is far better at controlling production time, budget and quality exactly as initially planned (Deloitte, 2010). Part of the explanation for this different performance is a process management system that large movie studios have developed over the years including a digital tool for micro-managing the process in a flexible way to suit the dynamic processes. Both industries live with the reality of obstacles occurring on the ‘critical path’ and the process becoming not as fast as anticipated. So there is a need for a map of alternative ‘short cuts’ to get back on track. Usually this work is managed inside the head of a professional construction manager, but even the best person in the job can become tired from keeping a high level of overview, information and coordination of escalating problem chains. This paper describes the innovative experiment of designing and testing on construction professionals a proto-version of a similar digital tool and system to that used on movie sets for detailed scheduling, coordinating and micro-managing construction processes. The current working name for this digital tool is “Short Cut”, and it is seen as a future supplement to common scheduling tools with application for project parts that are particular sensitive to deadlines or other complexities

    Building a boundaryless manufacturing organisation through HITOP method

    Get PDF
    There is little empirical research to support the allegation that ‘leagile’ manufacturing organisations thrive in hostile environments, nor has it been demonstrated that organisation processes (referred to as enablers) actually support ‘leagile’ performance. This study tests the statistical significance of five selected HITOP (highly integrated technology, organisation and people) ‘leagile’ enablers. This was accomplished by using a mail survey instrument to measure the presence of ‘leagile enablers’ in a sample of companies taken from best factory award winners in UK, US and Japan. [Continues.

    Characterizing industry-academia collaborations in software engineering: evidence from 101 projects

    Get PDF
    Research collaboration between industry and academia supports improvement and innovation in industry and helps ensure the industrial relevance of academic research. However, many researchers and practitioners in the community believe that the level of joint industry-academia collaboration (IAC) projects in Software Engineering (SE) research is relatively low, creating a barrier between research and practice. The goal of the empirical study reported in this paper is to explore and characterize the state of IAC with respect to industrial needs, developed solutions, impacts of the projects and also a set of challenges, patterns and anti-patterns identified by a recent Systematic Literature Review (SLR) study. To address the above goal, we conducted an opinion survey among researchers and practitioners with respect to their experience in IAC. Our dataset includes 101 data points from IAC projects conducted in 21 different countries. Our findings include: (1) the most popular topics of the IAC projects, in the dataset, are: software testing, quality, process, and project managements; (2) over 90% of IAC projects result in at least one publication; (3) almost 50% of IACs are initiated by industry, busting the myth that industry tends to avoid IACs; and (4) 61% of the IAC projects report having a positive impact on their industrial context, while 31% report no noticeable impacts or were “not sure”. To improve this situation, we present evidence-based recommendations to increase the success of IAC projects, such as the importance of testing pilot solutions before using them in industry. This study aims to contribute to the body of evidence in the area of IAC, and benefit researchers and practitioners. Using the data and evidence presented in this paper, they can conduct more successful IAC projects in SE by being aware of the challenges and how to overcome them, by applying best practices (patterns), and by preventing anti-patterns.The authors would like to thank the researchers and practitioners who participated in this survey. João M. Fernandes was supported by FCT (Fundação para a Ciência e Tecnologia) within the Project Scope UID/CEC/00319/2013. Dietmar Pfahl was supported by the institutional research grant IUT20-55 of the Estonian Research Council. Andrea Arcuri was supported by the Research Council of Norway (grant agreement No 274385). Mika Mäntylä was partially supported by Academy of Finland grant and ITEA3 / TEKES grant
    corecore