5,847 research outputs found

    DAVID D2.2: Analysis of loss modes in preservation systems

    No full text
    This is a report on the way in which loss and damage to digital AV content occurs for different content types, AV data carriers and preservation systems.Three different loss modes have been identified, and each has been analysed in terms of existing solutions and longterm effects. This report also includes an in-depth treatment of format compatibility (interoperability issues), format resilience to carrier degradation and format resilience to corruption

    Waste effectiveness of the construction industry: Understanding the impediments and requisites for improvements

    Get PDF
    © 2015 Elsevier B.V. All rights reserved.Construction industry contributes a large portion of waste to landfill, which in turns results in environmental pollution and CO2 emission. Despite the adoption of several waste management strategies, waste reduction to landfill continues seeming an insurmountable challenge. This paper explores factors impeding the effectiveness of existing waste management strategies, as well as strategies for reducing waste intensiveness of the construction industry. Drawing on series of semi structured focus group discussions with experts from the UK leading construction companies, this paper combines phenomenological approach with a critical review and analysis of extant literatures. Five broad categories of factors and practices are responsible for ineffectiveness of construction and demolition waste management strategies, which subsequently results in waste intensiveness of the industry. These include end of pipe treatment of waste, externality and incompatibility of waste management tools with design tools, atomism of waste management strategies, perceived or unexpected high cost of waste management, and culture of waste behaviour within the industry. To reduce waste intensiveness of the construction industry, the study suggests that six factors are requisites. These are tackling of waste at design stage, whole life waste consideration, compliance of waste management solutions with BIM, cheaper cost of waste management practice, increased stringency of waste management legislation and fiscal policies, and research and enlightenment. The proposed strategies are not only important for achieving low waste construction projects, they are important for reducing waste intensiveness of the construction. Implementation of the suggested measures would drive waste management practices within the construction industry

    FAIRness and Usability for Open-access Omics Data Systems

    Get PDF
    Omics data sharing is crucial to the biological research community, and the last decade or two has seen a huge rise in collaborative analysis systems, databases, and knowledge bases for omics and other systems biology data. We assessed the FAIRness of NASAs GeneLab Data Systems (GLDS) along with four similar kinds of systems in the research omics data domain, using 14 FAIRness metrics. The range of overall FAIRness scores was 6-12 (out of 14), average 10.1, and standard deviation 2.4. The range of Pass ratings for the metrics was 29-79%, Partial Pass 0-21%, and Fail 7-50%. The systems we evaluated performed the best in the areas of data findability and accessibility, and worst in the area of data interoperability. Reusability of metadata, in particular, was frequently not well supported. We relate our experiences implementing semantic integration of omics data from some of the assessed systems for federated querying and retrieval functions, given their shortcomings in data interoperability. Finally, we propose two new principles that Big Data system developers, in particular, should consider for maximizing data accessibility

    Oiling global capital accumulation : analyzing the principles, practices and geographical distribution of Islamic financial services

    Get PDF
    This article focuses on the Islamic financial services (IFS) sector, which originated in the Middle East, but is now rapidly becoming a global sector. First, Islamic economic ideology is discussed, which resulted in the foundation of IFS firms after the 1973 oil crisis, and then an overview of the most common IFS is provided. The third part discusses the global distribution of IFS firms and Shari'a compliant assets. The Middle East is at the apex of the IFS sector, with the Islamized economies of Iran and Pakistan and prime hubs such as Manama and Dubai. Outside the Middle East, Malaysia is identified as an important growing market for IFS, while outside the Muslim world, London is increasingly profiling itself as a global IFS hub

    The SAC Proposal for the Monetization of the File Sharing of Music in Canada: Does it Comply with Canada’s International Treaty Obligations Related To Copyright?

    Get PDF
    In November 2007, the Songwriters Association of Canada (SAC) released a proposal for the monetization of the file sharing of music in Canada. This article attempts to determine whether or not Canada, given its international and bilateral treaty agreements, could ever adopt the SAC‘s proposal. The article approaches this analysis through the ―three-step test‖, which was adopted under the Berne Convention in 1971 and enshrined in the subsequent TRIPS Agreement and NAFTA; the article also analyzes whether or not the Proposal is compatible with Canada‘s obligations under the Rome Convention. The article concludes that, without amendments to the international treaties to copyright of which Canada is a part, a proposal like SAC‘s could not be successfully enacted

    The SAC Proposal for the Monetization of the File Sharing of Music in Canada: Does it Comply with Canada’s International Treaty Obligations Related To Copyright?

    Get PDF
    In November 2007, the Songwriters Association of Canada (SAC) released a proposal for the monetization of the file sharing of music in Canada. This article attempts to determine whether or not Canada, given its international and bilateral treaty agreements, could ever adopt the SAC‘s proposal. The article approaches this analysis through the ―three-step test‖, which was adopted under the Berne Convention in 1971 and enshrined in the subsequent TRIPS Agreement and NAFTA; the article also analyzes whether or not the Proposal is compatible with Canada‘s obligations under the Rome Convention. The article concludes that, without amendments to the international treaties to copyright of which Canada is a part, a proposal like SAC‘s could not be successfully enacted

    Combating e-discrimination in the North West - final report

    Get PDF
    The Combating eDiscimination in the North West project examined over 100 websites advertising job opportunities both regionally and nationally, and found the vast majority to be largely inaccessible. Professional standards, such as using valid W3C code and adhering to the W3C Web Content Accessibility Guidelines, were largely not followed. The project also conducted interviews with both public and private sector web professionals, and focus groups of disabled computer users, to draw a broader picture of the accessibility of jobs websites. Interviews with leading web development companies in the Greater Manchester region, showed that there is a view there should not be any additional cost in making websites accessible, as the expertise to create a site professionally should be in place from the start, and that accessibility will follow from applying professional standards. However, through the process of trying to create a website for the project, with such a company, it was found that following professional standards is not sufficient to catch all the potential problems, and that user testing is an essential adjunct to professional practice. The main findings of the project are, thus, that: • Most websites in the job opportunities sector are not following professional standards of web development, and are largely inaccessible • Professional standards of web development need to be augmented with user testing to ensure proper accessibility

    Operationalizing Transparency and Explainability in Artificial Intelligence through Standardization

    Get PDF
    As artificial intelligence (AI) has developed, it has spread to almost every aspect of our society, from electric toothbrushes and telephone applications to automated transportation and military use. As AI becomes more ubiquitous, its importance and impact on our society grow continuously. With the pursuit and development of more efficient and accurate artificial intelligence applications, AI systems have evolved into so-called “black box” models, where the operation and decision-making have become immensely complex and difficult to understand, even for experts. As AI is increasingly applied in more critical and sensitive areas, such as healthcare, for instance in support of diagnoses, the lack of transparency and explainability of these complex models and their decision-making has become a problem. If there is no understandable argumentation backing up the results produced by the system, its use is questionable or even ethically impossible in such areas. Furthermore, these AI systems may be misused or behave in very unexpected and potentially harmful ways. Issues related to the governance of AI systems are thus more important than ever before. Standards provide one way to implement AI governance and promote the transparency and explainability of AI systems. This study sets out to examine how the role of standardization in promoting AI transparency and explainability is perceived from an organizational perspective and what kind of AI transparency and explainability needs are identified among different organizational actors. In addition, efforts will be made to identify possible drivers and barriers to the adoption of AI transparency and explainability standards. The research has been carried out by interviewing representatives from a total of 11 different Finnish organizations working in the field of AI. The data gathered from the interviews has been analyzed using the Gioia method. Based on this analysis, five different roles for standards were identified regarding the promotion of explainability and transparency in AI: 1. Facilitator, 2. Validator, 3. Supporter, 4. Business enhancer, and 5. Necessary evil. Furthermore, the identified AI transparency and explainability needs are composed of the needs for ensuring general acceptability of AI and risk management needs. Finally, the identified drivers for adopting AI transparency and explainability standards comprise the requirements of the operating environment, business facilitating drivers, and business improvement drivers, whereas the barriers consist of the lack of resources, lack of knowledge and know-how, downsides of standardization, and incompatibility of standardization and AI. In addition, the results showed that the implementation of possible standards for AI transparency and explainability is largely driven by binding legislation and financial incentives rather than ethical drivers. Furthermore, building trust in AI is seen as the ultimate purpose of transparency and explainability and its standardization. This dissertation provides an empirical basis for future research regarding the need for AI standardization, standards adoption, and AI transparency and explainability from an organizational perspective.Tekoäly on kehittyessään levinnyt lähes kaikille yhteiskuntamme osa-alueille aina sähköhammasharjoista ja puhelimen sovelluksista liikenteeseen ja maanpuolustukseen. Laajan leviämisen seurauksena sen merkitys ja vaikutus yhteiskunnassamme on kasvanut jatkuvasti sekä jatkaa yhä kasvamista. Tehokkaampien ja tarkempien tekoälysovellutusten tavoittelun ja kehityksen myötä AI-sovellutuksista on kehittynyt niin sanottuja ”black box” -malleja, joiden toiminta ja päätöksenteko on hyvin monimutkaista ja vaikeasti ymmärrettävää jopa alan asiantuntijoille. Kun tekoälyä aletaan kehityksen myötä yhä enenevissä määrin soveltamaan myös kriittisemmillä ja sensitiivisemmillä osa-alueilla kuten esimerkiksi terveydenhuollossa diagnoosien tukena, ongelmaksi nousee näiden monimutkaisten mallien avoimuuden puute ja saatujen tulosten läpinäkyvyys ja selitettävyys. Jos tekoälyn tuottamalle tulokselle ei löydy perusteluita, sen käyttö on hyvin hataralla pohjalla ja eettisesti jopa mahdotonta tällaisilla aloilla. Samaan aikaan tekoälyä voidaan käyttää väärin tai se voi käyttäytyä hyvinkin odottamattomilla ja mahdollisesti haitallisilla tavoilla. Tekoälyjärjestelmien hallintaan liittyvät kysymykset ovat siten tärkeämpiä kuin koskaan ennen. Standardit tarjoavat yhden keinon toteuttaa tekoälyn hallintaa ja edistää tekoälyjärjestelmien läpinäkyvyyttä ja selitettävyyttä. Tässä tutkimuksessa pyritään tutkimaan miten standardoinnin rooli tekoälyn läpinäkyvyyden ja selitettävyyden edistämisessä koetaan organisaatioiden näkökulmasta ja millaisia tekoälyn läpinäkyvyyden ja selitettävyyden tarpeita eri sidosryhmien keskuudessa tunnistetaan. Lisäksi pyritään selvittämään mitkä ovat mahdollisia ajureita ja esteitä tekoälyn läpinäkyvyys- ja selitettävyysstandardien käyttöönotolle. Tutkimus on toteutettu haastattelemalla yhteensä 11 eri tekoälyn parissa työskentelevän suomalaisen organisaation edustajia. Haastatteluista saatu aineisto on analysoitu Gioia-menetelmää hyödyntäen. Tämän analyysin perusteella tunnistettiin yhteensä viisi eri standardien roolia tekoälyn selitettävyyden ja läpinäkyvyyden edistämisessä: 1. Fasilitaattori, 2. Validaattori, 3. Tukija, 4. Liiketoiminnan edistäjä ja 5. Välttämätön paha. Lisäksi analyysin perusteella tunnistetut tekoälyn läpinäkyvyys- ja selitettävyystarpeet koostuvat tekoälyn yleisen hyväksynnän saavuttamisen tarpeista ja riskienhallintatarpeista. Tunnistetut tekoälyn läpinäkyvyys- ja selitettävyysstandardien käyttöönoton ajurit sisältävät toimintaympäristön vaatimukset, liiketoimintaa edistävät ajurit ja liiketoiminnan parantamisen ajurit, kun taas tunnistettuja esteitä ovat resurssien puute, tiedon ja taitotiedon puute sekä standardoinnissa tunnistetut huonot puolet, sekä standardoinnin ja tekoälyn yhteensopimattomuus. Lisäksi tulokset osoittivat, että mahdollisten tekoälyn läpinäkyvyys- ja selitettävyysstandardien käyttöönotto on eettisen ajureiden sijaan pitkälti pakottavan lainsäädännön ja taloudellisten kannustimien johdattelemaa. Tekoälyn läpinäkyvyyden ja selitettävyyden sekä sen standardisoinnin perimmäisenä tarkoituksena nähdään olevan luottamuksen saavuttaminen tekoälyä kohtaan. Tämä tutkielma tarjoaa empiirisen tietoperustan tulevalle tekoälyn standardoinnin, standardien käyttöönoton ja tekoälyn läpinäkyvyyden ja selitettävyyden tarpeiden tutkimukselle organisaationäkökulmasta

    Religion and modern finance: islamic banking and the use of derivatives

    Get PDF
    JEL classification System: N25, G210Throughout modern history, the world has faced economic crisis ranging from the great depression in the 1930s in the United States of America to the Dot-com bubble collapse in 2001 and more recently the subprime crisis for which the aftershocks are still felt in several countries. Amidst these cyclical events, Islamic banking has been gaining traction and increased importance in countries where Islamic religion is predominant: in Gulf cooperation council countries, in Asian countries and African countries. The defining characteristics of Islamic Banking (which follows Shariah law) seem to counterbalance the pitfalls that lead to the above mentioned crisis, though it is not clear if the road ahead of this industry, and the necessary developments it needs to achieve in order to continue its growth, are feasible and complementary to the core of Shariah law. The development of tools for managing risk specially, become increasingly important as the size of the Islamic banking industry increase. In order to prevent a similar path as conventional banking, a kind of symbiosis between the tenets of Shariah and the use of derivative instruments needs to be reached. It is the analysis of this relationship that is the goal of this thesis. This will be done by reviewing the current situation of Islamic Banking, analyzing a sample of the available products in both systems and analyzing the possibility of implementation and standardization of derivatives in Shariah through the current process of approval of such instruments.Ao longo da história moderna, o mundo tem enfrentado crises económica desde a grande depressão na década de 1930 nos Estados Unidos da América, o colapso da bolha ‘Dot-com’ em 2001, e mais recentemente, a crise do subprime em que as consequências ainda são sentidas em vários países. Durante estes eventos cíclicos, o sistema bancário Islâmico foi ganhando força e uma crescente importância nos países onde a religião islâmica é predominante: nos países do Conselho de Cooperação do Golfo, nos países asiáticos e países africanos. As características que definem o sistema bancário islâmico (que segue a lei Shariah) parecem contrabalançar as causas principais que levaram às crises acima indicado, embora não seja claro se o caminho à frente para esta indústria, e os desenvolvimentos necessários para continuar o seu crescimento, são viáveis e complementares com o núcleo da lei Shariah. O desenvolvimento, especialmente de ferramentas de gestão de risco, tornam-se cada vez mais importante tendo em conta o crescimento do setor bancário islâmico. A fim de evitar um caminho semelhante ao da banca tradicional, uma espécie de simbiose entre os princípios de Sharia e da utilização de derivados financeiros precisa ser alcançado. É a análise dessa relação que é o objetivo desta tese. Isto será feito através da revisão da situação atual do sistema bancário islâmico, analisando uma amostra dos produtos disponíveis em ambos os sistemas e analisar a possibilidade de implementação e padronização de derivados em Shariah através do atual processo de aprovação de tais instrumentos

    Software Interfaces - Stuck in the Middle: The Relationship Between the Law and Software Interfaces in Regulating and Encouraging INteroperability

    Get PDF
    Interoperability of software programs is associated with competition and innovation. This has resulted in exceptions to copyright protection for software interfaces and intervention by competition authorities. Despite this, the hidden and unreadable nature of machine code and the limitations of reverse engineering have given a much stronger protection than is normally associated with copyright. This article reviews the theoretical and empirical justifications for interoperability. Then, against the backdrop of the prevailing uncertainty pending the decision of the ECJ in the case of SAS Institute Inc v. World Programming Ltd, the author analyses the development of the law on the status of software interfaces in Europe and the USA. The effectiveness of compulsory disclosure of interoperability information under the “exceptional circumstances” test is considered but the shortcomings of this approach are identified. The possibility of relaxing the restriction on dissemination of interface information is proposed
    corecore