255 research outputs found

    A Survey on Wireless Sensor Network Security

    Full text link
    Wireless sensor networks (WSNs) have recently attracted a lot of interest in the research community due their wide range of applications. Due to distributed nature of these networks and their deployment in remote areas, these networks are vulnerable to numerous security threats that can adversely affect their proper functioning. This problem is more critical if the network is deployed for some mission-critical applications such as in a tactical battlefield. Random failure of nodes is also very likely in real-life deployment scenarios. Due to resource constraints in the sensor nodes, traditional security mechanisms with large overhead of computation and communication are infeasible in WSNs. Security in sensor networks is, therefore, a particularly challenging task. This paper discusses the current state of the art in security mechanisms for WSNs. Various types of attacks are discussed and their countermeasures presented. A brief discussion on the future direction of research in WSN security is also included.Comment: 24 pages, 4 figures, 2 table

    Privacy preserving protocols for smart meters and electric vehicles

    Get PDF
    Tese de mestrado, Segurança Informática, Universidade de Lisboa, Faculdade de Ciências, 2015Actualmente existe a tendência para se adicionar mais inteligência em vários pontos da rede elétrica, permitindo uma comunicação bidireccional entre a empresa fornecedora de energia eléctrica e as nossas casas. Ao longo dos próximos anos, os contadores de energia nas nossas casas serão gradualmente substituídos por um equipamento com mais capacidades, denominado medidor inteligente. Os medidores inteligentes podem colher informações sobre os gastos de energia em tempo real, e encaminhar os dados para o fornecedor. Além disso, podem receber comandos do fornecedor (ou outros intervenientes) e agir em conformidade, nomeadamente através da interacção com equipamentos locais (por exemplo, ar condicionado ou congelador) para ajustar o seu modo de operação, diminuindo temporariamente o consumo de energia. Os medidores inteligentes podem ainda apoiar a produção local de energia (com painéis solares ou geradores eólicos) e o seu armazenamento (através de um banco de baterias ou veículo eléctrico), sendo necessário haver coordenação entre a sua operação e as empresas fornecedoras de energia eléctrica. Estes medidores, quando coordenados de uma forma apropriada, podem permitir uma redução dos picos globais de consumo. Deste modo evitam investimentos na rede energética direccionados para lidar com estas condições extremas, que tendem a ocorrer durante o horário laboral. A evolução no uso de veículos eléctricos irá gerar também um grande consumo de energia. Caso todos os veículos se tornem eléctricos, a rede actual não tem capacidade para lidar com o enorme pico gerado. No entanto, estes veículos poderão também ter a capacidade de transferir para a rede parte da sua energia, o que significa que, poderão ser usados em caso de necessidade para colmatar flutuações no consumo de energia (juntamente com outras fontes alternativas de geração). Esta coordenação, quando eficiente, pode permitir grandes vantagens em situações limite, como por exemplo quando há um fornecimento reduzido de energia, em que os medidores podem desactivar total ou parcialmente os aparelhos domésticos, permitindo uma melhor distribuição de energia por todos, priorizando, se necessário, certos locais como por exemplo hospitais. Como esperado, este tipo de configuração é propenso a muitas formas de ataque, desde a espionagem de comunicações até à manipulação física dos medidores inteligentes. Por isso, é necessário desenvolver protocolos seguros que possam ser usados para proteger os dispositivos e aplicações que irão operar na rede eléctrica futura. Este projecto em particular, desenvolve uma solução que protege as comunicações entre o medidor inteligente e a empresa distribuidora de energia no que diz respeito aos ataques à privacidade. Nestes ataques, o adversário obtém informação sobre o que o utilizador está a fazer em sua casa, monitorizando em tempo real a informação que é transmitida pelo medidor inteligente. Nos últimos anos tem-se assistido igualmente a uma evolução rápida nas tecnologias de transferência de energia sem fios, existindo actualmente alguns protótipos em funcionamento, como o carregamento de baterias em autocarros eléctricos numa universidade da Coreia do Sul. Uma eventual utilização generalizada desta tecnologia obriga à definição de novas formas de pagamento, possibilitando que os veículos eléctricos se possam abastecer em movimento. Se existir um protocolo demasiado simples que faça esta tarefa, pode levar a que o condutor seja identificado quando e onde carregar as baterias do seu veículo, algo que não acontece com um tradicional abastecimento de combustível pago com notas ou moedas. Este projecto lida com duas vertentes relacionadas que tratam da aferição do consumo de energia. Uma é baseada nos contadores inteligentes das casas, e outra nos “contadores” em veículos (mais concretamente, a forma de pagamento da energia transferida sem fios para um veículo em movimento). Apresentam-se diferentes técnicas/algoritmos já propostos que podem contribuir para uma solução, mas que apesar disso não conseguem atingir todos os requisitos e funcionalidades pretendidas de forma isolada. Estabelece-se também uma relação com o trabalho já realizado que utiliza tais técnicas. É estudado um protocolo especifico, o Low Overhead PrivAcy (LOPA), que organiza vários medidores num grupo. Em cada grupo é gerada secretamente uma chave entre cada medidor do grupo, depois é criada a partir dessa chave uma outra chave, que é somada a cada medição que cada medidor envia para um agregador, sem que ninguém consiga ver o valor da medição individual (devido à chave). O agregador, ao somar todas as medições de todos os medidores de um grupo, obtém o valor total de consumo de todos os medidores. O agregador, no entanto, não consegue saber cada medição individual, devido ao modo como a chave é gerada, garantindo a privacidade de cada casa. Este protocolo é explicado em detalhe, implementado e avaliado. São propostos também três protocolos para o pagamento da transferência de energia, que permitem manter o anonimato de um veículo, evitando que se saiba quando ou onde este circula. Os protocolos também lidam com ineficiências de transmissão, assegurando uma rapidez, simplicidade e segurança adequadas para serem aplicados em carros em movimento a velocidades habituais de circulação. Um dos protocolos permite uma transferência de energia pós-paga, e os outros dois usam uma modalidade de pré-pagamento, um com contas temporárias e o outro com dinheiro digital. Estes protocolos baseiam-se num conjunto de mensagens que empregam técnicas como assinaturas digitais (para garantir a integridade e autenticação das comunicações), técnicas de cifra, dinheiro digital, ou entidades terceiras confiáveis para permitir a confidencialidade. Pretende-se que seja assegurada a segurança do pagamento, ao mesmo tempo que é permitido ao ponto de carregamento identificar o responsável pelo veículo, em caso de incumprimento. O dinheiro digital e o protocolo de perfis pseudo-anónimos foram implementados e avaliados em duas plataformas diferentes. Os resultados experimentais foram muito satisfatórios, dando indicações de que estes protocolos poderiam ser utilizados na prática.There is currently a trend to add more intelligence to various points of the electric grid, thus enabling a bidirectional communication path between the electrical utility company and our homes, by upgrading the existing components along the way. For example, the metering devices in our homes will be gradually replaced with a more capable equipment, called smart meter. Smart meters can collect information about energy spending in real-time, and forward this data to the utility. Moreover, they can receive information from the utility (or other operators) and act on it, for instance, by interacting with local equipments (e.g., air conditioner or refrigerator) to adjust their operation mode (e.g., make them decrease the energy use). Smart meters can also support local energy production (e.g., solar panels or windmills) and storage (e.g., batteries), by coordinating its operation with the utility companies. As expected, this sort of setting is prone to many forms of attack, ranging from eavesdropping on the communications to the physical tampering of the smart meters. Therefore, it is necessary to develop secure protocols that can be used to protect the devices and applications that will be operating in this future smart grid. In particular, in this project we study and evaluate a solution that protects the communications between the smart meter and the electrical company with respect to attacks on privacy. For instance, it addresses a form of attack where the adversary learns information about what a person is doing at home by monitoring the messages transmitted by the smart meter in real-time. In recent years there have been rapid developments in Wireless Power Transfer technology (WPT). There are currently some prototypes in operation, such as charging batteries in electric buses at a university in South Korea. In the event of a widespread use of this technology, it is required that new forms of accounting and payment of energy are established. This project proposes a protocol for the payment of energy transfer that ensures the anonymity of the vehicle, precluding attacks that attempt to determine where it circulates. The protocol also handles transmission inefficiencies, ensuring a fast, simple and adequate application in cars moving at normal speeds of movement

    From access to re-use:

    Get PDF
    If data are the building blocks to generate information needed to acquire knowledge and understanding, then geodata, i.e. data with a geographic component (geodata), are the building blocks for information vital for decision-making at all levels of government, for companies and for citizens. Governments collect geodata and create, develop and use geo-information - also referred to as spatial information - to carry out public tasks as almost all decision-making involves a geographic component, such as a location or demographic information. Geo-information is often considered “special” for technical, economic reasons and legal reasons. Geoinformation is considered special for technical reasons because geo-information is multi-dimensional, voluminous and often dynamic, and can be represented at multiple scales. Because of this complexity, geodata require specialised hardware, software, analysis tools and skills to collect, to process into information and to use geoinformation for analyses. Geo-information is considered special for economic reasons because of the economic aspects, which sets it apart from other products. The fixed production costs to create geo-information are high, especially for large-scale geo-information, such as topographic data, whereas the variable costs of reproduction are low which do not increase with the number of copies produced. In addition, there are substantial sunk costs, which cannot be recovered from the market. As such, geo-information shows characteristics of a public good, i.e. a good that is non-rivalrous and non-excludable. However, to protect the high investments costs, re-use of geo-information may be limited by legal and/or technological means such as intellectual property rights and digital rights management. Thus, by making geo-information excludable, it becomes a club good, i.e. a non-rivalrous but excludable good. By claiming intellectual property rights, such as copyright and/or database rights, and restricting (re-)use through licences and licence fees, geo-information can be commercially exploited and used to recover some of the investment costs. Geo-information is considered special for a number of legal reasons. First, as geo-information has a geographic component, e.g. a reference to a location, geoinformation may contain personal data, sensitive company data, environmentally sensitive data, or data that may pose a threat to the national security. Therefore, the dataset may have to be adapted, aggregated or anonymised before it can be made public. Secondly, geo-information may be subject to intellectual property rights. There may be a copyright on cartographic images or database rights on digital information. Such intellectual property rights may be claimed by third parties involved in the information chain, e.g. a private company supplying aerial photography to the National Mapping Authority. The data holder may also claim intellectual property rights to commercially exploit the dataset and recoup some of the vast investment costs made to produce the dataset. Lastly, there may be other (international) legislation or agreements that may either impede or promote publishing public sector information, whereby in some cases, these policies may contradict each other. It has been recognised that to deal with national, regional and global challenges, it is essential that geo-information collected by one level of government or government organisation be shared between all levels of government via a so-called Spatial Data Infrastructure (SDI). The main principles governing SDIs are that data are collected once and (re-)used many times; that data should be easy to discover, access and use; and that data are harmonised so that it is possible to combine spatial data from different sources seamlessly. In line with the SDI governing principles, this dissertation considers accessibility of information to include all these aspects. Accessibility concerns not only access to data, i.e. to be able to view the data without being able to alter the contents but also re-use of data, i.e. to be able to download and/or invoke the data and to share data, including to be able to provide feedback and/or to provide input for co-generated information. Accessibility to public sector geo-information is not only essential for effective and efficient government policy-making but is also associated with realising other ambitions. Examples of these ambitions are a more transparent and accountable government, more citizens’ participation in democratic processes, (co-)generation of solutions to societal problems, and to increase economic value due to companies creating innovative products and services with public sector information as a resource. Especially the latter ambition has been the subject of many international publications stressing the enormous potential economic value of re-use of public sector (geo-) information by companies. Previous research indicated that re-users of public sector information in Europe encountered barriers related to technical, organisational, legal and financial aspects, which was deemed to be the main reason why in Europe the number of value added products and services based on public service information were lagging compared to the United States. Especially the latter two barriers (restrictive licence conditions and high licence fees) were often cited to be the main barriers for reusers in Europe. However, in spite of considerable resources invested by governments to establish spatial data infrastructures, to facilitate data portals and to release public sector information as open data, i.e. without legal and financial restrictions, the expected surge of value added products based on public sector information has not quite eventuated to date and the expected benefits still appear to lag expectations. When this research started a decade ago, the debate around accessibility of public sector information focussed on access policies. Access policies ranged from open access (data available with a minimum of legal restrictions and for no more than marginal dissemination costs) to full cost recovery, whereby all costs incurred in collection, creation, processing, maintenance and dissemination costs to be recoveredfrom the re-users. Most of the public sector bodies in the European Union adhered to a cost recovery policy for allowing re-use of public sector information. In 2003, the European Commission adopted two directives to ensure better accessibility of public sector information Directive 2003/4/EC of the European Parliament and of the Council of 28 January 2003 on public access to environmental information and repealing Council Directive 90/313/EEC, the so-called Access Directive, provided citizens the right of access to environmental information. Citizens should be able to access documents related to the environment via a register, preferably in an electronic form and if a copy of a document was requested, the charges must not exceed marginal dissemination costs. Directive 2003/98/EC of the European Parliament and of the Council of 17 November 2003 on the re-use of public sector information, the co-called PSI Directive, intended to create conditions for a level playing field for all re-users of public sector information. However, the PSI Directive of 2003 left room for public sector organisations to maintain a cost recovery regime with restrictive licence conditions. In spite of these directives, access policies for geographic data were slow to change in most European nations. At the end of the last decade, accessibility of public sector information received two major impulses. The first major impulse was the implementation of Directive 2007/2/ EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE), the cocalled INSPIRE Directive, established a framework of standardisation rules for the data and publishing via web services, which significantly contributed to the accessibility of public sector geo-information. The second major impulse was the development of open data policies following the Digital Agenda for Europe adopted in 2010 and the USA Open Government Directive of 2009 and the Digital Agenda for Europe of 2010. These two impulses were the main drivers in Europe to start a careful move from cost recovery policies to open access or open data policies and for more public sector information to be made available as open data. Thus, of the four barriers to re-use of public sector information data cited in Chapter 1 (legal, financial, technical and organisational barriers), two barriers should have been lifted to a large degree due to open data. This shift to open data provided an excellent opportunity to test the hypothesis that the main barriers for re-users of public sector information were indeed restrictive licences and high fees as suggested by earlier research. Chapter 2 showed that by 2008, most European Union Member States had transposed and implemented the 2003/98/EC PSI Directive, however, in various ways and with considerable delay. By 2008, the effects of the PSI Directive were only slowly starting to emerge. A number of Member States reviewed their access policies and more public sector information became available for re-use. Some Member States made the information available free-of-charge or reduced their fees significantly. In many cases, where re-use fees were reduced the number of regular re-users increased significantly and total revenue even increased in spite of lower fees. Although the 2007/2/EC INSPIRE Directive paved the way for technical interoperability by providing guidelines for web services and catalogues, neither the INSPIRE Directive nor the PSI Directive had tackled the issue of legal interoperability. Chapter 2 also demonstrated that a major barrier to creating a level playing field for the private sector was the fact that some public sector bodies acted as value added resellers by developing and selling products and services based on their own data. Thus, the level playing field envisioned by the European Commission had not been realised. Chapter 3 researched the aspect of harmonised licences as a first step towards legal interoperability. Earlier research had indicated that one of the biggest barriers for re-users were complex, intransparent and inconsistent licence conditions, especially for re-users wanting to combine data from multiple sources. A survey of licences used by public sector data providers in the Netherlands demonstrated that although there were differences in length and language, there were also many similarities. The conclusion was that the introduction of a licence suite inspired by the Creative Commons concept would be a step towards increased transparency and consistency of geo-information license agreements. This chapter introduced a conceptual model for such a geo-information licence suite, the so-called Geo Shared licences. Both Creative Commons and Geo Shared licence suites enable harmonisation of licence conditions and promote transparency and legal interoperability, especially when re-users combine data from different sources. The Geo Shared licence suite became a serious option for inclusion into the draft version of the INSPIRE Directive as an annex. Unfortunately, the concept of one licence suite for the entire European Union came too early in 2006. The Geo Shared licences were further developed and implemented into the Dutch National Geo Register. In 2009, the European Commission recognised that PSI was the single largest source of information in Europe and the potential for re-use of PSI needed to be highlighted in the digital age. As part of a review of the 2003/98/EC PSI Directive, the European Commission carried out a round of consultations with stakeholders to seek their views on specific issues to be addressed in the future in 2010. In addition, the Commission commissioned a number of studies. These studies included a review of studies on public sector information re-use and related market studies, an assessment of the different models of supply and charging for public sector information and a study on public sector re-user in the cultural sector. The first study, carried out by Graham Vickery in 2011, showed that the overall economic gain from opening up public sector information as a resource for new products and services could be in the order of €40 billion per annum in the European Union. Both the Vickery Report and the second study, the so-called POPSIS Study, showed that for most public sector data providers their revenues from licence fees were relatively low in comparison to their total budget. After the evaluation, Directive 2013/37/EU of the European Parliament and of the Council of 26 June 2013 amending Directive 2003/98/EC on the re-use of public sector information was adopted and came into force on 17 July 2013. Chapter 4 described the main changes of the 2013/37/EU Amended PSI Directive, including the recommendation to employ open data licences. This chapter continued with a review of the various open data licences in use in Europe and analysed their interoperability. Although adoption of open data licences for public sector information should have addressed legal interoperability barriers for re-users, in practice, the different types of open data licences might not be so interoperable after all. Effectively, only a public domain declaration, such as a Creative Commons Zero (CC0) declaration, is suitable for open data re-users requiring with cross-border data sets and that such a public domain declaration is published in a prominent place to remove uncertainty for re-users. Without a public domain declaration, re-use of open data is still impeded as re-users are loathe to invest time into the development of value added products or services when it is uncertain if and which restrictions may be applicable and what the impact may be on their product or service. This dissertation also researched the financial and economic aspects of public sector information accessibility. Chapters 1 and 2 indicated that a cost recovery regime for dissemination of public sector information provided a financial barrier for private sector re-users because the fees charged were perceived to be too high. However, in 2008, there were still many advocates for maintaining a cost recovery regime. Especially public sector bodies that are not funded by the national Treasury, the socalled self-funding agencies, needed revenue from data sales to cover a substantial part of their operational costs. A sustainable source of revenue was viewed as essential to maintain the data at an adequate level, and to ensure actuality and continuity. Chapter 5 explored the potential business models and pricing mechanisms for public sector INSPIRE web services. Although, depending on the type of web service, and type of re-user, there might have been an argument for employing a subscription model as a pricing mechanism, business models based on generating revenue from public sector information would not be viable in the long run and were not in the spirit of the INSPIRE Directive. This research concluded that public sector information web services employing different pricing regimes were counterproductive to achieving financial interoperability. In Chapter 6, business models for public sector data providers were revisited, this time from an open data perspective. Government agencies, including self-funding government agencies are under increasing pressure to implement open data policies. This chapter analysed the business models of self-funding agencies either already providing open data or under pressure to provide (some) open data in the near future. The analysis showed which adaptions might be necessary to ensure the long-term availability of high quality open data and the long-term financial sustainability of self-funding agencies. The case studies confirmed that providing (raw) open data does not necessarily lead to losses in revenue in the long term as long as the organisation has enough flexibility to adapt its role in the information value chain, especially when revenue from licence fees represents only a relative small part of theirtotal budget. The case studies indicated that switching to open data has resulted in internal efficiency gains. In practice, it is difficult to isolate and quantify the internal efficiency gains that are solely attributable to open data as the researched organisations continuously implement efficiency measures. However, the reported decreases in internal and external transaction costs due to open data are in line with the case study carried out in Chapter 7. Open data also provided an excellent opportunity to assess the effects of open data ex ante as baseline measurements could be carried out. To develop both quantitative and qualitative indicators to assess the success of a policy change is a challenge for open data initiatives. In Chapter 7, a model to assess the effects on the organisation of an open data provider was developed. Liander, a private energy network administrator mandated with a public task, planned to publish some of their datasets as open data in the autumn of 2013. This offered an excellent opportunity to apply the developed assessment model to provide an insight into internal, external, and relational effects on Liander. A benchmark was carried out prior to release of open data and a follow-up measurement one year later. The benchmark provided an insight into the then work processes and into the preparations required to implement open data. The follow-up monitor indicated that Liander open data are used by a wide range of users and have had a positive effect on the development of apps to aid energy savings. However, it remains a challenge to quantify the societal effects of such apps. The follow-up monitor also indicated that regular re-users of Liander data used the open data to improve existing applications and work processes rather than to create new products. The case study demonstrated that private energy companies could successfully release open data. The case study also showed that Liander served as a best-practice case for open data and had a flywheel effect on companies within the same sector. By 2015, nearly all energy network administrators had published similar open data. The monitoring model developed in this project was assessed to be suitable to monitor the open data effects on the organisation of the data provider. The assessment model developed and tested in Chapter 7 proved to be suitable to monitor the effects of open data on organisational level. However, to provide a more complete picture of the effects of open data and to assess if there are other barriers for re-users, a more holistic approach was required to assess the maturity of open data. Therefore, a holistic open data assessment framework addressing the supplier side, the governance side, and the user side of the open data was developed and applied to the Dutch open data infrastructure in Chapter 8. This Holistic Open Data Maturity Assessment Framework was used to evaluate the State of the Open Data Nation in the Netherlands and to provide valuable information on (potential) bottlenecks. The framework showed that geographic data scored significantly better than other types of government data. The standardisation and implementation rules laid down by INSPIRE Directive framework appear to have been a catalyst for moving geographic data to a higher level of maturity. The maturity assessment framework provided Dutch policy makers with useful inputs for further development of the open data ecosystem and development of well-founded strategies that will ensure the full potential of open data will be reached. Since the publication of the State of the Open Data Nation in 2014, a number of the recommendations have already been implemented. This dissertation demonstrated that many aspects that should facilitate accessibility, such as standardised metadata, have already been addressed for geodata. This research also showed that for other types of data, there is still a long way to go. There is a growing demand for other types of data, such as financial data and healthcare data. Public sector organisations holding such types of data need hands-on guidelines to enable publication of their datasets, preferably as open data. However, data published as open data are forever and cannot be recalled. Therefore, the decision to publish public sector data as open data is complex: datasets are often of a heterogeneous nature and may contain microdata (data that quantify observations or facts, such as data collected during surveys) Although microdata may not necessarily contain personal data, the datasets will probably have to be processed before publication to address confidentiality and data quality issues. In addition, there is a tension between open data and protection of personal data. The big question remains to which level the data need to be aggregated and/or anonymised to ensure protection of personal data now and in the future, and at the same time keeping sufficient significance to be re-usable. Another issue that needs further research is data-ownership of sensor data and co-created data. Increasingly, sensor data generated by e.g. smart phones, smart energy meters and traffic sensors are collected by the public sector and the private sector and become part of a big data ecosystem. In addition, public sector organisations cooperate with other public sector organisations and the private sector to create information from their data, so-called co-created information. Citizens also collect data or complement information on

    From access to re-use

    Get PDF
    If data are the building blocks to generate information needed to acquire knowledge and understanding, then geodata, i.e. data with a geographic component (geodata), are the building blocks for information vital for decision-making at all levels of government, for companies and for citizens. Governments collect geodata and create, develop and use geo-information - also referred to as spatial information - to carry out public tasks as almost all decision-making involves a geographic component, such as a location or demographic information. Geo-information is often considered “special” for technical, economic reasons and legal reasons. Geoinformation is considered special for technical reasons because geo-information is multi-dimensional, voluminous and often dynamic, and can be represented at multiple scales. Because of this complexity, geodata require specialised hardware, software, analysis tools and skills to collect, to process into information and to use geoinformation for analyses. Geo-information is considered special for economic reasons because of the economic aspects, which sets it apart from other products. The fixed production costs to create geo-information are high, especially for large-scale geo-information, such as topographic data, whereas the variable costs of reproduction are low which do not increase with the number of copies produced. In addition, there are substantial sunk costs, which cannot be recovered from the market. As such, geo-information shows characteristics of a public good, i.e. a good that is non-rivalrous and non-excludable. However, to protect the high investments costs, re-use of geo-information may be limited by legal and/or technological means such as intellectual property rights and digital rights management. Thus, by making geo-information excludable, it becomes a club good, i.e. a non-rivalrous but excludable good. By claiming intellectual property rights, such as copyright and/or database rights, and restricting (re-)use through licences and licence fees, geo-information can be commercially exploited and used to recover some of the investment costs. Geo-information is considered special for a number of legal reasons. First, as geo-information has a geographic component, e.g. a reference to a location, geoinformation may contain personal data, sensitive company data, environmentally sensitive data, or data that may pose a threat to the national security. Therefore, the dataset may have to be adapted, aggregated or anonymised before it can be made public. Secondly, geo-information may be subject to intellectual property rights. There may be a copyright on cartographic images or database rights on digital information. Such intellectual property rights may be claimed by third parties involved in the information chain, e.g. a private company supplying aerial photography to the National Mapping Authority. The data holder may also claim intellectual property rights to commercially exploit the dataset and recoup some of the vast investment costs made to produce the dataset. Lastly, there may be other (international) legislation or agreements that may either impede or promote publishing public sector information, whereby in some cases, these policies may contradict each other. It has been recognised that to deal with national, regional and global challenges, it is essential that geo-information collected by one level of government or government organisation be shared between all levels of government via a so-called Spatial Data Infrastructure (SDI). The main principles governing SDIs are that data are collected once and (re-)used many times; that data should be easy to discover, access and use; and that data are harmonised so that it is possible to combine spatial data from different sources seamlessly. In line with the SDI governing principles, this dissertation considers accessibility of information to include all these aspects. Accessibility concerns not only access to data, i.e. to be able to view the data without being able to alter the contents but also re-use of data, i.e. to be able to download and/or invoke the data and to share data, including to be able to provide feedback and/or to provide input for co-generated information. Accessibility to public sector geo-information is not only essential for effective and efficient government policy-making but is also associated with realising other ambitions. Examples of these ambitions are a more transparent and accountable government, more citizens’ participation in democratic processes, (co-)generation of solutions to societal problems, and to increase economic value due to companies creating innovative products and services with public sector information as a resource. Especially the latter ambition has been the subject of many international publications stressing the enormous potential economic value of re-use of public sector (geo-) information by companies. Previous research indicated that re-users of public sector information in Europe encountered barriers related to technical, organisational, legal and financial aspects, which was deemed to be the main reason why in Europe the number of value added products and services based on public service information were lagging compared to the United States. Especially the latter two barriers (restrictive licence conditions and high licence fees) were often cited to be the main barriers for reusers in Europe. However, in spite of considerable resources invested by governments to establish spatial data infrastructures, to facilitate data portals and to release public sector information as open data, i.e. without legal and financial restrictions, the expected surge of value added products based on public sector information has not quite eventuated to date and the expected benefits still appear to lag expectations. When this research started a decade ago, the debate around accessibility of public sector information focussed on access policies. Access policies ranged from open access (data available with a minimum of legal restrictions and for no more than marginal dissemination costs) to full cost recovery, whereby all costs incurred in collection, creation, processing, maintenance and dissemination costs to be recoveredfrom the re-users. Most of the public sector bodies in the European Union adhered to a cost recovery policy for allowing re-use of public sector information. In 2003, the European Commission adopted two directives to ensure better accessibility of public sector information Directive 2003/4/EC of the European Parliament and of the Council of 28 January 2003 on public access to environmental information and repealing Council Directive 90/313/EEC, the so-called Access Directive, provided citizens the right of access to environmental information. Citizens should be able to access documents related to the environment via a register, preferably in an electronic form and if a copy of a document was requested, the charges must not exceed marginal dissemination costs. Directive 2003/98/EC of the European Parliament and of the Council of 17 November 2003 on the re-use of public sector information, the co-called PSI Directive, intended to create conditions for a level playing field for all re-users of public sector information. However, the PSI Directive of 2003 left room for public sector organisations to maintain a cost recovery regime with restrictive licence conditions. In spite of these directives, access policies for geographic data were slow to change in most European nations. At the end of the last decade, accessibility of public sector information received two major impulses. The first major impulse was the implementation of Directive 2007/2/ EC of the European Parliament and of the Council of 14 March 2007 establishing an Infrastructure for Spatial Information in the European Community (INSPIRE), the cocalled INSPIRE Directive, established a framework of standardisation rules for the data and publishing via web services, which significantly contributed to the accessibility of public sector geo-information. The second major impulse was the development of open data policies following the Digital Agenda for Europe adopted in 2010 and the USA Open Government Directive of 2009 and the Digital Agenda for Europe of 2010. These two impulses were the main drivers in Europe to start a careful move from cost recovery policies to open access or open data policies and for more public sector information to be made available as open data. Thus, of the four barriers to re-use of public sector information data cited in Chapter 1 (legal, financial, technical and organisational barriers), two barriers should have been lifted to a large degree due to open data. This shift to open data provided an excellent opportunity to test the hypothesis that the main barriers for re-users of public sector information were indeed restrictive licences and high fees as suggested by earlier research. Chapter 2 showed that by 2008, most European Union Member States had transposed and implemented the 2003/98/EC PSI Directive, however, in various ways and with considerable delay. By 2008, the effects of the PSI Directive were only slowly starting to emerge. A number of Member States reviewed their access policies and more public sector information became available for re-use. Some Member States made the information available free-of-charge or reduced their fees significantly. In many cases, where re-use fees were reduced the number of regular re-users increased significantly and total revenue even increased in spite of lower fees. Although the 2007/2/EC INSPIRE Directive paved the way for technical interoperability by providing guidelines for web services and catalogues, neither the INSPIRE Directive nor the PSI Directive had tackled the issue of legal interoperability. Chapter 2 also demonstrated that a major barrier to creating a level playing field for the private sector was the fact that some public sector bodies acted as value added resellers by developing and selling products and services based on their own data. Thus, the level playing field envisioned by the European Commission had not been realised. Chapter 3 researched the aspect of harmonised licences as a first step towards legal interoperability. Earlier research had indicated that one of the biggest barriers for re-users were complex, intransparent and inconsistent licence conditions, especially for re-users wanting to combine data from multiple sources. A survey of licences used by public sector data providers in the Netherlands demonstrated that although there were differences in length and language, there were also many similarities. The conclusion was that the introduction of a licence suite inspired by the Creative Commons concept would be a step towards increased transparency and consistency of geo-information license agreements. This chapter introduced a conceptual model for such a geo-information licence suite, the so-called Geo Shared licences. Both Creative Commons and Geo Shared licence suites enable harmonisation of licence conditions and promote transparency and legal interoperability, especially when re-users combine data from different sources. The Geo Shared licence suite became a serious option for inclusion into the draft version of the INSPIRE Directive as an annex. Unfortunately, the concept of one licence suite for the entire European Union came too early in 2006. The Geo Shared licences were further developed and implemented into the Dutch National Geo Register. In 2009, the European Commission recognised that PSI was the single largest source of information in Europe and the potential for re-use of PSI needed to be highlighted in the digital age. As part of a review of the 2003/98/EC PSI Directive, the European Commission carried out a round of consultations with stakeholders to seek their views on specific issues to be addressed in the future in 2010. In addition, the Commission commissioned a number of studies. These studies included a review of studies on public sector information re-use and related market studies, an assessment of the different models of supply and charging for public sector information and a study on public sector re-user in the cultural sector. The first study, carried out by Graham Vickery in 2011, showed that the overall economic gain from opening up public sector information as a resource for new products and services could be in the order of €40 billion per annum in the European Union. Both the Vickery Report and the second study, the so-called POPSIS Study, showed that for most public sector data providers their revenues from licence fees were relatively low in comparison to their total budget. After the evaluation, Directive 2013/37/EU of the European Parliament and of the Council of 26 June 2013 amending Directive 2003/98/EC on the re-use of public sector information was adopted and came into force on 17 July 2013. Chapter 4 described the main changes of the 2013/37/EU Amended PSI Directive, including the recommendation to employ open data licences. This chapter continued with a review of the various open data licences in use in Europe and analysed their interoperability. Although adoption of open data licences for public sector information should have addressed legal interoperability barriers for re-users, in practice, the different types of open data licences might not be so interoperable after all. Effectively, only a public domain declaration, such as a Creative Commons Zero (CC0) declaration, is suitable for open data re-users requiring with cross-border data sets and that such a public domain declaration is published in a prominent place to remove uncertainty for re-users. Without a public domain declaration, re-use of open data is still impeded as re-users are loathe to invest time into the development of value added products or services when it is uncertain if and which restrictions may be applicable and what the impact may be on their product or service. This dissertation also researched the financial and economic aspects of public sector information accessibility. Chapters 1 and 2 indicated that a cost recovery regime for dissemination of public sector information provided a financial barrier for private sector re-users because the fees charged were perceived to be too high. However, in 2008, there were still many advocates for maintaining a cost recovery regime. Especially public sector bodies that are not funded by the national Treasury, the socalled self-funding agencies, needed revenue from data sales to cover a substantial part of their operational costs. A sustainable source of revenue was viewed as essential to maintain the data at an adequate level, and to ensure actuality and continuity. Chapter 5 explored the potential business models and pricing mechanisms for public sector INSPIRE web services. Although, depending on the type of web service, and type of re-user, there might have been an argument for employing a subscription model as a pricing mechanism, business models based on generating revenue from public sector information would not be viable in the long run and were not in the spirit of the INSPIRE Directive. This research concluded that public sector information web services employing different pricing regimes were counterproductive to achieving financial interoperability. In Chapter 6, business models for public sector data providers were revisited, this time from an open data perspective. Government agencies, including self-funding government agencies are under increasing pressure to implement open data policies. This chapter analysed the business models of self-funding agencies either already providing open data or under pressure to provide (some) open data in the near future. The analysis showed which adaptions might be necessary to ensure the long-term availability of high quality open data and the long-term financial sustainability of self-funding agencies. The case studies confirmed that providing (raw) open data does not necessarily lead to losses in revenue in the long term as long as the organisation has enough flexibility to adapt its role in the information value chain, especially when revenue from licence fees represents only a relative small part of theirtotal budget. The case studies indicated that switching to open data has resulted in internal efficiency gains. In practice, it is difficult to isolate and quantify the internal efficiency gains that are solely attributable to open data as the researched organisations continuously implement efficiency measures. However, the reported decreases in internal and external transaction costs due to open data are in line with the case study carried out in Chapter 7. Open data also provided an excellent opportunity to assess the effects of open data ex ante as baseline measurements could be carried out. To develop both quantitative and qualitative indicators to assess the success of a policy change is a challenge for open data initiatives. In Chapter 7, a model to assess the effects on the organisation of an open data provider was developed. Liander, a private energy network administrator mandated with a public task, planned to publish some of their datasets as open data in the autumn of 2013. This offered an excellent opportunity to apply the developed assessment model to provide an insight into internal, external, and relational effects on Liander. A benchmark was carried out prior to release of open data and a follow-up measurement one year later. The benchmark provided an insight into the then work processes and into the preparations required to implement open data. The follow-up monitor indicated that Liander open data are used by a wide range of users and have had a positive effect on the development of apps to aid energy savings. However, it remains a challenge to quantify the societal effects of such apps. The follow-up monitor also indicated that regular re-users of Liander data used the open data to improve existing applications and work processes rather than to create new products. The case study demonstrated that private energy companies could successfully release open data. The case study also showed that Liander served as a best-practice case for open data and had a flywheel effect on companies within the same sector. By 2015, nearly all energy network administrators had published similar open data. The monitoring model developed in this project was assessed to be suitable to monitor the open data effects on the organisation of the data provider. The assessment model developed and tested in Chapter 7 proved to be suitable to monitor the effects of open data on organisational level. However, to provide a more complete picture of the effects of open data and to assess if there are other barriers for re-users, a more holistic approach was required to assess the maturity of open data. Therefore, a holistic open data assessment framework addressing the supplier side, the governance side, and the user side of the open data was developed and applied to the Dutch open data infrastructure in Chapter 8. This Holistic Open Data Maturity Assessment Framework was used to evaluate the State of the Open Data Nation in the Netherlands and to provide valuable information on (potential) bottlenecks. The framework showed that geographic data scored significantly better than other types of government data. The standardisation and implementation rules laid down by INSPIRE Directive framework appear to have been a catalyst for moving geographic data to a higher level of maturity. The maturity assessment framework provided Dutch policy makers with useful inputs for further development of the open data ecosystem and development of well-founded strategies that will ensure the full potential of open data will be reached. Since the publication of the State of the Open Data Nation in 2014, a number of the recommendations have already been implemented. This dissertation demonstrated that many aspects that should facilitate accessibility, such as standardised metadata, have already been addressed for geodata. This research also showed that for other types of data, there is still a long way to go. There is a growing demand for other types of data, such as financial data and healthcare data. Public sector organisations holding such types of data need hands-on guidelines to enable publication of their datasets, preferably as open data. However, data published as open data are forever and cannot be recalled. Therefore, the decision to publish public sector data as open data is complex: datasets are often of a heterogeneous nature and may contain microdata (data that quantify observations or facts, such as data collected during surveys) Although microdata may not necessarily contain personal data, the datasets will probably have to be processed before publication to address confidentiality and data quality issues. In addition, there is a tension between open data and protection of personal data. The big question remains to which level the data need to be aggregated and/or anonymised to ensure protection of personal data now and in the future, and at the same time keeping sufficient significance to be re-usable. Another issue that needs further research is data-ownership of sensor data and co-created data. Increasingly, sensor data generated by e.g. smart phones, smart energy meters and traffic sensors are collected by the public sector and the private sector and become part of a big data ecosystem. In addition, public sector organisations cooperate with other public sector organisations and the private sector to create information from their data, so-called co-created information. Citizens also collect data or complement information on

    Electric Vehicle (EV)-Assisted Demand-Side Management in Smart Grid

    Get PDF
    While relieving the dependency on diminishing fossil fuels, Electric Vehicles (EVs) provide a promising opportunity to realise an eco-friendly and cost-effective means of transportation. However, the enormous electricity demand imposed by the wide-scale deployment of EVs can put power infrastructure under critical strain, potentially impacting the efficiency, resilience, and safety of the electric power supply. Interestingly, EVs are deferrable loads with flexible charging requirements, making them an ideal prospect for the optimisation of consumer demand for energy, referred to as demand-side management. Furthermore, with the recent introduction of Vehicle-to-Grid (V2G) technology, EVs are now able to act as residential battery systems, enabling EV customers to store energy and use them as backup power for homes or deliver back to the grid when required. Hence, this thesis studies Electric Vehicle (EV)-assisted demand-side management strategies to manage peak electricity demand, with the long-term objective of transforming to a fully EV-based transportation system without requiring major upgrades in existing grid infrastructure. Specifically, we look at ways to optimise residential EV charging and discharging for smart grid, while addressing numerous requirements from EV customer's perspective and power system's perspective. First, we develop an EV charge scheduling algorithm with the objective of tracking an arbitrary power profile. The design of the algorithm is inspired by water-filling theory in communication systems design, and the algorithm is applied to schedule EV charging following a day-ahead renewable power generation profile. Then we extend that algorithm by incorporating V2G operation to shape the load curve in residential communities via valley-filling and peak-shaving. In the proposed EV charge-discharge algorithm, EVs are distributedly coordinated by implementing a non-cooperative game. Our numerical simulation results demonstrate that the proposed algorithm is effective in flattening the load curve while satisfying all heterogeneous charge requirements across EVs. Next, we propose an algorithm for network-aware EV charging and discharging, with an emphasis on both EV customer economics and distribution network aspects. The core of the algorithm is a Quadratic Program (QP) that is formulated to minimise the operational costs accrued to EV customers while maintaining distribution feeder nodal voltage magnitudes within prescribed thresholds. By means of a receding horizon control approach, the algorithm implements the respective QP-based EV charge-discharge control sequences in near-real-time. Our simulation results demonstrate that the proposed algorithm offers significant reductions in operational costs associated with EV charging and discharging, while also mitigating under-voltage and over-voltage conditions arising from peak power flows and reverse power flows in the distribution network. Moreover, the proposed algorithm is shown to be robust to non-deterministic EV arrivals and departures. While the previous algorithm ensures a stable voltage profile across the entire distribution feeder, it is limited to balanced power distribution networks. Therefore, we next extend that algorithm to facilitate EV charging and discharging in unbalanced distribution networks. The proposed algorithm also supports distributed EV charging and discharging coordination, where EVs determine their charge-discharge profiles in parallel, using an Alternating Direction Method of Multipliers (ADMM)-based approach driven by peer-to-peer EV communication. Our simulation results confirm that the proposed distributed algorithm is computationally efficient when compared to its centralised counterpart. Moreover, the proposed algorithm is shown to be successful in terms of correcting any voltage violations stemming from non-EV load, as well as, satisfying all EV charge requirements without causing any voltage violations

    Computing graph neural networks: A survey from algorithms to accelerators

    Get PDF
    Graph Neural Networks (GNNs) have exploded onto the machine learning scene in recent years owing to their capability to model and learn from graph-structured data. Such an ability has strong implications in a wide variety of fields whose data are inherently relational, for which conventional neural networks do not perform well. Indeed, as recent reviews can attest, research in the area of GNNs has grown rapidly and has lead to the development of a variety of GNN algorithm variants as well as to the exploration of ground-breaking applications in chemistry, neurology, electronics, or communication networks, among others. At the current stage research, however, the efficient processing of GNNs is still an open challenge for several reasons. Besides of their novelty, GNNs are hard to compute due to their dependence on the input graph, their combination of dense and very sparse operations, or the need to scale to huge graphs in some applications. In this context, this article aims to make two main contributions. On the one hand, a review of the field of GNNs is presented from the perspective of computing. This includes a brief tutorial on the GNN fundamentals, an overview of the evolution of the field in the last decade, and a summary of operations carried out in the multiple phases of different GNN algorithm variants. On the other hand, an in-depth analysis of current software and hardware acceleration schemes is provided, from which a hardware-software, graph-aware, and communication-centric vision for GNN accelerators is distilled.This work is possible thanks to funding from the European Union’s Horizon 2020 research and innovation programme under Grant No. 863337 (WiPLASH project) and the Spanish Ministry of Economy and Competitiveness under contract TEC2017-90034-C2-1-R (ALLIANCE project) that receives funding from FEDER.Peer ReviewedPostprint (published version

    Impact of Organized Retailing on the Unorganized Sector

    Get PDF
    The retail business, in India, is estimated to grow at 13 per cent per annum from US322billionin200607toUS 322 billion in 2006-07 to US 590 billion in 2011-12. The unorganized retail sector is expected to grow at about 10 per cent per annum from US309billion200607toUS 309 billion 2006-07 to US 496 billion in 2011-12. Organized retail which now constitutes a small four per cent of retail sector in 2006-07 is likely to grow at 45-50 per cent per annum and quadruple its share of total retail trade to 16 per cent by 2011-12. The study, which was based on the largest ever survey of all segments of the economy that could be affected by the entry of large corporates in the retail business, has found that unorganized retailers in the vicinity of organized retailers experienced a decline in sales and profit in the initial years of the entry of organized retailers. The adverse impact, however, weakens over time. The study has indicated how consumers and farmers benefit from organized retailers. The study has also examined the impact on intermediaries and manufacturers. The results are indicative of the mega-and-minimetro cities around a limited number of organized retail outlets. Based on the results of the surveys, the study has made a number of specific policy recommendations for regulating the interaction of large retailers with small suppliers and for strengthening the competitive response of the unorganized retailers.Retail Sector, Organised Retail, Unorganised Retail, Kirana store, Food Supply Chain
    corecore