256 research outputs found

    Design, Integration, and Evaluation of IoT-Based Electrochromic Building Envelopes for Visual Comfort and Energy Efficiency

    Get PDF
    Electrochromic glazing has been identified as the next-generation high-performance glazing material for building envelopes due to its dynamic properties, which allow the buildings to respond to various climate conditions. IoT technologies have improved the sensing, communication, and interactions of building environmental data. Few studies have been done to synthesize the advancements in EC materials and building IoT technologies for better building performance. The challenge remains in the lack of compatible design and simulation tools, limited understanding of integration, and a paucity of evaluation measures to support the convergence between the EC building envelopes and IoT technologies. This research first explores the existing challenges of using EC building envelopes using secondary data analysis and case studies. An IoT-based EC prototype system is developed to demonstrate the feasibility of IoT and EC integration. Functionalities, reliability, interoperability, and scalability are assessed with comparisons of four alternative building envelope systems. Nation-wide evaluations of EC building performance are conducted to show regional differences and trade-offs of visual comfort and energy efficiency. A machine learning approach is proposed to solve the predictive EC control problem under random weather conditions. The best prediction models achieve 91.08% mean accuracy with the 16-climate-zone data set. The importance of predictive variables is also measured in each climate zone to develop a better understanding of the effectiveness of climatic sensors. Additionally, a simulation study is conducted to investigate the relationships between design factors and EC building performance. An instantaneous daylight measure is developed to support active daylight control with IoT-based EC building envelopes

    Enhancing the predictive performance of ensemble models through novel multi-objective strategies: evidence from credit risk and business model innovation survey data

    Get PDF
    This paper proposes novel multi-objective optimization strategies to develop a weighted ensemble model. The comparison of the performance of the proposed strategies against simulated data suggests that the multi-objective strategy based on joint entropy is superior to other proposed strategies. For the application, generalization, and practical implications of the proposed approaches, we implemented the model on two real datasets related to the prediction of credit risk default and the adoption of the innovative business model by firms. The scope of this paper can be extended in ordering the solutions of the proposed multi- objective strategies and can be generalized for other similar predictive task

    Evaluating the share performance of socially responsible investment on the Johannesburg Stock Exchange

    Get PDF
    Socially responsible investing (SRI) integrates environmental, social and governance (ESG) issues into the investment decision-making process. Growing ESG concerns and the uncovering of corporate scandals have catalysed the substantial growth in SRI portfolios worldwide. Notwithstanding its increasing popularity, barriers to further SRI growth have been identified. Traditional investing practices suggest that theoretically, SRI may underperform conventional investment strategies. However, despite the vast amount of literature on SRI, empirical studies have yielded a mixture of results regarding fund performance. The JSE SRI Index was launched in 2004 to promote transparent business practices. It was discontinued at the end of 2015 succeeded by a new Responsible Investment Index established by the JSE in association with FTSE Russell. The aim of the research was to evaluate the share performance of the JSE SRI Index from 2004-2015. Additionally, the indices were categorised by environmental impact to further analyse disparity among share returns. The study was also divided into two sub-periods, 2004-2009 and 2010-2015, with the latter following the endorsement of integrated reporting by the King III Code as a listing requirement in 2010. A single-factor Capital Asset Pricing Model (CAPM) was used to assess differences in risk-adjusted returns. Engle-Granger and Johansen tests were employed to explore the possibility of a cointegrating relationship between the indices. No significant difference between returns was observed for 2004-2009, with the SRI Index exhibiting statistically significant inferior risk-adjusted returns for the latter half of the study. Overall, a significant difference between share returns was found, with CAPM results suggesting that the JSE SRI Index underperformed the All Share Index by -2.33% per annum throughout the time span of the study. Engle-Granger and Johansen test results indicated the existence of a cointegrating relationship over the first half of the study. However, there was no cointegration between the two indices for 2004-2015, which may be attributed to no significant relationship found for the latter years. Results support the notion that investors pay the price to invest ethically on the JSE. Inferior risk-adjusted returns associated with SRI may be a major barrier to its development in South African markets

    Biodiversity in Marine Ecosystems—European Developments toward Robust Assessments

    Get PDF
    Sustainability of marine ecosystems and their services are dependent on marine biodiversity, which is threatened worldwide. Biodiversity protection is a major target of the EU Marine Strategy Framework Directive, requiring assessment of the status of biodiversity on the level of species, habitats, and ecosystems including genetic diversity and the role of biodiversity in food web functioning and structure. This paper provides a summary of the development of new indicators and refinement of existing ones in order to address some of the observed gaps in indicator availability for marine biodiversity assessments considering genetic, species, habitat, and ecosystem levels. Promising new indicators are available addressing genetic diversity of microbial and benthic communities. Novel indicators to assess biodiversity and food webs associated with habitats formed by keystone species (such as macroalgae) as well as to map benthic habitats (such as biogenic reefs) using high resolution habitat characterization were developed. We also discuss the advances made on indicators for detecting impacts of non-native invasive species and assessing the structure and functioning of marine food-webs. The latter are based on indicators showing the effects of fishing on trophic level and size distribution of fish and elasmobranch communities well as phytoplankton and zooplankton community structure as food web indicators. New and refined indicators are ranked based on quality criteria). Their applicability for various EU and global biodiversity assessments and the need for further development of new indicators and refinement of the existing ones is discussed

    Methods for three-dimensional Registration of Multimodal Abdominal Image Data

    Get PDF
    Multimodal image registration benefits the diagnosis, treatment planning and the performance of image-guided procedures in the liver, since it enables the fusion of complementary information provided by pre- and intrainterventional data about tumor localization and access. Although there exist various registration methods, approaches which are specifically optimized for the registration of multimodal abdominal scans are only scarcely available. The work presented in this thesis aims to tackle this problem by focusing on the development, optimization and evaluation of registration methods specifically for the registration of multimodal liver scans. The contributions to the research field of medical image registration include the development of a registration evaluation methodology that enables the comparison and optimization of linear and non-linear registration algorithms using a point-based accuracy measure. This methodology has been used to benchmark standard registration methods as well as novel approaches that were developed within the frame of this thesis. The results of the methodology showed that the employed similarity measure used during the registration has a major impact on the registration accuracy of the method. Due to this influence, two alternative similarity metrics bearing the potential to be used on multimodal image data are proposed and evaluated. The first metric relies on the use of gradient information in form of Histograms of Oriented Gradients (HOG) whereas the second metric employs a siamese neural network to learn a similarity measure directly on the image data. The evaluation showed, that both metrics could compete with state of the art similarity measures in terms of registration accuracy. The HOG-metric offers the advantage that it does not require ground truth data to learn a similarity estimation, but instead it is applicable to various data sets with the sole requirement of distinct gradients. However, the Siamese metric is characterized by a higher robustness for large rotations than the HOG-metric. To train such a network, registered ground truth data is required which may be critical for multimodal image data. Yet, the results show that it is possible to apply models trained on registered synthetic data on real patient data. The last part of this thesis focuses on methods to learn an entire registration process using neural networks, thereby offering the advantage to replace the traditional, time-consuming iterative registration procedure. Within the frame of this thesis, the so-called VoxelMorph network which was originally proposed for monomodal, non-linear registration learning is extended for affine and multimodal registration learning tasks. This extension includes the consideration of an image mask during metric evaluation as well as loss functions for multimodal data, such as the pretrained Siamese metric and a loss relying on the comparison of deformation fields. Based on the developed registration evaluation methodology, the performance of the original network as well as the extended variants are evaluated for monomodal and multimodal registration tasks using multiple data sets. With the extended network variants, it is possible to learn an entire multimodal registration process for the correction of large image displacements. As for the Siamese metric, the results imply a general transferability of models trained with synthetic data to registration tasks including real patient data. Due to the lack of multimodal ground truth data, this transfer represents an important step towards making Deep Learning based registration procedures clinically usable

    Solutions pour l'auto-adaptation des systĂšmes sans fil

    Get PDF
    The current demand on ubiquitous connectivity imposes stringent requirements on the fabrication of Radio-Frequency (RF) circuits. Designs are consequently transferred to the most advanced CMOS technologies that were initially introduced to improve digital performance. In addition, as technology scales down, RF circuits are more and more susceptible to a lot of variations during their lifetime, as manufacturing process variability, temperature, environmental conditions, aging
 As a result, the usual worst-case circuit design is leading to sub-optimal conditions, in terms of power and/or performance most of the time for the circuit. In order to counteract these variations, increasing the performances and also reduce power consumption, adaptation strategies must be put in place.More importantly, the fabrication process introduces more and more performance variability, which can have a dramatic impact on the fabrication yield. That is why RF designs are not easily fabricated in the most advanced CMOS technologies, as 32nm or 22nm nodes for instance. In this context, the performances of RF circuits need to be calibrated after fabrication so as to take these variations into account and recover yield loss.This thesis work is presenting on a post-fabrication calibration technique for RF circuits. This technique is performed during production testing with minimum extra cost, which is critical since the cost of test can be comparable to the cost of fabrication concerning RF circuits and cannot be further raised. Calibration is enabled by equipping the circuit with tuning knobs and sensors. Optimal tuning knob identification is achieved in one-shot based on a single test step that involves measuring the sensor outputs once. For this purpose, we rely on variation-aware sensors which provide measurements that remain invariant under tuning knob changes. As an auxiliary benefit, the variation-aware sensors are non-intrusive and totally transparent to the circuit.Our proposed methodology has first been demonstrated with simulation data with an RF power amplifier as a case study. Afterwards, a silicon demonstrator has then been fabricated in a 65nm technology in order to fully demonstrate the methodology. The fabricated dataset of circuits is extracted from typical and corner wafers. This feature is very important since corner circuits are the worst design cases and therefore the most difficult to calibrate. In our case, corner circuits represent more than the two third of the overall dataset and the calibration can still be proven. In details, fabrication yield based on 3 sigma performance specifications is increased from 21% to 93%. This is a major performance of the technique, knowing that worst case circuits are very rare in industrial fabrication.La demande courante de connectivitĂ© instantanĂ©e impose un cahier des charges trĂšs strict sur la fabrication des circuits Radio-FrĂ©quences (RF). Les circuits doivent donc ĂȘtre transfĂ©rĂ©es vers les technologies les plus avancĂ©es, initialement introduites pour augmenter les performances des circuits purement numĂ©riques. De plus, les circuits RF sont soumis Ă  de plus en plus de variations et cette sensibilitĂ© s’accroĂźt avec l’avancĂ©es des technologies. Ces variations sont par exemple les variations du procĂ©dĂ© de fabrication, la tempĂ©rature, l’environnement, le vieillissement
 Par consĂ©quent, la mĂ©thode classique de conception de circuits “pire-cas” conduit Ă  une utilisation non-optimale du circuit dans la vaste majoritĂ© des conditions, en termes de performances et/ou de consommation. Ces variations doivent donc ĂȘtre compensĂ©es, en utilisant des techniques d’adaptation.De maniĂšre plus importante encore, le procĂ©dĂ© de fabrication des circuits introduit de plus en plus de variabilitĂ© dans les performances des circuits, ce qui a un impact important sur le rendement de fabrication des circuits. Pour cette raison, les circuits RF sont difficilement fabriquĂ©s dans les technologies CMOS les plus avancĂ©es comme les nƓuds 32nm ou 22nm. Dans ce contexte, les performances des circuits RF doivent ĂȘtres calibrĂ©es aprĂšs fabrication pour prendre en compte ces variations et retrouver un haut rendement de fabrication.Ce travail de these prĂ©sente une mĂ©thode de calibration post-fabrication pour les circuits RF. Cette mĂ©thodologie est appliquĂ©e pendant le test de production en ajoutant un minimum de coĂ»t, ce qui est un point essentiel car le coĂ»t du test est aujourd’hui dĂ©jĂ  comparable au coĂ»t de fabrication d’un circuit RF et ne peut ĂȘtre augmentĂ© d’avantage. Par ailleurs, la puissance consommĂ©e est aussi prise en compte pour que l’impact de la calibration sur la consommation soit minimisĂ©. La calibration est rendue possible en Ă©quipant le circuit avec des nƓuds de rĂ©glages et des capteurs. L’identification de la valeur de rĂ©glage optimale du circuit est obtenue en un seul coup, en testant les performances RF une seule et unique fois. Cela est possible grĂące Ă  l’utilisation de capteurs de variations du procĂ©dĂ© de fabrication qui sont invariants par rapport aux changements des nƓuds de rĂ©glage. Un autre benefice de l’utilisation de ces capteurs de variation sont non-intrusifs et donc totalement transparents pour le circuit sous test. La technique de calibration a Ă©tĂ© dĂ©montrĂ©e sur un amplificateur de puissance RF utilisĂ© comme cas d’étude. Une premiĂšre preuve de concept est dĂ©veloppĂ©e en utilisant des rĂ©sultats de simulation.Un dĂ©monstrateur en silicium a ensuite Ă©tĂ© fabriquĂ© en technologie 65nm pour entiĂšrement dĂ©montrer le concept de calibration. L’ensemble des puces fabriquĂ©es a Ă©tĂ© extrait de trois types de wafer diffĂ©rents, avec des transistors aux performances lentes, typiques et rapides. Cette caractĂ©ristique est trĂšs importante car elle nous permet de considĂ©rer des cas de procĂ©dĂ© de fabrication extrĂȘmes qui sont les plus difficiles Ă  calibrer. Dans notre cas, ces circuits reprĂ©sentent plus des deux tiers des puces Ă  disposition et nous pouvons quand mĂȘme prouver notre concept de calibration. Dans le dĂ©tails, le rendement de fabrication passe de 21% avant calibration Ă  plus de 93% aprĂšs avoir appliquĂ© notre mĂ©thodologie. Cela constitue une performance majeure de notre mĂ©thodologie car les circuits extrĂȘmes sont trĂšs rares dans une fabrication industrielle

    Towards a human-centric data economy

    Get PDF
    Spurred by widespread adoption of artificial intelligence and machine learning, “data” is becoming a key production factor, comparable in importance to capital, land, or labour in an increasingly digital economy. In spite of an ever-growing demand for third-party data in the B2B market, firms are generally reluctant to share their information. This is due to the unique characteristics of “data” as an economic good (a freely replicable, non-depletable asset holding a highly combinatorial and context-specific value), which moves digital companies to hoard and protect their “valuable” data assets, and to integrate across the whole value chain seeking to monopolise the provision of innovative services built upon them. As a result, most of those valuable assets still remain unexploited in corporate silos nowadays. This situation is shaping the so-called data economy around a number of champions, and it is hampering the benefits of a global data exchange on a large scale. Some analysts have estimated the potential value of the data economy in US$2.5 trillion globally by 2025. Not surprisingly, unlocking the value of data has become a central policy of the European Union, which also estimated the size of the data economy in 827C billion for the EU27 in the same period. Within the scope of the European Data Strategy, the European Commission is also steering relevant initiatives aimed to identify relevant cross-industry use cases involving different verticals, and to enable sovereign data exchanges to realise them. Among individuals, the massive collection and exploitation of personal data by digital firms in exchange of services, often with little or no consent, has raised a general concern about privacy and data protection. Apart from spurring recent legislative developments in this direction, this concern has raised some voices warning against the unsustainability of the existing digital economics (few digital champions, potential negative impact on employment, growing inequality), some of which propose that people are paid for their data in a sort of worldwide data labour market as a potential solution to this dilemma [114, 115, 155]. From a technical perspective, we are far from having the required technology and algorithms that will enable such a human-centric data economy. Even its scope is still blurry, and the question about the value of data, at least, controversial. Research works from different disciplines have studied the data value chain, different approaches to the value of data, how to price data assets, and novel data marketplace designs. At the same time, complex legal and ethical issues with respect to the data economy have risen around privacy, data protection, and ethical AI practices. In this dissertation, we start by exploring the data value chain and how entities trade data assets over the Internet. We carry out what is, to the best of our understanding, the most thorough survey of commercial data marketplaces. In this work, we have catalogued and characterised ten different business models, including those of personal information management systems, companies born in the wake of recent data protection regulations and aiming at empowering end users to take control of their data. We have also identified the challenges faced by different types of entities, and what kind of solutions and technology they are using to provide their services. Then we present a first of its kind measurement study that sheds light on the prices of data in the market using a novel methodology. We study how ten commercial data marketplaces categorise and classify data assets, and which categories of data command higher prices. We also develop classifiers for comparing data products across different marketplaces, and we study the characteristics of the most valuable data assets and the features that specific vendors use to set the price of their data products. Based on this information and adding data products offered by other 33 data providers, we develop a regression analysis for revealing features that correlate with prices of data products. As a result, we also implement the basic building blocks of a novel data pricing tool capable of providing a hint of the market price of a new data product using as inputs just its metadata. This tool would provide more transparency on the prices of data products in the market, which will help in pricing data assets and in avoiding the inherent price fluctuation of nascent markets. Next we turn to topics related to data marketplace design. Particularly, we study how buyers can select and purchase suitable data for their tasks without requiring a priori access to such data in order to make a purchase decision, and how marketplaces can distribute payoffs for a data transaction combining data of different sources among the corresponding providers, be they individuals or firms. The difficulty of both problems is further exacerbated in a human-centric data economy where buyers have to choose among data of thousands of individuals, and where marketplaces have to distribute payoffs to thousands of people contributing personal data to a specific transaction. Regarding the selection process, we compare different purchase strategies depending on the level of information available to data buyers at the time of making decisions. A first methodological contribution of our work is proposing a data evaluation stage prior to datasets being selected and purchased by buyers in a marketplace. We show that buyers can significantly improve the performance of the purchasing process just by being provided with a measurement of the performance of their models when trained by the marketplace with individual eligible datasets. We design purchase strategies that exploit such functionality and we call the resulting algorithm Try Before You Buy, and our work demonstrates over synthetic and real datasets that it can lead to near-optimal data purchasing with only O(N) instead of the exponential execution time - O(2N) - needed to calculate the optimal purchase. With regards to the payoff distribution problem, we focus on computing the relative value of spatio-temporal datasets combined in marketplaces for predicting transportation demand and travel time in metropolitan areas. Using large datasets of taxi rides from Chicago, Porto and New York we show that the value of data is different for each individual, and cannot be approximated by its volume. Our results reveal that even more complex approaches based on the “leave-one-out” value, are inaccurate. Instead, more complex and acknowledged notions of value from economics and game theory, such as the Shapley value, need to be employed if one wishes to capture the complex effects of mixing different datasets on the accuracy of forecasting algorithms. However, the Shapley value entails serious computational challenges. Its exact calculation requires repetitively training and evaluating every combination of data sources and hence O(N!) or O(2N) computational time, which is unfeasible for complex models or thousands of individuals. Moreover, our work paves the way to new methods of measuring the value of spatio-temporal data. We identify heuristics such as entropy or similarity to the average that show a significant correlation with the Shapley value and therefore can be used to overcome the significant computational challenges posed by Shapley approximation algorithms in this specific context. We conclude with a number of open issues and propose further research directions that leverage the contributions and findings of this dissertation. These include monitoring data transactions to better measure data markets, and complementing market data with actual transaction prices to build a more accurate data pricing tool. A human-centric data economy would also require that the contributions of thousands of individuals to machine learning tasks are calculated daily. For that to be feasible, we need to further optimise the efficiency of data purchasing and payoff calculation processes in data marketplaces. In that direction, we also point to some alternatives to repetitively training and evaluating a model to select data based on Try Before You Buy and approximate the Shapley value. Finally, we discuss the challenges and potential technologies that help with building a federation of standardised data marketplaces. The data economy will develop fast in the upcoming years, and researchers from different disciplines will work together to unlock the value of data and make the most out of it. Maybe the proposal of getting paid for our data and our contribution to the data economy finally flies, or maybe it is other proposals such as the robot tax that are finally used to balance the power between individuals and tech firms in the digital economy. Still, we hope our work sheds light on the value of data, and contributes to making the price of data more transparent and, eventually, to moving towards a human-centric data economy.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en Ingeniería Telemática por la Universidad Carlos III de MadridPresidente: Georgios Smaragdakis.- Secretario: Ángel Cuevas Rumín.- Vocal: Pablo Rodríguez Rodrígue

    Collaborative Networks, Decision Systems, Web Applications and Services for Supporting Engineering and Production Management

    Get PDF
    This book focused on fundamental and applied research on collaborative and intelligent networks and decision systems and services for supporting engineering and production management, along with other kinds of problems and services. The development and application of innovative collaborative approaches and systems are of primer importance currently, in Industry 4.0. Special attention is given to flexible and cyber-physical systems, and advanced design, manufacturing and management, based on artificial intelligence approaches and practices, among others, including social systems and services
    • 

    corecore