755 research outputs found

    Materials for high-temperature thermoelectric conversion

    Get PDF
    High boron materials of high efficiency for thermoelectric power generation and capable of prolonged operation at temperatures over 1200 C are discussed. Background theoretical studies indicated that the low carrier mobility of materials with beta boron and related structures is probably associated with the high density of traps. Experimental work was mainly concerned with silicon borides in view of promising data from European laboratories. A systematic study using structure determination and lattice constant measurements failed to confirm the existence of an SiBn phase. Only SiB6 and a solid solution of silicon in beta boron with a maximum solid solubility of 5.5-6 at % at 1650 C were found

    Direct observation of interface instability during crystal growth

    Get PDF
    The general aim of this investigation was to study interface stability and solute segregation phenomena during crystallization of a model system. Emphasis was to be placed on direct observational studies partly because this offered the possibility at a later stage of performing related experiments under substantially convection-free conditions in the space shuttle. The major achievements described in this report are: (1) the development of a new model system for fundamental studies of crystal growth from the melt and the measurement of a range of material parameters necessary for comparison of experiment with theory. (2) The introduction of a new method of measuring segregation coefficient using absorption of a laser beam by the liquid phase. (3) The comparison of segregation in crystals grown by gradient freezing and by pulling from the melt. (4) The introduction into the theory of solute segregation of an interface field term and comparison with experiment. (5) The introduction of the interface field term into the theories of constitutional supercooling and morphological stability and assessment of its importance

    Dave Janney Stays True to his Calling

    Get PDF

    Determining the impact of regulatory policy on UK gas use using Bayesian analysis on publicly available data

    Get PDF
    This paper presents a novel method to analyse policy performance, using the example of legislation in the UK to require domestic boilers fitted since 1 April 2005 to be condensing. A technological uptake model based on the logistic equation is combined with four physical and economic models; Bayesian techniques are used for data analysis. Projections of energy savings are presented and the impact of different policy implementation dates investigated

    The Importance of Heating System Transient Response in Domestic Energy Labelling

    Get PDF
    European National Calculation Methods (NCM), such as the UK Standard Assessment Procedure (SAP), are used to make standardised and simplified assessments of building energy performance. These NCMs contain simplifications to aid ease of use and comparability of resulting Energy Performance Certificates (EPC). By comparing SAP with a modern, dynamic modelling system, this study quantifies internal temperatures and thereby heating energy consumption. Results show that for the considered test house SAP results correspond closely to a dynamic model using an idealistic heating system, with perfect control and instant responsiveness. However, the introduction of a dynamic, physically realistic gas fired boiler and water based heating system to the model results in a consistent increase in internal temperature (0.5 °C) and energy demand (by ca. 1000 kWh/a). Variation of further parameters within the dynamic model, controls and heat source size, are presented and compared to SAP results and assumptions. The inclusion of more realistic dynamics in building energy modelling for NCMs may provide a better basis for effective decision making with respect to a wide range of heating systems

    Adaptive Normalization in Streaming Data

    Full text link
    In todays digital era, data are everywhere from Internet of Things to health care or financial applications. This leads to potentially unbounded ever-growing Big data streams and it needs to be utilized effectively. Data normalization is an important preprocessing technique for data analytics. It helps prevent mismodeling and reduce the complexity inherent in the data especially for data integrated from multiple sources and contexts. Normalization of Big Data stream is challenging because of evolving inconsistencies, time and memory constraints, and non-availability of whole data beforehand. This paper proposes a distributed approach to adaptive normalization for Big data stream. Using sliding windows of fixed size, it provides a simple mechanism to adapt the statistics for normalizing changing data in each window. Implemented on Apache Storm, a distributed real-time stream data framework, our approach exploits distributed data processing for efficient normalization. Unlike other existing adaptive approaches that normalize data for a specific use (e.g., classification), ours does not. Moreover, our adaptive mechanism allows flexible controls, via user-specified thresholds, for normalization tradeoffs between time and precision. The paper illustrates our proposed approach along with a few other techniques and experiments on both synthesized and real-world data. The normalized data obtained from our proposed approach, on 160,000 instances of data stream, improves over the baseline by 89% with 0.0041 root-mean-square error compared with the actual data

    Book Department

    Get PDF

    Concept drift detection based on anomaly analysis

    Full text link
    © Springer International Publishing Switzerland 2014. In online machine learning, the ability to adapt to new concept quickly is highly desired. In this paper, we propose a novel concept drift detection method, which is called Anomaly Analysis Drift Detection (AADD), to improve the performance of machine learning algorithms under non-stationary environment. The proposed AADD method is based on an anomaly analysis of learner’s accuracy associate with the similarity between learners’ training domain and test data. This method first identifies whether there are conflicts between current concept and new coming data. Then the learner will incrementally learn the non conflict data, which will not decrease the accuracy of the learner on previous trained data, for concept extension. Otherwise, a new learner will be created based on the new data. Experiments illustrate that this AADD method can detect new concept quickly and learn extensional drift incrementally
    • …
    corecore