435 research outputs found

    Modifications to ACM Classifier and Fine Powder Coating for Plastic Components

    Get PDF
    Powder coating is a dry coating technology, which has several advantages over conventional liquid coatings. However, wider applications were limited by its inferior surface finish and increased film thickness. Making powder finer (fine powder technology) can provide much better surface appearance and smaller film thickness comparable to liquid coating, but it’s much more difficult to produce fine powder products with narrow particle size distributions than coarse powder. Another issue that restricts the applications is its electrostatic spraying method, which limits the applications mainly to be with conductive substrates, like metals. In this study, to ensure a narrow particle size distribution of fine powder products, nine kinds of modification were conducted with the classifier of widely-used air classifying mill (ACM) by changing the air flow through it. For each kind of modification, the experiments were conducted under five operating conditions and were repeated for three times. According to the results of 150 samples, the particle sizes and particle size distributions of products were greatly affected by the classifier configuration. All nine kinds of modification showed better performance than the original classifier on narrowing particle size distributions, without compromising any collection efficiency. In addition, non-conductive plastics were employed as substrates in fine powder experiments, using two popular commercial coating powders. Results showed that lowering the particle sizes and narrowing particle size distributions of coating powders contributed to better surface finishes on the workpieces. Besides, due to the poor flowability of fine powder, different amounts of flow additives were used with fine coating powders, and the optimum amount of additives was selected considering the effects on both flowability and surface quality. Furthermore, utilizing voltage was proven to be an effective method assisting pre-heating to increase transfer efficiency

    Jonathan Edwards’s Judeo-centric and cosmic vision of the Millennial Kingdom

    Get PDF
    The present study addresses the less-well-known subject of Jonathan Edwards’s millennialism from his redemptive-historical vision. By situating him in the Reformation and post-Reformation contexts, taking into considerations of his interaction with the intellectual challenges posed by some of the Enlightenment thinkers, this study attempts to provide a more nuanced and extensive investigation of Edwards’s anticipation of the millennium. To put them in a nutshell, as a typical example of a dramatic paradigm shift in millennialism for the period between the sixteenth and eighteenth centuries, what Edwards expected was neither a political nor America-centric utopia as some scholars presented. Conversely, his vision of the millennium is a Christ-reigning, Judeo-centric and cosmic kingdom arriving on earth in distant future. As indispensable parts of Edwards’s theological system, the less-known facts of the Christological, Judeo-centric and cosmic nature of Edwards’s millennialism in Edwardsean scholarship highlight the greatness of God’s divine sovereignty, the magnificence of His glory as well as the capaciousness of His kingdom. This millennial vision departed significantly from the Reformed tradition in certain ways. In particular, while some of his Protestant predecessors and Puritan contemporaries tended to centralize, or even sacralize their present time and nations, Edwards decentralized England and New England in terms of time, space and people. This study sheds new light on a number of neglected and controversial issues. Firstly, this research provides a fresh and extensive review of Edwards’s millennial theology and provides another outlook on Edwards’s continuity in and departures from his Reformed tradition. Secondly, this study explores Edwards’s Christocentric conviction as well as his artful communication of this conviction in his millennialism. This offers a groundbreaking perspective to the correlation between Edwards’s Christology and his eschatology. Thirdly, the presentation of the Judeo-centric and cosmic nature offers an innovative interpretive key to his millennialism and provides a background to current debates on Israel and end times. Finally, this study ventures into two less well-known subjects: Israel and China in Edwards’s millennialism vision. Particularly, it provides new insights into his conviction of Israel’s restoration on the Promised Land and his eschatological hope for China and the heathen world

    Profiling Good Leakage Models For Masked Implementations

    Get PDF
    Leakage model plays a very important role in side channel attacks. An accurate leakage model greatly improves the efficiency of attacks. However, how to profile a good enough leakage model, or how to measure the accuracy of a leakage model, is seldom studied. Durvaux et al. proposed leakage certification tests to profile good enough leakage model for unmasked implementations. However, they left the leakage model profiling for protected implementations as an open problem. To solve this problem, we propose the first practical higher-order leakage model certification tests for masked implementations. First and second order attacks are performed on the simulations of serial and parallel implementations of a first-order fixed masking. A third-order attack is performed on another simulation of a second-order random masked implementation. The experimental results show that our new tests can profile the leakage models accurately

    Towards Easy Key Enumeration

    Get PDF
    Key enumeration solutions are post-processing schemes for the output sequences of side channel distinguishers, the application of which are prevented by very large key candidate space and computation power requirements. The attacker may spend several days or months to enumerate a huge key space (e.g. 2402^{40}). In this paper, we aim at pre-processing and reducing the key candidate space by deleting impossible key candidates before enumeration. A new distinguisher named Group Collision Attack (GCA) is given. Moreover, we introduce key verification into key recovery and a new divide and conquer strategy named Key Grouping Enumeration (KGE) is proposed. KGE divides the huge key space into several groups and uses GCA to delete impossible key combinations and output possible ones in each group. KGE then recombines the remaining key candidates in each group using verification. The number of remaining key candidates becomes much smaller through these two impossible key candidate deletion steps with a small amount of computation. Thus, the attacker can use KGE as a pre-processing tool of key enumeration and enumerate the key more easily and fast in a much smaller candidate space

    Towards Optimal Pre-processing in Leakage Detection

    Get PDF
    An attacker or evaluator can detect more information leakages if he improves the Signal-to-Noise Ratio (SNR) of power traces in his tests. For this purpose, pre-processings such as de-noise, distribution-based traces biasing are used. However, the existing traces biasing schemes can\u27t accurately express the characteristics of power traces with high SNR, making them not ideal for leakage detections. Moreover, if the SNR of power traces is very low, it is very difficult to use the existing de-noise schemes and traces biasing schemes to enhance leakage detection. In this paper, a known key based pre-processing tool named Traces Linear Optimal Biasing (TLOB) is proposed, which performs very well even on power traces with very low SNR. It can accurately evaluate the noise of time samples and give reliable traces optimal biasing. Experimental results show that TLOB significantly reduces number of traces used for detection; correlation coefficients in ρ\rho-tests using TLOB approach 1.00, thus the confidence of tests is significantly improved. As far as we know, there is no pre-processing tool more efficient than TLOB. TLOB is very simple, and only brings very limited time and memory consumption. We strongly recommend to use it to pre-process traces in side channel evaluations
    • 

    corecore