107 research outputs found

    FEM analysis on the deformation behavior of flange portion during early stage in deep drawing of dĂşplex embossed sheet metal

    Get PDF
    Duplex embossed sheet metals are very useful due to their high rigidity. Furthermore, it is very interesting that the periodic configurations given by embossing brings a new macroscopic feature into a sheet metal. In this study, in order to confirm the effect of only sub-macroscopic structure, which was the configuration given to sheet metal by duplex embossing process, the material was set to isotropy, and FEM simulation was carried out to investigate about the deformation ehavior of flange portion during the early stage in deep drawing of duplex embossed sheet

    FE analysis on the influence of width direction deformation on springback control in v-bending by sheet forging

    Get PDF
    There have some problems in the press engineering. One of the most representative phenomena is springback. Traditionally, a series of empirical methods were used to obtain target bending angle. However, such methods are relied on the ability and experience of engineer. Therefore, the control of springback is important. According to the viewpoint of plastic processing, it is considered that springback could be controlled by sheet forging method which was added after V-bending process used a punch with a single lump-punch. On the other hand, warp would occur in air bending process when the ratio of width to thickness is relatively small. So, it is considered that width direction deformation would affect springback control to some extent in case of warp is occurred. In this study, V-bending and continuous forging processes were conducted used FE analysis. From the analytical results, occurrence of warp was found. Next, model of these processes in consideration of warp was re-modified. Finally, it was found that the springback was controlled to some extent derived from width direction deformation

    New Implementations for Tabulating Pseudoprimes and Liars

    Get PDF
    Whether it is applied to primality test or cryptography, pseudoprimes are one of the most important topics in number theory. Regarding the study of strong pseudoprimes, there are two problems which mathematicians have been working on: 1. Given a, b, find all a-spsp up to b. 2. Given an odd composite n, find all a -n such that n is an a-spsp. where n = a-spsp means n is a strong pseudoprime to base a, and a is a strong liar of n. The two problems are respectively referred to as the tabulation of strong pseudoprimes and the tabulation of strong liars. The main focus of my work in this research project is on the tabulation of strong liars. This can be achieved by the application of the multiplicative group modulo n, denoted by (Z=nZ). Instead of checking each potential candidate, we can actually construct the set of Fermat liars, a \weaker version than strong liars, from the bottom up with the help of the primitive roots of (Z=pZ) for all prime factors p of n. We then sieve out the set of strong liars with Millerwitness() function in NTL library. By implementing the algorithms with appropriate data structures, I verified that in most cases the runtimes have been improved compared to previous algorithms or brute force. This is to be expected, since they have less computational complexities theoretically. All implementations of the algorithms in this research project are in C++. In 2010, Professors Mark Liffiton and Andrew Shallue built a new computer system for research purposes, known as Hyperion. This is essentially a cluster of 8 computers, also referred to as nodes, which are able to finish complex distributed computations. The operating systems installed on all the nodes in the cluster is Linux. By submitting Shell scripts which include instructions from its front end, I can grab a cup of coffee while the programs are running on the cluster. This feature is particularly beneficial for this project since the programs sometimes take tens of minutes to return all the outputs

    Health Risk of Heating Fuel Choice: A Simultaneity Causality Analysis

    Get PDF
    Combustion-generated pollutants, principally those from solid-fuels including biomass and coal when cooking and heating, bring out a significant public health hazard in both developed and developing countries. Most of the existing studies addressing this issue focus on developing countries, and on exposure when cooking rather than heating. By using Kentucky rural data, this research explores the health risk associated with heating fuel choice. Given the simultaneity between heating fuel choice and prevalence of asthma and allergy, we obtain the instrumental variable (IV) estimate for Logit models through the Generalized Method of Moments (GMM). After correcting for simultaneity bias, we do not find strong evidence supporting the causal relationship between polluting heating use and the prevalence of asthma, allergy, and other respiratory disease. Some demographic and lifestyle factors do have significant effects on the prevalence of these diseases.combustion-generated pollutants, indoor air pollution, heating fuel choice, health risk, GMM-IV Estimation, Environmental Economics and Policy, Health Economics and Policy, Q53, I18,

    Semantic Proximity Alignment: Towards Human Perception-consistent Audio Tagging by Aligning with Label Text Description

    Full text link
    Most audio tagging models are trained with one-hot labels as supervised information. However, one-hot labels treat all sound events equally, ignoring the semantic hierarchy and proximity relationships between sound events. In contrast, the event descriptions contains richer information, describing the distance between different sound events with semantic proximity. In this paper, we explore the impact of training audio tagging models with auxiliary text descriptions of sound events. By aligning the audio features with the text features of corresponding labels, we inject the hierarchy and proximity information of sound events into audio encoders, improving the performance while making the prediction more consistent with human perception. We refer to this approach as Semantic Proximity Alignment (SPA). We use Ontology-aware mean Average Precision (OmAP) as the main evaluation metric for the models. OmAP reweights the false positives based on Audioset ontology distance and is more consistent with human perception compared to mAP. Experimental results show that the audio tagging models trained with SPA achieve higher OmAP compared to models trained with one-hot labels solely (+1.8 OmAP). Human evaluations also demonstrate that the predictions of SPA models are more consistent with human perception.Comment: 5 pages, 3 figures. Submitted to ICASSP 202

    Nonpoint Source Abatement Costs in the Kentucky River Watershed

    Get PDF
    A growing share of water pollution in the U.S. can be attributed to nonpoint sources (USEPA 2002). Some of this trend can be attributed to declining point source (PS) emissions as a result of regulation under the Clean Water Act (CWA). However, fertilizer-intensive practices used to improve agricultural productivity over recent decades have also increased nitrate loads and resulted in water quality impairments. Nonpoint source (NPS) pollution from agricultural practices is generally exempt from federal regulation. However, some voluntary programs allow point sources subject to the CWA’s effluent limitations to meet their standards by purchasing offset credits reflecting reductions in NPS discharges to the same waters (USEPA 2004). Such water quality trading (WQT) programs have been implemented in a number of states to reduce pollution abatement costs (Breetz et al 2004). In this setting, NPS supply pollution abatement when they implement best management practices (BMP) that reduce nutrient loads, and the cost of BMPs form a supply curve for credits. WQT programs are supported by the EPA as an important means for efficiently pursuing water quality goals (USEPA 2003a). Among the BMPs available for water quality management, riparian buffer strips have proven effective in mitigating the movement of nutrients and other pollutants into surface waters (Qiu et al 2006). Estimates of riparian buffer costs would be valuable for developing policy related to WQT and other conservation programs. This paper estimates the annual costs of buffer strips in six counties in the Lower Kentucky River Basin, as part of a project evaluating the feasibility of WQT programs in that area.Environmental Economics and Policy,

    The Supply of Private Acreage for Public Recreational Use in Southern and Central Appalachia

    Get PDF
    Public lands in Southern and Central Appalachia (SCA) available for outdoor recreational pursuits are limited relative to the rest of the county. This study identifies factors that encourage private land owners to permit public access to their land for recreational purposes and determines how much acreage would be offered in the Southern and Central Appalachia region. The Tobit and Heckman’s sample selection models suggest that the probability of offering land to the public is correlated with the number of acres offered. Having acreage suited for recreation is a positive determinant of acres leased but attributes developed by the landowner act as a disincentive. Type of recreational activity has no effect on the landowner’s decision and the supply of recreational acreage is inelastic with respect to price.private land lease, recreational activities, Tobit model, Heckman’s sample selection model, public recreational use, Environmental Economics and Policy, Land Economics/Use, Resource /Energy Economics and Policy, Q24, Q26,

    Physical Perception Network and an All-weather Multi-modality Benchmark for Adverse Weather Image Fusion

    Full text link
    Multi-modality image fusion (MMIF) integrates the complementary information from different modal images to provide comprehensive and objective interpretation of a scenes. However, existing MMIF methods lack the ability to resist different weather interferences in real-life scenarios, preventing them from being useful in practical applications such as autonomous driving. To bridge this research gap, we proposed an all-weather MMIF model. Regarding deep learning architectures, their network designs are often viewed as a black box, which limits their multitasking capabilities. For deweathering module, we propose a physically-aware clear feature prediction module based on an atmospheric scattering model that can deduce variations in light transmittance from both scene illumination and depth. For fusion module, We utilize a learnable low-rank representation model to decompose images into low-rank and sparse components. This highly interpretable feature separation allows us to better observe and understand images. Furthermore, we have established a benchmark for MMIF research under extreme weather conditions. It encompasses multiple scenes under three types of weather: rain, haze, and snow, with each weather condition further subdivided into various impact levels. Extensive fusion experiments under adverse weather demonstrate that the proposed algorithm has excellent detail recovery and multi-modality feature extraction capabilities
    • …
    corecore