1,022,467 research outputs found

    A Two-stage Polynomial Method for Spectrum Emissivity Modeling

    Get PDF
    Spectral emissivity is a key in the temperature measurement by radiation methods, but not easy to determine in a combustion environment, due to the interrelated influence of temperature and wave length of the radiation. In multi-wavelength radiation thermometry, knowing the spectral emissivity of the material is a prerequisite. However in many circumstances such a property is a complex function of temperature and wavelength and reliable models are yet to be sought. In this study, a two stages partition low order polynomial fitting is proposed for multi-wavelength radiation thermometry. In the first stage a spectral emissivity model is established as a function of temperature; in the second stage a mathematical model is established to describe the dependence of the coefficients corresponding to the wavelength of the radiation. The new model is tested against the spectral emissivity data of tungsten, and good agreement was found with a maximum error of 0.64

    Supply optimization model in the hierarchical geographically distributed organization

    Get PDF
    The strategic importance of the procurement function in the large organizations management requires using effective tools by the logistics management to justify decisions in the supply process. The architecture features of hierarchical geographically distributed organizations allow the use of a hybrid supply scheme that rationally combines the advantages of centralized and decentralized purchasing and supply management (PSM). The article suggests a supply optimization model in the hierarchical geographically distributed organization (HGDO), reflecting the features of a complex, multifactorial and multi-stage procurement process. The model allows to find the optimal options for purchasing and supplying products for the criterion of minimizing the total logistics costs that characterize this process for the entire period of planning HGDO logistics support, taking into account the values of the various parameters of participants and the logistics functions of the procurement process over each period of time. The model is an effective tool for supporting and coordinating decisions made by logistics managers at different levels of management of HGDO based on numerous options for purchasing and supplying products and their budgeting in conditions of the dynamics and diversity of internal and external factors of influence

    Technical Efficiency of Maize Production in Fluoride Affected Locales, Tamil Nadu: A Stochastic Frontier Approach

    Get PDF
    To estimate the technical efficiency of maize production among fluoride affected and non affected locales of Tamil Nadu. A multi-stage sampling method involving a combination of purposive and random sampling procedures was employed in drawing up the samples for collecting primary data. The sample size is about 120. Stochastic frontier production function is used to estimate technical efficiency of maize. The result of stochastic frontier production function indicated that FYM, Potassium, machine power, irrigation and management index have significant influence on yield of maize in less fluoride affected locale, while, seed rate, nitrogen, phosphorous, machine power and irrigation are significantly influence the yield of maize in moderately fluoride affected locale, in case of highly fluoride affected locale, seed rate, nitrogen, phosphorous, potassium and irrigation are significantly influencing the yield of maize, while, nitrogen, potassium, irrigation and management index are significantly influences the yield of maize in non affected locale. The study suggests that awareness of fluoride contamination and averting measures must be disseminated to the farmers

    Deep Brain Stimulation: Prediction Model for Volume of Tissue Stimulated

    Get PDF
    An attempt has been made to understand deep brain stimulation in humans via marketable devices that are approved by the US Food and Drug Administration. The electrode characteristics were presented to influence the volume of tissue stimulated. Experimental data were digitized and VTS data were presented a function of electrode diameter, aspect ratio, applied current, pulse duration, voltage and frequency. Amongst other trends voltage and VTS were found to exhibit a three-stage relation, where Stage II characteristics were represented by a linear equation. It is in this stage, the VTS was found to be stable and where the most effective stimulation is experienced. In this study, three multi-variant mathematical models are presented which were developed to predict the VTS in terms of the above independent parameters. Because limited data was available in the literature, validation of the models with wide range of data was not possible. However, the equations provide both theoretical and experimental basis for predicting the volume of tissue stimulated in deep brain stimulation and will be very useful in future studies

    Stability Comparisons of Estimators

    Get PDF
    A property of estimators called stability is investigated in this paper. The stability of an estimator is a measure of the magnitude of the affect of any single observation in the sample on the realized value of the estimator. High stability often is desirable for robustness against misspecification and against highly variable observations. Stabilities are determined and compared for a wide variety of estimators and econometric models. Estimators considered include: least squares, maximum likelihood (including both LIML and FIML), instrumental variables, M-, and multi-stage estimators such as tow and three stage least squares, Zellner’s feasible Aikten estimator of the multivariate regression model, and Heckman’s estimator of censored regression and self-selection models. The general results of the paper apply to numerous additional estimators of various and sundry models. The stability of an estimator is found to depend on the number of finite moments of its influence curve (evaluated at a random observation in the sample). An estimator’s stability increases strictly and continuously from zero to one as the number of finite moments of its influence curve increases from one to infinity. The more moments, the higher the stability. Since it often is possible to construct estimators with a specified influence function, estimators with different stabilities can be constructed. For example, one can attain the maximum stability possible by formulating a bounded influence estimator, since they have an infinite number of finite moments

    Multi-stage feature decorrelation constraints for improving CNN classification performance

    Full text link
    For the convolutional neural network (CNN) used for pattern classification, the training loss function is usually applied to the final output of the network, except for some regularization constraints on the network parameters. However, with the increasing of the number of network layers, the influence of the loss function on the network front layers gradually decreases, and the network parameters tend to fall into local optimization. At the same time, it is found that the trained network has significant information redundancy at all stages of features, which reduces the effectiveness of feature mapping at all stages and is not conducive to the change of the subsequent parameters of the network in the direction of optimality. Therefore, it is possible to obtain a more optimized solution of the network and further improve the classification accuracy of the network by designing a loss function for restraining the front stage features and eliminating the information redundancy of the front stage features .For CNN, this article proposes a multi-stage feature decorrelation loss (MFD Loss), which refines effective features and eliminates information redundancy by constraining the correlation of features at all stages. Considering that there are many layers in CNN, through experimental comparison and analysis, MFD Loss acts on multiple front layers of CNN, constrains the output features of each layer and each channel, and performs supervision training jointly with classification loss function during network training. Compared with the single Softmax Loss supervised learning, the experiments on several commonly used datasets on several typical CNNs prove that the classification performance of Softmax Loss+MFD Loss is significantly better. Meanwhile, the comparison experiments before and after the combination of MFD Loss and some other typical loss functions verify its good universality

    Aging reduces left atrial performance during adrenergic stress in middle aged and older patients

    Get PDF
    Background: During adrenergic stress, the influence of age on left atrial (LA) function is unknown. We hypothesized that aging decreases LA total emptying fraction (LAEF) during maximal adrenergic stress. The aim of the study was to determine the influence of aging on LA function during adrenergic stress in middle aged and older patients. Methods: We enrolled 167 middle aged and elderly participants, and measured LA and left ventricular (LV) volumes using a multi-slice three-dimensional cine white blood cardiovascular magnetic resonance (CMR) technique before and during intravenous dobutamine infused to achieve 80% of the maximum heart rate response for age. Paired sample t-test was used to detect differences in LA and LV volumes between baseline and peak dose stage of dobutamine stress CMR, and multivariable linear regression was used to identify predictors of LA function. Results: Participants averaged 68 ± 8 years in age, 53% were men, 25% exhibited coronary artery disease, 35% had diabetes, 9% had a remote history of atrial fibrillation, 90% had hypertension, and 11% had inducible LV wall motion abnormalities indicative of ischemia during dobutamine CMR. Increasing age correlated with LA volumes (maximal and minimal) and inversely correlated with LAEF at rest and after peak adrenergic stress. Age was an independent predictor of LAEF during adrenergic stress, even after accounting for gender, LV volumes, and other co-morbidities including inducible ischemia. Conclusions: Age is associated with a decrease in LA function during adrenergic stress even after adjusting for co-morbidities associated with cardiovascular disease and LV function. (Cardiol J 2012; 19, 1: 45–52

    CARBON: A Counterfactual Reasoning based Framework for Neural Code Comprehension Debiasing

    Full text link
    Previous studies have demonstrated that code intelligence models are sensitive to program transformation among which identifier renaming is particularly easy to apply and effective. By simply renaming one identifier in source code, the models would output completely different results. The prior research generally mitigates the problem by generating more training samples. Such an approach is less than ideal since its effectiveness depends on the quantity and quality of the generated samples. Different from these studies, we are devoted to adjusting models for explicitly distinguishing the influence of identifier names on the results, called naming bias in this paper, and thereby making the models robust to identifier renaming. Specifically, we formulate the naming bias with a structural causal model (SCM), and propose a counterfactual reasoning based framework named CARBON for eliminating the naming bias in neural code comprehension. CARBON explicitly captures the naming bias through multi-task learning in the training stage, and reduces the bias by counterfactual inference in the inference stage. We evaluate CARBON on three neural code comprehension tasks, including function naming, defect detection and code classification. Experiment results show that CARBON achieves relatively better performance (e.g., +0.5% on the function naming task at F1 score) than the baseline models on the original benchmark datasets, and significantly improvement (e.g., +37.9% on the function naming task at F1 score) on the datasets with identifiers renamed. The proposed framework provides a causal view for improving the robustness of code intelligence models

    On multi-stage production/inventory systems under stochastic demand

    Get PDF
    This paper was presented at the 1992 Conference of the International Society of Inventory Research in Budapest, as a tribute to professor Andrew C. Clark for his inspiring work on multi-echelon inventory models both in theory and practice. It reviews and extends the work of the authors on periodic review serial and convergent multi-echelon systems under stochastic stationary demand. In particular, we highlight the structure of echelon cost functions which play a central role in the derivation of the decomposition results and the optimality of base stock policies. The resulting optimal base stock policy is then compared with an MRP system in terms of cost effectiveness, given a predefined target customer service level. Another extension concerns an at first glance rather different problem; it is shown that the problem of setting safety leadtimes in a multi-stage production-to-order system with stochastic lead times leads to similar decomposition structures as those derived for multi-stage inventory systems. Finally, a discussion on possible extensions to capacitated models, models with uncertainty in both demand and production lead time as well as models with an aborescent structure concludes the paper
    corecore