4,265 research outputs found

    Income Distribution Effects of Water Quality Controls: An Econometric Approach

    Get PDF
    The imposition of water quality controls may affect the economy chiefly by altering aggregate production and changing the factor payments, These two effects could not only reallocate resources among production possibilities, but also could change the distribution of benefits of production among members of the society. This study attempted to provide a workable theory to establish an empirical test of the impacts of water quality controls on family income distribution. It consists of two separate areas: first, to analyze methodologies of measuring income distribution changes, and , second, to develop a theoretical model that is useful for empirical tests of the impacts of different water quality controls. A number of alternative probability density functions have been proposed as models of personal income distribution. The lognormal, displaced lognormal, gamma, and beta distribution functions were considered as appropriate methodologies, since each allows more productive power for income distribution as suggested in the past literature. Detailed information on income distribution can be extracted from the approximations of the distribution functions. One of the objectives of the research was to evaluate the different methodologies for usefulness. The gastwirth bounds for Gini coefficient were used as the test of goodness to fit; the beta density was clearly superior to the other densities for the SMSA data. Next, a theoretical model was constructed, emphasizing the production sector and the distribution sector. Water quality controls were introduced in the production process as a negative input. Water quality data were collected for all states, and indices of quality were estimated using analysis of variance techniques. The equilibrium conditions in commodity and factor markets generated the first impacts of water quality controls on total output and factor payments in the economy. The specific assumption was made as a theoretical bridge connecting family income distribution and factor payments in the distribution sector. It was assumed that a family\u27s income equals total payments received from owned labor and capital in the production process. Thus, changes in factor payments and total output were included in the distribution equations. Water quality controls would, therefore, effect family income distribution through changes in total output and changes in factor payment. The simultaneous equation regression results for 72 SMSA\u27s were not conclusive. It appeared that water quality parameter may effect the wage rate and total output, if the parameter was not, in fact, a surrogate for other excluded variables in the system. The effect of wage changes on income distribution was not significant, but changes in total output appeared to be the most significant variable in the distribution equations. In an attempt to account for the many variables which might be expected to effect income distribution, factor analysis was performed on the SMSA\u27s. Two groups of SMSA\u27s were identified and regressions were performed for these groups. Results from these regressions were similar in sign to the results from the 172 observations regressions, although many of the coefficients were not significant. Interpreting the results of the research was somewhat difficult, even though some results did appear consistent among all regressions. It does appear that there is some evidence to indicate that water quality controls lead to less equal family income distribution. Better data are required from more complete and accurate analysis. The principle thrust of the study was to develop a model to organize the complexity of economic causality with respect to income distribution change and water quality policy. It appeared that this type of systematic econometric approach can be fruitful in analyzing income distribution change

    Multi-Factor Policy Evaluation and Selection in the One-Sample Situation

    Get PDF
    Firms nowadays need to make decisions with fast information obsolesce. In this paper I deal with one class of decision problems in this situation, called the “one-sample” problems: we have finite options and one sample of the multiple criteria with which we use to evaluate those options. I develop evaluation procedures based on bootstrapping DEA (Data Envelopment Envelopment) and the related decision-making methods. This paper improves the bootstrap procedure proposed by Simar and Wilson (1998) and shows how to exploit information from bootstrap outputs for decision-making

    Towards Correlated Sequential Rules

    Full text link
    The goal of high-utility sequential pattern mining (HUSPM) is to efficiently discover profitable or useful sequential patterns in a large number of sequences. However, simply being aware of utility-eligible patterns is insufficient for making predictions. To compensate for this deficiency, high-utility sequential rule mining (HUSRM) is designed to explore the confidence or probability of predicting the occurrence of consequence sequential patterns based on the appearance of premise sequential patterns. It has numerous applications, such as product recommendation and weather prediction. However, the existing algorithm, known as HUSRM, is limited to extracting all eligible rules while neglecting the correlation between the generated sequential rules. To address this issue, we propose a novel algorithm called correlated high-utility sequential rule miner (CoUSR) to integrate the concept of correlation into HUSRM. The proposed algorithm requires not only that each rule be correlated but also that the patterns in the antecedent and consequent of the high-utility sequential rule be correlated. The algorithm adopts a utility-list structure to avoid multiple database scans. Additionally, several pruning strategies are used to improve the algorithm's efficiency and performance. Based on several real-world datasets, subsequent experiments demonstrated that CoUSR is effective and efficient in terms of operation time and memory consumption.Comment: Preprint. 7 figures, 6 table

    Blast Load Input Estimation of the Medium Girder Bridgeusing Inverse Method

    Get PDF
    Innovative adaptive weighted input estimation inverse methodology for estimating theunknown time-varying blast loads on the truss structure system is presented. This method isbased on the Kalman filter and the recursive least square estimator (RLSE). The filter models thesystem dynamics in a linear set of state equations. The state equations of the truss structureare constructed using the finite element method. The input blast loads of the truss structuresystem are inverse estimated from the system responses measured at two distinct nodes. Thiswork presents an efficient weighting factor  applied in the RLSE, which is capable of providinga reasonable estimation results. The results obtained from the simulations show that the methodis effective in estimating input blast loads, so has great stability and precision.Defence Science Journal, 2008, 58(1), pp.46-56, DOI:http://dx.doi.org/10.14429/dsj.58.162

    Forgetful Large Language Models: Lessons Learned from Using LLMs in Robot Programming

    Full text link
    Large language models offer new ways of empowering people to program robot applications-namely, code generation via prompting. However, the code generated by LLMs is susceptible to errors. This work reports a preliminary exploration that empirically characterizes common errors produced by LLMs in robot programming. We categorize these errors into two phases: interpretation and execution. In this work, we focus on errors in execution and observe that they are caused by LLMs being "forgetful" of key information provided in user prompts. Based on this observation, we propose prompt engineering tactics designed to reduce errors in execution. We then demonstrate the effectiveness of these tactics with three language models: ChatGPT, Bard, and LLaMA-2. Finally, we discuss lessons learned from using LLMs in robot programming and call for the benchmarking of LLM-powered end-user development of robot applications.Comment: 9 pages ,8 figures, accepted by the AAAI 2023 Fall Symposium Serie

    Determination of Moving Tank and Missile ImpactForces on a Bridge Structure

    Get PDF
    A method to determine the moving tank and missile impact forces on a bridge is developed. The presentmethod is an online adaptive recursive inverse algorithm, which is composed of the Kalman filter  and therecursive least square estimator (RLSE), to estimate the force inputs on the bridge structure. The state equationsof the bridge structure were constructed by using the model superposition and orthogonal technique. Byadopting this inverse method, the moving tank and missile impact force inputs acting on the bridge structuresystem can be estimated from the measured dynamic responses. Besides, this work presents an efficientweighting factor applied in the RLSE, which is capable of providing a reasonable estimation results. The resultsobtained from the simulations show that the method is effective in determining the moving tank and missileimpact forces, so that the acceptable results can be obtained.Defence Science Journal, 2008, 58(6), pp.752-761, DOI:http://dx.doi.org/10.14429/dsj.58.170

    Maximizing Friend-Making Likelihood for Social Activity Organization

    Full text link
    The social presence theory in social psychology suggests that computer-mediated online interactions are inferior to face-to-face, in-person interactions. In this paper, we consider the scenarios of organizing in person friend-making social activities via online social networks (OSNs) and formulate a new research problem, namely, Hop-bounded Maximum Group Friending (HMGF), by modeling both existing friendships and the likelihood of new friend making. To find a set of attendees for socialization activities, HMGF is unique and challenging due to the interplay of the group size, the constraint on existing friendships and the objective function on the likelihood of friend making. We prove that HMGF is NP-Hard, and no approximation algorithm exists unless P = NP. We then propose an error-bounded approximation algorithm to efficiently obtain the solutions very close to the optimal solutions. We conduct a user study to validate our problem formulation and per- form extensive experiments on real datasets to demonstrate the efficiency and effectiveness of our proposed algorithm

    Probing triple-Higgs productions via 4b2γ4b2\gamma decay channel at a 100 TeV hadron collider

    Full text link
    The quartic self-coupling of the Standard Model Higgs boson can only be measured by observing the triple-Higgs production process, but it is challenging for the Large Hadron Collider (LHC) Run 2 or International Linear Collider (ILC) at a few TeV because of its extremely small production rate. In this paper, we present a detailed Monte Carlo simulation study of the triple-Higgs production through gluon fusion at a 100 TeV hadron collider and explore the feasibility of observing this production mode. We focus on the decay channel HHHbbˉbbˉγγHHH\rightarrow b\bar{b}b\bar{b}\gamma\gamma, investigating detector effects and optimizing the kinematic cuts to discriminate the signal from the backgrounds. Our study shows that, in order to observe the Standard Model triple-Higgs signal, the integrated luminosity of a 100 TeV hadron collider should be greater than 1.8×1041.8\times 10^4 ab1^{-1}. We also explore the dependence of the cross section upon the trilinear (λ3\lambda_3) and quartic (λ4\lambda_4) self-couplings of the Higgs. We find that, through a search in the triple-Higgs production, the parameters λ3\lambda_3 and λ4\lambda_4 can be restricted to the ranges [1,5][-1, 5] and [20,30][-20, 30], respectively. We also examine how new physics can change the production rate of triple-Higgs events. For example, in the singlet extension of the Standard Model, we find that the triple-Higgs production rate can be increased by a factor of O(10)\mathcal{O}(10).Comment: 33 pages, 11 figures, added references, corrected typos, improved text, affiliation is changed. This is the publication versio
    corecore