52,646 research outputs found
Recommended from our members
Globalization, Worker Insecurity, and Policy Approaches
[Excerpt] Today’s global economy, or what many call globalization, has a growing impact on the economic futures of American companies, workers, and families. Increasing integration with the world economy makes the U.S. and other economies more productive. For most Americans, this has translated into absolute increases in living standards and real disposable incomes. However, while the U.S. economy as a whole benefits from globalization, it is not always a win-win situation for all Americans. Rising trade with low-wage developing countries not only increases concerns of job loss, but it also leads U.S. workers to fear that employers will lower their wages and benefits in order to compete. Globalization facilitated by the information technology revolution expands international trade in a wider range of services, but also subjects an increasing number of U.S. white collar jobs to outsourcing and international competition. Also, globalization may benefit some groups more than others, leading some to wonder whether the global economy is structured to help the few or the many.
The current wave of globalization is supported by three broad trends. The first is technology, which has sharply reduced the cost of communication and transportation that previously divided markets. The second is a dramatic increase in the world supply of labor engaged in international trade. The third is government policies that have reduced barriers to trade and investment. Whether these trends are creating new vulnerabilities for workers is the subject of increasing research and debate
Trade and the Americas
CRS ReportCRSTradeAmericas12Sept03.pdf: 220 downloads, before Oct. 1, 2020
Comparative Experiments on Disambiguating Word Senses: An Illustration of the Role of Bias in Machine Learning
This paper describes an experimental comparison of seven different learning
algorithms on the problem of learning to disambiguate the meaning of a word
from context. The algorithms tested include statistical, neural-network,
decision-tree, rule-based, and case-based classification techniques. The
specific problem tested involves disambiguating six senses of the word ``line''
using the words in the current and proceeding sentence as context. The
statistical and neural-network methods perform the best on this particular
problem and we discuss a potential reason for this observed difference. We also
discuss the role of bias in machine learning and its importance in explaining
performance differences observed on specific problems.Comment: 10 page
Minimalist Solution to Williamson County
Williamson County Regional Planning Commission v. Hamilton Bank of Johnson County relegated Fifth Amendment takings claims to a second-class of federal rights. Before a takings plaintiff can sue in federal court, she must first seek compensation through an “adequate state procedure.” Many federal courts have held that requirement to mean a takings litigant must first seek compensation through state courts if that state provides an inverse condemnation proceeding. However, if a takings litigant sues in state court, she will be unable to sue in federal court because of issue preclusion. This effectively shuts the federal courthouse door to many property owners. Only two Supreme Court justices have shown any interest in revisiting Williamson County . Thus, land use attorneys who are concerned about federal court access for takings plaintiffs should craft a case that would attract the Supreme Court’s attention. This Article argues that land use lawyers should present the Court with a case in which the property owner has used a non-judicial procedure to seek compensation (such as asking for compensation from a county board). The Court could then rule that such a non-judicial procedure is an “adequate state procedure” that satisfies Williamson County’ s requirements. This ruling would minimize the negative effects that Williamson County has wrought on takings plaintiffs
Estimating the Distribution of Dietary Consumption Patterns
In the United States the preferred method of obtaining dietary intake data is
the 24-hour dietary recall, yet the measure of most interest is usual or
long-term average daily intake, which is impossible to measure. Thus, usual
dietary intake is assessed with considerable measurement error. We were
interested in estimating the population distribution of the Healthy Eating
Index-2005 (HEI-2005), a multi-component dietary quality index involving ratios
of interrelated dietary components to energy, among children aged 2-8 in the
United States, using a national survey and incorporating survey weights. We
developed a highly nonlinear, multivariate zero-inflated data model with
measurement error to address this question. Standard nonlinear mixed model
software such as SAS NLMIXED cannot handle this problem. We found that taking a
Bayesian approach, and using MCMC, resolved the computational issues and doing
so enabled us to provide a realistic distribution estimate for the HEI-2005
total score. While our computation and thinking in solving this problem was
Bayesian, we relied on the well-known close relationship between Bayesian
posterior means and maximum likelihood, the latter not computationally
feasible, and thus were able to develop standard errors using balanced repeated
replication, a survey-sampling approach.Comment: Published in at http://dx.doi.org/10.1214/12-STS413 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org). arXiv admin note: substantial text
overlap with arXiv:1107.486
- …