94,566 research outputs found
A simple statistical method for measuring how life events affect happiness
Background Life events—like illness, marriage, or unemployment—have important effects on people. But there is no accepted way to measure the different sizes of these events upon human happiness and psychological health. By using happiness regression equations, economists have recently developed a method.
Methods We estimate happiness regressions using large random samples of individuals. The relative coefficients of income and life events on happiness allow us to calculate a monetary ‘compensating amount’ for each kind of life event.
Results The paper calculates the impact of different life events upon human well-being. Getting married, for instance, is calculated to bring each year the same amount of happiness, on average, as having an extra £70 000 of income per annum. The psychological costs of losing a job greatly exceed those from the pure drop in income. Health is hugely important to happiness. Widowhood brings a degree of unhappiness that would take, on average, an extra £170 000 per annum to offset. Well-being regressions also allow us to assess one of the oldest conjectures in social science—that well-being depends not just on absolute things but inherently on comparisons with other people. We find evidence for comparison effects.
Conclusion We believe that the new statistical method has many applications. In principle, it can be used to value any kind of event in life
The Charge Transfer Efficiency and Calibration of WFPC2
A new determination of WFPC2 photometric corrections is presented, using
HSTphot reduction of the WFPC2 Omega Centauri and NGC 2419 observations from
January 1994 through March 2000 and a comparison with ground-based photometry.
No evidence is seen for any position-independent photometric offsets (the
"long-short anomaly"); all systematic errors appear to be corrected with the
CTE and zero point solution. The CTE loss time dependence is determined to be
very significant in the Y direction, causing time-independent CTE solutions
(Stetson 1998; Saha, Lambert, & Prosser 2000) to be valid only for a small
range of times. On average, the present solution produces corrections similar
to Whitmore, Heyer, & Casertano (1999), although with an improved functional
form that produces less scatter in the residuals and determined with roughly a
year of additional data. In addition to the CTE loss characterization, zero
point corrections are also determined as functions of chip, gain, filter, and
temperature. Of interest, there are chip-to-chip differences of order 0.01-0.02
magnitudes relative to the Holtzman et al. (1995) calibrations, and the present
study provides empirical zero point determinations for the non-standard filters
such as the frequently-used F450W, F606W, and F702W.Comment: 30 pages, 10 figures Accepted for publication in October 2000 PAS
What is Probable Cause, and Why Should We Care?: The Costs, Benefits, and Meaning of Individualized Suspicion
Taslitz defines probable cause as having four components: one quantitative, one qualitative, one temporal, and one moral. He focuses on the last of these components. Individualized suspicion, the US Supreme Court has suggested, is perhaps the most important of the four components of probable cause. That is a position with which he heartily agree. The other three components each play only a supporting role. But individualized suspicion is the beating heart that gives probable cause its vitality
Foreword: The Political Geography of Race Data in the Criminal Justice System
Several months ago, there was a heated discussion on CrimProf, the listserv for criminal law professors, about the disproportionate representation of minorities in the criminal justice system. Few participants in this online discussion contested the reality that racial and ethnic minorities, especially African Americans, make up a far larger percentage of those arrested and incarcerated than should be expected from their percentage of the country\u27s total population
- …