2,187 research outputs found
FISHING BEHAVIOR AND THE LENGTH OF THE FISHING SEASON
The basic hypothesis of this paper is that the amount of fishing that a fish harvester undertakes during a year is not determined entirely by circumstances which are exogenous to the fisher, such as weather conditions and resource availability, but is also partially a matter of individual choice. The paper develops a behavioral model of fishing from the perspective that the decision to modify the period of time over which fishing takes place is governed by a comparison of the marginal benefits and costs of doing so. The model is tested econometrically as an error-components model using a 10% longitudinal sample of recipients of seasonal fishermen's unemployment insurance benefits in Newfoundland over the period 1971-93. The results suggest that the Canadian unemployment insurance program has reduced the length of the fishing season in Newfoundland by about 8-10 weeks.Resource /Energy Economics and Policy,
Employment Prospects in a Commercially Viable Newfoundland Fishery: An Application of 'An Econometric Model of the Newfoundland Groundfishery'
An econometric model is utilized to simulate the effects of a policy change in which government financial assistance to a major Canadian marine fishery is withdrawn and the industry is placed on a commercially viable basis. Under near-ideal conditions of marketing and harvesting, harvesting employment would fall drastically, from approximately thirty thousand fishermen under the current regime to approximately six thousand. There would be a concomitant fall in seasonal fish plant employment, and a severe fall in those federal transfer payments (e.g., unemployment insurance) which are currently generated by extensive seasonal employment in both harvesting and processing sectors of the fishery. The policy analysis consists of simulations with a prototype econometric model which integrates the demand, processing, and harvesting sectors of the fishery. The essential components of the 1,000-equation model are described.Environmental Economics and Policy, Resource /Energy Economics and Policy,
Heavy social drinkers score higher on implicit wanting and liking for alcohol than alcohol-dependent patients and light social drinkers
Abstract Background and objectives Automatic hedonic ("liking") and incentive ("wanting") processes are assumed to play an important role in addiction. Whereas some neurobiological theories suggest that these processes become dissociated when drug use develops into an addiction (i.e. "liking" becomes weaker, whereas "wanting" becomes exaggerated; e.g. Robinson & Berridge, 1993), other theories suggest that there is a linear relationship between these two processes (i.e. both "liking" and "wanting" increase equally; e.g. Koob & Le Moal, 1997). Our aim was to examine "wanting" and "liking" in three groups of participants: alcohol-dependent patients, heavy social drinkers, and light social drinkers. Methods Participants performed two different single target implicit association tests (ST-IATs; e.g. Bluemke & Friese, 2007) and explicit ratings that were designed to measure "liking" and "wanting" for alcohol. Results Our results are in sharp contrast with the theories of both Robinson and Berridge and Koob and Le Moal: heavy drinkers had higher scores than light drinkers and alcohol-dependent patients on both the wanting ST-IAT and the liking ST-IAT. There were no differences between alcohol-dependent patients and light drinkers. Explicit ratings mirrored these results. Limitations These findings suggest that our ST-IATs are not valid measures of "wanting" and "liking". Instead, they might assess more complex knowledge regarding participants' experiences and goals. Conclusions These findings suggest that the relationship between drug consumption and appetitive drug associations is not linear, highlighting the importance of testing both sub-clinical and clinical samples in future research.SCOPUS: ar.jinfo:eu-repo/semantics/publishe
Evaluation of the cancer risk from PAHs by inhalation:Are current methods fit for purpose?
There is ample evidence from occupational studies that exposure to a mixture of Polycyclic Aromatic Hydrocarbons
(PAHs) is causally associated with an increased incidence of lung cancers. In both occupational atmospheres
and ambient air, PAHs are present as a mixture of many compounds, but the composition of the mixture
in ambient air differs from that in the occupational atmosphere, and varies in time and space in ambient air.
Estimates of cancer risk for PAH mixtures are based upon unit risks which derive from extrapolation of occupational
exposure data or animal model data, and in the case of the WHO use one compound, benzo[a]pyrene as
a marker for the entire mixture, irrespective of composition. The U.S. EPA has used an animal exposure study to
derive a unit risk for inhalation exposure to benzo[a]pyrene alone, and there have been a number of rankings of
relative carcinogenic potency for other PAHs which many studies have used to calculate a cancer risk from the
PAHs mixture, frequently incorrectly by adding the estimated relative risks of individual compounds, and
applying the total “B[a]P equivalent” to the WHO unit risk, which already applies to the entire mixture. Such
studies are often based upon data solely for the historic US EPA group of 16 compounds which do not include
many of the apparently more potent carcinogens. There are no data for human cancer risk of individual PAHs,
and conflicting evidence of additivity of PAH carcinogenicity in mixtures. This paper finds large divergences
between risk estimates deriving from the WHO and U.S. EPA methods, as well as considerable sensitivity to the
mixture composition, and assumed PAH relative potencies. Of the two methods, the WHO approach appears
more likely to provide reliable risk estimates, but recently proposed mixture-based approaches using in vitro
toxicity data may offer some advantages.peer-reviewe
Recommended from our members
A Stochastic Production Frontier Model of the Newfoundland Snow Crab Fishery
Since the collapse of the Newfoundland groundfishery in 1992, the snow crab fishery has become Newfoundland’s largest fishery, accounting for approximately half the value of total landings. This study uses trip log data to estimate the production frontier and the technical efficiency of this fishery using a Stochastic Frontier Analysis (SFA) methodology. The analysis is based on over 11,000 observations taken over a five-year period. The technical efficiency of the fishery is estimated to be at a level of fifty percent or less.Keywords: Theoretical and Empirical Bio-Economic Modelling, Newfoundland snow crab, Stochastic Frontier Analysis, production frontier, Fisheries Economics, technical efficienc
Recommended from our members
The Fishery as an Economic Base Industry after the Newfoundland Cod Moratorium
In a 2009 paper in Land Economics, Roy, Arnason and Schrank used a newly developed methodology based on cointegration analysis to establish and measure the role of the fishing industry as an economic base industry (and the only such base industry) for the Canadian province of Newfoundland over the period 1961-1994. Since that period, the groundfish harvesting sector has collapsed, although it has been replaced by a crustacean fishery that provides similar value added but is considerably less labor-intensive. At the same time, valuable petroleum deposits have been developed offshore which have resulted in considerable consequent economic activity, perhaps to the extent of establishing a new economic base industry. This study is based on the same methodology as in Roy, Arnason and Schrank, but documents the impact of both the major structural shifts within the fishing industry and the development of competing economic base sectors in petroleum extraction and its derivatives
Factors Influencing the Decisions of Women Small Business Owners on Hiring People with Disabilities
Despite the passage of disability rights legislation in the United States, individuals with disabilities continue to experience high unemployment and underemployment rates than their counterparts without disabilities. The purpose of the study was to examine the attitudes of women small business owners towards hiring individuals with disabilities, and to determine what factors influence their hiring decisions. A total of 80 women small business owners in a southwestern U.S. state took part in the study. The Employer Attitudes Questionnaire (EAQ) and the Marlowe-Crowne Social Desirability Scale Short Form-C (M-C Form C) were used to assess participants’ attitudes. The results of a sequential multiple regression analysis indicated that the independent variables as a whole contributed 7.2% to the variance in the outcome of EAQ score. Scores of the EAQ were weakly correlated with scores on the M-C Form C (r = .276, p = .013). Working facilitates the development of a sense of self-worth, self-sufficiency, self-efficacy, and social networks. The bearing of unemployment and underemployment on the quality of life for individuals with disabilities cannot be underestimated. Women-led businesses offer a number of advantages for employees with disabilities, including their resilience to economic downturns, have a lower employee retrenchment rate, and possess a better understanding of employment and anti-discrimination legislation
Comparison of machine learning approaches with a general linear model to predict personal exposure to benzene
Machine learning techniques (MLTs)
offer great power in analyzing
complex data sets and have not previously been applied to non-occupational
pollutant exposure. MLT models that can predict personal exposure
to benzene have been developed and compared with a standard model
using a linear regression approach (GLM). The models were tested against
independent data sets obtained from three personal exposure measurement
campaigns. A correlation-based feature subset (CFS) selection algorithm
identified a reduced attribute set, with common attributes grouped
under the use of paints in homes, upholstery materials, space heating,
and environmental tobacco smoke as the attributes suitable to predict
the personal exposure to benzene. Personal exposure was categorized
as low, medium, and high, and for big data sets, both the GLM and
MLTs show high variability in performance to correctly classify greater
than 90 percentile concentrations, but the MLT models have a higher
score when accounting for divergence of incorrectly classified cases.
Overall, the MLTs perform at least as well as the GLM and avoid the
need to input microenvironment concentrations
Chiasma
Newspaper reporting on events at the Boston University School of Medicine in the 1960s
- …