1,013 research outputs found

    Fractional reserve banking: The genesis of macro-instability

    Full text link
    This thesis is a theoretical criticism of fractional reserve banKing The Austrian school (especially the works of Mises, Hayek and Rothbard), consider credit expansion to be the genesis of macro-economic instability; This study applies Austrian interest, capital and business cycle theory to challenge the widely held assumption that our present economic system is fundamentally sound. Using a theoretical construct known as the Angel Gabriel Model , the devastating consequences of continuing credit expansion are explored. Fractional reserve free banking is also considered within this model and found to be an unsatisfactory option when compared to 100 percent reserve banKing In conclusion, policy proposals and system transition are considered in the event of a general economic collapse

    Rethinking the Role of Information in Chemicals Policy: Implications for TSCA and REACH

    Get PDF
    This article analyses the role of different kinds of information for minimizing or eliminating the risks due to the production, use, and disposal of chemical substances and contrasts it with present and planned (informational) regulation in the United States and the European Union, respectively. Some commentators who are disillusioned with regulatory approaches have argued that informational tools should supplant mandatory regulatory measures unflatteringly described as ‘‘command and control.’’ Critics of this reformist view are concerned with the lack of technology-innovation forcing that results frominformational policies alone.Weargue that informational tools can be made more technology inducing e and thus more oriented towards environmental innovations e than they are under current practices, with or without complementary regulatory mechanisms, although a combination of approachesmay yield the best results. The conventional approach to chemicals policy envisions a sequential process that includes three steps of (1) producing or collecting risk-relevant information, (2) performing a risk assessment or characterization, followed by (3) risk management practices, often driven by regulation. We argue that such a sequential process is too static, or linear, and spends too many resources on searching for, or generating information about present hazards, in comparison to searching for, and generating information related to safer alternatives which include input substitution, final product reformulation, and/or process changes. These pollution prevention or cleaner technology approaches are generally acknowledged to be superior to pollution control. We argue that the production of risk information necessary for risk assessment, on the one hand, and the search for safer alternatives on the other hand, should be approached simultaneously in two parallel quests. Overcoming deficits in hazard-related information and knowledge about risk reduction alternatives must take place in a more synchronized manner than is currently being practiced. This parallel approach blurs the alleged bright line between risk assessment and risk management, but reflects more closely how regulatory agencies actually approach the regulation of chemicals. These theoretical considerations are interpreted in the context of existing and planned informational tools in the United States and the European Union, respectively. The current political debate in the European Union concerned with reforming chemicals policy and implementing the REACH (Registration, Evaluation and Authorization of Chemicals) system is focused on improving the production and assessment of risk information with regard to existing chemicals, although it also contains some interesting risk management elements. To some extent, REACH mirrors the approach taken in the United States under the Toxics Substances Control Act (TSCA) of 1976. TSCAturned out not to be effectively implemented and provides lessons that should be relevant toREACH. In this context, we discuss the opportunities and limits of existing and planned informational tools for achieving risk reduction

    Development and verification of an analytical algorithm to predict absorbed dose distributions in ocular proton therapy using Monte Carlo simulations

    Get PDF
    Proton beam radiotherapy is an effective and non-invasive treatment for uveal melanoma. Recent research efforts have focused on improving the dosimetric accuracy of treatment planning and overcoming the present limitation of relative analytical dose calculations. Monte Carlo algorithms have been shown to accurately predict dose per monitor unit (D/MU) values, but this has yet to be shown for analytical algorithms dedicated to ocular proton therapy, which are typically less computationally expensive than Monte Carlo algorithms. The objective of this study was to determine if an analytical method could predict absolute dose distributions and D/MU values for a variety of treatment fields like those used in ocular proton therapy. To accomplish this objective, we used a previously validated Monte Carlo model of an ocular nozzle to develop an analytical algorithm to predict three-dimensional distributions of D/MU values from pristine Bragg peaks and therapeutically useful spread-out Bragg peaks (SOBPs). Results demonstrated generally good agreement between the analytical and Monte Carlo absolute dose calculations. While agreement in the proximal region decreased for beams with less penetrating Bragg peaks compared with the open-beam condition, the difference was shown to be largely attributable to edge-scattered protons. A method for including this effect in any future analytical algorithm was proposed. Comparisons of D/MU values showed typical agreement to within 0.5%. We conclude that analytical algorithms can be employed to accurately predict absolute proton dose distributions delivered by an ocular nozzle. © 2010 Institute of Physics and Engineering in Medicine

    The use and effectiveness of the eLib subject gateways: a preliminary investigation

    Get PDF
    Internet subject gateways were set up under the Electronic Libraries Programme (eLib) in order to address some of the problems of searching the Internet which have been identified by information professionals, i.e. locating relevant, good quality information. This preliminary study examines the extent to which academics in two universities use three eLib subject gateways (EEVL, OMNI and SOSIG). The results are generally encouraging for the eLib programme, but it is necessary for the gateways to be more effectively promoted. The study also found that academics do not have the same misgivings about the general search engines as the information professionals and seem to use them more readily than the gateways

    Design and development of a slurry spinner

    Get PDF
    Call number: LD2668 .T4 1978 K62Master of Scienc

    Does Vitamin D Deficiency Cause Hypertension? Current Evidence from Clinical Studies and Potential Mechanisms

    Get PDF
    Vitamin D deficiency is widely prevalent across all ages, races, geographical regions, and socioeconomic strata. In addition to its important role in skeletal development and calcium homeostasis, several recent studies suggest its association with diabetes, hypertension, cardiovascular disease, certain types of malignancy, and immunologic dysfunction. Here, we review the current evidence regarding an association between vitamin D deficiency and hypertension in clinical and epidemiological studies. We also look into plausible biological explanations for such an association with the renin-angiotensin-aldosterone system and insulin resistance playing potential roles. Taken together, it appears that more studies in more homogeneous study populations are needed before a firm conclusion can be reached as to whether vitamin D deficiency causes or aggravates hypertension and whether vitamin D supplementation is safe and exerts cardioprotective effects. The potential problems with bias and confounding factors present in previous epidemiological studies may be overcome or minimized by well designed randomized controlled trials in the future
    corecore