61 research outputs found

    Phosphorus recovery: a need for an integrated approach

    Get PDF
    Increasing cost of phosphate fertilizer, a scarcity of high quality phosphate rock (PR)and increasing surface water pollution are driving aneed to accelerate the recovery and re-use ofphosphorus (P) from various waste sectors. Options to recover P occur all along the open P cycle from mining to households to oceans. However, P recovery as a regional and global strategy towards P sustainability and future food, bio energy and water security is in its infancy because of a number of technological, socio-economic and institutional constraints. There is no single solution and resolving these constraints requires concerted collaboration betweenrelevant stakeholders and an integrated approach combiningsuccessful business models withsocio-economic and institutional change. We suggest that an operational framework is developed for fast tracking cost-effective recovery options

    Opportunities and challenges in sustainable treatment and resource reuse of sewage sludge: A review

    Get PDF
    Sludge or waste activated sludge (WAS) generated from wastewater treatment plants may be considered a nuisance. It is a key source for secondary environmental contamination on account of the presence of diverse pollutants (polycyclic aromatic hydrocarbons, dioxins, furans, heavy metals, etc.). Innovative and cost-effective sludge treatment pathways are a prerequisite for the safe and environment-friendly disposal of WAS. This article delivers an assessment of the leading disposal (volume reduction) and energy recovery routes such as anaerobic digestion, incineration, pyrolysis, gasification and enhanced digestion using microbial fuel cell along with their comparative evaluation, to measure their suitability for different sludge compositions and resources availability. Furthermore, the authors shed light on the bio-refinery and resource recovery approaches to extract value added products and nutrients from WAS, and control options for metal elements and micro-pollutants in sewage sludge. Recovery of enzymes, bio-plastics, bio-pesticides, proteins and phosphorus are discussed as a means to visualize sludge as a potential opportunity instead of a nuisance

    Przyczynek do krytyki statystyczno-relewantnego modelu wyjaśniania naukowego

    No full text
    The statistical relevance model of scientific explanation was proposed by Wesley Salmon in 1971 as an interesting alternative to already existed models introduced by Hempel and supported by many other philosophers of science. The most important difference between the nomological models and statistical relevance model is that the latter tries not to use the very dubious term of 'law of nature'. The first part of the paper consists of the overview of the Salmon's model and of the main arguments which were raised by various authors against it. In the main part of the text all of those arguments which were meant to undermine the model are presented on an example taken from the economic practice. It is very popular among the economists and especially among valuation experts the so called 'statistical analysis of the market'. The main objective of the analysis is to discover all of the factors which influence the market value of the particular product, in other words to explain the market value of the product. The example was taken from the social science (economics) for purpose as one of the thesis in the paper is that, the SR model can work quite well in physics or chemistry, but it is dubious whether we can really deploy it in sciences which try to describe and explain the various phenomena of human activity and behavior. The final conclusions are: The practical deployment of the model in social sciences are problematic, as it is too idealistic and therefore it doesn't work properly. Against its initial presumption the model doesn't avoid the problem of laws of nature. Although the law of nature is not a required element of the explanans, it comes back at the stage of proposing the initial candidates for the relevant variables. The hypothesis on, which variables can be and which cannot be relevant to the explained phenomenon are constructed mostly according to the intuitively understood causal relationship founded on laws of nature. The important postulate of homogenous partition is in practice unachievable what causes that the explanation is bound with the enormous risk of a mistake. The risk is quantifiable and can be estimated, but the estimation is depended upon experience and intuition of a researcher

    Limits of scientific explanation (II)

    No full text
    The second part of the text is intended to deal with the anti-naturalistic argument of F.A. Hayek. To present it comprehensively, however, his theory of mind has to be outlined first. According to Hayek, the way in which we perceive the world is entirely grounded in the biological construction of our neural order and thus, from this perspective, he seems to be a naturalist. He excludes any non-natural properties of our cognition like e.g. transcendental free will. However, a closer look at the functioning of our biological apparatus of perception divulges certain inherent and internal restrictions. First of all, we notice that the neural order (biological construction of neurons) is in fact a very complex apparatus of classification and discrimination of sensory impulses. Impulses may come from reality which is outer to the neural order as well as from the inside. The apparatus of classification and discrimination of sensory impulses is not stable, but permanently dynamic. An unstoppable attack of sensations and relevant responses of the system creates new classification rules (neural connections) and demolishes those which have been inactive for a longer time. A system of those rules, existing in a particular time unit, forms a model of reality which imperfectly corresponds to the existing, transcendent reality. The final argument for anti-naturalism which is elucidated in the text is Hayek’s idea of what is explanation and where lie its limits. This idea can be reduced to the following quotation: “…any apparatus of classification must possess a structure of a higher degree of complexity that is possessed by an object which it classifies.” In other words: if our cognitive system is an “apparatus of classification”, and if an explanation means modeling, and if a complete explanation requires the explanation of the apparatus itself, then a complete explanation is not possible at all, as the apparatus, which has a certain level of complexity, cannot upgrade this level in order to explain itself. Hayek’s reasoning is generally approved yet it is emphasized, however, that it rests on very strong assumptions which are identified and named at the end of the text

    The case for critique of statistical relevance model of scientific explanation

    No full text
    The statistical relevance model of scientific explanation was proposed by Wesley Salmon in 1971 as an interesting alternative to already existed models introduced by Hempel and supported by many other philosophers of science. The most important difference between the nomological models and statistical relevance model is that the latter tries not to use the very dubious term of 'law of nature'. The first part of the paper consists of the overview of the Salmon's model and of the main arguments which were raised by various authors against it. In the main part of the text all of those arguments which were meant to undermine the model are presented on an example taken from the economic practice. It is very popular among the economists and especially among valuation experts the so called 'statistical analysis of the market'. The main objective of the analysis is to discover all of the factors which influence the market value of the particular product, in other words to explain the market value of the product. The example was taken from the social science (economics) for purpose as one of the thesis in the paper is that, the SR model can work quite well in physics or chemistry, but it is dubious whether we can really deploy it in sciences which try to describe and explain the various phenomena of human activity and behavior. The final conclusions are: The practical deployment of the model in social sciences are problematic, as it is too idealistic and therefore it doesn't work properly. Against its initial presumption the model doesn't avoid the problem of laws of nature. Although the law of nature is not a required element of the explanans, it comes back at the stage of proposing the initial candidates for the relevant variables. The hypothesis on, which variables can be and which cannot be relevant to the explained phenomenon are constructed mostly according to the intuitively understood causal relationship founded on laws of nature. The important postulate of homogenous partition is in practice unachievable what causes that the explanation is bound with the enormous risk of a mistake. The risk is quantifiable and can be estimated, but the estimation is depended upon experience and intuition of a researcher

    Limits of scientific explanation (I)

    No full text
    The purpose of the paper is to challenge one of the most important assumptions of the neo-positivists, namely the unity of science. The idea that all of the sciences, both natural and social, should have the same structure and should deploy similar methods is, after Grobler, called naturalism. I try to argue for anti-naturalism. An interesting example seems to be economics. It does not, however, demonstrate the success, similar to that achieved by natural sciences. Certain naturalistic explanations for this lack of success are reviewed and criticized in the paper. Firstly, complexity: at the beginning of this naturalistic argument, one encounters the problem of definition. Up to nine different notions of complexity are proposed and only a few of them are practically quantitative. Secondly, mathematics: in the natural sciences we explore mathematical theories in order to capture the regularities in the investigated phenomena and to include them in the corresponding equations. However, even if we do not have a perfectly corresponding mathematical model, regularities themselves can be observed. Wherever we do not have a good theory expressed in terms of exact mathematical equations, we should at least be able to judge the existence or non-existence of certain regularities on the basis of linear (statistical) or non-linear methods. Those methods, some of them extremely sophisticated, are being extensively applied in economics and in econometrics (the so called quantitative methods). The results are disappointing. The anti-naturalistic argumentation of Grobler is dealt with separately. Grobler names three anti-naturalistic arguments: complexity (as mentioned above), the free will of humans (which the author did not find interesting enough) and, finally, the reasoning which is called, ”inherent two-way interdependence”. Grobler maintains that we are able to work out a meta-theory which shall include both predictions and the possible impact of those predictions on the theory’s object. This proposal is rejected in the paper

    Granice wyjaśnienia naukowego, część I

    No full text
    The purpose of the paper is to challenge one of the most important assumptions of the neo-positivists, namely the unity of science. The idea that all of the sciences, both natural and social, should have the same structure and should deploy similar methods is, after Grobler, called naturalism. I try to argue for anti-naturalism. An interesting example seems to be economics. It does not, however, demonstrate the success, similar to that achieved by natural sciences. Certain naturalistic explanations for this lack of success are reviewed and criticized in the paper. Firstly, complexity: at the beginning of this naturalistic argument, one encounters the problem of definition. Up to nine different notions of complexity are proposed and only a few of them are practically quantitative. Secondly, mathematics: in the natural sciences we explore mathematical theories in order to capture the regularities in the investigated phenomena and to include them in the corresponding equations. However, even if we do not have a perfectly corresponding mathematical model, regularities themselves can be observed. Wherever we do not have a good theory expressed in terms of exact mathematical equations, we should at least be able to judge the existence or non-existence of certain regularities on the basis of linear (statistical) or non-linear methods. Those methods, some of them extremely sophisticated, are being extensively applied in economics and in econometrics (the so called quantitative methods). The results are disappointing. The anti-naturalistic argumentation of Grobler is dealt with separately. Grobler names three anti-naturalistic arguments: complexity (as mentioned above), the free will of humans (which the author did not find interesting enough) and, finally, the reasoning which is called, ”inherent two-way interdependence”. Grobler maintains that we are able to work out a meta-theory which shall include both predictions and the possible impact of those predictions on the theory’s object. This proposal is rejected in the paper

    Believable world of economic models

    No full text
    Book review: Łukasz Hardt, Economics Without Laws. Towards a New Philosophy of Economics, Palgrave Macmillan, Cham, 2017, pp. 220
    corecore