616 research outputs found

    Automatic parametrisation of beached microplastics

    Get PDF
    Four sandy beaches on the island of Malta were regularly sampled for Large MicroPlastic (LMP) particles having a diameter between 1mm and 5mm, at stations located at the waterline, and 10m inshore. The extracted LMPs were characterised (dimensions, surface roughness, colour) by microscopic analyses, as well as by a developed algorithm. Two-thirds of the isolated particles were smooth and the majority of these belonged to the grey -white colour category suggesting that these were preproduction pellets. Roughly six times as many particles were recorded within the inshore sampling stations as the particles recorded at the waterline stations. The automated image processing algorithm performed well when the dimension and colour parameter values it delivered were compared with those obtained by microscopic analyses.peer-reviewe

    Next Generation Access and Digital Divide: Opposite Sides of the Same Coin?

    Get PDF
    Geographical averaging of retail and wholesale prices could distort incentives for bypass entry in both the metropolitan and the high-cost areas. The two-instrument approach to universal service support, proposed in (Armstrong, 2001), could enhance efficiency, through competitive and technological neutrality. Alternatively, the industry support to high cost areas could be substituted by redistributive fiscal measures or public subsidies. Using evidence from Italy we suggest that tackling demographic, educational, and income inequalities is necessary, even in low cost areas, to support further broadband penetration. We estimate logistic regressions of Internet and broadband use at home, and show that a substantial increase of broadband penetration is possible in Italy only if specific platforms and applications are made available to older and less educated households. Therefore, a critical mass of services could help reaching the critical mass of users that make Next Generation Access Networks viable. --Infrastructural Digital divide,Cultural Digital Divide,Geographical crosssubsidies,Efficient bypass,Critical mass of services

    The diversification potential offered by emerging markets in recent years

    Get PDF
    This paper investigates the diversification prospects which may be reaped when investing in a mixture of emerging and developed market assets. Given that emerging markets are somewhat distinct from developed ones, one may expect significant diversification potential and therefore risk reduction. Yet, the latter may be counterbalanced by the fact that emerging markets usually present higher risks when considered on their own; for instance higher price volatility and fluctuating liquidity. We use a panel data set spanning over a 10 year period and form a number of portfolios. We find that over the sample period, emerging market assets could be combined into efficient portfolios when assessed in terms of risk and return. By contrast, portfolios involving developed market assets tended to be inefficient. We also investigate whether emerging markets have converged to developed ones over the past years. When analysing co-movements between indices, the correlation values suggest that emerging markets have offered diversification potential. However we also find evidence of features which make it more challenging to reap the expected risk reduction benefits. The latter factors are the tendency for emerging markets to exhibit a higher individual variability, and the trend for markets to move more in line with each other as suggested by convergence literature.peer-reviewe

    The determinants of securities trading activity : evidence from four European equity markets

    Get PDF
    Purpose: The main objective of this study is to obtain new empirical evidence about the connections between equity trading activity and five possible liquidity determinants: market capitalisation, dividend yield, earnings yield, company growth, and the distinction between recently-listed firms as opposed to more established ones. Design / Methodology / Approach: We use a sample of 172 stocks from four European markets and estimate models using the entire sample data and different sub-samples to check the relative importance of the above determinants. We also conduct a factor analysis to re-classify the variables into a more succinct framework. Findings: The evidence suggests that market capitalisation is the most important trading activity determinant, and the number of years listed ranks thereafter. Research limitations / implications: The positive relation between trading activity and market capitalisation is in line with prior literature, while the findings relating to the other determinants offer further empirical evidence which is a worthy addition in view of the contradictory results in prior research. Practical implications: This study is of relevance to practitioners who would like to understand the cross-sectional variation in stock liquidity at a more detailed level. Originality / value: The originality of the paper rests on two important grounds: (a) we focus on trading turnover rather than on other liquidity proxies, since the former is accepted as an important determinant of the liquidity generation process, and (b) we adopt a rigorous approach towards checking the robustness of the results by considering various sub-sample configurations.peer-reviewe

    Development of a biomarker for penconazole: a human oral dosing study and a survey of UK residents’ exposure

    Get PDF
    Penconazole is a widely used fungicide in the UK; however, to date, there have been no peer-reviewed publications reporting human metabolism, excretion or biological monitoring data. The objectives of this study were to i) develop a robust analytical method, ii) determine biomarker levels in volunteers exposed to penconazole, and, finally, to iii) measure the metabolites in samples collected as part of a large investigation of rural residents’ exposure. An LC-MS/MS method was developed for penconazole and two oxidative metabolites. Three volunteers received a single oral dose of 0.03 mg/kg body weight and timed urine samples were collected and analysed. The volunteer study demonstrated that both penconazole-OH and penconazole-COOH are excreted in humans following an oral dose and are viable biomarkers. Excretion is rapid with a half-life of less than four hours. Mean recovery of the administered dose was 47% (range 33%–54%) in urine treated with glucuronidase to hydrolyse any conjugates. The results from the residents’ study showed that levels of penconazole-COOH in this population were low with >80% below the limit of detection. Future sampling strategies that include both end of exposure and next day urine samples, as well as contextual data about the route and time of exposure, are recommended

    The dissociable effects of punishment and reward on motor learning

    Get PDF
    A common assumption regarding error-based motor learning (motor adaptation) in humans is that its underlying mechanism is automatic and insensitive to reward- or punishment-based feedback. Contrary to this hypothesis, we show in a double dissociation that the two have independent effects on the learning and retention components of motor adaptation. Negative feedback, whether graded or binary, accelerated learning. While it was not necessary for the negative feedback to be coupled to monetary loss, it had to be clearly related to the actual performance on the preceding movement. Positive feedback did not speed up learning, but it increased retention of the motor memory when performance feedback was withdrawn. These findings reinforce the view that independent mechanisms underpin learning and retention in motor adaptation, reject the assumption that motor adaptation is independent of motivational feedback, and raise new questions regarding the neural basis of negative and positive motivational feedback in motor learning

    Using dynamic binary analysis for tracking pointer data

    Get PDF
    The examination and monitoring of binaries during runtime, referred to as dynamic binary analysis, is a widely adopted approach, especially in the field of security and software vulnerabilities. Fundamentally, it provides one with a means to understand and reason about binary executions. There are various applications of dynamic binary analysis, including vulnerability analysis, malware analysis, and Web security. One technique typically employed to perform dynamic analysis is taint analysis, which revolves around inspecting interesting information flows [3]. In this approach, taint marks are associated with values that are (1) introduced via defined sources and (2) propagated to other values to keep track of information flow. Marks may also be removed (untainted) once a defined sink has been reached. In addition, taint checking is also carried out in order to determine whether or not certain runtime behaviours of the program occur. The properties describing how taint analysis is performed, i.e taint introduction, propagation and checking, are specified by a set of rules referred to as a taint policy. One convenient way to define taint rules is in the form of operational semantics rules, as it avoids ambiguity issues. Rule 1 specifies the general form of a taint rule used in this paper. Given the current machine context of the program 4 and a statement, the rule specifies the end result, after the computation has been carried out.peer-reviewe

    Lill-mużika ta’ Chopin

    Get PDF
    Ġabra ta’ poeżiji u proża li tinkludi: Talba lil Ġesù Bambin ta’ Ivo Muscat Azzopardi – Il-kewkba tal-Milied ta’ Ġużè Galea – Il-Milied it-tajjeb! ta’ G. Borg Pantalleresco – Lill-mużika ta’ Chopin ta’ John Sciberras.N/

    Determination of optimal conditions for pressure oxidative leaching of Sarcheshmeh Molybdenite concentrate using Taguchi method

    Get PDF
    The present research work is based on finding the optimum conditions for pressure oxidative leaching of the molybdenite concentrate to produce technical-grade molybdic oxide (MoO3) with high recovery through further treatment of the filtrate solution. The Taguchi method was used to design and minimize the number of experiments. By using Taguchi orthogonal (L25) array, five parameters (time, temperature, oxygen pressure, pulp density and acid concentration) at five levels were selected for 25 experiments. The experiments were designed and carried out in a high-pressure reactor in the presence of nitric acid as solvent and oxidizing agent for the molybdenite concentrate and its ReS2 content. The optimum conditions for pressure leaching of molybdenite were obtained through using Signal to Noise analysis and modified by using Minitab software prediction tool. Furthermore, the optimum condition for an economical pressure leaching of rhenium sulfide (ReS2) was achieved with the same process. Analysis of variance (ANOVA) showed that the pulp density is of paramount importance in this process
    • …
    corecore