119 research outputs found

    The increased volatility of electricity prices for Belgian households

    Get PDF
    Several studies have shown that the Belgian harmonised index of consumer prices for electricity (HICP-EL) has become more volatile since the end of 2007. The increased volatility can be observed not only for the past behaviour of the index, but also relative to other countries’ indices. A study of the National Bank of Belgium has shown that methodological changes in the method of registering the base data collected for calculating the index are not the only reason. It has concluded that differences in the price-setting behaviour in the deregulated electricity markets are a more plausible explanation for this volatility. The article analyses the Belgian price-setting mechanism in detail. Most households have signed variable-price contracts with their electricity supplier. The variable price in such contracts is aligned to suppliers’ costs using indexing parameters. One of the parameters reflects changes in wages and material costs (the parameter Ne), while other parameters follow the changes in fuel costs (several parameters are used, the most representative being Nc and Iem). These parameters are computed and published by the regulator on a monthly basis. The main findings in the article are that (1) the evolution of the HICP for electricity is strongly correlated with a linear combination of the above-mentioned parameters, (2) this linear combination is ahead of the HICP by one or two months, (3) the monthly indexation implies a fast transmission of the parameter changes to consumer prices, (4) the monthly price changes combined with the annual invoicing frequency increase the complexity and reduce the transparency, moreover, users have only ex-post knowledge about the price, which limits price comparability, and (5) the indexing formulae assume a fixed fuel mix. In practice, however, the fuel mix changes because suppliers switch their purchase contracts to other producers, because of fluctuations in relative fuel prices and because the composition of the production capacity is changing. Suppliers also entail costs that are not taken into account by the formulae (e.g. costs of greenhouse gas emission rights and green certificates). It is also worth mentioning that no signs of the existence of similar indexing mechanisms were found in neighbouring countries.harmonised index of consumer prices, electricity

    Liberalisation of network industries : Is the electricity sector an exception to the rule?

    Get PDF
    For quite a long time, network industries used to be regarded as (natural) monopolies. This was due to these industries having some special characteristics. Network externalities and economies of scale in particular justified the (natural) monopoly thesis. Recently, however, a trend towards deregulation of such industries has been observed. This trend started with the successful introduction of competition in the telecommunications sector. The main reason behind this success is that the economies of scale have disappeared as a result of emerging new technologies. The successful deregulation in telecommunications is in line with micro-economic theory, which predicts an increase in efficiency and lower prices when markets are opened up to competition. The success in the telecommunications sector is often used as an argument for opening up other network industries to competition as well. In this paper we analyse whether this reasoning can be transposed to the electricity sector. It is argued that the two sectors, electricity and telecommunications, are similar in that they are both network industries which used to be characterised by economies of scale, and that technological progress might have put an end to this scale effect. There are however certain differences. Firstly, technological progress on the supply side was accompanied by a strong growth in demand in the telecommunications sector. This demand side effect is absent in electricity. Moreover, due to physical characteristics, the electricity sector seems to be more complicated: in order to introduce competition in the sector, it has to be split up into subsectors (production, transmission, distribution and supply). Competition is introduced in production and supply, transmission and distribution remain monopolies. This splitting up creates a new kind of costs, the so-called transaction costs. The paper is centered around two issues: (a) are the basic assumptions behind the theoretical model of the perfectly free market met in the deregulated subsectors? and (b) do the transaction costs (partly) offset possible price decreases in competitive segments ? There is no hard evidence that the hypotheses behind the theoretical model are met in the electricity sector, and there are strong indications that these transaction costs might be substantial. Moreover, in addition to the deregulation process, the electricity sector is also subject to other changes such as the internalisation of externalities (see the Kyoto protocol) and the debate on nuclear energy. These elements could exert an upward pressure on prices. Since electricity is ubiquitous, the deregulation process should be closely monitored.Welfare economics; market structure and pricing; organizational behaviour, transaction costs, property rights, Electric Utilities, Telecommunications.

    The single European electricity market: A long road to convergence

    Get PDF
    In the context of a first Working Paper the authors argued that electricity has a number of characteristics that set it apart from other commodities. It was demonstrated that some of these characteristics might complicate the deregulation process. This paper analyses the ongoing deregulation process in the European electricity sector and attempts to establish whether these difficulties can more readily be solved at European level. It would appear that some problems, e.g. economies of scale in electricity generation, have less of an impact at European level than within smaller national markets. However, a number of difficulties have to be overcome before a unified European electricity market can become a reality. These include the limited interconnection capacities between Member States. The European Commission has taken steps to improve the situation, for example by offering financial support for investments and promoting the development of regional markets as an interim measure ultimately leading to a fully integrated market. Apart from the difficulties related to electricity generation and transmission there are also exogenous factors that influence the ongoing deregulation process, e.g. the implementation of the Kyoto protocol and the dramatic increases in primary fuel prices. This paper argues that a consistent, stable and uniform European regulatory framework must be put in place if the impact of these difficulties is to be minimised.Electricity deregulation

    Analysis of business demography using markov chains : an application to Belgian data

    Get PDF
    This paper applies the theory of finite Markov chains to analyse the demographic evolution of Belgian enterprises. While other methodologies concentrate on the entry and exit of firms, the Markov approach also analyses migrations between economic sectors. Besides helping to provide a fuller picture of the evolution of the population, Markov chains also enable forecasts of its future composition to be made, as well as the computation of average lifetimes of companies by branch of activity. The method is applied to Belgian data from the Crossroads Bank for Enterprises (CBE). To ensure compliance with Eurostat-OECD definitions, only 'active' enterprises, i.e. enterprises with a positive turnover and/or at least one employee, are considered. The forecasting method is applied to simulate the demographic evolution of the CBE population between 2000 and 2006. This simulation seems to match well the observed changes. Taking migrations into account yields better forecasts than if they are not considered. Moreover, several off-diagonal percentages in the transition matrix are sigificantly different from zero. A case study shows that these migrations are changes in main activity and not the consequence of corrections of wrongly classified firms. Next, the average remaining lifetime and the average age of enterprises in a particular branch of activity is computed and analysed. These lifetimes and ages differ considerably across branches. As expected the life-times of public services are longer than average. Shorter lifetimes combined with an increasing number of enterprises is an indication of renewal inside the branch. A low average age is a sign of relatively new branches. Comparing age to total expected lifetime yields an indicator of closeness to extinction. This might be an indicator of the maturity of the branch. The method is more generally applicable in the sense that it can be used to analyse other populations than those from the CBE and other partitions of the populationBusiness demography, Markov chains, Transition matrix

    De autonijverheid in België

    Get PDF
    This paper provides an extensive overview of the economic importance and evolution of the car manufacturing industry. In addition, it provides evidence that the car industry is still playing a vital role in process innovation. The currently widespread lean production method, under which companies focus on their core activities and develop a network of subcontractors, in fact originates from the Japanese car industry. The introduction of the "lean production" concept in Europe had a far-reaching impact on corporate relations. Important responsibilities - such as product development, quality control, innovation efforts and timely deliveries - have been/are passed on to the subcontractors. Company clusters have been formed, which often also have consequences in terms of geographical location due to the necessity of "just-in-time" or even "just-in-sequence"-deliveries. The mere fact that global companies have implemented this production method, also adds to the internationalization of the subcontracting companies. The latter conclusion fuelled/fuels the trend to anchor as it were the industrial core activities through an appropriate policy. Such a policy must be based upon reliable statistical observations. A major disadvantage, however, is that because of their network structure, the corporate clusters' importance is hard to measure. Since the input-output tables are not available for the very latest years, on the one hand, and are not sufficiently detailed, on the other hand, a method has been searched for which allows to gauge the importance of a specific branch. In this paper the method will be applied to the car manufacturing in Belgium. The proposed calculation method is based on the supply and use tables drawn up by the Bank within the framework of the National Accounts Institute.branch survey, car industry, subcontracting, indirect effects

    The performance of credit rating systems in the assessment of collateral used in Eurosystem monetary policy operations

    Get PDF
    The aims of this paper are twofold: first, we attempt to express the threshold of a single “A” rating as issued by major international rating agencies in terms of annualised probabilities of default. We use data from Standard & Poor’s and Moody’s publicly available rating histories to construct confidence intervals for the level of probability of default to be associated with the single “A” rating. The focus on the single “A” rating level is not accidental, as this is the credit quality level at which the Eurosystem considers financial assets to be eligible collateral for its monetary policy operations. The second aim is to review various existing validation models for the probability of default which enable the analyst to check the ability of credit assessment systems to forecast future default events. Within this context the paper proposes a simple mechanism for the comparison of the performance of major rating agencies and that of other credit assessment systems, such as the internal ratings-based systems of commercial banks under the Basel II regime. This is done to provide a simple validation yardstick to help in the monitoring of the performance of the different credit assessment systems participating in the assessment of eligible collateral underlying Eurosystem monetary policy operations. Contrary to the widely used confidence interval approach, our proposal, based on an interpretation of p-values as frequencies, guarantees a convergence to an ex ante fixed probability of default (PD) value. Given the general characteristics of the problem considered, we consider this simple mechanism to also be applicable in other contexts.

    The performance of credit rating systems in the assessment of collateral used in Eurosystem monetary policy operations

    Get PDF
    The aims of this paper are twofold: first, we attempt to express the threshold of a single “A” rating as issued by major international rating agencies in terms of annualised probabilities of default. We use data from Standard & Poor’s and Moody’s publicly available rating histories to construct confidence intervals for the level of probability of default to be associated with the single “A” rating. The focus on the single A rating level is not accidental, as this is the credit quality level at which the Eurosystem considers financial assets to be eligible collateral for its monetary policy operations. The second aim is to review various existing validation models for the probability of default which enable the analyst to check the ability of credit assessment systems to forecast future default events. Within this context the paper proposes a simple mechanism for the comparison of the performance of major rating agencies and that of other credit assessment systems, such as the internal ratings-based systems of commercial banks under the Basel II regime. This is done to provide a simple validation yardstick to help in the monitoring of the performance of the different credit assessment systems participating in the assessment of eligible collateral underlying Eurosystem monetary policy operations. Contrary to the widely used confidence interval approach, our proposal, based on an interpretation of p-values as frequencies, guarantees a convergence to an ex ante fixed probability of default (PD) value. Given the general characteristics of the problem considered, we consider this simple mechanism to also be applicable in other contexts.credit risk, rating, probability of default (PD), performance checking, backtesting

    L'enseignement magistral : révérence ou référence ?

    Get PDF
    International audiencePartant de la tentation d'opposer pratiques coopĂ©ratives et magistrales, nous la reconduisons Ă  son arriĂšre-plan social (les injonctions d'autonomie et de partage d'un monde commun) et anthropologique (l'Ă©mergence du sujet). Clarifier les enjeux de la relation magistrale, et plus prĂ©cisĂ©ment de la rĂ©ceptivitĂ© au cƓur de cette relation, aide Ă  identifier les voies d'une humanisation qui ne se confonde ni avec l'auto-fondation ni avec la soumission. La rĂ©flexion quant aux attendus de la relation magistrale porte Ă  la fois sur les plans de la relation Ă©ducative, des pratiques pĂ©dagogiques et de cette difficile Ă©mergence d'une subjectivitĂ© qui est au cƓur des prĂ©occupations contemporaines. Un examen critique de ces dynamiques devrait favoriser leur juste articulation dans une transmission sans laquelle il n'est pas d'histoire humaine
    • 

    corecore