35 research outputs found

    THE RELATIONSHIP BETWEEN MACROECONOMIC VARIABLES AND ROMANIAN CORPORATE DEFAULT RATES BETWEEN 2002-2008

    Get PDF
    During its 20 year history of market economy, Romania experienced the most severe downturn in 2009, which resulted in many cost, mainly because of the output loss. These conditions forced several firms to declare bankruptcy and to stop their activity. The aim of this research is to assess the relationship between the corporate default rates and the macroeconomic processes in the case of Romania for the period comprised between 2002Q1-2008Q4. For this, based on the relevant literature, we ranked the potential explanatory variables of the default rates into seven groups: cyclical indicators, household indicators, corporate indicators, external sector indicators, price stability indicators and interest rates, loans to private sector and finally the capital market indicators. Some studies base their results only on accounting data, others only on market data. Our study focuses on both, since this seems to be an adequate approach in capturing most of the processes. Similar to the banks' loan portfolio structure, we conducted analysis for five sectors: industry, construction, agriculture, services and the overall economy. For each sector the average default probability at time t is modeled as a logistic function of many general and sector-specific macroeconomic variables. The use of logistic regression was motivated by its ability to account for fractional data between 0 and 1. We found that at least one variable from each group has a significant explanatory power regarding the evolution of the default rates in all five sectors analyzed. In some cases the sign of the variables was the opposite of what the economic theory would have suggested, but it has to be taken into account that Romania posted the picture of an overheated economy during the analyzed period. Another important conclusion was that many variables were significant through their lagged value, which indicates an even better supervision of the evolution of the specific variables. From all the variables, the volatility of the BET-C index proves to be the most important in predicting the evolution of the default rates, as it didn't proved to be significant only for the construction sector. The evolution of FDI and the volatility of the BET-C index proved to be very important in determining the evolution of the corporate default rates, as well. The first was a very important factor in the financing of companies, especially during the analyzed period, and the risk meter is something that never should be disregarded when it comes of analyzing default rates.corporate default rate, macroeconomic processes, economic imbalances, logistic regression, lagged effects

    Hunting for Stellar Coronal Mass Ejections

    Get PDF
    Coronal mass ejections (CMEs) are explosive events that occur basically daily on the Sun. It is thought that these events play a crucial role in the angular momentum and mass loss of late-type stars, and also shape the environment in which planets form and live. Stellar CMEs can be detected in optical spectra in the Balmer lines, especially in Halpha, as blue-shifted extra emission/absorption. To increase the detection probability one can monitor young open clusters, in which the stars are due to their youth still rapid rotators, and thus magnetically active and likely to exhibit a large number of CMEs. Using ESO facilities and the Nordic Optical Telescope we have obtained time series of multi-object spectroscopic observations of late-type stars in six open clusters with ages ranging from 15 Myrs to 300 Myrs. Additionally, we have studied archival data of numerous active stars. These observations will allow us to obtain information on the occurrence rate of CMEs in late-type stars with different ages and spectral types. Here we report on the preliminary outcome of our studies.Comment: 6 pages, submitted to the proceedings of IAU Symposium 328 'Living Around Active Stars

    Degradation Analysis of DC-Link Aluminium Electrolytic Capacitors Operating in PWM Power Converters

    Get PDF
    The most common failure mode of aluminium electrolytic capacitor is the so-called wear out fault. It is caused by the high core temperature of the capacitor. Therefore, life cycle calculations generally use temperature data to estimate degradation level. Core temperature-based life cycle calculations can consider different current loads on capacitors. The calculation method uses scaling factors for different ripple current waveforms. However, it is not observed that temperature only is responsible for aging, but current waveform also influences the level of degradation. Therefore, sinusoidal and PWM-loaded capacitor tests were performed under the same temperature conditions. The results show that the pore distribution of aluminium anode foil has changed during the test. The pore diameter reduces and it leads to an increase in the ESR value and decrease in the capacitance, electrolyte amount and weight. Comparative results show that the PWM-loaded capacitor is more degraded than the capacitor loaded by sinusoidal test current

    Cell lines and clearing approaches : a single-cell level 3D light-sheet fluorescence microscopy dataset of multicellular spheroids

    Get PDF
    Nowadays, three dimensional (3D) cell cultures are widely used in the biological laboratories and several optical clearing approaches have been proposed to visualize individual cells in the deepest layers of cancer multicellular spheroids. However, defining the most appropriate clearing approach for the different cell lines is an open issue due to the lack of a gold standard quantitative metric. In this article, we describe and share a single-cell resolution 3D image dataset of human carcinoma spheroids imaged using a light-sheet fluorescence microscope. The dataset contains 90 multicellular cancer spheroids derived from 3 cell lines (i.e. T-47D, 5-8F, and Huh-7D12) and cleared with 5 different protocols, precisely Clear(T) , Clear(T2) , CUBIC, ScaleA2, and Sucrose. To evaluate image quality and light penetration depth of the cleared 3D samples, all the spheroids have been imaged under the same experimental conditions, labelling the nuclei with the DRAQ(5) stain and using a Leica SP8 Digital LightSheet microscope. The clearing quality of this dataset was annotated by 10 independent experts and thus allows microscopy users to qualitatively compare the effects of different optical clearing protocols on different cell lines. It is also an optimal testbed to quantitatively assess different com putational metrics evaluating the image quality in the deepest layers of the spheroids. (C) 2021 The Author(s). Published by Elsevier Inc.Peer reviewe

    A quantitative metric for the comparative evaluation of optical clearing protocols for 3D multicellular spheroids

    Get PDF
    3D multicellular spheroids quickly emerged as in vitro models because they represent the in vivo tumor environment better than standard 2D cell cultures. However, with current microscopy technologies, it is difficult to visualize individual cells in the deeper layers of 3D samples mainly because of limited light penetration and scattering. To overcome this problem several optical clearing methods have been proposed but defining the most appropriate clearing approach is an open issue due to the lack of a gold standard metric. Here, we propose a guideline for 3D light microscopy imaging to achieve single-cell resolution. The guideline includes a validation experiment focusing on five optical clearing protocols. We review and compare seven quality metrics which quantitatively characterize the imaging quality of spheroids. As a test environment, we have created and shared a large 3D dataset including approximately hundred fluorescently stained and optically cleared spheroids. Based on the results we introduce the use of a novel quality metric as a promising method to serve as a gold standard, applicable to compare optical clearing protocols, and decide on the most suitable one for a particular experiment. (C) 2021 The Authors. Published by Elsevier B.V. on behalf of Research Network of Computational and Structural Biotechnology.Peer reviewe

    Severity of rotator cuff disorders and additional load affect fluoroscopy-based shoulder kinematics during arm abduction.

    Get PDF
    BACKGROUND Rotator cuff disorders, whether symptomatic or asymptomatic, may result in abnormal shoulder kinematics (scapular rotation and glenohumeral translation). This study aimed to investigate the effect of rotator cuff tears on in vivo shoulder kinematics during a 30° loaded abduction test using single-plane fluoroscopy. MATERIALS AND METHODS In total, 25 younger controls, 25 older controls and 25 patients with unilateral symptomatic rotator cuff tears participated in this study. Both shoulders of each participant were analysed and grouped on the basis of magnetic resonance imaging into healthy, rotator cuff tendinopathy, asymptomatic and symptomatic rotator cuff tears. All participants performed a bilateral 30° arm abduction and adduction movement in the scapular plane with handheld weights (0, 2 and 4 kg) during fluoroscopy acquisition. The range of upward-downward scapular rotation and superior-inferior glenohumeral translation were measured and analysed during abduction and adduction using a linear mixed model (loads, shoulder types) with random effects (shoulder ID). RESULTS Scapular rotation was greater in shoulders with rotator cuff tendinopathy and asymptomatic rotator cuff tears than in healthy shoulders. Additional load increased upward during abduction and downward during adduction scapular rotation (P < 0.001 in all groups but rotator cuff tendinopathy). In healthy shoulders, upward scapular rotation during 30° abduction increased from 2.3° with 0-kg load to 4.1° with 4-kg load and on shoulders with symptomatic rotator cuff tears from 3.6° with 0-kg load to 6.5° with 4-kg load. Glenohumeral translation was influenced by the handheld weights only in shoulders with rotator cuff tendinopathy (P ≀ 0.020). Overall, superior glenohumeral translation during 30° abduction was approximately 1.0 mm with all loads. CONCLUSIONS The results of glenohumeral translation comparable to control but greater scapular rotations during 30° abduction in the scapular plane in rotator cuff tears indicate that the scapula compensates for rotator cuff deficiency by rotating. Further analysis of load-dependent joint stability is needed to better understand glenohumeral and scapula motion. LEVEL OF EVIDENCE Level 2. TRIAL REGISTRATION Ethical approval was obtained from the regional ethics committee (Ethics Committee Northwest Switzerland EKNZ 2021-00182), and the study was registered at clinicaltrials.gov on 29 March 2021 (trial registration number NCT04819724, https://clinicaltrials.gov/ct2/show/NCT04819724 )

    Load-induced glenohumeral translation after rotator cuff tears : protocol for an in vivo study

    Get PDF
    Background: Rotator cuff tears are a common shoulder injury, but they sometimes remain undiagnosed, as symptoms can be limited. Altered shoulder biomechanics can lead to secondary damage and degeneration. In biomechanical analyses, the shoulder (ie, the glenohumeral joint) is normally idealized as a ball-and-socket joint, even though a translation is often observed clinically. To date, no conclusive changes in glenohumeral translation have been reported in patients with rotator cuff tears, and it is unknown how an additional handheld weight that is comparable to those used during daily activities will affect glenohumeral translations in patients with rotator cuff tears. Objective: This study aims to assess the load-induced glenohumeral translation (liTr) in patients with rotator cuff tears and its association with the load-induced changes in muscle activation (liMA). Methods: Patients and asymptomatic controls will be recruited. Participants will fill out health questionnaires and perform 30° arm abduction and adduction trials, during which they will hold different handheld weights of a maximum of 4 kg while motion capture and electromyographic data are collected. In addition, fluoroscopic images of the shoulders will be taken for the same movements. Isometric shoulder muscle strength for abduction and rotation will be assessed with a dynamometer. Finally, shoulder magnetic resonance images will be acquired to assess muscle status and injury presence. The dose-response relationship between additional weight, liTr, and liMA will be evaluated. Results: Recruitment and data collection began in May 2021, and they will last until the recruitment target is achieved. Data collection is expected to be completed by the end of 2022. As of November 2022, data processing and analysis are in progress, and the first results are expected to be submitted for publication in 2023. Conclusions: This study will aid our understanding of biological variations in liTr, the influence of disease pathology on liTr, the potential compensation of rotator cuff tears by muscle activation and size, and the association between liTr and patient outcomes. The outcomes will be relevant for diagnosis, treatment, and rehabilitation planning in patients with rotator cuff tears

    nucleAIzer : A Parameter-free Deep Learning Framework for Nucleus Segmentation Using Image Style Transfer

    Get PDF
    Single-cell segmentation is typically a crucial task of image-based cellular analysis. We present nucleAIzer, a deep-learning approach aiming toward a truly general method for localizing 2D cell nuclei across a diverse range of assays and light microscopy modalities. We outperform the 739 methods submitted to the 2018 Data Science Bowl on images representing a variety of realistic conditions, some of which were not represented in the training data. The key to our approach is that during training nucleAIzer automatically adapts its nucleus-style model to unseen and unlabeled data using image style transfer to automatically generate augmented training samples. This allows the model to recognize nuclei in new and different experiments efficiently without requiring expert annotations, making deep learning for nucleus segmentation fairly simple and labor free for most biological light microscopy experiments. It can also be used online, integrated into CellProfiler and freely downloaded at www.nucleaizer.org. A record of this paper's transparent peer review process is included in the Supplemental Information.Peer reviewe

    Dutch, Hungarian And German Dairy Farms Technical Efficiency Comparison

    No full text
    The abolishment of the dairy quota system in the EU is expected to increase competition across dairy farms in Europe. Assuming a common price for milk in the EU, only the most efficient farms will survive in the new environment. The main objective of the paper is to compare dairy farms in Germany, The Netherlands and Hungary about their technical efficiency. In the first part of the research, the efficiency is measured by partial efficiency indexes using one dimensional efficiency measuring. In the second part, the Data Envelopment Analysis (DEA) have to be used to measure efficiency in a multidimensional space, using six inputs and two outputs. It appears from the results that the highest efficiency farms are in the Netherlands, and then Germany and Hungary follow. If we want to eliminate the low sample size effect, we can assume a common frontier, which decreases the efficiency scores a bit, and makes the Hungarian results more reliable. With respect the abolishment of the dairy quota system, our results suggest that the Dutch farms are the most efficient, thus probably they will increase their production after the quota system. But because the size of the country we cannot expect dramatic changes in the European Dairy market. The Germans farms efficiency is lower, but their efficiency is also lower, so we wonñ€Âℱt expect high increase about the dairy supply. The Hungarian dairy sector is not so efficient like the Dutch, and the size of the sector has also small among the European countries, thus if they want to survive the quota system demolishing, they have to increase their technical efficiency

    DUTCH, HUNGARIAN AND GERMAN DAIRY FARMS TECHNICAL EFFICIENCY COMPARISON

    No full text
    The abolishment of the dairy quota system in the EU is expected to increase competition across dairy farms in Europe. Assuming a common price for milk in the EU, only the most efficient farms will survive in the new environment. The main objective of the paper is to compare dairy farms in Germany, The Netherlands and Hungary about their technical efficiency. In the first part of the research, the efficiency is measured by partial efficiency indexes using one dimensional efficiency measuring. In the second part, the Data Envelopment Analysis (DEA) have to be used to measure efficiency in a multidimensional space, using six inputs and two outputs. It appears from the results that the highest efficiency farms are in the Netherlands, and then Germany and Hungary follow. If we want to eliminate the low sample size effect, we can assume a common frontier, which decreases the efficiency scores a bit, and makes the Hungarian results more reliable. With respect the abolishment of the dairy quota system, our results suggest that the Dutch farms are the most efficient, thus probably they will increase their production after the quota system. But because the size of the country we cannot expect dramatic changes in the European Dairy market. The Germans farms efficiency is lower, but their efficiency is also lower, so we won’t expect high increase about the dairy supply. The Hungarian dairy sector is not so efficient like the Dutch, and the size of the sector has also small among the European countries, thus if they want to survive the quota system demolishing, they have to increase their technical efficiency.efficiency, dairy quota system, dairy farming, Data Envelopment Analysis, Agribusiness, Livestock Production/Industries,
    corecore