71 research outputs found
A lidar for detecting atmospheric turbulence based on modified Von Karman turbulence power spectrum
Introduction: Atmospheric turbulence is a kind of random vortex motion. A series of turbulent effects, such as fluctuation of light intensity, occur when laser is transmitted in atmospheric turbulence.Methods: In order to verify the possibility of detecting atmospheric turbulence by the Mie-scattering lidar, firstly, based on the power spectrum method, the Zernike polynomial method is used to simulate generation of the modified Von Karman turbulent phase screen by low-frequency compensation. By comparing the obtained phase structure function with the theoretical value, the accuracy of the method is verified. Moreover, the transmission process of the Gaussian beam from Mie-scattering lidar through the phase screen is simulated, and the transmission characteristics of the beam under modified Von Karman turbulence are obtained by analyzing the fluctuation of light intensity. Secondly, based on the guidance for simulation analysis, a Miescattering lidar system for detecting the intensity of atmospheric turbulence was developed in Yinchuan area, and the atmospheric turbulence profile was inverted by detected scintillation index.Results: The results show it is feasible to use the Zernike polynomial method perform the low-frequency compensation, and the compensation effect of low order is better than that of high order compensation. The scintillation index of simulation is consistent with the actual detection result, and has the very high accuracy, indicating that the atmospheric turbulence detection using Mie-scattering lidar is effective.Conclusion: These simulations and experiments play a significant guiding role for the similar lidar to detect atmospheric turbulence
Reliable Detection of Myocardial Ischemia Using Machine Learning Based on Temporal-Spatial Characteristics of Electrocardiogram and Vectorcardiogram
Background: Myocardial ischemia is a common early symptom of cardiovascular disease (CVD). Reliable detection of myocardial ischemia using computer-aided analysis of electrocardiograms (ECG) provides an important reference for early diagnosis of CVD. The vectorcardiogram (VCG) could improve the performance of ECG-based myocardial ischemia detection by affording temporal-spatial characteristics related to myocardial ischemia and capturing subtle changes in ST-T segment in continuous cardiac cycles. We aim to investigate if the combination of ECG and VCG could improve the performance of machine learning algorithms in automatic myocardial ischemia detection. Methods: The ST-T segments of 20-second, 12-lead ECGs, and VCGs were extracted from 377 patients with myocardial ischemia and 52 healthy controls. Then, sample entropy (SampEn, of 12 ECG leads and of three VCG leads), spatial heterogeneity index (SHI, of VCG) and temporal heterogeneity index (THI, of VCG) are calculated. Using a grid search, four SampEn and two features are selected as input signal features for ECG-only and VCG-only models based on support vector machine (SVM), respectively. Similarly, three features (S ( I ), THI, and SHI, where S ( I ) is the SampEn of lead I) are further selected for the ECG + VCG model. 5-fold cross validation was used to assess the performance of ECG-only, VCG-only, and ECG + VCG models. To fully evaluate the algorithmic generalization ability, the model with the best performance was selected and tested on a third independent dataset of 148 patients with myocardial ischemia and 52 healthy controls. Results: The ECG + VCG model with three features (S ( I ),THI, and SHI) yields better classifying results than ECG-only and VCG-only models with the average accuracy of 0.903, sensitivity of 0.903, specificity of 0.905, F1 score of 0.942, and AUC of 0.904, which shows better performance with fewer features compared with existing works. On the third independent dataset, the testing showed an AUC of 0.814. Conclusion: The SVM algorithm based on the ECG + VCG model could reliably detect myocardial ischemia, providing a potential tool to assist cardiologists in the early diagnosis of CVD in routine screening during primary care services
Entropy-based reliable non-invasive detection of coronary microvascular dysfunction using machine learning algorithm
Purpose:
Coronary microvascular dysfunction (CMD) is emerging as an important cause of myocardial ischemia, but there is a lack of a non-invasive method for reliable early detection of CMD.
Aim:
To develop an electrocardiogram (ECG)-based machine learning algorithm for CMD detection that will lay the groundwork for patient-specific non-invasive early detection of CMD.
Methods:
Vectorcardiography (VCG) was calculated from each 10-second ECG of CMD patients and healthy controls. Sample entropy (SampEn), approximate entropy (ApEn), and complexity index (CI) derived from multiscale entropy were extracted from ST-T segments of each lead in ECGs and VCGs. The most effective entropy subset was determined using the sequential backward selection algorithm under the intra-patient and inter-patient schemes, separately. Then, the corresponding optimal model was selected from eight machine learning models for each entropy feature based on five-fold cross-validations. Finally, the classification performance of SampEn-based, ApEn-based, and CI-based models was comprehensively evaluated and tested on a testing dataset to investigate the best one under each scheme.
Results:
ApEn-based SVM model was validated as the optimal one under the intra-patient scheme, with all testing evaluation metrics over 0.8. Similarly, ApEn-based SVM model was selected as the best one under the intra-patient scheme, with major evaluation metrics over 0.8.
Conclusions:
Entropies derived from ECGs and VCGs can effectively detect CMD under both intra-patient and inter-patient schemes. Our proposed models may provide the possibility of an ECG-based tool for non-invasive detection of CMD
Large expert-curated database for benchmarking document similarity detection in biomedical literature search
Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe
Observation and Inversion of Aerosol Particle Size Distribution over Yinchuan Area
Particle size distribution is one of the important microphysical parameters to characterize the aerosol properties. The aerosol optical depth is used as the function of wavelength to study the particle size distribution of whole atmospheric column. However, the inversion equation of the particle size distribution from the aerosol optical depth belongs to the Fredholm integral equation of the first kind, which is usually ill-conditioned. To overcome this drawback, the integral equation is first discretized directly by using the complex trapezoid formula. Then, the corresponding parameters are selected by the L curve method. Finally the truncated singular value decomposition regularization method is employed to regularize the discrete equation and retrieve the particle size distribution. To verify the feasibility of the algorithm, the aerosol optical depths taken by a sun photometer CE318 over Yinchuan area in four seasons, as well as hazy, sunny, floating dusty and blowing dusty days, were used to retrieve the particle size distribution. In order to verify the effect of truncated singular value decomposition algorithm, the Tikhonov regularization algorithm was also adopted to retrieve the aerosol PSD. By comparing the errors of the two regularizations, the truncated singular value decomposition regularization algorithm has a better retrieval effect. Moreover, to understand intuitively the sources of aerosol particles, the backward trajectory was used to track the source. The experiment results show that the truncated singular value decomposition regularization method is an effective method to retrieve the particle size distribution from aerosol optical depth
Dust Particle Size Distribution Inversion Based on the Multi Population Genetic Algorithm
The aerosol number size distribution is the main parameter for characterizing aerosol optical properties and physical properties, it has a major influence on radiation forcing. With regard to some disadvantages in the traditional methods, a method based on the multi population genetic algorithm (MPGA) is proposed and employed to retrieve the aerosol size distribution of dust particles. The MPGA principles and design are presented in detail. The MPGA has better performance compared with conventional methods. In order to verify the feasibility of the inversion method, the measured aerosol optical thickness (AOT) data of dust particles taken by a sun photometer are used and a series of comparisons between the simple genetic algorithm (SGA) and MPGA are carried out. The results show that the MPGA presents better properties when compared with the SGA with smaller inversion errors, smaller population size and fewer generation numbers to retrieve the aerosol size distribution. The MPGA inversion method is analyzed using the background day, dust storm event and seasonal size distribution. The method proposed in this study has important applications and reference value for aerosol particle size distribution inversion
- …