7,532 research outputs found
ResumeNet: A Learning-based Framework for Automatic Resume Quality Assessment
Recruitment of appropriate people for certain positions is critical for any
companies or organizations. Manually screening to select appropriate candidates
from large amounts of resumes can be exhausted and time-consuming. However,
there is no public tool that can be directly used for automatic resume quality
assessment (RQA). This motivates us to develop a method for automatic RQA.
Since there is also no public dataset for model training and evaluation, we
build a dataset for RQA by collecting around 10K resumes, which are provided by
a private resume management company. By investigating the dataset, we identify
some factors or features that could be useful to discriminate good resumes from
bad ones, e.g., the consistency between different parts of a resume. Then a
neural-network model is designed to predict the quality of each resume, where
some text processing techniques are incorporated. To deal with the label
deficiency issue in the dataset, we propose several variants of the model by
either utilizing the pair/triplet-based loss, or introducing some
semi-supervised learning technique to make use of the abundant unlabeled data.
Both the presented baseline model and its variants are general and easy to
implement. Various popular criteria including the receiver operating
characteristic (ROC) curve, F-measure and ranking-based average precision (AP)
are adopted for model evaluation. We compare the different variants with our
baseline model. Since there is no public algorithm for RQA, we further compare
our results with those obtained from a website that can score a resume.
Experimental results in terms of different criteria demonstrate the
effectiveness of the proposed method. We foresee that our approach would
transform the way of future human resources management.Comment: ICD
The Priority of Exploiting Fiscal Revenue or Lessening Public Expenditure: Evidence from China
In the past 28 years, we find that except for the fiscal revenue of 5,132.1 billion yuan in 2007, which is greater than the fiscal expenditure of 4,978.1 billion yuan, presenting a fiscal surplus, the fiscal expenditure of the rest years is greater than the fiscal revenue, showing the situation of public sector net cash requirement (psncr), especially in 2011, the deficit( the gap between fiscal expenditure and fiscal revenue) is 537.3 billion yuan. Since then, the gap between expenditure and revenue has been increasing with each passing year. In 2015, the fiscal deficit is 2,368 billion yuan. In 2018, the fiscal deficit has been expanded to 3,754.4 billion yuan. In order to avoid the continuous increment of the deficit. This paper discusses the causal relationship between China's fiscal revenue and public expenditure from 1990 to 2018. If fiscal revenue has a positive impact on public expenditure, showing that the government shall reduce fiscal deficit through tax increment. On the contrary, it makes public expenditure continue to expand, leading to the continuous deterioration of fiscal deficit, so as to further decide whether China's future fiscal policy should adopt increasing fiscal revenue or deducting public expenditure policy to reduce the deficit
Inelastic current-voltage characteristics of atomic and molecular junctions
We report first-principles calculations of the inelastic current-voltage
(I-V) characteristics of a gold point contact and a molecular junction in the
nonresonant regime. Discontinuities in the I-V curves appear in correspondence
to the normal modes of the structures. Due to the quasi-one-dimensional nature
of these systems, specific modes with large longitudinal component dominate the
inelastic I-V curves. In the case of the gold point contact, our results are in
good agreement with recent experimental data. For the molecular junction, we
find that the inelastic I-V curves are quite sensitive to the structure of the
contact between the molecule and the electrodes thus providing a powerful tool
to extract the bonding geometry in molecular wires.Comment: 4 pages, 3 figure
An epidemiologic study of early biologic effects of benzene in Chinese workers.
Benzene is a recognized hematotoxin and leukemogen, but its mechanisms of action in humans are still uncertain. To provide insight into these processes, we carried out a cross-sectional study of 44 healthy workers currently exposed to benzene (median 8-hr time-weighted average; 31 ppm), and unexposed controls in Shanghai, China. Here we provide an overview of the study results on peripheral blood cells levels and somatic cell mutation frequency measured by the glycophorin A (GPA) gene loss assay and report on peripheral cytokine levels. All peripheral blood cells levels (i.e., total white blood cells, absolute lymphocyte count, platelets, red blood cells, and hemoglobin) were decreased among exposed workers compared to controls, with the exception of the red blood cell mean corpuscular volume, which was higher among exposed subjects. In contrast, peripheral cytokine levels (interleukin-3, interleukin-6, erythropoietin, granulocyte colony-stimulating factor, tissue necrosis factor-alpha) in a subset of the most highly exposed workers (n = 11) were similar to values in controls (n = 11), suggesting that benzene does not affect these growth factor levels in peripheral blood. The GPA assay measures stem cell or precursor erythroid cell mutations expressed in peripheral red blood cells of MN heterozygous subjects, identifying NN variants, which result from loss of the GPA M allele and duplication of the N allele, and N phi variants, which arise from gene inactivation. The NN (but not N phi) GPA variant cell frequency was elevated in the exposed workers compared with controls (mean +/- SD, 13.9 +/- 8.4 mutants per million cells versus 7.4 +/- 5.2 per million cells, (respectively; p = 0.0002), suggesting that benzene produces gene-duplicating but not gene-inactivating mutations at the GPA locus in bone marrow cells of exposed humans. These findings, combined with ongoing analyses of benzene macromolecular adducts and chromosomal aberrations, will provide an opportunity to comprehensively evaluate a wide range of early biologic effects associated with benzene exposure in humans
The Economic Analysis of Carbon Emissions: Evidence from China
Transitioning away from coal supply is a cost-effective path to a low-carbon economy. Although many articles have considered the issue of manufacturers' production and emission of pollution. Few papers have discussed the interrelations among CO2 emissions, economic growth and coal supply on the cost of environmental degradation.This paper seeks to fill this gap by using some empirical tests including unit root, ARDL bounds test and impulse effect to check the causality among carbon emission, economic growth and coal supply The time series used in the model ranged from 1990 to 2016. We specifically take China as a case to analyze. The main results show that there exist only one-way positive causality between LGDP (dependent variable) and LCO2 (independent variable), in addition, we show China's GDP growth in recent years has gradually decoupled from CO2 emissions, in other words, the current growth of China's economy is not at the cost of worsening the environmental degradation, Furthermore, we outline that the generalized impulse response between LCO2 and LGDP is higher than that of LCOALSUPPLY and LGDP
The correlates of urinary albumin to creatinine ratio (ACR) in a high risk Australian Aboriginal community
Background: Albuminuria marks renal disease and cardiovascular risk. It was estimated to contribute 75% of the risk of all-cause natural death in one Aboriginal group. The urine albumin/creatinine ratio (ACR) is commonly used as an index of albuminuria. This study aims to examine the associations between demographic factors, anthropometric index, blood pressure, lipid-protein measurements and other biomarkers and albuminuria in a cross-sectional study in a high-risk Australian Aboriginal population. The models will be evaluated for albuminuria at or above the microalbuminuria threshold, and at or above the "overt albuminuria" threshold with the potential to distinguish associations they have in common and those that differ
Universal quantum control of an atomic spin qubit on a surface
Scanning tunneling microscopy (STM) enables the bottom-up fabrication of tailored spin systems on a surface that are engineered with atomic precision. When combining STM with electron spin resonance (ESR), these single atomic and molecular spins can be controlled quantum-coherently and utilized as electron-spin qubits. Here we demonstrate universal quantum control of such a spin qubit on a surface by employing coherent control along two distinct directions, achieved with two consecutive radio-frequency (RF) pulses with a well-defined phase difference. We first show transformations of each Cartesian component of a Bloch vector on the quantization axis, followed by ESR-STM detection. Then we demonstrate the ability to generate an arbitrary superposition state of a single spin qubit by using two-axis control schemes, in which experimental data show excellent agreement with simulations. Finally, we present an implementation of two-axis control in dynamical decoupling. Our work extends the scope of STM-based pulsed ESR, highlighting the potential of this technique for quantum gate operations of electron-spin qubits on a surface
Recommended from our members
Error, reproducibility and sensitivity : a pipeline for data processing of Agilent oligonucleotide expression arrays
Background
Expression microarrays are increasingly used to obtain large scale transcriptomic information on a wide range of biological samples. Nevertheless, there is still much debate on the best ways to process data, to design experiments and analyse the output. Furthermore, many of the more sophisticated mathematical approaches to data analysis in the literature remain inaccessible to much of the biological research community. In this study we examine ways of extracting and analysing a large data set obtained using the Agilent long oligonucleotide transcriptomics platform, applied to a set of human macrophage and dendritic cell samples.
Results
We describe and validate a series of data extraction, transformation and normalisation steps which are implemented via a new R function. Analysis of replicate normalised reference data demonstrate that intrarray variability is small (only around 2% of the mean log signal), while interarray variability from replicate array measurements has a standard deviation (SD) of around 0.5 log2 units ( 6% of mean). The common practise of working with ratios of Cy5/Cy3 signal offers little further improvement in terms of reducing error. Comparison to expression data obtained using Arabidopsis samples demonstrates that the large number of genes in each sample showing a low level of transcription reflect the real complexity of the cellular transcriptome. Multidimensional scaling is used to show that the processed data identifies an underlying structure which reflect some of the key biological variables which define the data set. This structure is robust, allowing reliable comparison of samples collected over a number of years and collected by a variety of operators.
Conclusions
This study outlines a robust and easily implemented pipeline for extracting, transforming normalising and visualising transcriptomic array data from Agilent expression platform. The analysis is used to obtain quantitative estimates of the SD arising from experimental (non biological) intra- and interarray variability, and for a lower threshold for determining whether an individual gene is expressed. The study provides a reliable basis for further more extensive studies of the systems biology of eukaryotic cells
Optimum spectral window for imaging of art with optical coherence tomography
Optical Coherence Tomography (OCT) has been shown to have potential for important applications in the field of art conservation and archaeology due to its ability to image subsurface microstructures non-invasively. However, its depth of penetration in painted objects is limited due to the strong scattering properties of artists’ paints. VIS-NIR (400 nm – 2400 nm) reflectance spectra of a wide variety of paints made with historic artists’ pigments have been measured. The best spectral window with which to use optical coherence tomography (OCT) for the imaging of subsurface structure of paintings was found to be around 2.2 μm. The same spectral window would also be most suitable for direct infrared imaging of preparatory sketches under the paint layers. The reflectance spectra from a large sample of chemically verified pigments provide information on the spectral transparency of historic artists’ pigments/paints as well as a reference set of spectra for pigment identification. The results of the paper suggest that broadband sources at ~2 microns are highly desirable for OCT applications in art and potentially material science in general
- …