101 research outputs found

    Userland CO-PAGER: boosting data-intensive applications with non-volatile memory, userspace paging

    Get PDF
    With the emergence of low-latency non-volatile memory (NVM) storage, the software overhead, incurred by the operating system, becomes more prominent. The Linux (monolithic) kernel, incorporates a complex I/O subsystem design, using redundant memory copies and expensive user/kernel context switches to perform I/O. Memory-mapped I/O, which internally uses demand paging, has recently become popular when paired with low-latency storage. It improves I/O performance by mapping the data DMA transfers directly to userspace memory and removing the additional data copy between user/kernel space. However, for data-intensive applications, when there is insufficient physical memory, frequent page faults can still trigger expensive mode switches and I/O operations. To tackle this problem, we propose CO-PAGER, which is a lightweight userspace memory service. CO-PAGER consists of a minimal kernel module and a userspace component. The userspace component handles (redirected) page faults, performs memory management and I/O operations and accesses NVM storage directly. The kernel module is used to update memory mapping between user and kernel space. In this way CO-PAGER can bypass the deep kernel I/O stacks and provide a flexible/customizable and efficient memory paging service in userspace. We provide a general programming interface to use the CO-PAGER service. In our experiments, we also demonstrate how the CO-PAGER approach can be applied to a MapReduce framework and improves performance for data-intensive applications

    Encoder-Decoder-Based Intra-Frame Block Partitioning Decision

    Full text link
    The recursive intra-frame block partitioning decision process, a crucial component of the next-generation video coding standards, exerts significant influence over the encoding time. In this paper, we propose an encoder-decoder neural network (NN) to accelerate this process. Specifically, a CNN is utilized to compress the pixel data of the largest coding unit (LCU) into a fixed-length vector. Subsequently, a Transformer decoder is employed to transcribe the fixed-length vector into a variable-length vector, which represents the block partitioning outcomes of the encoding LCU. The vector transcription process adheres to the constraints imposed by the block partitioning algorithm. By fully parallelizing the NN prediction in the intra-mode decision, substantial time savings can be attained during the decision phase. The experimental results obtained from high-definition (HD) sequences coding demonstrate that this framework achieves a remarkable 87.84\% reduction in encoding time, with a relatively small loss (8.09\%) of coding performance compared to AVS3 HPM4.0

    Relationship between Tumor DNA Methylation Status and Patient Characteristics in African-American and European-American Women with Breast Cancer

    Get PDF
    Aberrant DNA methylation is critical for development and progression of breast cancer. We investigated the association of CpG island methylation in candidate genes and clinicopathological features in 65 African-American (AA) and European-American (EA) breast cancer patients. Quantitative methylation analysis was carried out on bisulfite modified genomic DNA and sequencing (pyrosequencing) for promoter CpG islands of p16, ESR1, RASSF1A, RARβ2, CDH13, HIN1, SFRP1 genes and the LINE1 repetitive element using matched paired non-cancerous and breast tumor specimen (32 AA and 33 EA women). Five of the genes, all known tumor suppressor genes (RASSF1A, RARβ2, CDH13, HIN1 and SFRP1), were found to be frequently hypermethylated in breast tumor tissues but not in the adjacent non-cancerous tissues. Significant differences in the CDH13 methylation status were observed by comparing DNA methylation between AA and EA patients, with more obvious CDH13 methylation differences between the two patient groups in the ER- disease and among young patients (age<50). In addition, we observed associations between CDH13, SFRP1, and RASSF1A methylation and breast cancer subtypes and between SFRP1 methylation and patient's age. Furthermore, tumors that received neoadjuvant therapy tended to have reduced RASSF1A methylation when compared with chemotherapy naïve tumors. Finally, Kaplan Meier survival analysis showed a significant association between methylation at 3 loci (RASSF1A, RARβ2 and CDH13) and reduced overall disease survival. In conclusion, the DNA methylation status of breast tumors was found to be significantly associated with clinicopathological features and race/ethnicity of the patients

    Creación y Simulación de Metodologías de Análisis, Clasificación e Integración de Nuevos Requerimientos a Software Propietario

    Get PDF
    La priorización de nuevos requerimientos a implementar en un software propietario es un punto fundamental para su mantenimiento, la conservación de la calidad, observación de las reglas de negocio y los estándares de la empresa. Aunque existen herramientas de priorización basadas en técnicas probadas y reconocidas, las mismas requieren una calificación previa de cada requerimiento. Si la empresa cuenta con solicitudes provenientes de varios clientes de un mismo producto, aumentan los factores que afectan a la empresa, las herramientas disponibles no contemplan estos aspectos y hacen mucho más compleja la tarea de calificación. Este trabajo de investigación abarca la realización de un relevamiento de los métodos de priorización y selección de nuevos requerimientos utilizados por empresas de la zona de Rosario, y la definición de una metodología para la selección un nuevo requerimiento, que implica el análisis y evaluación de todas las implicaciones sobre el producto de software y la empresa, respetando sus reglas de negocio. La metodología creada conduce a la definición de los procesos para la construcción de una herramienta de calificación y priorización de nuevos requerimientos en software propietario que tiene solicitudes de varios clientes al mismo tiempo, con instrumentos de calificación que consideran todos los aspectos relacionados, proveerá técnicas de priorización actuales y emitirá informes personalizados según diferentes perspectivas de la empresa.Eje: Ingeniería de SoftwareRed de Universidades con Carreras en Informática (RedUNCI

    日本の学会に参加するには <留学生学習サポート強化週間>

    No full text
    留学生学習サポート強化週間主催: 京都大学図書館機構日時: 2021.11.10 16:45-17:15, 18:25-18:55会場: Zoomによるオンライン開

    The Impact of Economic Policy Uncertainty on Industrial Output: The Regulatory Role of Technological Progress

    No full text
    Since the 2008 financial crisis, EPU has become an important issue for the stable and healthy development of the economy and society. The existing research has not analyzed the nonlinear impact of economic policy uncertainty (EPU) on output at the industrial level, and it has also ignored the regulatory role of technological progress in the impact of EPU on economic growth. Based on panel data of China&rsquo;s industry from 2005 to 2017, this paper makes an empirical analysis on the nonlinear impact of EPU on industry output. The results show that: (1) Different from the existing research, this paper finds that EPU has a significant inverted &ldquo;U&rdquo;-type nonlinear effect on industrial output, and when the EPU index is close to 221, this is best for output growth. This paper firstly finds that technological progress has a positive regulatory effect in the impact of EPU on industrial output. Technological progress can promote industrial output when EPU is low, and it can reduce the adverse impact of economic policy fluctuations when the EPU index is high. (2) The regulatory effect of technological progress only exists in the industries dominated by state-owned enterprises, and the impact of EPU on the output of non-state-owned enterprises&rsquo; leading industries is greater than that of state-owned enterprises. (3) The impact of EPU on the output of cyclical industries shows a significant inverted &ldquo;U&rdquo; shape, but there is no regulatory effect of technological progress. Its impact on the output of noncyclical industries is not significant, but it will work together with technological progress. (4) The influence of EPU on the output of the tertiary industry is characterized by an inverted &ldquo;U&rdquo; shape, in which technological progress can play a positive regulatory role. However, its impact on the output of primary and secondary industries is not significant

    An Interactive System Based on the IASP91 Earth Model for Earthquake Data Processing

    No full text
    System software for interactive human–computer data processing based on the IASP91 Earth model was designed. An interactive data processing system for visualizing earthquake data was designed and implemented via the Intel Fortran platform. The system reads and processes broadband seismic data acquired by field stations, mainly including the reading and import of raw data, pre-processing, identification of seismic phases and inter-correlation traveltimes picking. In the data processing step, shortcomings have been improved and functions have been gradually refined and enhanced, making it easier and faster to process data. It has already processed more than 1000 large seismic events received by the station from 2013 to 2018. The practical application shows that the human–computer interaction system is easy to operate, accurate, fast and flexible, and is an effective tool for processing seismic data

    Accurate Redetermination of the Focal Depth by Using the Time Intervals between the Inner Core Phases PKIKP and pPKIKP

    No full text
    The hypocenter parameters of an earthquake may give us an insight into the Earth’s structure and tectonic processes. Among the hypocenter parameters, the focal depth is normally more difficult to estimate than the earthquake location (latitude and longitude). We propose to use the pPKIKP-PKIKP arrival time intervals for estimating the focal depth. We analyze the sensitivity of the pPKIKP-PKIKP arrival time interval to the earthquake depth. We measure the pPKIKP-PKIKP arrival time interval on seismograms (the vertical component), and invert the time interval data set using the simulated annealing inversion algorithm. We illustrate the inversion approach on two teleseismic earthquakes which have shallow and deep focal depths, and demonstrate that the approach is indeed appropriate to the shallow and deep event. We can obtain a reliable estimate on focal depth, even though the seismic station is sparse or in a remote part of the epicenter
    corecore