37 research outputs found
Using Two-stage Network to Segment Kidneys and Kidney Tumors
There are many new cases of kidney cancer each year, and surgery is the most common treatment. To assist doctors in surgical planning, an accurate and automatic kidney and tumor segmentation method is helpful in the clinical practice. In this paper, we propose a deep learning framework for the segmentation of kidneys and tumors in abdominal CT images. The key idea is using a two-stage strategy. First, for each case, we use a 3d U-shape convolution network to get the localization of each kidney. Then using next 3d U-shape convolution network we obtain the precise segmentation results of each kidney. Finally, merge the results to obtain the complete segmentation. Also, we try some tricks to improve the performance
Rethinking GNN-based Entity Alignment on Heterogeneous Knowledge Graphs: New Datasets and A New Method
The development of knowledge graph (KG) applications has led to a rising need
for entity alignment (EA) between heterogeneous KGs that are extracted from
various sources. Recently, graph neural networks (GNNs) have been widely
adopted in EA tasks due to GNNs' impressive ability to capture structure
information. However, we have observed that the oversimplified settings of the
existing common EA datasets are distant from real-world scenarios, which
obstructs a full understanding of the advancements achieved by recent methods.
This phenomenon makes us ponder: Do existing GNN-based EA methods really make
great progress?
In this paper, to study the performance of EA methods in realistic settings,
we focus on the alignment of highly heterogeneous KGs (HHKGs) (e.g., event KGs
and general KGs) which are different with regard to the scale and structure,
and share fewer overlapping entities. First, we sweep the unreasonable
settings, and propose two new HHKG datasets that closely mimic real-world EA
scenarios. Then, based on the proposed datasets, we conduct extensive
experiments to evaluate previous representative EA methods, and reveal
interesting findings about the progress of GNN-based EA methods. We find that
the structural information becomes difficult to exploit but still valuable in
aligning HHKGs. This phenomenon leads to inferior performance of existing EA
methods, especially GNN-based methods. Our findings shed light on the potential
problems resulting from an impulsive application of GNN-based methods as a
panacea for all EA datasets. Finally, we introduce a simple but effective
method: Simple-HHEA, which comprehensively utilizes entity name, structure, and
temporal information. Experiment results show Simple-HHEA outperforms previous
models on HHKG datasets.Comment: 11 pages, 6 figure
Blow-up of solutions for a nonlinear Petrovsky type equation with initial data at arbitrary high energy level
Abstract In this paper, we study the initial boundary value problem for a Petrovsky type equation with a memory term, nonlinear weak damping, and a superlinear source: utt+Δ2u−∫0tg(t−τ)Δ2u(τ)dτ+|ut|m−2ut=|u|p−2u,in Ω×(0,T). When the source is stronger than dissipations, we obtain the existence of certain weak solutions which blow up in finite time with initial energy E(0)=R for any given R≥0
Optimum Design of Structural Parameters for Thin-walled Metal Container
In order to further increase the volume, reduce the weight and manufacturing cost, the key structural parameters of thin-walled metal packing container are optimized. The instability conditions under circumferential external pressure and axial load are analyzed, a mathematical model with the constraint of critical instability strength, the maximum volume and minimum mass as the objective is constructed. Multi-objective optimization method with nonlinear constraints is used to solve the key structural parameters, such as wall thickness, diameter and height, and the optimization result is calculated by fgoalattain() function in the Matlab optimization toolbox. The instability pressure test system is constructed, the instability pressure of the optimized thin-wall metal packing container is tested. The results show that the unstable pressure is higher than 120kPa, which are better than the design index
Cloning, expression analysis and recombinant expression of a gene encoding a polygalacturonase-inhibiting protein from tobacco, Nicotiana tabacum
Polygalacturonase inhibiting proteins (PGIPs) are major defensive proteins produced by plant cell walls that play a crucial role in pathogen resistance by reducing polygalacturonase (PG) activity. In the present study, a novel PGIP gene was isolated from tobacco (Nicotiana tabacum), hereafter referred as NtPGIP. A full-length NtPGIP cDNA of 1,412 bp with a 186 bp 5′-untranslated region (UTR), and 209 bp 3′-UTR was cloned from tobacco, NtPGIP is predicted to encode a protein of 338 amino acids. The NtPGIP sequence from genomic DNA showed no introns and sequence alignments of NtPGIP’s deduced amino acid sequence showed high homology with known PGIPs from other plant species. Moreover, the putative NtPGIP protein was closely clustered with several Solanaceae PGIPs. Further, the expression profile of NtPGIP was examined in tobacco leaves following stimulation with the oomycete Phytophthora nicotianae and other stressors, including salicylic acid (SA), abscisic acid (ABA), salt, and cold treatment. The results showed that all of the treatments up-regulated the expression of NtPGIP at different times. To understand the biochemical activity of NtPGIP gene, a full-length NtPGIP cDNA sequence was subcloned into a pET28a vector and transformed into E. coli BL21 (DE3). Recombinant proteins were successfully induced by 1.0 nmol/L IPTG and the purified proteins effectively inhibited Phytophthora capsici PG activity. The results of this study suggest that NtPGIP may be a new candidate gene with properties that could be exploited in plant breeding
Simulation study on flow field characteristics of air flotation deoiling process
In order to solve the problems of low clarification efficiency and long subsequent processing steps in the extraction metallurgically dissolved air flotation(DAF) process, supported by the actual process with a domestic oil removal equipment, the Reynolds Average Navier-Stokes(RANS) turbulence transient numerical simulation method and the κ-ω turbulence model were used to analyze the flow characteristics of internal flow. The results are shown as follows. 1) With a constant air blowing volume, it is favorable for oil removal with the range of the liquid inlet volume from 45 m3/h to 55 m3/h because of the good flow properties; while the liquid inlet is lower than 40 m3/h or higher than 60 m3/h, the gas bubbles are concentrated in the surface of the reactor, which leads to the low efficiency of the oil separation. 2) With a constant liquid inlet volume, the wall velocity and gas volume fraction increase while the gas inlet increases, which shows unfavorable effects for the oil removal because of the gas distribution on the wall. 3) The lower position of the liquid inlet or the gas phase inlet and the lower height of the partition have obvious adverse effects on the separation of oil from liquid phases
TFLEX: Temporal Feature-Logic Embedding Framework for Complex Reasoning over Temporal Knowledge Graph
Multi-hop logical reasoning over knowledge graph (KG) plays a fundamental
role in many artificial intelligence tasks. Recent complex query embedding
(CQE) methods for reasoning focus on static KGs, while temporal knowledge
graphs (TKGs) have not been fully explored. Reasoning over TKGs has two
challenges: 1. The query should answer entities or timestamps; 2. The operators
should consider both set logic on entity set and temporal logic on timestamp
set. To bridge this gap, we define the multi-hop logical reasoning problem on
TKGs. With generated three datasets, we propose the first temporal CQE named
Temporal Feature-Logic Embedding framework (TFLEX) to answer the temporal
complex queries. We utilize vector logic to compute the logic part of Temporal
Feature-Logic embeddings, thus naturally modeling all First-Order Logic (FOL)
operations on entity set. In addition, our framework extends vector logic on
timestamp set to cope with three extra temporal operators (After, Before and
Between). Experiments on numerous query patterns demonstrate the effectiveness
of our method