477 research outputs found

    A phantom-node method with edge-based strain smoothing for linear elastic fracture mechanics

    Get PDF
    This paper presents a novel numerical procedure based on the combination of an edge-based smoothed finite element (ES-FEM) with a phantom-node method for 2D linear elastic fracture mechanics. In the standard phantom-node method, the cracks are formulated by adding phantom nodes, and the cracked element is replaced by two new superimposed elements. This approach is quite simple to implement into existing explicit finite element programs. The shape functions associated with discontinuous elements are similar to those of the standard finite elements, which leads to certain simplification with implementing in the existing codes. The phantom-node method allows modeling discontinuities at an arbitrary location in the mesh. The ES-FEM model owns a close-to-exact stiffness that is much softer than lower-order finite element methods (FEM). Taking advantage of both the ES-FEM and the phantom-node method, we introduce an edge-based strain smoothing technique for the phantom-node method. Numerical results show that the proposed method achieves high accuracy compared with the extended finite element method (XFEM) and other reference solutions

    Lattice Boltzmann for Binary Fluids with Suspended Colloids

    Full text link
    A new description of the binary fluid problem via the lattice Boltzmann method is presented which highlights the use of the moments in constructing two equilibrium distribution functions. This offers a number of benefits, including better isotropy, and a more natural route to the inclusion of multiple relaxation times for the binary fluid problem. In addition, the implementation of solid colloidal particles suspended in the binary mixture is addressed, which extends the solid-fluid boundary conditions for mass and momentum to include a single conserved compositional order parameter. A number of simple benchmark problems involving a single particle at or near a fluid-fluid interface are undertaken and show good agreement with available theoretical or numerical results.Comment: 10 pages, 4 figures, ICMMES 200

    Spectral-domain optical coherence tomography with an arrayed waveguide grating spectrometer

    Get PDF
    We designed and fabricated an arrayed waveguide grating (AWG) with 2.1cmx2.6cm footprint. Using the AWG as spectrometer in a spectral-domain optical coherence tomography (OCT) set-up we demonstrate OCT imaging up to the maximum depth of 1 mm with 19 µm spatial resolution in air and in a multi-layered phantom

    Phase diagram of the one-dimensional extended attractive Hubbard model for large nearest-neighbor repulsion

    Full text link
    We consider the extended Hubbard model with attractive on-site interaction U and nearest-neighbor repulsions V. We construct an effective Hamiltonian H_{eff} for hopping t<<V and arbitrary U<0. Retaining the most important terms, H_{eff} can be mapped onto two XXZ models, solved by the Bethe ansatz. The quantum phase diagram shows two Luttinger liquid phases and a region of phase separation between them. For density n<0.422 and U<-4, singlet superconducting correlations dominate at large distances. For some parameters, the results are in qualitative agreement with experiments in BaKBiO.Comment: 6 pages, 3 figures, submitted to Phys. Rev.

    Market segmentation and ideal point identification for new product design using fuzzy data compression and fuzzy clustering methods

    Get PDF
    In product design, various methodologies have been proposed for market segmentation, which group consumers with similar customer requirements into clusters. Central points on market segments are always used as ideal points of customer requirements for product design, which reflects particular competitive strategies to effectively reach all consumers’ interests. However, existing methodologies ignore the fuzziness on consumers’ customer requirements. In this paper, a new methodology is proposed to perform market segmentation based on consumers’ customer requirements, which exist fuzziness. The methodology is an integration of a fuzzy compression technique for multi-dimension reduction and a fuzzy clustering technique. It first compresses the fuzzy data regarding customer requirements from high dimensions into two dimensions. After the fuzzy data is clustered into marketing segments, the centre points of market segments are used as ideal points for new product development. The effectiveness of the proposed methodology in market segmentation and identification of the ideal points for new product design is demonstrated using a case study of new digital camera design

    ‘Cytology-on-a-chip’ based sensors for monitoring of potentially malignant oral lesions

    Get PDF
    Despite significant advances in surgical procedures and treatment, long-term prognosis for patients with oral cancer remains poor, with survival rates among the lowest of major cancers. Better methods are desperately needed to identify potential malignancies early when treatments are more effective. Objective To develop robust classification models from cytology-on-a-chip measurements that mirror diagnostic performance of gold standard approach involving tissue biopsy. Materials and methods Measurements were recorded from 714 prospectively recruited patients with suspicious lesions across 6 diagnostic categories (each confirmed by tissue biopsy -histopathology) using a powerful new ‘cytology-on-a-chip’ approach capable of executing high content analysis at a single cell level. Over 200 cellular features related to biomarker expression, nuclear parameters and cellular morphology were recorded per cell. By cataloging an average of 2000 cells per patient, these efforts resulted in nearly 13 million indexed objects. Results Binary “low-risk”/“high-risk” models yielded AUC values of 0.88 and 0.84 for training and validation models, respectively, with an accompanying difference in sensitivity + specificity of 6.2%. In terms of accuracy, this model accurately predicted the correct diagnosis approximately 70% of the time, compared to the 69% initial agreement rate of the pool of expert pathologists. Key parameters identified in these models included cell circularity, Ki67 and EGFR expression, nuclear-cytoplasmic ratio, nuclear area, and cell area. Conclusions This chip-based approach yields objective data that can be leveraged for diagnosis and management of patients with PMOL as well as uncovering new molecular-level insights behind cytological differences across the OED spectrum

    An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics

    Get PDF
    For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types

    Search for the standard model Higgs boson decaying into two photons in pp collisions at sqrt(s)=7 TeV

    Get PDF
    A search for a Higgs boson decaying into two photons is described. The analysis is performed using a dataset recorded by the CMS experiment at the LHC from pp collisions at a centre-of-mass energy of 7 TeV, which corresponds to an integrated luminosity of 4.8 inverse femtobarns. Limits are set on the cross section of the standard model Higgs boson decaying to two photons. The expected exclusion limit at 95% confidence level is between 1.4 and 2.4 times the standard model cross section in the mass range between 110 and 150 GeV. The analysis of the data excludes, at 95% confidence level, the standard model Higgs boson decaying into two photons in the mass range 128 to 132 GeV. The largest excess of events above the expected standard model background is observed for a Higgs boson mass hypothesis of 124 GeV with a local significance of 3.1 sigma. The global significance of observing an excess with a local significance greater than 3.1 sigma anywhere in the search range 110-150 GeV is estimated to be 1.8 sigma. More data are required to ascertain the origin of this excess.Comment: Submitted to Physics Letters
    corecore