91 research outputs found

    Study of an evaporating droplet in an immiscible liquid

    Get PDF
    Imperial Users onl

    Concurrent level set and fiber orientation optimization of composite structures

    Full text link
    By adjusting both the structural shape and fiber orientation, this research aims to optimize the design of Fiber Reinforced Composite structures. The structural geometry is represented by a level set function, which is approximated by quadratic B-spline functions. The fiber orientation field is parameterized with quadratic/cubic B-splines on hierarchically refined meshes. Different levels for B-spline mesh refinement for the level set and fiber orientation fields are studied to obtain a smooth fiber layout. To facilitate FRC manufacturing, the parallel alignment, and smoothness of fiber paths are enforced by introducing penalty terms referred to as "misalignment penalty and curvature penalty", which are incorporated into the optimization process. A geometric interpretation of the penalties is provided. The material behavior of the FRCs is modeled by the Mori-Tanaka homogenization scheme and the macroscopic structure response is modeled by linear elasticity under static mutiloading conditions. The Governing equations are discretized by a Heaviside-enriched eXtended IsoGeometric Analysis to avoid the need to generate conformal meshes. Instabilities in XIGA are mitigated by the facet-oriented ghost stabilization technique. This work considers mass and strain energy in the formulation of the optimization objective, along with misalignment and curvature penalties and additional regularization terms. Constraints are imposed on the volume of the structure. The resulting optimization problems are solved by a gradient-based algorithm. The design sensitivities are computed by the adjoint method. Numerical examples demonstrate with two-dimensional and three-dimensional configurations that the proposed method is efficient in simultaneously optimizing the macroscopic shape and the fiber layout while improving manufacturability by promoting parallel and smooth fiber paths

    A novel hybrid approach for technology selection in the information technology industry

    Get PDF
    open access articleHigh-tech companies are rapidly growing in the world. Research and development (hereafter R&D) department strength is the main asset that allows a firm to achieve a competitive advantage in high-tech businesses. The allocated budget to this sector is finite; thus, integration, human resource, risk and budget limitations should be considered to choose the most valuable project in the best portion of time. This paper investigates a case study from a high-tech company in Iran to prioritize the most attractive technologies for the R&D department. The case consists of twenty three technology options and the goal is to find the most attractive projects to sort them out for implementation in the R&D department. In this research, scholars proposed the best–worst method (henceforth BWM) to find the weight of the criteria of the attractive technologies in first step and utilize the newly developed method total area based on orthogonal vectors (henceforward TAOV) to sort the selected technologies based upon the identified criteria. Project integration is one of the least-noticed subjects in scientific papers; therefore, the researchers presented a zero or one linear programming (ZOLP) model to optimize and schedule the implementation procedure on the project risk, budget and time limitation simultaneously. The results indicate that starting few but attractive projects in the first years and postponing the rest to the future, helps a firm to manage funds and gain profit with the least amount of ris

    Clusterin in the eye: An old dog with new tricks at the ocular surface

    Full text link

    Study of an evaporating droplet in an immiscible liquid

    No full text
    SIGLEAvailable from British Library Document Supply Centre- DSC:D34714/81 / BLDSC - British Library Document Supply CentreGBUnited Kingdo

    A large eddy simulation of an airfoil turbulent wake subjected to streamwise curvature

    No full text
    This paper presents large eddy simulations (LES) of the curved wake of an airfoil. The wake was generated by placing a NACA0012 airfoil in a uniform stream of air, which is then subjected to an abrupt 90° curvature created by a duct bend. The trailing edge of the airfoil is one chord length upstream of the bend entry. The duct cross-section measures 457 mm × 457 mm, and the bend has radius to height ratio of 1.17. The flow Reynolds number (1.02 × 105) is based on a mainstream velocity of 10 m/s and airfoil chord length 0.15 m. The sub-grid scale models employed are the classical Smagorinsky, its dynamic variant and the dynamic kinetic energy transport. The performance of LES in depicting the experimental flow is assessed and compared with results predicted by the Reynolds stress model (RSM). The results show the advantages of LES over Reynolds-averaged Navier-Stokes methods in predicting convex wall separation in strongly curved ducts on relatively coarse grids. Results from LES on a considerably finer near-wall-resolved grid lead to much improved comparison with the experimental data in the near wake, bettering predictions by RSM and LES on the coarse grid. Copyright © 2008 John Wiley & Sons, Ltd

    AUTOMATIC ROAD GAP DETECTION USING FUZZY INFERENCE SYSTEM

    No full text
    Automatic feature extraction from aerial and satellite images is a high-level data processing which is still one of the most important research topics of the field. In this area, most of the researches are focused on the early step of road detection, where road tracking methods, morphological analysis, dynamic programming and snakes, multi-scale and multi-resolution methods, stereoscopic and multi-temporal analysis, hyper spectral experiments, are some of the mature methods in this field. Although most researches are focused on detection algorithms, none of them can extract road network perfectly. On the other hand, post processing algorithms accentuated on the refining of road detection results, are not developed as well. In this article, the main is to design an intelligent method to detect and compensate road gaps remained on the early result of road detection algorithms. The proposed algorithm consists of five main steps as follow: 1) Short gap coverage: In this step, a multi-scale morphological is designed that covers short gaps in a hierarchical scheme. 2) Long gap detection: In this step, the long gaps, could not be covered in the previous stage, are detected using a fuzzy inference system. for this reason, a knowledge base consisting of some expert rules are designed which are fired on some gap candidates of the road detection results. 3) Long gap coverage: In this stage, detected long gaps are compensated by two strategies of linear and polynomials for this reason, shorter gaps are filled by line fitting while longer ones are compensated by polynomials.4) Accuracy assessment: In order to evaluate the obtained results, some accuracy assessment criteria are proposed. These criteria are obtained by comparing the obtained results with truly compensated ones produced by a human expert. The complete evaluation of the obtained results whit their technical discussions are the materials of the full paper

    UNCERTAIN TRAINING DATA EDITION FOR AUTOMATIC OBJECT-BASED CHANGE MAP EXTRACTION

    No full text
    Due to the rapid transformation of the societies, and the consequent growth of the cities, it is necessary to study these changes in order to achieve better control and management of urban areas and assist the decision-makers. Change detection involves the ability to quantify temporal effects using multi-temporal data sets. The available maps of the under study area is one of the most important sources for this reason. Although old data bases and maps are a great resource, it is more than likely that the training data extracted from them might contain errors, which affects the procedure of the classification; and as a result the process of the training sample editing is an essential matter. Due to the urban nature of the area studied and the problems caused in the pixel base methods, object-based classification is applied. To reach this, the image is segmented into 4 scale levels using a multi-resolution segmentation procedure. After obtaining the segments in required levels, training samples are extracted automatically using the existing old map. Due to the old nature of the map, these samples are uncertain containing wrong data. To handle this issue, an editing process is proposed according to K-nearest neighbour and k-means algorithms. Next, the image is classified in a multi-resolution object-based manner and the effects of training sample refinement are evaluated. As a final step this classified image is compared with the existing map and the changed areas are detected
    • …
    corecore