823 research outputs found

    BeeFlow: Behavior Tree-based Serverless Workflow Modeling and Scheduling for Resource-Constrained Edge Clusters

    Full text link
    Serverless computing has gained popularity in edge computing due to its flexible features, including the pay-per-use pricing model, auto-scaling capabilities, and multi-tenancy support. Complex Serverless-based applications typically rely on Serverless workflows (also known as Serverless function orchestration) to express task execution logic, and numerous application- and system-level optimization techniques have been developed for Serverless workflow scheduling. However, there has been limited exploration of optimizing Serverless workflow scheduling in edge computing systems, particularly in high-density, resource-constrained environments such as system-on-chip clusters and single-board-computer clusters. In this work, we discover that existing Serverless workflow scheduling techniques typically assume models with limited expressiveness and cause significant resource contention. To address these issues, we propose modeling Serverless workflows using behavior trees, a novel and fundamentally different approach from existing directed-acyclic-graph- and state machine-based models. Behavior tree-based modeling allows for easy analysis without compromising workflow expressiveness. We further present observations derived from the inherent tree structure of behavior trees for contention-free function collections and awareness of exact and empirical concurrent function invocations. Based on these observations, we introduce BeeFlow, a behavior tree-based Serverless workflow system tailored for resource-constrained edge clusters. Experimental results demonstrate that BeeFlow achieves up to 3.2X speedup in a high-density, resource-constrained edge testbed and 2.5X speedup in a high-profile cloud testbed, compared with the state-of-the-art.Comment: Accepted by Journal of Systems Architectur

    A Novel Dual-Band MIMO Antenna with Lower Correlation Coefficient

    Get PDF
    This paper demonstrates a novel dual-band operated MIMO antenna which consisted of planar monopole (main antenna) and 3D slot element (auxiliary antenna). The main antenna is printed on a 1.6 mm thick FR4 board, while the auxiliary antenna is fabricated with gold-coated copper. A lumped impedance network is applied to enhance matching effect at port1. From simulations by commercial software, it can be found that the proposed antenna is able to cover GSM800, GSM900 (lower band), and LTE/ WiMAX/WLAN (higher band) quite well. Good agreements between simulations and measurements are obtained. Corresponding measured results, antenna efficiency, peak gain, and radiation patterns, are presented at the same time. By equipping a passive decoupling element, the coupling power on the ground is radiated into free space, and great enhancement of isolation between antenna elements, especially for lower band, is achieved

    Searching Transferable Mixed-Precision Quantization Policy through Large Margin Regularization

    Full text link
    Mixed-precision quantization (MPQ) suffers from time-consuming policy search process (i.e., the bit-width assignment for each layer) on large-scale datasets (e.g., ISLVRC-2012), which heavily limits its practicability in real-world deployment scenarios. In this paper, we propose to search the effective MPQ policy by using a small proxy dataset for the model trained on a large-scale one. It breaks the routine that requires a consistent dataset at model training and MPQ policy search time, which can improve the MPQ searching efficiency significantly. However, the discrepant data distributions bring difficulties in searching for such a transferable MPQ policy. Motivated by the observation that quantization narrows the class margin and blurs the decision boundary, we search the policy that guarantees a general and dataset-independent property: discriminability of feature representations. Namely, we seek the policy that can robustly keep the intra-class compactness and inter-class separation. Our method offers several advantages, i.e., high proxy data utilization, no extra hyper-parameter tuning for approximating the relationship between full-precision and quantized model and high searching efficiency. We search high-quality MPQ policies with the proxy dataset that has only 4% of the data scale compared to the large-scale target dataset, achieving the same accuracy as searching directly on the latter, and improving the MPQ searching efficiency by up to 300 times

    Image Synthesis with Disentangled Attributes for Chest X-Ray Nodule Augmentation and Detection

    Full text link
    Lung nodule detection in chest X-ray (CXR) images is common to early screening of lung cancers. Deep-learning-based Computer-Assisted Diagnosis (CAD) systems can support radiologists for nodule screening in CXR. However, it requires large-scale and diverse medical data with high-quality annotations to train such robust and accurate CADs. To alleviate the limited availability of such datasets, lung nodule synthesis methods are proposed for the sake of data augmentation. Nevertheless, previous methods lack the ability to generate nodules that are realistic with the size attribute desired by the detector. To address this issue, we introduce a novel lung nodule synthesis framework in this paper, which decomposes nodule attributes into three main aspects including shape, size, and texture, respectively. A GAN-based Shape Generator firstly models nodule shapes by generating diverse shape masks. The following Size Modulation then enables quantitative control on the diameters of the generated nodule shapes in pixel-level granularity. A coarse-to-fine gated convolutional Texture Generator finally synthesizes visually plausible nodule textures conditioned on the modulated shape masks. Moreover, we propose to synthesize nodule CXR images by controlling the disentangled nodule attributes for data augmentation, in order to better compensate for the nodules that are easily missed in the detection task. Our experiments demonstrate the enhanced image quality, diversity, and controllability of the proposed lung nodule synthesis framework. We also validate the effectiveness of our data augmentation on greatly improving nodule detection performance

    Differential Diagnosis of Solitary Pulmonary Inflammatory Lesions and Peripheral Lung Cancers with Contrast-enhanced Computed Tomography

    Get PDF
    OBJECTIVES: To clarify differences between solitary pulmonary inflammatory lesions and peripheral lung cancers with contrast-enhanced computed tomography. METHODS: In total, 64 and 132 patients with solitary pulmonary inflammatory masses/nodules and peripheral lung cancers, respectively, were enrolled in this study. Their computed tomographic findings were summarized and compared retrospectively. RESULTS: Compared with the peripheral lung cancers, the inflammatory lesions were located closer to the pleura (

    A SURVEY OF COLLABORATIVE FILTERING BASED ON NEAREST-NEIGHBORS

    Get PDF
    This paper explains the k-NN classification algorithm and its operator in RapidMiner. The Use Case of this chapter applies the k-NN operator on the Teacher Evaluation dataset. The operators explained in this chapter are: Read URL, Rename, Numerical to Binominal, Numerical to Polynominal, Set Role, Split Validation, Apply Model, and Performance. The k-Nearest Neighbor algorithm is based on learning by analogy, that is, by comparing a given test example with the training examples that are similar to it. The training examples are described by n attributes. Each example represents a point in an n-dimensional space. In this way, all of the training examples are stored in an n-dimensional pattern space. When given an unknown example, the k-nearest neighbor algorithm searches the pattern space for the k training examples that are closest to the unknown example. These k training examples are the k “nearest neighbors” of the unknown example. The “Closeness” is defined in terms of a distance metric, such as the Euclidean distance

    Quality test of clamping connection of transmission lines across tensile line

    Get PDF
    This paper develops a new technology for the quality inspection of the transmission line that is important across the tensile clamp. The new technology mainly based on the ultrasonic pulse echo thickness measurement mechanism tests the thickness of the aluminum sleeve after crimping the tensile clamp to reflect the relative position of the aluminum sleeve and the steel anchor after the crimping, thereby judging whether there is a crimping positioning defect. At the same time, it is supplemented by steel anchor model comparison, crimping position length comparison, and crimping to margin detection to determine whether the transmission line crimping quality is qualified
    • …
    corecore