2,480 research outputs found

    Use of optoelectronic tweezers in manufacturing – accurate solder bead positioning

    Get PDF
    In this work, we analyze the use of optoelectronic tweezers (OETs) to manipulate 45 μm diameter Sn62Pb36Ag2 solder beads with light-induced dielectrophoresis force and we demonstrate high positioning accuracy. It was found that the positional deviation of the solder beads increases with the increase of the trap size. To clarify the underlying mechanism, simulations based on the integration of the Maxwell stress tensor were used to study the force profiles of OET traps with different sizes. It was found that the solder beads felt a 0.1 nN static friction or stiction force due to electrical forces pulling them towards the surface and that this force is not dependent on the size of the trap. The stiction limits the positioning accuracy; however, we show that by choosing a trap that is just larger than the solder bead sub-micron positional accuracy can be achieved

    Sparsity-Constrained Optimal Transport

    Full text link
    Regularized optimal transport (OT) is now increasingly used as a loss or as a matching layer in neural networks. Entropy-regularized OT can be computed using the Sinkhorn algorithm but it leads to fully-dense transportation plans, meaning that all sources are (fractionally) matched with all targets. To address this issue, several works have investigated quadratic regularization instead. This regularization preserves sparsity and leads to unconstrained and smooth (semi) dual objectives, that can be solved with off-the-shelf gradient methods. Unfortunately, quadratic regularization does not give direct control over the cardinality (number of nonzeros) of the transportation plan. We propose in this paper a new approach for OT with explicit cardinality constraints on the transportation plan. Our work is motivated by an application to sparse mixture of experts, where OT can be used to match input tokens such as image patches with expert models such as neural networks. Cardinality constraints ensure that at most kk tokens are matched with an expert, which is crucial for computational performance reasons. Despite the nonconvexity of cardinality constraints, we show that the corresponding (semi) dual problems are tractable and can be solved with first-order gradient methods. Our method can be thought as a middle ground between unregularized OT (recovered in the limit case k=1k=1) and quadratically-regularized OT (recovered when kk is large enough). The smoothness of the objectives increases as kk increases, giving rise to a trade-off between convergence speed and sparsity of the optimal plan

    The Challenges of Business Analytics: Successes and Failures

    Get PDF
    The successful use of business analytics is an important element of a company’s success. Business analytics enables analysts and managers to engage in an IT-driven sense-making process in which they use the data and analysis as a means to understand the phenomena that the data represent . Not all organizations apply business analytics successfully to decision making. When used correctly, the actionable intelligence gained from a business analytics program can be utilized to improve strategic decision making. Conversely, an organization that does not utilize business analytics information appropriately will not experience optimal decision making; failing to realize the full potential of a data analytics program. This paper examines some organizations that implemented data analytics programs; both successfully and unsuccessfully, and discuss the implications for each organization. Based on the lesson learned, we present ways to implement a successful business analytics program

    When transcription meets recombination: a lesson from the human RECQ protein complexes

    Get PDF
    Since the cloning of the first human RECQ gene, RECQ1, more than 15 years ago, RECQ helicases have been a major focus in cancer research. Recent studies of human RECQ protein complexes are providing insight into their roles in various DNA metabolic pathways that protect the integrity of our genome

    IonSeq Genome Sequencing

    Get PDF
    The emergence of advanced DNA sequencing methods has presented disruptive opportunities in biotechnology, establishing the foundation for the personalized medicine industry. Since the completion of the Human Genome Project, the number of genomes sequenced has grown exponentially and the sequencing price has dropped precipitously. To make personalized medicine a reality, there is a need for a large collection of sequenced genomes in order to link specific genes to diseases. IonSeq seeks to be the leading DNA sequencing service, employing new semiconductor- based sequencing technology offered by Ion Torrent, to help pharmaceutical companies generate these libraries of genomes for their drug-development processes. To support sequencing reliability and throughput, IonSeq will explore such technical details such as chip configuration, insertion kinetics, signal generation, base-calling methods, and accuracy metrics. IonSeq will prove a 40 genome/day output, made possible by the massively parallel procedure employed by the sequencers. IonSeq will sequence each genome at a price of 2,000whilethecostofmanufacturewillonlybe2,000 while the cost of ‘manufacture’ will only be 645. Series A will consist of a 3,682,886investmentandwillyieldtheinvestorsaMIRRof102.983,682,886 investment and will yield the investors a MIRR of 102.98% over four years. The Series B investment will total 4,510,491 and result in a 93.43% MIRR over a three period. The NPV by the time of liquidation or acquisition event will be $39,322,347, at a conservative projected growth rate of 5%

    Evaluating Self-Supervised Learning for Molecular Graph Embeddings

    Full text link
    Graph Self-Supervised Learning (GSSL) provides a robust pathway for acquiring embeddings without expert labelling, a capability that carries profound implications for molecular graphs due to the staggering number of potential molecules and the high cost of obtaining labels. However, GSSL methods are designed not for optimisation within a specific domain but rather for transferability across a variety of downstream tasks. This broad applicability complicates their evaluation. Addressing this challenge, we present "Molecular Graph Representation Evaluation" (MOLGRAPHEVAL), generating detailed profiles of molecular graph embeddings with interpretable and diversified attributes. MOLGRAPHEVAL offers a suite of probing tasks grouped into three categories: (i) generic graph, (ii) molecular substructure, and (iii) embedding space properties. By leveraging MOLGRAPHEVAL to benchmark existing GSSL methods against both current downstream datasets and our suite of tasks, we uncover significant inconsistencies between inferences drawn solely from existing datasets and those derived from more nuanced probing. These findings suggest that current evaluation methodologies fail to capture the entirety of the landscape.Comment: update result
    corecore