871 research outputs found

    IMPLEMENTATION OF GENETIC ALGORITHM TO OPTIMIZE THE ASSEMBLY SEQUENCE PLAN BASED ON PENALTY FUNCTION

    Get PDF
    ABSTRACT Genetic Algorithms (GA) are, conceptually, suitable to optimize the Assembly Sequence Planning (ASP) problem. GA was implemented in this research to optimize the ASP problem because they can easily handle large search spaces, flexibility in defining the constraints and derive them in a fitness function. A penalty function approach has been used to compute the fitness value for assembly sequences. The penalty function approach was chosen as the penalties are easy to define, realistically capture the difficulties associated with the assembly process and the number of penalties to consider is relatively reduced. The evaluation of the penalty function is simple and straightforward, a most desirable feature for a population-based search

    4D printing of materials for the future: Opportunities and challenges

    Get PDF
    The concept of 4D printing is its formation of complex three-dimensional structures that have the ability to adopt different shapes and forms when subjected to different environmental stimuli. A few researchers simply view 4D printing as an extended technique of 3D printing or additive manufacturing with the added constraint of time. However, the unique shape change mechanism exhibited in this process is a combination of shape programming and the usage of smart active materials mostly polymers. This review article highlights the various smart materials, activation mechanisms and the shape-changing techniques employed in the 4D printing process. The potential of the shape-changing structures and their current applications in various biomedical and engineering fields is also explored. The article aims to emphasize the potential and viability of 4D printing and focused on providing an in-depth insight into the 4D printing process

    Domi Inter Astra (DIA) Moon Base: an interdisciplinary approach for cooperation to build a near-future Moonbase and how to use it as an educational tool

    Get PDF
    Permanent human settlements outside of low-earth orbit face technical and psycho-social challenges for the crew members and programmatic risks around funding and operating these missions, without clear public support and international involvement. A concept for the construction and operation of a lunar settlement named "Domi Inter Astra" (DIA), near the Shackleton Crater, was developed to understand the feasibility of a near-term permanent settlement crewed by international researchers and tourists. This project was created by a team under the Space Generation Advisory Council's auspices and a follow-on to our First Place design in the Moon Base Design Contest by The Moon Society. Technologies for infrastructure, life-support, environment control, and robotics were selected using high-level trade studies to balance resource requirements, safety, reliability, operability, and maintainability of the base over a long (20+ year) operating life with 10-30 inhabitants. Technology roadmaps were developed for gaps in existing technologies, considering opportunities with ISRU and methods of closing the environment control and life support system loops. A wider range of human factors pertaining to the social environment onboard the base is discussed to ensure long-term stability. Architectural design choices were made, keeping these factors in mind while also considering technical and economic viability. Large-scale space exploration projects must mitigate both public interest and funding risks throughout their life cycle. Economic roadmaps are introduced to diversify revenue streams throughout the settlement's design, deployment, and operation. Funding opportunities that evolve with the base design and functionality over time are identified for long-term economic sustainability. A polycentric model for international collaboration is explored to promote interest from current space-leading countries while providing opportunities for emerging space nations. The DIA lunar settlement case study showcases the interrelation between engineering, economics, architecture, science, social and management scopes. It highlights the interdisciplinary approach and inclusivity in the field of space sciences. This case study can help international and public-private partnerships to develop human space exploration capabilities further. The current DIA base plan could be used in many ways for educational activities, for any level of students and professionals. Two types of activities could be design and analysis based and mini analogue missions. Students could devise and perform small experiments that relate to the base’s day-to-day activities as well as resources required, for example growing microgreens and plants in different conditions, geology surveys, 3D printing different objects and many such mini-projects. Graduate students and professionals could work on CAD modelling for structures, improving the architectural plan and the statistical analysis for the economical model

    4D printing of smart polymer nanocomposites: integrating graphene and acrylate based shape memory polymers

    Get PDF
    The ever-increasing demand for materials to have superior properties and satisfy functions in the field of soft robotics and beyond has resulted in the advent of the new field of four-dimensional (4D) printing. The ability of these materials to respond to various stimuli inspires novel applications and opens several research possibilities. In this work, we report on the 4D printing of one such Shape Memory Polymer (SMP) tBA-co-DEGDA (tert-Butyl Acrylate with diethylene glycol diacrylate). The novelty lies in establishing the relationship between the various characteristic properties (tensile stress, surface roughness, recovery time, strain fixity, and glass transition temperature) concerning the fact that the print parameters of the laser pulse frequency and print speed are governed in the micro-stereolithography (Micro SLA) method. It is found that the sample printed with a speed of 90 mm/s and 110 pulses/s possessed the best batch of properties, with shape fixity percentages of about 86.3% and recovery times as low as 6.95 s. The samples built using the optimal parameters are further subjected to the addition of graphene nanoparticles, which further enhances all the mechanical and surface properties. It has been observed that the addition of 0.3 wt.% of graphene nanoparticles provides the best results

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    The genetic architecture of the human cerebral cortex

    Get PDF
    The cerebral cortex underlies our complex cognitive capabilities, yet little is known about the specific genetic loci that influence human cortical structure. To identify genetic variants that affect cortical structure, we conducted a genome-wide association meta-analysis of brain magnetic resonance imaging data from 51,665 individuals. We analyzed the surface area and average thickness of the whole cortex and 34 regions with known functional specializations. We identified 199 significant loci and found significant enrichment for loci influencing total surface area within regulatory elements that are active during prenatal cortical development, supporting the radial unit hypothesis. Loci that affect regional surface area cluster near genes in Wnt signaling pathways, which influence progenitor expansion and areal identity. Variation in cortical structure is genetically correlated with cognitive function, Parkinson's disease, insomnia, depression, neuroticism, and attention deficit hyperactivity disorder

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Large expert-curated database for benchmarking document similarity detection in biomedical literature search

    Get PDF
    Document recommendation systems for locating relevant literature have mostly relied on methods developed a decade ago. This is largely due to the lack of a large offline gold-standard benchmark of relevant documents that cover a variety of research fields such that newly developed literature search techniques can be compared, improved and translated into practice. To overcome this bottleneck, we have established the RElevant LIterature SearcH consortium consisting of more than 1500 scientists from 84 countries, who have collectively annotated the relevance of over 180 000 PubMed-listed articles with regard to their respective seed (input) article/s. The majority of annotations were contributed by highly experienced, original authors of the seed articles. The collected data cover 76% of all unique PubMed Medical Subject Headings descriptors. No systematic biases were observed across different experience levels, research fields or time spent on annotations. More importantly, annotations of the same document pairs contributed by different scientists were highly concordant. We further show that the three representative baseline methods used to generate recommended articles for evaluation (Okapi Best Matching 25, Term Frequency-Inverse Document Frequency and PubMed Related Articles) had similar overall performances. Additionally, we found that these methods each tend to produce distinct collections of recommended articles, suggesting that a hybrid method may be required to completely capture all relevant articles. The established database server located at https://relishdb.ict.griffith.edu.au is freely available for the downloading of annotation data and the blind testing of new methods. We expect that this benchmark will be useful for stimulating the development of new powerful techniques for title and title/abstract-based search engines for relevant articles in biomedical research.Peer reviewe

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe

    Combined searches for the production of supersymmetric top quark partners in proton-proton collisions at root s=13 TeV

    Get PDF
    A combination of searches for top squark pair production using proton-proton collision data at a center-of-mass energy of 13 TeV at the CERN LHC, corresponding to an integrated luminosity of 137 fb(-1) collected by the CMS experiment, is presented. Signatures with at least 2 jets and large missing transverse momentum are categorized into events with 0, 1, or 2 leptons. New results for regions of parameter space where the kinematical properties of top squark pair production and top quark pair production are very similar are presented. Depending on themodel, the combined result excludes a top squarkmass up to 1325 GeV for amassless neutralino, and a neutralinomass up to 700 GeV for a top squarkmass of 1150 GeV. Top squarks with masses from 145 to 295 GeV, for neutralino masses from 0 to 100 GeV, with a mass difference between the top squark and the neutralino in a window of 30 GeV around the mass of the top quark, are excluded for the first time with CMS data. The results of theses searches are also interpreted in an alternative signal model of dark matter production via a spin-0 mediator in association with a top quark pair. Upper limits are set on the cross section for mediator particle masses of up to 420 GeV
    corecore