3,750 research outputs found

    Applications of Artificial Intelligence in Healthcare

    Get PDF
    Now in these days, artificial intelligence (AI) is playing a major role in healthcare. It has many applications in diagnosis, robotic surgeries, and research, powered by the growing availability of healthcare facts and brisk improvement of analytical techniques. AI is launched in such a way that it has similar knowledge as a human but is more efficient. A robot has the same expertise as a surgeon; even if it takes a longer time for surgery, its sutures, precision, and uniformity are far better than the surgeon, leading to fewer chances of failure. To make all these things possible, AI needs some sets of algorithms. In Artificial Intelligence, there are two key categories: machine learning (ML) and natural language processing (NPL), both of which are necessary to achieve practically any aim in healthcare. The goal of this study is to keep track of current advancements in science, understand technological availability, recognize the enormous power of AI in healthcare, and encourage scientists to use AI in their related fields of research. Discoveries and advancements will continue to push the AI frontier and expand the scope of its applications, with rapid developments expected in the future

    Intelligent Computing: The Latest Advances, Challenges and Future

    Get PDF
    Computing is a critical driving force in the development of human civilization. In recent years, we have witnessed the emergence of intelligent computing, a new computing paradigm that is reshaping traditional computing and promoting digital revolution in the era of big data, artificial intelligence and internet-of-things with new computing theories, architectures, methods, systems, and applications. Intelligent computing has greatly broadened the scope of computing, extending it from traditional computing on data to increasingly diverse computing paradigms such as perceptual intelligence, cognitive intelligence, autonomous intelligence, and human-computer fusion intelligence. Intelligence and computing have undergone paths of different evolution and development for a long time but have become increasingly intertwined in recent years: intelligent computing is not only intelligence-oriented but also intelligence-driven. Such cross-fertilization has prompted the emergence and rapid advancement of intelligent computing. Intelligent computing is still in its infancy and an abundance of innovations in the theories, systems, and applications of intelligent computing are expected to occur soon. We present the first comprehensive survey of literature on intelligent computing, covering its theory fundamentals, the technological fusion of intelligence and computing, important applications, challenges, and future perspectives. We believe that this survey is highly timely and will provide a comprehensive reference and cast valuable insights into intelligent computing for academic and industrial researchers and practitioners

    Artificial intelligence : A powerful paradigm for scientific research

    Get PDF
    Y Artificial intelligence (AI) coupled with promising machine learning (ML) techniques well known from computer science is broadly affecting many aspects of various fields including science and technology, industry, and even our day-to-day life. The ML techniques have been developed to analyze high-throughput data with a view to obtaining useful insights, categorizing, predicting, and making evidence-based decisions in novel ways, which will promote the growth of novel applications and fuel the sustainable booming of AI. This paper undertakes a comprehensive survey on the development and application of AI in different aspects of fundamental sciences, including information science, mathematics, medical science, materials science, geoscience, life science, physics, and chemistry. The challenges that each discipline of science meets, and the potentials of AI techniques to handle these challenges, are discussed in detail. Moreover, we shed light on new research trends entailing the integration of AI into each scientific discipline. The aim of this paper is to provide a broad research guideline on fundamental sciences with potential infusion of AI, to help motivate researchers to deeply understand the state-of-the-art applications of AI-based fundamental sciences, and thereby to help promote the continuous development of these fundamental sciences.Peer reviewe

    Current Status and Emerging Trends in Colorectal Cancer Screening and Diagnostics

    Get PDF
    Colorectal cancer (CRC) is a prevalent and potentially fatal disease categorized based on its high incidences and mortality rates, which raised the need for effective diagnostic strategies for the early detection and management of CRC. While there are several conventional cancer diagnostics available, they have certain limitations that hinder their effectiveness. Significant research efforts are currently being dedicated to elucidating novel methodologies that aim at comprehending the intricate molecular mechanism that underlies CRC. Recently, microfluidic diagnostics have emerged as a pivotal solution, offering non-invasive approaches to real-time monitoring of disease progression and treatment response. Microfluidic devices enable the integration of multiple sample preparation steps into a single platform, which speeds up processing and improves sensitivity. Such advancements in diagnostic technologies hold immense promise for revolutionizing the field of CRC diagnosis and enabling efficient detection and monitoring strategies. This article elucidates several of the latest developments in microfluidic technology for CRC diagnostics. In addition to the advancements in microfluidic technology for CRC diagnostics, the integration of artificial intelligence (AI) holds great promise for further enhancing diagnostic capabilities. Advancements in microfluidic systems and AI-driven approaches can revolutionize colorectal cancer diagnostics, offering accurate, efficient, and personalized strategies to improve patient outcomes and transform cancer management

    Deep Artificial Neural Networks and Neuromorphic Chips for Big Data Analysis: Pharmaceutical and Bioinformatics Applications

    Get PDF
    [Abstract] Over the past decade, Deep Artificial Neural Networks (DNNs) have become the state-of-the-art algorithms in Machine Learning (ML), speech recognition, computer vision, natural language processing and many other tasks. This was made possible by the advancement in Big Data, Deep Learning (DL) and drastically increased chip processing abilities, especially general-purpose graphical processing units (GPGPUs). All this has created a growing interest in making the most of the potential offered by DNNs in almost every field. An overview of the main architectures of DNNs, and their usefulness in Pharmacology and Bioinformatics are presented in this work. The featured applications are: drug design, virtual screening (VS), Quantitative Structureโ€“Activity Relationship (QSAR) research, protein structure prediction and genomics (and other omics) data mining. The future need of neuromorphic hardware for DNNs is also discussed, and the two most advanced chips are reviewed: IBM TrueNorth and SpiNNaker. In addition, this review points out the importance of considering not only neurons, as DNNs and neuromorphic chips should also include glial cells, given the proven importance of astrocytes, a type of glial cell which contributes to information processing in the brain. The Deep Artificial Neuronโ€“Astrocyte Networks (DANAN) could overcome the difficulties in architecture design, learning process and scalability of the current ML methods.Galicia. Consellerรญa de Cultura, Educaciรณn e Ordenaciรณn Universitaria; GRC2014/049Galicia. Consellerรญa de Cultura, Educaciรณn e Ordenaciรณn Universitaria; R2014/039Instituto de Salud Carlos III; PI13/0028

    ๋จธ์‹  ๋Ÿฌ๋‹ ๊ธฐ๋ฐ˜์˜ ๋‚ธ๋“œ ํ”Œ๋ž˜์‹œ ์นฉ eFuse ๊ตฌ์„ฑ ์ƒ์„ฑ ์ž๋™ํ™” ๋ฐฉ๋ฒ•๋ก 

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ(์„์‚ฌ)--์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› :๊ณต๊ณผ๋Œ€ํ•™ ์ปดํ“จํ„ฐ๊ณตํ•™๋ถ€,2019. 8. ์œ ์Šน์ฃผ.Post fabrication process is becoming more and more important as memory technology becomes complex, in the bid to satisfy target performance and yield across diverse business domains, such as servers, PCs, automotive, mobiles, and embedded devices, etc. Electronic fuse adjustment (eFuse optimization and trimming) is a traditional method used in the post fabrication processing of memory chips. Engineers adjust eFuse to compensate for wafer inter-chip variations or guarantee the operating characteristics, such as reliability, latency, power consumption, and I/O bandwidth. These require highly skilled expert engineers and yet take significant time. This paper proposes a novel machine learning-based method of automatic eFuse configuration to meet the target NAND flash operating characteristics. The proposed techniques can maximally reduce the expert engineers workload. The techniques consist of two steps: initial eFuse generation and eFuse optimization. In the first step, we apply the variational autoencoder (VAE) method to generate an initial eFuse configuration that will probably satisfy the target characteristics. In the second step, we apply the genetic algorithm (GA), which attempts to improve the initial eFuse configuration and finally achieve the target operating characteristics. We evaluate the proposed techniques with Samsung 64-Stacked vertical NAND (VNAND) in mass production. The automatic eFuse configuration takes only two days to complete the implementation.๋ฉ”๋ชจ๋ฆฌ ๊ณต์ • ๊ธฐ์ˆ ์ด ๋ฐœ์ „ํ•˜๊ณ  ๋น„์ฆˆ๋‹ˆ์Šค ์‹œ์žฅ์ด ๋‹ค์–‘ํ•ด ์ง์— ๋”ฐ๋ผ ์›จ์ดํผ ์ˆ˜์œจ์„ ๋†’์ด๊ณ  ๋น„์ฆˆ๋‹ˆ์Šค ํŠน์„ฑ ๋ชฉํ‘œ๋ฅผ ๋งŒ์กฑํ•˜๊ธฐ ์œ„ํ•œ ํ›„ ๊ณต์ • ๊ณผ์ •์ด ๋งค์šฐ ์ค‘์š”ํ•ด ์ง€๊ณ  ์žˆ๋‹ค. ์ „๊ธฐ์  ํ“จ์ฆˆ ์กฐ์ ˆ ๋ฐฉ์‹(์ด-ํ“จ์ฆˆ ์ตœ์ ํ™” ๋ฐ ํŠธ๋ฆผ)์€ ๋ฉ”๋ชจ๋ฆฌ ์นฉ ํ›„ ๊ณต์ • ๊ณผ์ •์—์„œ ์‚ฌ์šฉ๋˜๋Š” ์ „ํ†ต์ ์ธ ๋ฐฉ์‹์ด๋‹ค. ์—”์ง€๋‹ˆ์–ด๋Š” ์ด-ํ“จ์ฆˆ ์กฐ์ ˆ์„ ํ†ตํ•ด ์›จ์ดํผ ์ƒ์˜ ์นฉ๋“ค ๊ฐ„์˜ ์ดˆ๊ธฐ ํŠน์„ฑ์˜ ๋ณ€ํ™”๋ฅผ ๋ณด์ƒํ•˜๊ฑฐ๋‚˜, ์‹ ๋ขฐ์„ฑ, ๋ ˆ์ดํ„ด์‹œ, ํŒŒ์›Œ ์†Œ๋ชจ, ๊ทธ๋ฆฌ๊ณ  I/O ๋Œ€์—ญํญ ๋“ฑ์˜ ์นฉ ๋ชฉํ‘œ ํŠน์„ฑ์„ ๋ณด์žฅํ•œ๋‹ค. ์ด-ํ“จ์ฆˆ ์กฐ์ ˆ ์—…๋ฌด๋Š” ๋‹ค์ˆ˜์˜ ์ˆ™๋ จ๋œ ์—”์ง€๋‹ˆ์–ด๊ฐ€ ํ•„์š”ํ•˜๊ณ  ๋˜ํ•œ ์ƒ๋‹นํžˆ ๋งŽ์€ ์‹œ๊ฐ„์„ ์†Œ๋ชจํ•œ๋‹ค. ๋ณธ ๋…ผ๋ฌธ์—์„œ๋Š” ๋‚ธ๋“œ ํ”Œ๋ž˜์‹œ ์นฉ์˜ ๋™์ž‘ ํŠน์„ฑ ๋ชฉํ‘œ๋ฅผ ์–ป๊ธฐ ์œ„ํ•œ ๊ธฐ๊ณ„ ํ•™์Šต ๊ธฐ๋ฐ˜์˜ ์ด-ํ“จ์ฆˆ ์ž๋™ ์ƒ์„ฑ ๊ธฐ์ˆ ์„ ์ œ์•ˆํ•˜๊ณ , ํ•ด๋‹น ๊ธฐ์ˆ ์€ ์—”์ง€๋‹ˆ์–ด์˜ ์ž‘์—…์‹œ๊ฐ„์„ ํš๊ธฐ์ ์œผ๋กœ ๋‹จ์ถ•์‹œํ‚ฌ ์ˆ˜ ์žˆ๋‹ค. ๋…ผ๋ฌธ์˜ ๊ธฐ์ˆ ์€ ๋‘ ๋‹จ๊ณ„๋กœ ๊ตฌ์„ฑ ๋œ๋‹ค. ์ฒซ ๋ฒˆ์งธ ๋‹จ๊ณ„์—์„œ๋Š” variational autoencoder (VAE) ๊ธฐ์ˆ ์„ ์ ์šฉํ•˜์—ฌ ๋ชฉํ‘œํ•˜๋Š” ๋™์ž‘ ํŠน์„ฑ์„ ๋งŒ์กฑ์‹œํ‚ค๋Š” ์ดˆ๊ธฐ ์ด-ํ“จ์ฆˆ ๊ตฌ์„ฑ์„ ์ƒ์„ฑํ•œ๋‹ค. ๋‘ ๋ฒˆ์งธ ๋‹จ๊ณ„์—์„œ๋Š” ์œ ์ „ ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ ์šฉํ•˜์—ฌ ์ดˆ๊ธฐ ์ƒ์„ฑ๋œ ์ด-ํ“จ์ฆˆ ๊ตฌ์„ฑ์— ๋Œ€ํ•˜์—ฌ ๋ชฉํ‘œํ•˜๋Š” ์„ฑ๋Šฅ ํŠน์„ฑ๊ณผ์˜ ์ •ํ•ฉ์„ฑ์„ ์ถ”๊ฐ€๋กœ ๊ฐœ์„ ํ•˜์—ฌ ์ตœ์ข…์ ์œผ๋กœ ๋ชฉํ‘œํ•˜๋Š” ์„ฑ๋Šฅ ํŠน์„ฑ์„ ์–ป๋Š”๋‹ค. ๋…ผ๋ฌธ์˜ ํ‰๊ฐ€๋Š” ์‹ค์ œ ์–‘์‚ฐ์ค‘์ธ ์‚ผ์„ฑ 64๋‹จ ๋ธŒ์ด๋‚ธ๋“œ ์ œํ’ˆ์„ ์ด์šฉํ•˜์—ฌ ์ง„ํ–‰ํ•˜์˜€๋‹ค. ๋…ผ๋ฌธ์˜ ์ด-ํ“จ์ฆˆ ์ž๋™ํ™” ์ƒ์„ฑ ๊ธฐ์ˆ ์€ 2์ผ ์ด๋‚ด์˜ ๊ตฌํ˜„ ์‹œ๊ฐ„๋งŒ์ด ์†Œ์š”๋œ๋‹ค.Contents I. Introduction..........................................................................1 II. Background..........................................................................4 2.1. NAND Flash Block Architecture..................................................4 2.2. NAND Cell Vth Distribution........................................................5 2.3. eFuse Operation of NAND Flash Chip.......................................6 III. Basic Idea and Background...............................................7 3.1. Basic Idea.......................................................................................7 3.2. Background: Variational Autoencoder........................................10 IV. Initial eFuse Generation: VAE-Based Dual Network....14 V. eFuse Optimization: Genetic Algorithm..........................17 VI. Experimental Results.........................................................21 6.1. Experimental Setup......................................................................21 6.2. Initial eFuse Generation Results................................................23 6.3. eFuse Optimization Results........................................................26 6.4. Discussion.....................................................................................29 VII. Related Work..................................................................31 VIII. Conclusion.......................................................................33Maste

    Rethinking drug design in the artificial intelligence era

    Get PDF
    Artificial intelligence (AI) tools are increasingly being applied in drug discovery. While some protagonists point to vast opportunities potentially offered by such tools, others remain sceptical, waiting for a clear impact to be shown in drug discovery projects. The reality is probably somewhere in-between these extremes, yet it is clear that AI is providing new challenges not only for the scientists involved but also for the biopharma industry and its established processes for discovering and developing new medicines. This article presents the views of a diverse group of international experts on the 'grand challenges' in small-molecule drug discovery with AI and the approaches to address them

    Principal manifolds and graphs in practice: from molecular biology to dynamical systems

    Full text link
    We present several applications of non-linear data modeling, using principal manifolds and principal graphs constructed using the metaphor of elasticity (elastic principal graph approach). These approaches are generalizations of the Kohonen's self-organizing maps, a class of artificial neural networks. On several examples we show advantages of using non-linear objects for data approximation in comparison to the linear ones. We propose four numerical criteria for comparing linear and non-linear mappings of datasets into the spaces of lower dimension. The examples are taken from comparative political science, from analysis of high-throughput data in molecular biology, from analysis of dynamical systems.Comment: 12 pages, 9 figure

    Advanced photonic and electronic systems WILGA 2018

    Get PDF
    WILGA annual symposium on advanced photonic and electronic systems has been organized by young scientist for young scientists since two decades. It traditionally gathers around 400 young researchers and their tutors. Ph.D students and graduates present their recent achievements during well attended oral sessions. Wilga is a very good digest of Ph.D. works carried out at technical universities in electronics and photonics, as well as information sciences throughout Poland and some neighboring countries. Publishing patronage over Wilga keep Elektronika technical journal by SEP, IJET and Proceedings of SPIE. The latter world editorial series publishes annually more than 200 papers from Wilga. Wilga 2018 was the XLII edition of this meeting. The following topical tracks were distinguished: photonics, electronics, information technologies and system research. The article is a digest of some chosen works presented during Wilga 2018 symposium. WILGA 2017 works were published in Proc. SPIE vol.10445. WILGA 2018 works were published in Proc. SPIE vol.10808

    A decade of neural networks: Practical applications and prospects

    Get PDF
    On May 11-13, 1994, JPL's Center for Space Microelectronics Technology (CSMT) hosted a neural network workshop entitled, 'A Decade of Neural Networks: Practical Applications and Prospects,' sponsored by DOD and NASA. The past ten years of renewed activity in neural network research has brought the technology to a crossroads regarding the overall scope of its future practical applicability. The purpose of the workshop was to bring together the sponsoring agencies, active researchers, and the user community to formulate a vision for the next decade of neural network research and development prospects, with emphasis on practical applications. Of the 93 participants, roughly 15% were from government agencies, 30% were from industry, 20% were from universities, and 35% were from Federally Funded Research and Development Centers (FFRDC's)
    • โ€ฆ
    corecore