1,861 research outputs found

    Computing and data processing

    Get PDF
    The applications of computers and data processing to astronomy are discussed. Among the topics covered are the emerging national information infrastructure, workstations and supercomputers, supertelescopes, digital astronomy, astrophysics in a numerical laboratory, community software, archiving of ground-based observations, dynamical simulations of complex systems, plasma astrophysics, and the remote control of fourth dimension supercomputers

    Bionanomedicine: A “Panacea” In Medicine?

    Full text link
    Recent advances in nanotechnology, biotechnology, bioinformatics, and materials science have prompted novel developments in the field of nanomedicine. Enhancements in the theranostics, computational information, and management of diseases/disorders are desperately required. It may now be conceivable to accomplish checked improvements in both of these areas utilising nanomedicine. This scientific and concise review concentrates on the fundamentals and potential of nanomedicine, particularly nanoparticles and their advantages, nanoparticles for siRNA conveyance, nanopores, nanodots, nanotheragnostics, nanodrugs and targeting mechanisms, and aptamer nanomedicine. The combination of various scientific fields is quickening these improvements, and these interdisciplinary endeavours to have significant progressively outstretching influences on different fields of research. The capacities of nanomedicine are immense, and nanotechnology could give medicine a completely new standpoint

    From Social Simulation to Integrative System Design

    Full text link
    As the recent financial crisis showed, today there is a strong need to gain "ecological perspective" of all relevant interactions in socio-economic-techno-environmental systems. For this, we suggested to set-up a network of Centers for integrative systems design, which shall be able to run all potentially relevant scenarios, identify causality chains, explore feedback and cascading effects for a number of model variants, and determine the reliability of their implications (given the validity of the underlying models). They will be able to detect possible negative side effect of policy decisions, before they occur. The Centers belonging to this network of Integrative Systems Design Centers would be focused on a particular field, but they would be part of an attempt to eventually cover all relevant areas of society and economy and integrate them within a "Living Earth Simulator". The results of all research activities of such Centers would be turned into informative input for political Decision Arenas. For example, Crisis Observatories (for financial instabilities, shortages of resources, environmental change, conflict, spreading of diseases, etc.) would be connected with such Decision Arenas for the purpose of visualization, in order to make complex interdependencies understandable to scientists, decision-makers, and the general public.Comment: 34 pages, Visioneer White Paper, see http://www.visioneer.ethz.c

    Many-Task Computing and Blue Waters

    Full text link
    This report discusses many-task computing (MTC) generically and in the context of the proposed Blue Waters systems, which is planned to be the largest NSF-funded supercomputer when it begins production use in 2012. The aim of this report is to inform the BW project about MTC, including understanding aspects of MTC applications that can be used to characterize the domain and understanding the implications of these aspects to middleware and policies. Many MTC applications do not neatly fit the stereotypes of high-performance computing (HPC) or high-throughput computing (HTC) applications. Like HTC applications, by definition MTC applications are structured as graphs of discrete tasks, with explicit input and output dependencies forming the graph edges. However, MTC applications have significant features that distinguish them from typical HTC applications. In particular, different engineering constraints for hardware and software must be met in order to support these applications. HTC applications have traditionally run on platforms such as grids and clusters, through either workflow systems or parallel programming systems. MTC applications, in contrast, will often demand a short time to solution, may be communication intensive or data intensive, and may comprise very short tasks. Therefore, hardware and software for MTC must be engineered to support the additional communication and I/O and must minimize task dispatch overheads. The hardware of large-scale HPC systems, with its high degree of parallelism and support for intensive communication, is well suited for MTC applications. However, HPC systems often lack a dynamic resource-provisioning feature, are not ideal for task communication via the file system, and have an I/O system that is not optimized for MTC-style applications. Hence, additional software support is likely to be required to gain full benefit from the HPC hardware

    Linking brain structure, activity and cognitive function through computation

    Get PDF
    Understanding the human brain is a “Grand Challenge” for 21st century research. Computational approaches enable large and complex datasets to be addressed efficiently, supported by artificial neural networks, modeling and simulation. Dynamic generative multiscale models, which enable the investigation of causation across scales and are guided by principles and theories of brain function, are instrumental for linking brain structure and function. An example of a resource enabling such an integrated approach to neuroscientific discovery is the BigBrain, which spatially anchors tissue models and data across different scales and ensures that multiscale models are supported by the data, making the bridge to both basic neuroscience and medicine. Research at the intersection of neuroscience, computing and robotics has the potential to advance neuro-inspired technologies by taking advantage of a growing body of insights into perception, plasticity and learning. To render data, tools and methods, theories, basic principles and concepts interoperable, the Human Brain Project (HBP) has launched EBRAINS, a digital neuroscience research infrastructure, which brings together a transdisciplinary community of researchers united by the quest to understand the brain, with fascinating insights and perspectives for societal benefits

    Znaczenie Kliniczne Obliczeniowych Modeli Mózgu W Rehabilitacji Neurologicznej

    Get PDF
    Despite quick development of the newest neurorehabilitation methods and techniques there is a need for experimentally validated models of motor learning, neural control of movements, functional recovery, therapy control strategies.Computational models are perceived as another way for optimization and objectivization of the neurorehabilitation. Fully understanding of the neural repair is needed for simulation of reorganization and remodeling of neural networks as the effect of neurorehabilitation. Better understanding can significantly influence both traditional forms of the therapy (neurosurgery, drug therapy, neurorehabilitation, etc.) and use of the advanced Assitive Technology (AT) solutions, e.g. brain-computer interfaces (BCIs) and neuroprostheses [49, 50] or artificial brain stimulation.Pomimo szybkiego rozwoju najnowszych metod i technik rehabilitacyjnych istnieje potrzeba tworzenia eksperymentalnie weryfikowalnych modeli motorycznego uczenia się, nerwowej kontroli ruchu, funkcjonalnego powrotu do zdrowia oraz strategii terapeutycznych.Modele obliczeniowe są uważanie za kolejny ze sposobów optymalizacji i obiektywizacji rehabilitacji neurologicznej. Pełne zrozumienie naprawy struktur nerwowych wymaga modelowania reorganizacji i przemodelowania sieci neuronowych następujących w efekcie rehabilitacji neurologicznej. Lepsze zrozumienie ww. procesów może znacząco wpłynąć zarówno na tradycyjne formy terapii (neurochirurgię, farmakoterapię, rehabilitację neurologiczną i inne), jak również użycie zaawansowanych rozwiązań technologii wspomagających, takich jak interfejsy mózg-komputer i neuroprotezy, jak również sztucznej stymulacji mózgu

    Non-equilibrium dynamics in the dual-wavelength operation of Vertical external-cavity surface-emitting lasers

    Full text link
    Microscopic many-body theory coupled to Maxwell's equation is used to investigate dual-wavelength operation in vertical external-cavity surface-emitting lasers. The intrinsically dynamic nature of coexisting emission wavelengths in semiconductor lasers is associated with characteristic non-equilibrium carrier dynamics which causes significant deformations of the quasi-equilibrium gain and carrier inversion. Extended numerical simulations are employed to efficiently investigate the parameter space to identify the regime for two-wavelength operation. Using a frequency selective intracavity etalon, two families of modes are stabilized with dynamical interchange of the strongest emission peaks. For this operation mode, anti-correlated intensity noise is observed in agreement with the experiment. A method using effective frequency selective filtering is suggested for stabilization genuine dual-wavelength output.Comment: 15 pages, 7 figure
    corecore