305 research outputs found
Decoupling Cache Coherence From XML in Model Checking
Many electrical engineers would agree that, had it not been for link-level acknowledgements, the eval- uation of congestion control might never have occurred. After years of significant research into Smalltalk, we validate the visualization of vacuum tubes. Our focus in our research is not on whether the famous encrypted algorithm for the visualization of simulated annealing by Garcia [5] is impossible, but rather on describing a novel application for the investigation of Lamport clocks (LealEst)
Hybrid Deterministic-Stochastic Methods for Data Fitting
Many structured data-fitting applications require the solution of an
optimization problem involving a sum over a potentially large number of
measurements. Incremental gradient algorithms offer inexpensive iterations by
sampling a subset of the terms in the sum. These methods can make great
progress initially, but often slow as they approach a solution. In contrast,
full-gradient methods achieve steady convergence at the expense of evaluating
the full objective and gradient on each iteration. We explore hybrid methods
that exhibit the benefits of both approaches. Rate-of-convergence analysis
shows that by controlling the sample size in an incremental gradient algorithm,
it is possible to maintain the steady convergence rates of full-gradient
methods. We detail a practical quasi-Newton implementation based on this
approach. Numerical experiments illustrate its potential benefits.Comment: 26 pages. Revised proofs of Theorems 2.6 and 3.1, results unchange
Adding Isolated Vertices Makes some Online Algorithms Optimal
An unexpected difference between online and offline algorithms is observed.
The natural greedy algorithms are shown to be worst case online optimal for
Online Independent Set and Online Vertex Cover on graphs with 'enough' isolated
vertices, Freckle Graphs. For Online Dominating Set, the greedy algorithm is
shown to be worst case online optimal on graphs with at least one isolated
vertex. These algorithms are not online optimal in general. The online
optimality results for these greedy algorithms imply optimality according to
various worst case performance measures, such as the competitive ratio. It is
also shown that, despite this worst case optimality, there are Freckle graphs
where the greedy independent set algorithm is objectively less good than
another algorithm. It is shown that it is NP-hard to determine any of the
following for a given graph: the online independence number, the online vertex
cover number, and the online domination number.Comment: A footnote in the .tex file didn't show up in the last version. This
was fixe
DeepWalk: Online Learning of Social Representations
We present DeepWalk, a novel approach for learning latent representations of
vertices in a network. These latent representations encode social relations in
a continuous vector space, which is easily exploited by statistical models.
DeepWalk generalizes recent advancements in language modeling and unsupervised
feature learning (or deep learning) from sequences of words to graphs. DeepWalk
uses local information obtained from truncated random walks to learn latent
representations by treating walks as the equivalent of sentences. We
demonstrate DeepWalk's latent representations on several multi-label network
classification tasks for social networks such as BlogCatalog, Flickr, and
YouTube. Our results show that DeepWalk outperforms challenging baselines which
are allowed a global view of the network, especially in the presence of missing
information. DeepWalk's representations can provide scores up to 10%
higher than competing methods when labeled data is sparse. In some experiments,
DeepWalk's representations are able to outperform all baseline methods while
using 60% less training data. DeepWalk is also scalable. It is an online
learning algorithm which builds useful incremental results, and is trivially
parallelizable. These qualities make it suitable for a broad class of real
world applications such as network classification, and anomaly detection.Comment: 10 pages, 5 figures, 4 table
Impact of proctoring on success rates for percutaneous revascularisation of coronary chronic total occlusions.
OBJECTIVE: To assess the impact of proctoring for chronic total occlusion (CTO) percutaneous coronary intervention (PCI) in six UK centres. METHODS: We retrospectively analysed 587 CTO procedures from six UK centres and compared success rates of operators who had received proctorship with success rates of the same operators before proctorship (pre-proctored) and operators in the same institutions who had not been proctored (non-proctored). There were 232 patients in the pre-proctored/non-proctored group and 355 patients in the post-proctored group. Complexity was assessed by calculating the Japanese CTO (JCTO) score for each case. RESULTS: CTO PCI success was greater in the post-proctored compared with the pre-proctored/non-proctored group (77.5% vs 62.1%, p<0.0001). In more complex cases where JCTO≥2, the difference in success was greater (70.7% vs 49.5%, p=0.0003). After proctoring, there was an increase in CTO PCI activity in centres from 2.5% to 3.5%, p<0.0001 (as a proportion of total PCI), and the proportion of very difficult cases with JCTO score ≥3 increased from 15.3% (35/229) to 29.7% (105/354), p<0.0001. CONCLUSIONS: Proctoring resulted in an increase in procedural success for CTO PCI, an increase in complex CTO PCI and an increase in total CTO PCI activity. Proctoring may be a valuable way to improve access to CTO PCI and the likelihood of procedural success
Toward the PSTN/Internet Inter-Networking--Pre-PINT Implementations
This document contains the information relevant to the development of the inter-networking interfaces underway in the Public Switched Telephone Network (PSTN)/Internet Inter-Networking (PINT) Working Group. It addresses technologies, architectures, and several (but by no means all) existing pre-PINT implementations of the arrangements through which Internet applications can request and enrich PSTN telecommunications services. The common denominator of the enriched services (a.k.a. PINT services) is that they combine the Internet and PSTN services in such a way that the Internet is used for non-voice interactions, while the voice (and fax) are carried entirely over the PSTN. One key observation is that the pre-PINT implementations, being developed independently, do not inter-operate. It is a task of the PINT Working Group to define the inter-networking interfaces that will support inter-operation of the future implementations of PINT services
Visibility graphs of random scalar fields and spatial data
The family of visibility algorithms were recently introduced as mappings
between time series and graphs. Here we extend this method to characterize
spatially extended data structures by mapping scalar fields of arbitrary
dimension into graphs. After introducing several possible extensions, we
provide analytical results on some topological properties of these graphs
associated to some types of real-valued matrices, which can be understood as
the high and low disorder limits of real-valued scalar fields. In particular,
we find a closed expression for the degree distribution of these graphs
associated to uncorrelated random fields of generic dimension, extending a well
known result in one-dimensional time series. As this result holds independently
of the field's marginal distribution, we show that it directly yields a
statistical randomness test, applicable in any dimension. We showcase its
usefulness by discriminating spatial snapshots of two-dimensional white noise
from snapshots of a two-dimensional lattice of diffusively coupled chaotic
maps, a system that generates high dimensional spatio-temporal chaos. We
finally discuss the range of potential applications of this combinatorial
framework, which include image processing in engineering, the description of
surface growth in material science, soft matter or medicine and the
characterization of potential energy surfaces in chemistry, disordered systems
and high energy physics. An illustration on the applicability of this method
for the classification of the different stages involved in carcinogenesis is
briefly discussed
A Neural Framework for Organization and Flexible Utilization of Episodic Memory in Cumulatively Learning Baby Humanoids
Cumulatively developing robots offer a unique opportunity to reenact the constant interplay between neural mechanisms related to learning, memory, prospection, and abstraction from the perspective of an integrated system that acts, learns, remembers, reasons, and makes mistakes. Situated within such interplay lie some of the computationally elusive and fundamental aspects of cognitive behavior: the ability to recall and flexibly exploit diverse experiences of one’s past in the context of the present to realize goals, simulate the future, and keep learning further. This article is an adventurous exploration in this direction using a simple engaging scenario of how the humanoid iCub learns to construct the tallest possible stack given an arbitrary set of objects to play with. The learning takes place cumulatively, with the robot interacting with different objects (some previously experienced, some novel) in an open-ended fashion. Since the solution itself depends on what objects are available in the “now,” multiple episodes of past experiences have to be remembered and creatively integrated in the context of the present to be successful. Starting from zero, where the robot knows nothing, we explore the computational basis of organization episodic memory in a cumulatively learning humanoid and address (1) how relevant past experiences can be reconstructed based on the present context, (2) how multiple stored episodic memories compete to survive in the neural space and not be forgotten, (3) how remembered past experiences can be combined with explorative actions to learn something new, and (4) how multiple remembered experiences can be recombined to generate novel behaviors (without exploration). Through the resulting behaviors of the robot as it builds, breaks, learns, and remembers, we emphasize that mechanisms of episodic memory are fundamental design features necessary to enable the survival of autonomous robots in a real world where neither everything can be known nor can everything be experienced
Passenger transport decarbonization in emerging economies: policy lessons from modelling long-term deep decarbonization pathways
Reaching the goal of the Paris Agreement will not be possible without a deep decarbonization of the passenger transport sector. In emerging economies experiencing rapid economic growth and social transformations, and large-scale development of urban areas and associated infrastructure, opportunities and challenges exist when considering a broader set of mitigation options. In this paper, we apply the Deep Decarbonization Pathways (DDP) approach to develop and report scenarios on the passenger transport sector in Brazil, India, Indonesia, and South Africa. This approach supports an increase in the sectoral ambition of covering all drivers of change in transport mobility and facilitating collective comparison and policy discussions on the barriers and enablers of transitions. The scenario analysis illustrates that all four countries can achieve reductions in emissions per passenger kilometres of 59% and up to 92% by 2050 while meeting growing mobility needs. Lastly, the analysis identifies short-term policy needed to address barriers and promote enablers
Encapsulation of olanzapine into beeswax microspheres: preparation, characterization and release kinetics
The objective of the present study was to minimise the unwanted side effects of olanzapine (OZ) drug by kinetic control of drug release by entrapping into gastro resistant, biodegradable waxes such as beeswax (BW) microspheres using meltable emulsified dispersion cooling induced solidification technique utilizing a wetting agent. Solid, discrete, reproducible free flowing microspheres were obtained. The yield of the microspheres was up to 94.0 %. The microspheres had smooth surfaces, with free flowing and good packing properties, indicating that the obtained angle of repose, % Carr’s index and tapped density values were well within the limit. More than 97.0 % of the isolated spherical microspheres were in the particle size range of 312-330 μm were confirmed by scanning electron microscopy (SEM) photographs. The drug loaded in microspheres was stable and compatible, as confirmed by DSC and FTIR studies. The release of drug was controlled for more than 8 h. Intestinal drug release from microspheres was studied and compared with the release behaviour of commercially available formulation Olanex®. The release kinetics followed different transport mechanisms. The drug release from the bees wax microspheres was found sufficient for oral delivery and the drug release profile was significantly affected by the properties of wax used in the preparation of microspheres. These results demonstrate the potential use of wax for the fabrication of controlled delivery devices for many water soluble drugs.Colegio de Farmacéuticos de la Provincia de Buenos Aire
- …