153,253 research outputs found
Microservices: Granularity vs. Performance
Microservice Architectures (MA) have the potential to increase the agility of
software development. In an era where businesses require software applications
to evolve to support software emerging requirements, particularly for Internet
of Things (IoT) applications, we examine the issue of microservice granularity
and explore its effect upon application latency. Two approaches to microservice
deployment are simulated; the first with microservices in a single container,
and the second with microservices partitioned across separate containers. We
observed a neglibible increase in service latency for the multiple container
deployment over a single container.Comment: 6 pages, conferenc
Granularity of corporate debt : [Version 9 Mai 2013]
We study to what extent firms spread out their debt maturity dates across time, which we call "granularity of corporate debt." We consider the role of debt granularity using a simple model in which a firm's inability to roll over expiring debt causes inefficiencies, such as costly asset sales or underinvestment. Since multiple small asset sales are less costly than a single large one, firms may diversify debt rollovers across maturity dates. We construct granularity measures using data on corporate bond issuers for the 1991-2011 period and establish a number of novel findings. First, there is substantial variation in granularity in that many firms have either very concentrated or highly dispersed maturity structures. Second, our model's predictions are consistent with observed variation in granularity. Corporate debt maturities are more dispersed for larger and more mature firms, for firms with better investment opportunities, with higher leverage ratios, and with lower levels of current cash flows. We also show that during the recent financial crisis especially firms with valuable investment opportunities implemented more dispersed maturity structures. Finally, granularity plays an important role for bond issuances, because we document that newly issued corporate bond maturities complement pre-existing bond maturity profiles
Recommended from our members
Model granularity and related concepts
Models are integral to engineering design and basis for many decisions. Therefore, it is necessary to comprehend how a model’s properties might influence its behaviour. Model granularity is an important property but has so far only received limited attention. The terminology used to describe granularity and related phenomena varies and pertinent concepts are distributed across communities. This article positions granularity in the theoretical background of models, collects formal definitions for relevant terms from a range of communities and discusses the implications for engineering design
Complex small-scale structure in the infrared extinction towards the Galactic Centre
A high level of complex structure, or ``granularity'', has been observed in
the distribution of infrared-obscuring material towards the Galactic Centre
(GC), with a characteristic scale of 5arcsec - 15arcsec, corresponding to 0.2 -
0.6pc at a GC distance of 8.5kpc. This structure has been observed in ISAAC
images which have a resolution of 0.6arcsec, significantly higher than that of
previous studies of the GC.
We have discovered granularity throughout the GC survey region, which covers
an area of 1.6deg x 0.8deg in longitude and latitude respectively (300pc x
120pc at 8.5kpc) centred on Sgr A*. This granularity is variable over the whole
region, with some areas exhibiting highly structured extinction in one or more
wavebands and other areas displaying no structure and a uniform stellar
distribution in all wavebands. The granularity does not appear to correspond to
longitude, latitude or radial distance from Sgr A*. We find that regions
exhibiting high granularity are strongly associated with high stellar
reddening.Comment: 5 pages, 3 figures, accepted for publication in ApJ
Recommended from our members
Improving parallel program performance using critical path analysis
A programming tool that performs analysis of critical paths for parallel programs has been developed. This tool determines the critical path for the program as scheduled onto a parallel computer with P processing elements, the critical path for the program expressed as a data flow graph (when maximal parallelism can be expressed), and the minimum number of processing elements (P_opt) needed to obtain maximum program speedup. Experiments were performed using several versions of a Gaussian elimination program to examine how speedup varied with changes in granularity and critical path length. These experiments showed that when the available numer of processing elements P < P_opt, increasing granularity improved program speedup more than reducing (the data flow graph's) critical path length, whereas when P ≥ P_opt, increasing granularity degraded program speedup while reducing critical path length improved program speedup
- …