4,977 research outputs found

    Bank ownership, lending relationships and capital structure: Evidence from Spain

    Get PDF
    This paper analyses the influence of bank ownership and lending on capital structure for a sample of listed and unlisted Spanish firms in the period 2005–2012. The results suggest that bank ownership allows banks to obtain better information and reduce the agency costs of debt, as it has a positive relationship with the maturity of debt and a negative relationship with the cost of debt. These results are consistent with the predominance of the monitoring effect in bank ownership over the expropriation effect. The role of banks as shareholders and lenders also contributes to reduce agency cost of debt, as it reduces debt cost. JEL classification: G32, Keywords: Bank ownership, Bank lending, Debt, Debt maturity, Debt cos

    A Case Study on Artefact-based RE Improvement in Practice

    Get PDF
    Most requirements engineering (RE) process improvement approaches are solution-driven and activity-based. They focus on the assessment of the RE of a company against an external norm of best practices. A consequence is that practitioners often have to rely on an improvement approach that skips a profound problem analysis and that results in an RE approach that might be alien to the organisational needs. In recent years, we have developed an RE improvement approach (called \emph{ArtREPI}) that guides a holistic RE improvement against individual goals of a company putting primary attention to the quality of the artefacts. In this paper, we aim at exploring ArtREPI's benefits and limitations. We contribute an industrial evaluation of ArtREPI by relying on a case study research. Our results suggest that ArtREPI is well-suited for the establishment of an RE that reflects a specific organisational culture but to some extent at the cost of efficiency resulting from intensive discussions on a terminology that suits all involved stakeholders. Our results reveal first benefits and limitations, but we can also conclude the need of longitudinal and independent investigations for which we herewith lay the foundation

    Case Studies in Industry: What We Have Learnt

    Full text link
    Case study research has become an important research methodology for exploring phenomena in their natural contexts. Case studies have earned a distinct role in the empirical analysis of software engineering phenomena which are difficult to capture in isolation. Such phenomena often appear in the context of methods and development processes for which it is difficult to run large, controlled experiments as they usually have to reduce the scale in several respects and, hence, are detached from the reality of industrial software development. The other side of the medal is that the realistic socio-economic environments where we conduct case studies -- with real-life cases and realistic conditions -- also pose a plethora of practical challenges to planning and conducting case studies. In this experience report, we discuss such practical challenges and the lessons we learnt in conducting case studies in industry. Our goal is to help especially inexperienced researchers facing their first case studies in industry by increasing their awareness for typical obstacles they might face and practical ways to deal with those obstacles.Comment: Proceedings of the 4th International Workshop on Conducting Empirical Studies in Industry, co-located with ICSE, 201

    Naming the Pain in Requirements Engineering: A Design for a Global Family of Surveys and First Results from Germany

    Get PDF
    For many years, we have observed industry struggling in defining a high quality requirements engineering (RE) and researchers trying to understand industrial expectations and problems. Although we are investigating the discipline with a plethora of empirical studies, they still do not allow for empirical generalisations. To lay an empirical and externally valid foundation about the state of the practice in RE, we aim at a series of open and reproducible surveys that allow us to steer future research in a problem-driven manner. We designed a globally distributed family of surveys in joint collaborations with different researchers and completed the first run in Germany. The instrument is based on a theory in the form of a set of hypotheses inferred from our experiences and available studies. We test each hypothesis in our theory and identify further candidates to extend the theory by correlation and Grounded Theory analysis. In this article, we report on the design of the family of surveys, its underlying theory, and the full results obtained from Germany with participants from 58 companies. The results reveal, for example, a tendency to improve RE via internally defined qualitative methods rather than relying on normative approaches like CMMI. We also discovered various RE problems that are statistically significant in practice. For instance, we could corroborate communication flaws or moving targets as problems in practice. Our results are not yet fully representative but already give first insights into current practices and problems in RE, and they allow us to draw lessons learnt for future replications. Our results obtained from this first run in Germany make us confident that the survey design and instrument are well-suited to be replicated and, thereby, to create a generalisable empirical basis of RE in practice

    On Evidence-based Risk Management in Requirements Engineering

    Full text link
    Background: The sensitivity of Requirements Engineering (RE) to the context makes it difficult to efficiently control problems therein, thus, hampering an effective risk management devoted to allow for early corrective or even preventive measures. Problem: There is still little empirical knowledge about context-specific RE phenomena which would be necessary for an effective context- sensitive risk management in RE. Goal: We propose and validate an evidence-based approach to assess risks in RE using cross-company data about problems, causes and effects. Research Method: We use survey data from 228 companies and build a probabilistic network that supports the forecast of context-specific RE phenomena. We implement this approach using spreadsheets to support a light-weight risk assessment. Results: Our results from an initial validation in 6 companies strengthen our confidence that the approach increases the awareness for individual risk factors in RE, and the feedback further allows for disseminating our approach into practice.Comment: 20 pages, submitted to 10th Software Quality Days conference, 201

    Quid Pro Quo: A Mechanism for Fair Collaboration in Networked Systems

    Get PDF
    Collaboration may be understood as the execution of coordinated tasks (in the most general sense) by groups of users, who cooperate for achieving a common goal. Collaboration is a fundamental assumption and requirement for the correct operation of many communication systems. The main challenge when creating collaborative systems in a decentralized manner is dealing with the fact that users may behave in selfish ways, trying to obtain the benefits of the tasks but without participating in their execution. In this context, Game Theory has been instrumental to model collaborative systems and the task allocation problem, and to design mechanisms for optimal allocation of tasks. In this paper, we revise the classical assumptions and propose a new approach to this problem. First, we establish a system model based on heterogenous nodes (users, players), and propose a basic distributed mechanism so that, when a new task appears, it is assigned to the most suitable node. The classical technique for compensating a node that executes a task is the use of payments (which in most networks are hard or impossible to implement). Instead, we propose a distributed mechanism for the optimal allocation of tasks without payments. We prove this mechanism to be robust event in the presence of independent selfish or rationally limited players. Additionally, our model is based on very weak assumptions, which makes the proposed mechanisms susceptible to be implemented in networked systems (e.g., the Internet).Comment: 23 pages, 5 figures, 3 algorithm

    eXtended hybridizable discontinuous Galerkin for incompressible flow problems with unfitted meshes and interfaces

    Get PDF
    The eXtended hybridizable discontinuous Galerkin (X-HDG) method is developed for the solution of Stokes problems with void or material interfaces. X-HDG is a novel method that combines the hybridizable discontinuous Galerkin (HDG) method with an eXtended finite element strategy, resulting in a high-order, unfitted, superconvergent method, with an explicit definition of the interface geometry by means of a level-set function. For elements not cut by the interface, the standard HDG formulation is applied, whereas a modified weak form for the local problem is proposed for cut elements. Heaviside enrichment is considered on cut faces and in cut elements in the case of bimaterial problems. Two-dimensional numerical examples demonstrate that the applicability, accuracy, and superconvergence properties of HDG are inherited in X-HDG, with the freedom of computational meshes that do not fit the interfacesPeer ReviewedPostprint (author's final draft

    Elliptic harbor wave model with perfectly matched layer and exterior bathymetry effects

    Get PDF
    Standard strategies for dealing with the Sommerfeld condition in elliptic mild-slope models require strong assumptions on the wave field in the region exterior to the computational domain. More precisely, constant bathymetry along (and beyond) the open boundary, and parabolic approximations–based boundary conditions are usually imposed. Generally, these restrictions require large computational domains, implying higher costs for the numerical solver. An alternative method for coastal/harbor applications is proposed here. This approach is based on a perfectly matched layer (PML) that incorporates the effects of the exterior bathymetry. The model only requires constant exterior depth in the alongshore direction, a common approach used for idealizing the exterior bathymetry in elliptic models. In opposition to standard open boundary conditions for mild-slope models, the features of the proposed PML approach include (1) completely noncollinear coastlines, (2) better representation of the real unbounded domain using two different lateral sections to define the exterior bathymetry, and (3) the generation of reliable solutions for any incoming wave direction in a small computational domain. Numerical results of synthetic tests demonstrate that solutions are not significantly perturbed when open boundaries are placed close to the area of interest. In more complex problems, this provides important performance improvements in computational time, as shown for a real application of harbor agitation.Peer ReviewedPostprint (author's final draft
    • …
    corecore