9 research outputs found
Monitoring credit risk in the social economy sector by means of a binary goal programming model
The final publication is available at Springer via http://dx.doi.org/10.1007/s11628-012-0173-7Monitoring the credit risk of firms in the social economy sector presents a considerable challenge, since it is difficult to calculate ratings with traditional methods such as logit or discriminant analysis, due to the relatively small number of firms in the sector and the low default rate among cooperatives. This paper intro- duces a goal programming model to overcome such constraints and to successfully manage credit risk using economic and financial information, as well as expert advice. After introducing the model, its application to a set of Spanish cooperative societies is described.GarcÃa GarcÃa, F.; Guijarro MartÃnez, F.; Moya Clemente, I. (2013). Monitoring credit risk in the social economy sector by means of a binary goal programming model. Service Business. 7(3):483-495. doi:10.1007/s11628-012-0173-7S48349573Alfares H, Duffuaa S (2009) Assigning cardinal weights in multi-criteria decision making based on ordinal rankings. J Multicriteria Decis Anal 15:125–133Altman EI (1968) Financial ratios, discriminant analysis and the prediction of corporate bankruptcy. J Financ 23:589–609Altman EI, Hadelman RG, Narayanan P (1977) Zeta analysis: a new model to identify bankruptcy risk of corporations. J Bank Financ 1:29–54Andenmatten A (1995) Evaluation du risque de défaillance des emetteurs d’obligations: Une approche par l’aide multicritère á la décision. Presses Polytechniques et Univertitaires Romandes, LausanneBeaver WH (1966) Financial ratios as predictors of failure. J Account Res 4:71–111Boritz JE, Kennedey DB (1995) Effectiveness of neural network types for prediction of business failure. Expert Syst Appl 9:503–512Bottomley P, Doyle J, Green R (2000) Testing the reliability of weight elicitation methods: direct rating versus point allocation. J Mark Res 37:508–513Casey M, McGee V, Stinkey C (1986) Discriminating between reorganized and liquidated firms in bankruptcy. Account Rev 61:249–262Cruz S, Gonzalez T, Perez C (2010) Marketing capabilities, stakeholders’ satisfaction, and performance. Serv Bus 4:209–223DÃaz M, Marcuello C (2010) Impacto económico de las cooperativas. La generación de empleo en las sociedades cooperativas y su relación con el PIB. CIRIEC 67:23–44Dimitras AI, Zopounidis C, Hurson C (1995) A multicriteria decision aid method for the assessment of business failure risk. Found Comput Decis Sci 20:99–112Dimitras AI, Slowinski R, Susmaga R, Zopounidis C (1999) Business failure prediction using rough sets. Eur J Oper Res 114:263–280Elmer PJ, Borowski DM (1988) An expert system approach to financial analysis: the case of S&L bankruptcy. Financ Manage 17:66–76Frydman H, Altman EI, Kao DL (1985) Introducing recursive partitioning for financial classification: the case of financial distress. J Financ 40:269–291GarcÃa F, Guijarro F, Moya I (2008) La valoración de empresas agroalimentarias: una extensión de los modelos factoriales. Rev Estud Agro-Soc 217:155–181Gupta MC, Huefner RJ (1972) A cluster analysis study of financial ratios and industry characteristics. J Account Res 10:77–95Jensen RE (1971) A cluster analysis study of financial performance of selected firms. Account Rev 16:35–56Juliá J (2011) Social economy: a responsible people-oriented economy. Serv Bus 5:173–175Keasey K, Mcguinnes P, Short H (1990) Multilogit approach to predicting corporate failure: further analysis and the issue of signal consistency. Omega-Int J Manage S 18:85–94Li H, Adeli H, Sun J, Han JG (2011) Hybridizing principles of TOPSIS with case-based reasoning for business failure prediction. Comput Oper Res 38:409–419Luoma M, Laitinen EK (1991) Survival analysis as a tool for firm failure prediction. Omega-Int J Manage S 19:673–678March I, Yagüe RM (2009) Desempeño en empresas de economÃa social. Un modelo para su medición. CIRIEC 64:105–131Martin D (1977) Early warning of bank failure: a logit regression approach. J Bank Financ 1:249–276Mateos A, MarÃn M, Marà S, Seguà E (2011) Los modelos de predicción del fracaso empresarial y su aplicabilidad en cooperativas agrarias. CIRIEC 70:179–208McKee T (2000) Developing a bankruptcy prediction model via rough sets theory. Int J Intell Syst Account Finan Manage 9:159–173Messier WF, Hansen JV (1988) Inducing rules for expert system development: an example using default and bankruptcy data. Manage Sci 34:1403–1415Ohlson JA (1980) Financial ratios and the probabilistic prediction of bankruptcy. J Account Res 18:109–131Peel MJ (1987) Timeliness of private firm reports predicting corporate failure. Invest Anal J 83:23–27Saaty TL (1980) The analytic hierarchy process. McGraw-Hill, New YorkScapens RW, Ryan RJ, Flecher L (1981) Explaining corporate failure: a catastrophe theory approach. J Bus Finan Account 8:1–26Skogsvik R (1990) Current cost accounting ratios as predictors of business failures: the Swedish case. J Bus Finan Account 17:137–160Slowinski R, Zopounidis C (1995) Application of the rough set approach to evaluation of bankruptcy risk. Int J Intell Syst Account Finan Manage 4:24–41Vranas AS (1992) The significance of financial characteristics in predicting business failure: an analysis in the Greek context. Found Comput Decis Sci 17:257–275Westgaard S, Wijst N (2001) Default probabilities in a corporate bank portfolio: a logistic model approach. Eur J Oper Res 135:338–349Wilson RL, Sharda R (1994) Bankruptcy prediction using neuronal networks. Decis Support Syst 11:545–557Zavgren CV (1985) Assessing the vulnerability to failure of American industrial firms. A logistic analysis. J Bus Financ Account 12:19–45Zmijewski M (1984) Methodological issues related to the estimation of financial distress prediction models. Studies on Current Econometric Issues in Accounting Research. J Account Res 22:59–86Zopounidis C, Doumpos M (2002) Multicriteria classification and sorting methods: a literature review. Eur J Oper Res 138:229–24
Incorporating clinical guidelines through clinician decision-making
<p>Abstract</p> <p>Background</p> <p>It is generally acknowledged that a disparity between knowledge and its implementation is adversely affecting quality of care. An example commonly cited is the failure of clinicians to follow clinical guidelines. A guiding assumption of this view is that adherence should be gauged by a standard of conformance. At least some guideline developers dispute this assumption and claim that their efforts are intended to inform and assist clinical practice, not to function as standards of performance. However, their ability to assist and inform will remain limited until an alternative to the conformance criterion is proposed that gauges how evidence-based guidelines are incorporated into clinical decisions.</p> <p>Methods</p> <p>The proposed investigation has two specific aims to identify the processes that affect decisions about incorporating clinical guidelines, and then to develop ad test a strategy that promotes the utilization of evidence-based practices. This paper focuses on the first aim. It presents the rationale, introduces the clinical paradigm of treatment-resistant schizophrenia, and discusses an exemplar of clinician non-conformance to a clinical guideline. A modification of the original study is proposed that targets psychiatric trainees and draws on a cognitively rich theory of decision-making to formulate hypotheses about how the guideline is incorporated into treatment decisions. Twenty volunteer subjects recruited from an accredited psychiatry training program will respond to sixty-four vignettes that represent a fully crossed 2 × 2 × 2 × 4 within-subjects design. The variables consist of criteria contained in the clinical guideline and other relevant factors. Subjects will also respond to a subset of eight vignettes that assesses their overall impression of the guideline. Generalization estimating equation models will be used to test the study's principal hypothesis and perform secondary analyses.</p> <p>Implications</p> <p>The original design of phase two of the proposed investigation will be changed in recognition of newly published literature on the relative effectiveness of treatments for schizophrenia. It is suggested that this literature supports the notion that guidelines serve a valuable function as decision tools, and substantiates the importance of decision-making as the means by which general principles are incorporated into clinical practice.</p
Recommended from our members
Detailed analysis of excited-state systematics in a lattice QCD calculation of gA
Excited state contamination remains one of the most challenging sources of systematic uncertainty to control in lattice QCD calculations of nucleon matrix elements and form factors: early time separations are contaminated by excited states and late times suffer from an exponentially bad signal-to-noise problem. High-statistics calculations at large time separations 1 fm are commonly used to combat these issues. In this work, focusing on gA, we explore the alternative strategy of utilizing a large number of relatively low-statistics calculations at short to medium time separations (0.2-1 fm), combined with a multistate analysis. On an ensemble with a pion mass of approximately 310 MeV and a lattice spacing of approximately 0.09 fm, we find this provides a more robust and economical method of quantifying and controlling the excited state systematic uncertainty. A quantitative separation of various types of excited states enables the identification of the transition matrix elements as the dominant contamination. The excited state contamination of the Feynman-Hellmann correlation function is found to reduce to the 1% level at approximately 1 fm while, for the more standard three-point functions, this does not occur until after 2 fm. Critical to our findings is the use of a global minimization, rather than fixing the spectrum from the two-point functions and using them as input to the three-point analysis. We find that the ground state parameters determined in such a global analysis are stable against variations in the excited state model, the number of excited states, and the truncation of early-time or late-time numerical data
Recommended from our members
Nucleon Axial Form Factor from Domain Wall on HISQ
The Deep Underground Neutrino Experiment (DUNE) is an upcoming neutrino oscillation experiment that is poised to answer key questions about the nature of neutrinos. Lattice QCD has the ability to make significant impact upon DUNE, beginning with computations of nucleon-neutrino interactions with weak currents. Nucleon amplitudes involving the axial form factor are part of the primary signal measurement process for DUNE, and precise calculations from LQCD can significantly reduce the uncertainty for inputs into Monte Carlo generators. Recent calculations of the nucleon axial charge have demonstrated that sub-percent precision is possible on this vital quantity. In these proceedings, we discuss preliminary results for the CalLat collaboration's calculation of the axial form factor of the nucleon. These computations are performed with Möbius domain wall valence quarks on HISQ sea quark ensembles generated by the MILC and CalLat collaborations. The results use a variety of ensembles including several at physical pion mass
Recommended from our members
Toward a resolution of the NN controversy
Lattice QCD calculations of two-nucleon interactions have been underway for about a decade, but still haven't reached the pion mass regime necessary for matching onto effective field theories and extrapolating to the physical point. Furthermore, results from different methods, including the use of the Lüscher formalism with different types of operators, as well as the HALQCD potential method, do not agree even qualitatively at very heavy pion mass. We investigate the role that different operators employed in the literature may play on the extraction of spectra for use within the Lüscher method. We first explore expectations from Effective Field Theory solved within a finite volume, for which the exact spectrum may be computed given different physical scenarios. We then present preliminary lattice QCD results for two-nucleon spectra calculated using different operators on a common lattice ensemble
Recommended from our members
Scale setting the Möbius domain wall fermion on gradient-flowed HISQ action using the omega baryon mass and the gradient-flow scales t0 and w0
We report on a subpercent scale determination using the omega baryon mass and gradient-flow methods. The calculations are performed on 22 ensembles of Nf=2+1+1 highly improved, rooted staggered sea-quark configurations generated by the MILC and CalLat Collaborations. The valence quark action used is Möbius domain wall fermions solved on these configurations after a gradient-flow smearing is applied with a flowtime of tgf=1 in lattice units. The ensembles span four lattice spacings in the range 0.06a0.15 fm, six pion masses in the range 130mπ400 MeV and multiple lattice volumes. On each ensemble, the gradient-flow scales t0/a2 and w0/a and the omega baryon mass amω are computed. The dimensionless product of these quantities is then extrapolated to the continuum and infinite volume limits and interpolated to the physical light, strange and charm quark mass point in the isospin limit, resulting in the determination of t0=0.1422(14) fm and w0=0.1709(11) fm with all sources of statistical and systematic uncertainty accounted for. The dominant uncertainty in both results is the stochastic uncertainty, though for t0 there are comparable continuum extrapolation uncertainties. For w0, there is a clear path for a few-per-mille uncertainty just through improved stochastic precision, as recently obtained by the Budapest-Marseille-Wuppertal Collaboration
FK /Fπ from Möbius domain-wall fermions solved on gradient-flowed HISQ ensembles
We report the results of a lattice quantum chromodynamics calculation of FK/Fπ using Möbius domain-wall fermions computed on gradient-flowed Nf=2+1+1 highly improved staggered quark (HISQ) ensembles. The calculation is performed with five values of the pion mass ranging from 130 400 MeV, four lattice spacings of a∼0.15, 0.12, 0.09 and 0.06 fm and multiple values of the lattice volume. The interpolation/extrapolation to the physical pion and kaon mass point, the continuum, and infinite volume limits are performed with a variety of different extrapolation functions utilizing both the relevant mixed-action effective field theory expressions as well as discretization-enhanced continuum chiral perturbation theory formulas. We find that the a∼0.06 fm ensemble is helpful, but not necessary to achieve a subpercent determination of FK/Fπ. We also include an estimate of the strong isospin breaking corrections and arrive at a final result of FK+/Fπ+=1.1942(45) with all sources of statistical and systematic uncertainty included. This is consistent with the Flavour Lattice Averaging Group average value, providing an important benchmark for our lattice action. Combining our result with experimental measurements of the pion and kaon leptonic decays leads to a determination of |Vus|/|Vud|=0.2311(10)
Tracing the Arrows of Time
noOver the last century there have been a number of proposals to ground both local and cosmic arrows of time: from the Second law to the Growing Block Universe, from Decoherence to Earman’s time-direction heresy. The latter proposal rejects the traditional association of the Second law of thermodynamics with arrows of time. But it seems that notions like entropy and related notions – phase space volumes and typicality – are not easily banned from discussions of temporal arrows. A close reading of Eddington’s thinking on these questions reveals that his views underwent a considerable development. In particular Eddington abandoned his identification of the arrows of time with the increase in entropy and began to see the Second law as a criterion for temporal arrows. In the process, Eddington also developed an argument against Loschmidt’s reversibility objections, in terms of an expanding universe. This latter argument brings his contribution close to contemporary thinking in terms of Liouville’s theorem, the topology of phase space and typicality arguments. Their reliability to deliver arrows of time will therefore be considered.
Are there arrows of time? This question is related to the epistemological views of both Eddington and Wheeler. They insisted on the role of inferences in scientific thinking. Physical reality was to be inferred from data (Eddington) or information (Wheeler) about the physical universe. The paper will conclude that the arrows of time are equally to be regarded as conceptual inferences from various physical criteria – not just entropy – which the universe makes available to us