694 research outputs found
The complexity dilemma: Three tips for dealing with complexity in organizations
Today storm-tossed markets call managers to take a stand on the rising up of external complexity. Organisations are constantly facing a crossroad (complexity dilemma): To accept and nurture complexity, or to avoid and reduce it. The first option can be traced back to Ashby\u2019s Law of Requisite Variety, 1. while the second comes from Luhmann\u2019s Complexity Reduction, 2. Both Ashby and Luhmann theories are valid due to an inverted U-shaped relation between complexity and firm\u2019s performance, called \u201ccomplexity curve\u201d. Once fixed the amount of external complexity, performance increase as internal complexity increase, till reaching a tipping point; after that point, an overburden of complexity sinks performance. To solve Ashby-Luhmann trade-off on complexity, and moving over the complexity curve, we suggest that complex organizing may be facilitated by a simple design through (i) modularity, (ii) simple rules, and (iii) organisational capabilities
Complessit\ue0 e capability organizzative: sviluppo ed applicazione di una metodologia di analisi della complessit\ue0 e delle capability organizzative
In recent years, the growing up of complexity in markets and organizations kept scholars and professionals attention. Issues on how to manage Complex Systems are gaining deep interest. A stream of literature investigates reactions processes in organization to the increasing of external complexity. These processes can be ruled by increasing internal complexity as per Ashby\u2019s Law of Requisite Variety (1956) or by selecting external complexity as per Luhmann\u2019s Complexity Reduction (1984). Moreover, recent studies focused on organisational capabilities as a way to manage complexity and support long term performance in organisations (Garengo & Bernardi, 2007).
This research investigates the dimensions of internal complexity, external complexity, and organizational capabilities in organizations, and their impact on performance. For studying these relationships a methodology, called Complexity Assessment Methodology (CAM), has been developed and tested. From literature the following literature gaps emerged: there is still lack of clarity about complexity definition and its main dimensions; a systematic literature review of organizational capabilities is missing; empirical research assessing the impact of complexity and capabilities on firms performances is still poor. Thus, the following research questions were made: What are the dimensions of external and internal complexity? How can organizations manage complexity through organizational capabilities? How can be developed a methodology assessing and linking internal complexity, external complexity, organizational capabilities and firm performance?
From literature review on complexity and organizational capabilities theories four dimensions of external and internal complexity were derived - interdependence, diversity, uncertainty, and dynamicity - and four main organizational capabilities to face complexity were defined, namely redundancy, interconnection, sharing and reconfiguration. In order to answer the third research question a a methodology to assess effects on performance of the relationships among internal complexity, external complexity and capabilities was developed.
A pilot case study in UniCredit business Integrated Solutions (UBIS) and subsequently three case studies in Coop Italia, Coop Liguria Euris were carried out in order to test the CAM..
Data from multiple case studies shown that (i) the ratio between internal complexity and external complexity influences performance through an inverted U-shaped function, called \u201ccomplexity curve\u201d. Once fixed the amount of external complexity, performance increases as internal complexity increases, till reaching a tipping point. After that point, an overburden of complexity starts to sink performance; (ii) the ratio between capabilities and internal complexity influences performance through an inverted U-shaped curve. Undersized capabilities show firms\u2019 inability of facing and managing internal complexity. But also oversized capabilities sink performance generating inefficiency (costs).
From a theoretical point of view the research permitted to build a methodology for modelling (sizing) external complexity, internal complexity and organisational capabilities, and linking them to performance. From a practical point of view the research shows evidences form different case studies on how complexity and capabilities were managed and developed. Moreover the CAM can be useful to managers for defining and measuring internal complexity and then optimizing capabilities in order to maximize performance.Nei recenti anni, studiosi e professionisti hanno dedicato molta attenzione all\u2019aumento della complessit\ue0 dei mercati e delle organizzazioni ed i problemi di gestione dei sistemi complessi stanno guadagnando sempre pi\uf9 interesse. Un filone di ricerca sta investigando i processi di risposta delle organizzazioni all\u2019aumento della complessit\ue0 esterna, processi che possono essere regolati costruendo complessit\ue0 interna - legge della variet\ue0 necessaria di Ashby (1956) - o selezionando complessit\ue0 esterna - riduzione di complessit\ue0 di Luhmann (1984). Le capability organizzative sono uno strumento efficace per gestire la complessit\ue0 e supportare le prestazioni a lungo termine nelle organizzazioni (Garengo e Bernardi, 2007).
Il lavoro di ricerca indaga la complessit\ue0 interna, la complessit\ue0 esterna, le capability organizzative e l\u2019impatto di queste tre dimensioni sulle prestazioni. A tal fine \ue8 sviluppata una metodologia di analisi delle relazioni fra queste dimensioni e le prestazioni definita Complexity Assessment Methodology (CAM).
Dalla revisione della letteratura sono emersi i seguenti tre gap: mancanza di chiarezza sulla definizione delle principali dimensioni e misure della complessit\ue0; assenza di una review sistematica della letteratura sulle capability organizzative; la ricerca empirica degli effetti sulle prestazioni delle relazione fra complessit\ue0 e capability \ue8 ancora in fase embrionale.
Da questi gap sono derivate le domande di ricerca: quali sono le dimensioni caratterizzanti la misura della complessit\ue0 dell\u2019ambiente competitivo e dell\u2019organizzazione? Come possono le organizzazioni gestire la complessit\ue0 attraverso le capability organizzative? Come pu\uf2 essere strutturata una metodologia di analisi della complessit\ue0 dell\u2019ambiente, dell\u2019organizzazione e delle capability organizzative sviluppate per gestirla?
Le revisioni della letteratura sulla Complessit\ue0 e sulle Capability organizzative evidenziano quattro dimensioni principali della complessit\ue0 interna ed esterna: interdipendenza, diversit\ue0, incertezza e dinamicit\ue0; e quattro capability principali per affrontare la complessit\ue0: ridondanza, interconnessione, condivisione e riconfigurazione. Per rispondere alla terza domanda di ricerca, \ue8 stata sviluppata e testata una metodologia di assessment che identifica gli effetti del rapporto tra complessit\ue0 interna e complessit\ue0 esterna e capability sulle prestazioni.
Per testare la CAM sono stati condotti un caso studio pilota in UniCredit Business Integrated Solutions (UBIS) e successivamente tre casi studio in Coop Italia, Coop Liguria, ed Euris.
Dai casi studio condotti si deduce che (i) il rapporto tra complessit\ue0 interna e complessit\ue0 esterna influenza le prestazioni attraverso una funzione a forma di U rovesciata chiamata \u201ccurva della complessit\ue0\u201d. Una volta fissato il livello della complessit\ue0 esterna, le prestazioni aumentano con l\u2019aumentare della complessit\ue0 interna, fino a raggiungere un punto di ottimo. Dopo questo punto, un sovraccarico di complessit\ue0 inizia a ridurre le prestazioni; (ii) il rapporto tra capability e complessit\ue0 interna influenza le prestazioni attraverso un curva a forma di U rovesciata. Livelli di capability sottodimensionati evidenziano l\u2019incapacit\ue0 di affrontare e gestire la complessit\ue0 interna cos\uec come un eccesso di capability riduce le prestazioni generando inefficienze.
Le implicazioni teoriche della ricerca sono costituite dallo sviluppo della metodologia CAM e per lo studio delle relazioni fra complessit\ue0 esterna, interna, capability e prestazioni. Da un punto di vista pratico la ricerca riporta evidenze empiriche da pi\uf9 casi studio su come la complessit\ue0 e le capability siano state gestite e/o sviluppate. La CAM \ue8 infine utile ai manager per definire e misurare i livelli di complessit\ue0 esterna ed interna e quindi ottimizzare il livello di capability della propria organizzazione al fine di massimizzare le prestazioni
Cardiac magnetic resonance-guided cardiac ablation: a case series of an early experience
Radiofrequency (RF) catheter ablation has become a widely used therapeutic approach. However, long-term results in terms of arrhythmia recurrence are still suboptimal. Cardiac magnetic resonance (CMR) could offer a valuable tool to overcome this limitation, with the possibility of targeting the arrhythmic substrate and evaluating the location, depth, and possible gaps of RF lesions. Moreover, real-time CMR-guided procedures offer a radiation-free approach with an evaluation of anatomical structures, substrates, RF lesions, and possible complications during a single procedure. The first steps in the field have been made with cavotricuspid isthmus ablation, showing similar procedural duration and success rate to standard fluoroscopy-guided procedures, while allowing visualization of anatomic structures and RF lesions. These promising results open the path for further studies in the context of more complex arrhythmias, like atrial fibrillation and ventricular tachycardias. Of note, setting up an interventional CMR (iCMR) centre requires safety and technical standards, mostly related to the need for CMR-compatible equipment and medical staff's educational training. For the cardiac imagers, it is fundamental to provide correct CMR sequences for catheter tracking and guide RF delivery. At the same time, the electrophysiologist needs a rapid interpretation of CMR images during the procedures. The aim of this paper is first to review the logistic and technical aspects of setting up an iCMR suite. Then, we will describe the experience in iCMR-guided flutter ablations of two European centres, Policlinico Casilino in Rome, Italy, and Haga Teaching Hospital in The Hague, the Netherlands
VALS: Virtual Alliances for Learning Society
[EN] VALS has the aims of establishing sustainable methods and processes to build knowledge partnerships between Higher
Education and companies to collaborate on resolving authentic business problems through open innovation mediated by
the use of Open Source Software. Open Source solutions provide the means whereby educational institutions, students,
businesses and foundations can all collaborate to resolve authentic business problems. Not only Open Software provides
the necessary shared infrastructure and collaborative practice, the foundations that manage the software are also hubs,
which channel the operational challenges of their users through to the people who can solve them. This has great
potential for enabling students and supervisors to collaborate in resolving the problems of businesses, but is constrained
by the lack of support for managing and promoting collaboration across the two sectors. VALS should 1) provide the
methods, practice, documentation and infrastructure to unlock this potential through virtual placements in businesses and
other public and private bodies; and 2) pilot and promote these as the “Semester of Code”. To achieve its goals the
project develops guidance for educational institutions, and for businesses and foundations, detailing the opportunities and
the benefits to be gained from the Semester of Code, and the changes to organisation and practice required. A Virtual
Placement System is going to be developed, adapting Apache Melange, and extending it where necessary. In piloting, the
necessary adaptations to practice will be carried out, particularly in universities, and commitments will be established
between problem owners and applicants for virtual placements
Physics case for an LHCb Upgrade II - Opportunities in flavour physics, and beyond, in the HL-LHC era
The LHCb Upgrade II will fully exploit the flavour-physics opportunities of the HL-LHC, and study additional physics topics that take advantage of the forward acceptance of the LHCb spectrometer. The LHCb Upgrade I will begin operation in 2020. Consolidation will occur, and modest enhancements of the Upgrade I detector will be installed, in Long Shutdown 3 of the LHC (2025) and these are discussed here. The main Upgrade II detector will be installed in long shutdown 4 of the LHC (2030) and will build on the strengths of the current LHCb experiment and the Upgrade I. It will operate at a luminosity up to 2×1034
cm−2s−1, ten times that of the Upgrade I detector. New detector components will improve the intrinsic performance of the experiment in certain key areas. An Expression Of Interest proposing Upgrade II was submitted in February 2017. The physics case for the Upgrade II is presented here in more depth. CP-violating phases will be measured with precisions unattainable at any other envisaged facility. The experiment will probe b → sl+l−and b → dl+l− transitions in both muon and electron decays in modes not accessible at Upgrade I. Minimal flavour violation will be tested with a precision measurement of the ratio of B(B0 → μ+μ−)/B(Bs → μ+μ−). Probing charm CP violation at the 10−5 level may result in its long sought discovery. Major advances in hadron spectroscopy will be possible, which will be powerful probes of low energy QCD. Upgrade II potentially will have the highest sensitivity of all the LHC experiments on the Higgs to charm-quark couplings. Generically, the new physics mass scale probed, for fixed couplings, will almost double compared with the pre-HL-LHC era; this extended reach for flavour physics is similar to that which would be achieved by the HE-LHC proposal for the energy frontier
LHCb upgrade software and computing : technical design report
This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis
Multidifferential study of identified charged hadron distributions in -tagged jets in proton-proton collisions at 13 TeV
Jet fragmentation functions are measured for the first time in proton-proton
collisions for charged pions, kaons, and protons within jets recoiling against
a boson. The charged-hadron distributions are studied longitudinally and
transversely to the jet direction for jets with transverse momentum 20 GeV and in the pseudorapidity range . The
data sample was collected with the LHCb experiment at a center-of-mass energy
of 13 TeV, corresponding to an integrated luminosity of 1.64 fb. Triple
differential distributions as a function of the hadron longitudinal momentum
fraction, hadron transverse momentum, and jet transverse momentum are also
measured for the first time. This helps constrain transverse-momentum-dependent
fragmentation functions. Differences in the shapes and magnitudes of the
measured distributions for the different hadron species provide insights into
the hadronization process for jets predominantly initiated by light quarks.Comment: All figures and tables, along with machine-readable versions and any
supplementary material and additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-013.html (LHCb
public pages
Study of the decay
The decay is studied
in proton-proton collisions at a center-of-mass energy of TeV
using data corresponding to an integrated luminosity of 5
collected by the LHCb experiment. In the system, the
state observed at the BaBar and Belle experiments is
resolved into two narrower states, and ,
whose masses and widths are measured to be where the first uncertainties are statistical and the second
systematic. The results are consistent with a previous LHCb measurement using a
prompt sample. Evidence of a new
state is found with a local significance of , whose mass and width
are measured to be and , respectively. In addition, evidence of a new decay mode
is found with a significance of
. The relative branching fraction of with respect to the
decay is measured to be , where the first
uncertainty is statistical, the second systematic and the third originates from
the branching fractions of charm hadron decays.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-028.html (LHCb
public pages
Measurement of the ratios of branching fractions and
The ratios of branching fractions
and are measured, assuming isospin symmetry, using a
sample of proton-proton collision data corresponding to 3.0 fb of
integrated luminosity recorded by the LHCb experiment during 2011 and 2012. The
tau lepton is identified in the decay mode
. The measured values are
and
, where the first uncertainty is
statistical and the second is systematic. The correlation between these
measurements is . Results are consistent with the current average
of these quantities and are at a combined 1.9 standard deviations from the
predictions based on lepton flavor universality in the Standard Model.Comment: All figures and tables, along with any supplementary material and
additional information, are available at
https://cern.ch/lhcbproject/Publications/p/LHCb-PAPER-2022-039.html (LHCb
public pages
Validation of Deep Learning techniques for quality augmentation in diffusion MRI for clinical studies
The objective of this study is to evaluate the efficacy of deep learning (DL) techniques in improving the quality of diffusion MRI (dMRI) data in clinical applications. The study aims to determine whether the use of artificial intelligence (AI) methods in medical images may result in the loss of critical clinical information and/or the appearance of false information. To assess this, the focus was on the angular resolution of dMRI and a clinical trial was conducted on migraine, specifically between episodic and chronic migraine patients. The number of gradient directions had an impact on white matter analysis results, with statistically significant differences between groups being drastically reduced when using 21 gradient directions instead of the original 61. Fourteen teams from different institutions were tasked to use DL to enhance three diffusion metrics (FA, AD and MD) calculated from data acquired with 21 gradient directions and a b-value of 1000 s/mm2. The goal was to produce results that were comparable to those calculated from 61 gradient directions. The results were evaluated using both standard image quality metrics and Tract-Based Spatial Statistics (TBSS) to compare episodic and chronic migraine patients. The study results suggest that while most DL techniques improved the ability to detect statistical differences between groups, they also led to an increase in false positive. The results showed that there was a constant growth rate of false positives linearly proportional to the new true positives, which highlights the risk of generalization of AI-based tasks when assessing diverse clinical cohorts and training using data from a single group. The methods also showed divergent performance when replicating the original distribution of the data and some exhibited significant bias. In conclusion, extreme caution should be exercised when using AI methods for harmonization or synthesis in clinical studies when processing heterogeneous data in clinical studies, as important information may be altered, even when global metrics such as structural similarity or peak signal-to-noise ratio appear to suggest otherwise
- …