16,925 research outputs found

    Corporate Social Responsibility: the institutionalization of ESG

    Get PDF
    Understanding the impact of Corporate Social Responsibility (CSR) on firm performance as it relates to industries reliant on technological innovation is a complex and perpetually evolving challenge. To thoroughly investigate this topic, this dissertation will adopt an economics-based structure to address three primary hypotheses. This structure allows for each hypothesis to essentially be a standalone empirical paper, unified by an overall analysis of the nature of impact that ESG has on firm performance. The first hypothesis explores the evolution of CSR to the modern quantified iteration of ESG has led to the institutionalization and standardization of the CSR concept. The second hypothesis fills gaps in existing literature testing the relationship between firm performance and ESG by finding that the relationship is significantly positive in long-term, strategic metrics (ROA and ROIC) and that there is no correlation in short-term metrics (ROE and ROS). Finally, the third hypothesis states that if a firm has a long-term strategic ESG plan, as proxied by the publication of CSR reports, then it is more resilience to damage from controversies. This is supported by the finding that pro-ESG firms consistently fared better than their counterparts in both financial and ESG performance, even in the event of a controversy. However, firms with consistent reporting are also held to a higher standard than their nonreporting peers, suggesting a higher risk and higher reward dynamic. These findings support the theory of good management, in that long-term strategic planning is both immediately economically beneficial and serves as a means of risk management and social impact mitigation. Overall, this contributes to the literature by fillings gaps in the nature of impact that ESG has on firm performance, particularly from a management perspective

    Examples of works to practice staccato technique in clarinet instrument

    Get PDF
    Klarnetin staccato tekniğini güçlendirme aşamaları eser çalışmalarıyla uygulanmıştır. Staccato geçişlerini hızlandıracak ritim ve nüans çalışmalarına yer verilmiştir. Çalışmanın en önemli amacı sadece staccato çalışması değil parmak-dilin eş zamanlı uyumunun hassasiyeti üzerinde de durulmasıdır. Staccato çalışmalarını daha verimli hale getirmek için eser çalışmasının içinde etüt çalışmasına da yer verilmiştir. Çalışmaların üzerinde titizlikle durulması staccato çalışmasının ilham verici etkisi ile müzikal kimliğe yeni bir boyut kazandırmıştır. Sekiz özgün eser çalışmasının her aşaması anlatılmıştır. Her aşamanın bir sonraki performans ve tekniği güçlendirmesi esas alınmıştır. Bu çalışmada staccato tekniğinin hangi alanlarda kullanıldığı, nasıl sonuçlar elde edildiği bilgisine yer verilmiştir. Notaların parmak ve dil uyumu ile nasıl şekilleneceği ve nasıl bir çalışma disiplini içinde gerçekleşeceği planlanmıştır. Kamış-nota-diyafram-parmak-dil-nüans ve disiplin kavramlarının staccato tekniğinde ayrılmaz bir bütün olduğu saptanmıştır. Araştırmada literatür taraması yapılarak staccato ile ilgili çalışmalar taranmıştır. Tarama sonucunda klarnet tekniğin de kullanılan staccato eser çalışmasının az olduğu tespit edilmiştir. Metot taramasında da etüt çalışmasının daha çok olduğu saptanmıştır. Böylelikle klarnetin staccato tekniğini hızlandırma ve güçlendirme çalışmaları sunulmuştur. Staccato etüt çalışmaları yapılırken, araya eser çalışmasının girmesi beyni rahatlattığı ve istekliliği daha arttırdığı gözlemlenmiştir. Staccato çalışmasını yaparken doğru bir kamış seçimi üzerinde de durulmuştur. Staccato tekniğini doğru çalışmak için doğru bir kamışın dil hızını arttırdığı saptanmıştır. Doğru bir kamış seçimi kamıştan rahat ses çıkmasına bağlıdır. Kamış, dil atma gücünü vermiyorsa daha doğru bir kamış seçiminin yapılması gerekliliği vurgulanmıştır. Staccato çalışmalarında baştan sona bir eseri yorumlamak zor olabilir. Bu açıdan çalışma, verilen müzikal nüanslara uymanın, dil atış performansını rahatlattığını ortaya koymuştur. Gelecek nesillere edinilen bilgi ve birikimlerin aktarılması ve geliştirici olması teşvik edilmiştir. Çıkacak eserlerin nasıl çözüleceği, staccato tekniğinin nasıl üstesinden gelinebileceği anlatılmıştır. Staccato tekniğinin daha kısa sürede çözüme kavuşturulması amaç edinilmiştir. Parmakların yerlerini öğrettiğimiz kadar belleğimize de çalışmaların kaydedilmesi önemlidir. Gösterilen azmin ve sabrın sonucu olarak ortaya çıkan yapıt başarıyı daha da yukarı seviyelere çıkaracaktır

    Comprehensive analysis of the ionospheric response to the largest geomagnetic storms from solar cycle 24 over Europe

    Get PDF
    A multi-instrumental analysis of the meridional ionospheric response is presented over Europe during the two largest ICME-driven geomagnetic storms of solar cycle #24 maximum. Data from 5 European digisonde stations, ground-based Global Navigation Satellite System, Total Electron Content (GNSS TEC), the ratio of the TEC difference (rTEC), as well as Swarm and Thermosphere, Ionosphere, Mesosphere, Energetics and Dynamics (TIMED) satellite observations have been used for the investigation of selected intervals (11–17 November, 2012, and 16–25 March, 2015). The storm evolution is monitored by digisonde foF2 critical frequency (related to the maximum electron density of F2-layer) and GNSS TEC data. Moreover, Global Ultraviolet Imager (GUVI) measurements from the TIMED satellite are used to investigate the changes in the thermospheric O/N2 ratio. Our main focus was on the main phase of the geomagnetic storms, when during the nighttime hours extremely depleted plasma was detected. The extreme depletion is observed in foF2, TEC and rTEC, which is found to be directly connected to the equatorward motion of the midlatitude ionospheric trough (MIT) on the nightside. We demonstrate a method (beside the existing ones) which allows the monitoring of the storm-time evolution of the disturbances (e.g., MIT, SAPS, SED) in the thermosphere-ionosphere-plasmasphere system by the combined analysis of the worldwide digisonde system data (with the drift measurements and the ionospheric layer parameters with 5–15 min cadence), with rTEC and GNSS TEC data, and with the satellite data like Swarm, TIMED/GUVI

    A direct-laser-written heart-on-a-chip platform for generation and stimulation of engineered heart tissues

    Full text link
    In this dissertation, we first develop a versatile microfluidic heart-on-a-chip model to generate 3D-engineered human cardiac microtissues in highly-controlled microenvironments. The platform, which is enabled by direct laser writing (DLW), has tailor-made attachment sites for cardiac microtissues and comes with integrated strain actuators and force sensors. Application of external pressure waves to the platform results in controllable time-dependent forces on the microtissues. Conversely, oscillatory forces generated by the microtissues are transduced into measurable electrical outputs. After characterization of the responsivity of the transducers, we demonstrate the capabilities of this platform by studying the response of cardiac microtissues to prescribed mechanical loading and pacing. Next, we tune the geometry and mechanical properties of the platform to enable parametric studies on engineered heart tissues. We explore two geometries: a rectangular seeding well with two attachment sites, and a stadium-like seeding well with six attachment sites. The attachment sites are placed symmetrically in the longitudinal direction. The former geometry promotes uniaxial contraction of the tissues; the latter additionally induces diagonal fiber alignment. We systematically increase the length for both configurations and observe a positive correlation between fiber alignment at the center of the microtissues and tissue length. However, progressive thinning and “necking” is also observed, leading to the failure of longer tissues over time. We use the DLW technique to improve the platform, softening the mechanical environment and optimizing the attachment sites for generation of stable microtissues at each length and geometry. Furthermore, electrical pacing is incorporated into the platform to evaluate the functional dynamics of stable microtissues over the entire range of physiological heart rates. Here, we typically observe a decrease in active force and contraction duration as a function of frequency. Lastly, we use a more traditional ?TUG platform to demonstrate the effects of subthreshold electrical pacing on the rhythm of the spontaneously contracting cardiac microtissues. Here, we observe periodic M:N patterns, in which there are ? cycles of stimulation for every ? tissue contractions. Using electric field amplitude, pacing frequency, and homeostatic beating frequencies of the tissues, we provide an empirical map for predicting the emergence of these rhythms

    Decoding spatial location of attended audio-visual stimulus with EEG and fNIRS

    Get PDF
    When analyzing complex scenes, humans often focus their attention on an object at a particular spatial location in the presence of background noises and irrelevant visual objects. The ability to decode the attended spatial location would facilitate brain computer interfaces (BCI) for complex scene analysis. Here, we tested two different neuroimaging technologies and investigated their capability to decode audio-visual spatial attention in the presence of competing stimuli from multiple locations. For functional near-infrared spectroscopy (fNIRS), we targeted dorsal frontoparietal network including frontal eye field (FEF) and intra-parietal sulcus (IPS) as well as superior temporal gyrus/planum temporal (STG/PT). They all were shown in previous functional magnetic resonance imaging (fMRI) studies to be activated by auditory, visual, or audio-visual spatial tasks. We found that fNIRS provides robust decoding of attended spatial locations for most participants and correlates with behavioral performance. Moreover, we found that FEF makes a large contribution to decoding performance. Surprisingly, the performance was significantly above chance level 1s after cue onset, which is well before the peak of the fNIRS response. For electroencephalography (EEG), while there are several successful EEG-based algorithms, to date, all of them focused exclusively on auditory modality where eye-related artifacts are minimized or controlled. Successful integration into a more ecological typical usage requires careful consideration for eye-related artifacts which are inevitable. We showed that fast and reliable decoding can be done with or without ocular-removal algorithm. Our results show that EEG and fNIRS are promising platforms for compact, wearable technologies that could be applied to decode attended spatial location and reveal contributions of specific brain regions during complex scene analysis

    The MeerKAT Galaxy Cluster Legacy Survey: Survey overview and highlights

    Get PDF
    MeerKAT’s large number (64) of 13.5 m diameter antennas, spanning 8 km with a densely packed 1 km core, create a powerful instrument for wide-area surveys, with high sensitivity over a wide range of angular scales. The MeerKAT Galaxy Cluster Legacy Survey (MGCLS) is a programme of long-track MeerKAT L-band (900−1670 MHz) observations of 115 galaxy clusters, observed for ∼6−10 h each in full polarisation. The first legacy product data release (DR1), made available with this paper, includes the MeerKAT visibilities, basic image cubes at ∼8″ resolution, and enhanced spectral and polarisation image cubes at ∼8″ and 15″ resolutions. Typical sensitivities for the full-resolution MGCLS image products range from ∼3−5 μJy beam−1. The basic cubes are full-field and span 2° × 2°. The enhanced products consist of the inner 1.2° × 1.2° field of view, corrected for the primary beam. The survey is fully sensitive to structures up to ∼10′ scales, and the wide bandwidth allows spectral and Faraday rotation mapping. Relatively narrow frequency channels (209 kHz) are also used to provide H I mapping in windows of 0 < z < 0.09 and 0.19 < z < 0.48. In this paper, we provide an overview of the survey and the DR1 products, including caveats for usage. We present some initial results from the survey, both for their intrinsic scientific value and to highlight the capabilities for further exploration with these data. These include a primary-beam-corrected compact source catalogue of ∼626 000 sources for the full survey and an optical and infrared cross-matched catalogue for compact sources in the primary-beam-corrected areas of Abell 209 and Abell S295. We examine dust unbiased star-formation rates as a function of cluster-centric radius in Abell 209, extending out to 3.5 R 200. We find no dependence of the star-formation rate on distance from the cluster centre, and we observe a small excess of the radio-to-100 μm flux ratio towards the centre of Abell 209 that may reflect a ram pressure enhancement in the denser environment. We detect diffuse cluster radio emission in 62 of the surveyed systems and present a catalogue of the 99 diffuse cluster emission structures, of which 56 are new. These include mini-halos, halos, relics, and other diffuse structures for which no suitable characterisation currently exists. We highlight some of the radio galaxies that challenge current paradigms, such as trident-shaped structures, jets that remain well collimated far beyond their bending radius, and filamentary features linked to radio galaxies that likely illuminate magnetic flux tubes in the intracluster medium. We also present early results from the H I analysis of four clusters, which show a wide variety of H I mass distributions that reflect both sensitivity and intrinsic cluster effects, and the serendipitous discovery of a group in the foreground of Abell 3365

    Combining shallow-water and analytical wake models for tidal-array micro-siting

    Get PDF
    For tidal-stream energy to become a competitive renewable energy source, clustering multiple turbines into arrays is paramount. Array optimisation is thus critical for achieving maximum power performance and reducing cost of energy. However, ascertaining an optimal array layout is a complex problem, subject to specific site hydrodynamics and multiple inter-disciplinary constraints. In this work, we present a novel optimisation approach that combines an analytical-based wake model, FLORIS, with an ocean model, Thetis. The approach is demonstrated through applications of increasing complexity. By utilising the method of analytical wake superposition, the addition or alteration of turbine position does not require re-calculation of the entire flow field, thus allowing the use of simple heuristic techniques to perform optimisation at a fraction of the computational cost of more sophisticated methods. Using a custom condition-based placement algorithm, this methodology is applied to the Pentland Firth for arrays with turbines of 3.05m/s rated speed, demonstrating practical implications whilst considering the temporal variability of the tide. For a 24-turbine array case, micro-siting using this technique delivered an array 15.8% more productive on average than a staggered layout, despite flow speeds regularly exceeding the rated value. Performance was evaluated through assessment of the optimised layout within the ocean model that treats turbines through a discrete turbine representation. Used iteratively, this methodology could deliver improved array configurations in a manner that accounts for local hydrodynamic effects

    The determinants of value addition: a crtitical analysis of global software engineering industry in Sri Lanka

    Get PDF
    It was evident through the literature that the perceived value delivery of the global software engineering industry is low due to various facts. Therefore, this research concerns global software product companies in Sri Lanka to explore the software engineering methods and practices in increasing the value addition. The overall aim of the study is to identify the key determinants for value addition in the global software engineering industry and critically evaluate the impact of them for the software product companies to help maximise the value addition to ultimately assure the sustainability of the industry. An exploratory research approach was used initially since findings would emerge while the study unfolds. Mixed method was employed as the literature itself was inadequate to investigate the problem effectively to formulate the research framework. Twenty-three face-to-face online interviews were conducted with the subject matter experts covering all the disciplines from the targeted organisations which was combined with the literature findings as well as the outcomes of the market research outcomes conducted by both government and nongovernment institutes. Data from the interviews were analysed using NVivo 12. The findings of the existing literature were verified through the exploratory study and the outcomes were used to formulate the questionnaire for the public survey. 371 responses were considered after cleansing the total responses received for the data analysis through SPSS 21 with alpha level 0.05. Internal consistency test was done before the descriptive analysis. After assuring the reliability of the dataset, the correlation test, multiple regression test and analysis of variance (ANOVA) test were carried out to fulfil the requirements of meeting the research objectives. Five determinants for value addition were identified along with the key themes for each area. They are staffing, delivery process, use of tools, governance, and technology infrastructure. The cross-functional and self-organised teams built around the value streams, employing a properly interconnected software delivery process with the right governance in the delivery pipelines, selection of tools and providing the right infrastructure increases the value delivery. Moreover, the constraints for value addition are poor interconnection in the internal processes, rigid functional hierarchies, inaccurate selections and uses of tools, inflexible team arrangements and inadequate focus for the technology infrastructure. The findings add to the existing body of knowledge on increasing the value addition by employing effective processes, practices and tools and the impacts of inaccurate applications the same in the global software engineering industry
    corecore