288 research outputs found
Aktive Arbeitsmarktpolitik im Brennpunkt I: Evaluierung sozialökonomischer Betriebe
Die im Auftrag der Bundesgeschäftsstelle des Arbeitsmarktservice Österreich von Lechner, Reiter und Riesenfelder Sozialforschung (L&R) in Wien durchgeführte Evaluierung untersucht anhand der Befragungsdaten und der Sekundärdaten aus den Unterlagen die SÖB im Lichte ihrer betrieblichen Strukturen, Handlungsorientierungen und Besonderheiten (auch im Vergleich zu privaten Betrieben) sowie der institutionellen Rahmenbedingungen
On the nature of conspicuous consumption: linking evolution, american old institutionalism and methodological issues
Neste artigo, propõe-se um diálogo entre o trabalho de relevantes expoentes da escola de pensamento do institucionalismo americano; Walton Hamilton (1919), Thorstein Veblen (1898, 1899), e os escritos de Geoffrey Hogdson (1998; 2004; 2010), Ulrich Witt (2008; 2010; 2011; 2013), Wolfhard Kaus (2013) e Karin Knottenbauer (2010). Argumenta-se que o que une tais autores é a busca por traços evolutivos do comportamento econômico. Analisa-se o caso da categoria consumo conspícuo e de suas raízes evolutivas. Para tal, primeiro apresentamos ao leitor um resumo do que é a essência do institucionalismo americano e as proposições de Hamilton (1919) para uma teoria econômica que adota a dinâmica em vez da estática. Em seguida, é apresentada uma exposição da teoria do consumo conspícuo de Veblen (1899), possibilitando o posterior vínculo entre características evolutivas e o tópico da conspicuidade. O artigo prossegue com um apelo para que os economistas evolucionistas modernos prestem mais atenção às questões ontológicas e epistemológicas ao abordar o tópico do consumo na ciência econômica. Aqui os trabalhos de Campbell (1993) e Zahavi (1975) são úteis para esclarecer os fundamentos evolutivos do ato de consumo conspícuo, seu comportamento emulativo e a possível emergência de um comportamento de consumo inovativo. A partir desta abordagem estilizada, nossas conclusões mostram que a economia evolutiva ainda não ofereceu uma estrutura interpretativa robusta ao lidar com categorias econômicas importantes, como o consumo conspícuo. Além disso, a falta de um constructo de categorias explicativas interligadas está no centro dos desafios ontológicas e epistemológicas que a teoria enfrenta atualmente.In this paper we aim to propose a dialogue between the work of main exponents of what is known as American Old Institutionalism (AOI); Walton Hamilton (1919), Thorstein Veblen (1898, 1899), and the writings of Geoffrey Hogdson (1998; 2004; 2010), Ulrich Witt (2008; 2010; 2011; 2013), Wolfhard Kaus (2013) and Karin Knottenbauer (2010). It is argued what unites all these authors is the search for evolutionary traits of economic behavior. The case is made for the category of conspicuous consumption and its evolutionary roots. To do so, we first present the reader with a brief of what is the essence of the American Old Institutionalism (AOI) and the propositions of Hamilton (1919) for an economic theory that would embrace dynamics instead of statics. Then, an exhibition of Veblen’s (1899) conspicuous consumption theory is offered, making it possible for the future linkage between evolutionary features and the topic of conspicuity. The paper proceeds with an appeal for modern evolutionary economists to pay more attention to the matter of ontology and epistemology when approaching the topic of consumption in Economics. Here the work of Campbell (1993) and Buss (2008) come handy to clarify the underpinnings of the act of conspicuous consumption, its emulative behavior and the eventual consumption novelty. Within that stylized approach, our conclusions show that evolutionary economics still has not offered a robust framework when dealing with key economic categories such as conspicuous consumption. Also, the lack of linked explanatory categories is at the core of the ontological and epistemological issues the theory faces nowadays
OS PORQUÊS DO CONSUMO CONSPÍCUO: VEBLEN E A PSICOLOGIA EVOLUCIONÁRIA
TCC (graduação) - Universidade Federal de Santa Catarina. Centro Sócio-Econômico. Economia.Para além da lógica necessidade de garantir a subsistência, a presente investigação sugere que a aquisição de bens possui um sentido supra econômico, indo além da mera garantia do mínimo referente a sobrevivência. Tratando o consumo dentro de uma perspectiva evolucionária, como feito por Veblen (1980) e Miller (2010), conclui-se que o papel de sinalização deste é fundamental para se entender sua significação mental, esta não tão aparente. As várias interpretações do consumo em Economia vinculam-no, de forma empírica, à renda. Ao fazê-lo, os economistas entendem que a decisão de consumo está em maior ou menor grau ligada à conceitos como propensão marginal a consumir, renda disponível, renda permanente, renda intertemporal, etc. Em decorrência, reforça-se a ideia de o consumo ser uma variável cuja explicação se dá majoritariamente através de categorias objetivas, limitando-se a análise ao campo material. Tais abordagens deixam de lado o porquê de o consumo existir como fenômeno, não da a devida atenção a quais seriam suas razões de ser. Deste modo, é proposta uma abordagem a partir da Psicologia Evolucionária para tratar do consumo e seus determinante
Deep learning-based quantum algorithms for solving nonlinear partial differential equations
Partial differential equations frequently appear in the natural sciences and
related disciplines. Solving them is often challenging, particularly in high
dimensions, due to the "curse of dimensionality". In this work, we explore the
potential for enhancing a classical deep learning-based method for solving
high-dimensional nonlinear partial differential equations with suitable quantum
subroutines. First, with near-term noisy intermediate-scale quantum computers
in mind, we construct architectures employing variational quantum circuits and
classical neural networks in conjunction. While the hybrid architectures show
equal or worse performance than their fully classical counterparts in
simulations, they may still be of use in very high-dimensional cases or if the
problem is of a quantum mechanical nature. Next, we identify the bottlenecks
imposed by Monte Carlo sampling and the training of the neural networks. We
find that quantum-accelerated Monte Carlo methods offer the potential to speed
up the estimation of the loss function. In addition, we identify and analyse
the trade-offs when using quantum-accelerated Monte Carlo methods to estimate
the gradients with different methods, including a recently developed
backpropagation-free forward gradient method. Finally, we discuss the usage of
a suitable quantum algorithm for accelerating the training of feed-forward
neural networks. Hence, this work provides different avenues with the potential
for polynomial speedups for deep learning-based methods for nonlinear partial
differential equations.Comment: 48 pages, 17 figure
Data-independent acquisition improves quantitative cross-linking mass spectrometry
Quantitative cross-linking mass spectrometry (QCLMS) reveals structural detail on altered protein states in solution. On its way to becoming a routine technology, QCLMS could benefit from data-independent acquisition (DIA), which generally enables greater reproducibility than data-dependent acquisition (DDA) and increased throughput over targeted methods. Therefore, here we introduce DIA to QCLMS by extending a widely used DIA software, Spectronaut, to also accommodate cross-link data. A mixture of seven proteins cross-linked with bis[sulfosuccinimidyl] suberate (BS3) was used to evaluate this workflow. Out of the 414 identified unique residue pairs, 292 (70%) were quantifiable across triplicates with a coefficient of variation (CV) of 10%, with manual correction of peak selection and boundaries for PSMs in the lower quartile of individual CV values. This compares favorably to DDA where we quantified cross-links across triplicates with a CV of 66%, for a single protein. We found DIA-QCLMS to be capable of detecting changing abundances of cross-linked peptides in complex mixtures, despite the ratio compression encountered when increasing sample complexity through the addition of E. coli cell lysate as matrix. In conclusion, the DIA software Spectronaut can now be used in cross-linking and DIA is indeed able to improve QCLMS
Impact of the horizontal resolution on the simulation of extremes
The simulation of extremes using climate models is still a challenging task. Currently, the model grid horizontal resolution of state-of-the art regional climate models (RCMs) is about 11–25 km, which may still be too coarse to represent local extremes realistically. In this study we use dynamically downscaled ERA-40 reanalysis data of the RCM COSMO-CLM at 18 km resolution, downscale it dynamically further to 4.5 km and finally to 1.3 km to investigate the impact of the horizontal resolution on extremes. Extremes are estimated as return levels for the 2, 5 and 10‑year return periods using ‘peaks-over-threshold’ (POT) models. Daily return levels are calculated for precipitation and maximum 2 m temperature in summer as well as precipitation and 2 m minimum temperature in winter. The results show that CCLM is able to capture the spatial and temporal structure of the observed extremes, except for summer precipitation extremes. Furthermore, the spatial variability of the return levels increases with resolution. This effect is more distinct in case of temperature extremes due to a higher correlation with the better resolved orography. This dependency increases with increasing horizontal resolution. In comparison to observations, the spatial variability of temperature extremes is better simulated at a resolution of 1.3 km, but the return levels are cold-biased in summer and warm-biased in winter. Regarding precipitation, the spatial variability improves as well, although the return levels were slightly overestimated in summer by all CCLM simulations. In summary, the results indicate that an increase of the horizontal resolution of CCLM does have a significant effect on the simulation of extremes and that impact models and assessment studies may benefit from such high-resolution model output
Magnetically Induced Current Densities in Toroidal Carbon Nanotubes
Molecular structures of toroidal carbon nanotubes (TCNTs) have been constructed and optimized at the density functional theory (DFT) level. The TCNT structures have been constrained by using point groups with high symmetry. TCNTs consisting of only hexagons (polyhex) with armchair, chiral, and zigzag structures as well as TCNTs with pentagons and heptagons have been studied. The employed method for constructing general polyhex TCNTs is discussed. Magnetically induced current densities have been calculated using the gauge-including magnetically induced currents (GIMIC) method. The strength of the magnetically induced ring currents has been obtained by integrating the current density passing a plane cutting the ring of the TCNT. The main pathways of the current density have been identified by visualizing the current density. The calculations show that the strength of the diatropic ring current of polyhex TCNTs with an armchair structure generally increases with the size of the TCNT, whereas TCNTs with a zigzag structure sustain very weak diatropic ring currents. Some of the TCNTs with pentagons and heptagons sustain a strong diatropic ring current, whereas other TCNT structures with pentagons and heptagons sustain paratropic ring currents that are, in most cases, relatively weak. We discuss the reasons for the different behaviors of the current density of the seemingly similar TCNTs.Peer reviewe
Graduate student´s educational fulfilment in Brazil: A study based on the Capability Approach
Cost-effective generation of precise label-free quantitative proteomes in high-throughput by microLC and data-independent acquisition
Quantitative proteomics is key for basic research, but needs improvements to satisfy an increasing demand for large sample series in diagnostics, academia and industry. A switch from nanoflowrate to microflowrate chromatography can improve throughput and reduce costs. However, concerns about undersampling and coverage have so far hampered its broad application. We used a QTOF mass spectrometer of the penultimate generation (TripleTOF5600), converted a nanoLC system into a microflow platform, and adapted a SWATH regime for large sample series by implementing retention time-A nd batch correction strategies. From 3 μg to 5 μg of unfractionated tryptic digests that are obtained from proteomics-typical amounts of starting material, microLC-SWATH-MS quantifies up to 4000 human or 1750 yeast proteins in an hour or less. In the acquisition of 750 yeast proteomes, retention times varied between 2% and 5%, and quantified the typical peptide with 5-8% signal variation in replicates, and below 20% in samples acquired over a five-months period. Providing precise quantities without being dependent on the latest hardware, our study demonstrates that the combination of microflow chromatography and data-independent acquisition strategies has the potential to overcome current bottlenecks in academia and industry, enabling the cost-effective generation of precise quantitative proteomes in large scale
- …
