8,408 research outputs found
Transforming the European Economy
Europe grew rapidly for many years, but now, faced with greater challenges, several of the large economies in Europe have either failed to generate enough jobs or have failed to achieve the highest levels of productivity or both. This study explores why Europe's growth slowed, what contribution information technology makes to growth, and what policies could facilitate economic transformation. It emphasizes a system with strong work incentives and a high level of competitive intensity. Europe doesn't need to eliminate its protections for individuals, the authors conclude, but both social programs and policies toward business must be reoriented so that they encourage economic change.
PREDICTING ONLINE USER BEHAVIOR BASED ON REAL-TIME ADVERTISING DATA
Generating economic value from big data is a challenge for many companies these days. On the Internet, a major source of big data is structured and unstructured data generated by users. Companies can use this data to better understand patterns of user behavior and to improve marketing decisions. In this paper, we focus on data generated in real-time advertising where billions of advertising slots are sold by auction. The auctions are triggered by user activity on websites that use this form of advertising to sell their advertising slots. During an auction, so-called bid requests are sent to advertisers who bid for the advertising slots. We develop a model that uses bid requests to predict whether a user will visit a certain website during his or her user journey. These predictions can be used by advertisers to derive user interests early in the sales funnel and, thus, to increase profits from branding campaigns. By iteratively applying a Bayesian multinomial logistic model to data from a case study, we show how to constantly improve the predictive accuracy of the model. We calculate the economic value of our model and show that it can be beneficial for advertisers in the context of cross-channel advertising
How Much Tracking Is Necessary? - The Learning Curve in Bayesian User Journey Analysis
Extracting value from big data is one of today’s business challenges. In online marketing, for instance, advertisers use high volume clickstream data to increase the efficiency of their campaigns. To prevent collecting, storing, and processing of irrelevant data, it is crucial to determine how much data to analyze to achieve acceptable model performance. We propose a general procedure that employs the learning curve sampling method to determine the optimal sample size with respect to cost/benefit considerations. Applied in two case studies, we model the users\u27 click behavior based on clickstream data and offline channel data. We observe saturation effects of the predictive accuracy when the sample size is increased and, thus, demonstrate that advertisers only have to analyze a very small subset of the full dataset to obtain an acceptable predictive accuracy and to optimize profits from advertising activities. In both case studies we observe that a random intercept logistic model outperforms a non-hierarchical model in terms of predictive accuracy. Given the high infrastructure costs and the users\u27 growing awareness for tracking activities, our results have managerial implications for companies in the online marketing field
Moment tensor inversions of icequakes on Gornergletscher, Switzerland
We have determined seismic source mechanisms for shallow and intermediate-depth icequake clusters recorded on the glacier Gornergletscher, Switzerland, during the summers of 2004 and 2006. The selected seismic events are part of a large data set of over 80,000 seismic events acquired with a dense seismic network deployed in order to study the yearly rapid drainage of Gornersee lake, a nearby ice-marginal lake. Using simple frequency and distance scaling and Green’s functions for a homogeneous half-space, we calculated moment tensor solutions for icequakes with M_w-1.5 using a full-waveform inversion method usually applied to moderate seismic events (M_w>4) recorded at local to regional distances (≈50–700 km). Inversions from typical shallow events are shown to represent tensile crack openings. This explains well the dominating Rayleigh waves and compressive first motions observed at all recording seismograms. As these characteristics can be observed in most icequake signals, we believe that the vast majority of icequakes recorded in the 2 yr is due to tensile faulting, most likely caused by surface crevasse openings. We also identified a shallow cluster with somewhat atypical waveforms in that they show less dominant Rayleigh waves and quadrantal radiation patterns of first motions. Their moment tensors are dominated by a large double-couple component, which is strong evidence for shear faulting. Although less than a dozen such icequakes have been identified, this is a substantial result as it shows that shear faulting in glacier ice is generally possible even in the absence of extreme flow changes such as during glacier surges. A third source of icequakes was located at 100 m depth. These sources can be represented by tensile crack openings. Because of the high-hydrostatic pressure within the ice at these depths, these events are most likely related to the presence of water lenses that reduce the effective stress to allow for tensile faulting
Time forecast of a break-off event from a hanging glacier
A cold hanging glacier located on the south face of the Grandes Jorasses (Mont Blanc, Italy) broke off on the 23 and 29 September 2014 with a total estimated ice vol- ume of 105 000 m 3 . Thanks to accurate surface displacement measurements taken up to the final break-off, this event was successfully predicted 10 days in advance, enabling local au- thorities to take the necessary safety measures. The break- off event also confirmed that surface displacements expe- rienced a power law acceleration along with superimposed log-periodic oscillations prior to the final rupture. This pa- per describes the methods used to achieve a satisfactory time forecast in real time and demonstrates, using a retrospective analysis, their potential for the development of early-warning systems in real time
Icequakes coupled with surface displacements for predicting glacier break-off
A hanging glacier at the east face of Weisshorn (Switzerland) broke off in
2005. We were able to monitor and measure surface motion and icequake activity
for 25 days up to three days prior to the break-off. The analysis of seismic
waves generated by the glacier during the rupture maturation process revealed
four types of precursory signals of the imminent catastrophic rupture: (i) an
increase in seismic activity within the glacier, (ii) a decrease in the waiting
time between two successive icequakes, (iii) a change in the size-frequency
distribution of icequake energy, and (iv) a modification in the structure of
the waiting time distributions between two successive icequakes. Morevover, it
was possible to demonstrate the existence of a correlation between the seismic
activity and the log-periodic oscillations of the surface velocities
superimposed on the global acceleration of the glacier during the rupture
maturation. Analysis of the seismic activity led us to the identification of
two regimes: a stable phase with diffuse damage, and an unstable and dangerous
phase characterized by a hierarchical cascade of rupture instabilities where
large icequakes are triggered.Comment: 16 pages, 7 figure
Co-detection of micro seismic activity as early warning of gravitational slope failure
We developed a new strategy for Disaster Risk Reduction for gravitational
slope failure: We propose a simple method for real-time early warning of
gravity-driven failures that considers and exploits both the heterogeneity of
natural media and characteristics of acoustic emissions attenuation. This
method capitalizes on co-detection of elastic waves emanating from micro-cracks
by a network of multiple and spatially distributed sensors. Event co-detection
is considered as surrogate for large event size with more frequent co-detected
events marking imminence of catastrophic failure. In this study we apply this
method to a steep rock glacier / debris slope and demonstrate the potential of
this simple strategy for real world cases, i.e. at slope scale. This low cost,
robust and autonomous system provides a well adapted alternative/complementary
solution for Early Warning Systems.Comment: 11 pages, 8 figure
Observational constraints on the sensitivity of two calving glaciers to external forcings
Future mass loss projections of the Greenland ice sheet require understanding of the processes at a glacier terminus, especially of iceberg calving. We present detailed and high-rate terrestrial radar interferometer observations of Eqip Sermia and Bowdoin Glacier, two outlet glaciers in Greenland with comparable dimensions and investigate iceberg calving, surface elevation, velocity, strain rates and their links to air temperature, tides and topography. The results reveal that the two glaciers exhibit very different flow and calving behaviour on different timescales. Ice flow driven by a steep surface slope with several topographic steps leads to high velocities, areas of extension and intense crevassing, which triggers frequent but small calving events independent of local velocity gradients. In contrast, ice flow under smooth surface slopes leaves the ice relatively intact, such that sporadic large-scale calving events dominate, which initiate in areas with high shearing. Flow acceleration caused by enhanced meltwater input and tidal velocity variations were observed for terminus sections close to floatation. Firmly grounded terminus sections showed no tidal signal and a weak short-term reaction to air temperature. These results demonstrate reaction timescales to external forcings from hours to months, which are, however, strongly dependent on local terminus geometry
- …