699 research outputs found

    Factors Contributing to Participation in Web-based Surveys among Italian University Graduates

    Get PDF
    An established yearly survey aimed at monitoring the employment opportunities of Italian graduates, traditionally carried out with Cati methods, has been integrated during the last few years with Cawi. Cawi has become increasingly crucial due to the high number of graduates involved in the survey, which has mandated a reduction in fieldwork duration and unit costs. Although the seven Cawi surveys used here have different substantive and methodological characteristics, preliminary analysis reveals a common trend: the utmost participation is observed during the first few days immediately following initiation of fieldwork and, to a lesser degree, the delivery of follow-up reminders. Web respondents comprise a self-selected subgroup of the target population, having better academic performance and greater computer skills. A Cox regression model estimating response probability (or response time) shows, besides the obvious effects of certain personal and survey design characteristics, that faster response times are expressed by graduates in science or engineering and reporting good computer skills, whereas the fields of medicine/health and defence/security and no computer skills give rise to lower response probability. Ways to use these findings for fine-tuning data collection are discussed.Cawi surveys, Response rate, University graduates,Cox regression

    The Business Model of Free Software: Legal Myths and Real Benefits

    Get PDF
    International audienceFree Software is the term coined by Richard Stallman in 1983 to denote programs whose sources are available to whoever receives a copy of the software and come with the freedom to run, copy, distribute, study, change and improve the software.As Richard Stallman’s concept grew in popularity, and with the subsequent advent of GNU/Linux, Free Software has received a great deal of attention and media publicity. With attention and publicity came expectations as well as a number of legal myths and confusions.The objective of this article is to clarify some of these legal misunderstandings while explaining how the legal fundamentals on which Free Software is based allow for a long-lasting business model based on a special kind of expertise-based support that benefits customers and guarantees the creation of a local pool of expertise.This article is based on our experience with GNAT Pro. GNAT Pro is the Free Software development environment for the Ada 95 programming language. It comprises a compiler that is part of GCC (the GNU Compiler Collection), a toolset and graphical Integrated Development Environment, and a set of supporting libraries.Developing, maintaining, and marketing GNAT Pro for ten years have provided significant experience with both technical and non-technical aspects of Free Software. This article summarizes the principal legal and business lessons learned

    FLOSS, COTS, and Safety: A Business Perspective

    Get PDF
    International audienceThis paper discusses the relationship between COTS software (Commercial-off-the-shelf) and FLOSS (Freely Licensed Open Source Software) from a purely business perspective. The emphasis of this work is on safety-centric industries such as aerospace, automotive, and railways

    Free Software and Leveraged Service Organizations

    Get PDF
    International audienceIn this work we concentrate on where for- profit pure FLOSS business organizations in the embedded real-time software space draw their revenue. The business model of these ventures rests on an original concept: the LSO (Leveraged Service Organization) which, thanks to its subscription-based model, is capable of generating a stable cash flow that can be invested in innovation and reward employees and investors alike. The leverage aspect, in an LSO, comes from concentration of know-how and expertise around the Free Software package(s) marketed. Thanks to this expertise an LSO can offer a service of extremely high-value to its customers

    Genealogie(n), Sozialwissenschaften und Digital Humanities: Zu drei genealogischen Datenbanken jüdischer und christlicher Populationen Mittelitaliens in der Neuzeit

    Get PDF
    This article presents a research approach that combines genealogy, social sciences and digital humanities. Initiated in the early 2000s, the project concerns itself with the systematic collection and analysis of two central Italian population groups: Jews and Christians, based on a methodological and epistemological reflection on sources and quantitative methods in historiography and ethnology. Starting from the experiences of this project, the article discusses fundamental questions: why and how can we reconstruct an entire population? Which problems occur in the process, both in terms of the sources and the tools developed by researchers and genealogists? Finally, what is the future of databases that we develop?This article presents a research approach that combines genealogy, social sciences and digital humanities. Initiated in the early 2000s, the project concerns itself with the systematic collection and analysis of two central Italian population groups: Jews and Christians, based on a methodological and epistemological reflection on sources and quantitative methods in historiography and ethnology. Starting from the experiences of this project, the article discusses fundamental questions: why and how can we reconstruct an entire population? Which problems occur in the process, both in terms of the sources and the tools developed by researchers and genealogists? Finally, what is the future of databases that we develop

    Studio e confronto dei trattamenti termici di componenti in acciaio prodotti con tecnologia additiva e con processi convenzionali

    Get PDF
    Studio e confronto degli effetti dei trattamenti termici applicati su componenti in acciaio prodotti utilizzando processi di fabbricazione additiva di Selective Laser Melting e con processi convenzionali. L’obbiettivo è mostrare come questi trattamenti possano permettere di ottenere risultati diversi rispetto ai trattamenti convenzionali, in termini di proprietà meccaniche e microstruttura

    Forecast Sensitivity to Observations using Data Denial and Ensemble-based Methods over the Dallas-Fort Worth Testbed

    Get PDF
    The ‘Nationwide Network of Networks’ (NNoN) concept was introduced by the National Research Council to address the growing need for a national mesoscale observing system. Part of this growing need is the continued advancement toward accurate high-resolution numerical weather prediction. The research testbed known as the Dallas – Fort Worth (DFW) Urban Demonstration Network was created to experiment with many kinds of mesoscale observations that could be used in a data assimilation system, in order to identify observational systems that are most impactful on high-resolution forecasts. Many observation systems have been implemented for the DFW testbed, including Earth Networks (ERNET) Weather Bug surface stations, Citizen Weather Observer Program (CWOP) amateur surface stations, Global Science and Technology (GST) mobile truck observations, CASA X- band radars, SODARs, and radiometers. These ‘nonconventional’ observations are combined with conventional operational data from METARs, mesonet, aircraft, rawinsondes, profilers, and operational radars to form the testbed network. A principal component of the NNoN effort is the quantification of observation impact from several different sources of information. This dissertation covers two main themes related to quantifying the impact that observations have on forecasts. The first part is the quantification of impact using data denial experiments, or observational simulation experiments. The GSI-based EnKF data assimilation system was used together with the WRF-ARW model to examine impacts of observations assimilated for forecasting convection initiation (CI) in the 3 April 2014 hailstorm case. Data denial experiments were conducted testing the impact of high-frequency (5-min) assimilation of nonconventional data on the timing and location of CI, as well as on the development of storms as they progress through the testbed domain. Results using ensemble probability of reflectivity and neighborhood ensemble probability of hail show nonconventional observations were necessary to capture local details in the dryline structure causing localized enhanced convergence and leading to CI. Diagnosis of denial-minus-control fields showed the cumulative influence each observing network had on the resulting CI forecast. It was found that most of this impact came from the assimilation of thermodynamic observations. Accurate metadata is found to be crucial to the application of nonconventional observations in high-resolution assimilation and forecasts systems. The second part of this dissertation explored the application of the ensemble- based forecast sensitivity to observations (EFSO). First, tests using a global two-layer model were performed to identify areas of improvement in the localization methods needed to make EFSO estimates accurate. Due to the time-forecast component, localization of the EFSO metric is more complicated than during traditional assimilation because as forecast time increases the error correlation structures evolve with the flow. Experiments made use of the local ensemble transform Kalman filter (LETKF) with a simple two-layer primitive equation model and simulated observations. Application of an adaptive localization method – regression confidence factors (RCF) based on a Monte Carlo “group filter” technique – led to marked improvement especially for longer forecasts and at midlatitudes, when systematically verified against actual impact in RMSE and skill scores. Results showed that the shape,location, time-dependency, and variable-dependency of RCF localization functions are consistent with underlying dynamical processes of the model. The impact estimates near the equator were not as effective due to large discrepancies between the RCF function and the localization used at assimilation time. These latter results indicated that there exists an inherent relationship between the localization applied during the assimilation time and the proper localization choice for observation impact estimates. Application of RCF for automatically tuned localization is introduced and tested for a single observation experiment. Next, the EFSO method was applied to the high-resolution CI case from 3 April 2014 and evaluated for accuracy in terms of several verification metrics, including energy norms surface variables, and composite reflectivity. Static and advected localization were applied to EFSO and compared for accuracy to the actual forecast error reduction. The RCF method was also applied to the convective-scale EFSO estimation. Results showed that different verification metrics lead to different forecast length scales of useful estimates. The application of EFSO to reflectivity is hindered by the high nonlinearity of convection, though there were some qualitative insights in its use. The application of RCF localization, while found to reveal the underlying flow- dependence of the case study including the time-forecast component, did not improve upon the advected localization method. This is hypothesized to be due in part to insights gained from the two-layer model work, though other adaptive methods may yet yield better results. Nevertheless, the application of EFSO is appropriate for convective-scale systems on forecast time scales of 90 minutes or less

    Implementazione di un radar FMCW con dispositivi SDR

    Get PDF
    Il documento di tesi è stato così strutturato: il capitolo 1 fornisce alcune generalità riguardo la tecnologia radar e l'importanza che quest'ultima ricopre nelle moderne applicazioni all'avanguardia e in quelle future che si creeranno con l'arrivo della tecnologia 6G. Il capitolo 2 fornisce informazioni di base riguardo il principio di funzionamento della tecnologia radar, ponendo maggiore attenzione sul tema riguardante i dispositivi FMCW e CW e la tecnica di modulazione che essi adottano per la trasmissione dei segnali. Il capitolo 3 introduce la tecnologia dei dispositivi SDR, in particolare, fornisce una descrizione generale del dispositivo radio Adalm-Pluto utilizzato per l’attività sperimentale. Nel capitolo 4 vengono mostrati i passi che hanno portato alla stesura del codice Matlab riguardante la generazione e l’elaborazione dei segnali trasmessi e la configurazione del dispositivo Adalm-Pluto come dispositivo radar. Infine, nel capitolo 5 si parla dei risultati numerici ottenuti mediante l’attività sperimentale, in particolare, gli esiti conseguiti verranno commentati ed analizzati. Infine, saranno discusse le conclusioni finali a riguardo del lavoro svolto

    ALIX COHEN: KANT’S LECTURES ON ANTHROPOLOGY. ACRITICALGUIDE. CAMBRIDGE UNIVERSITY PRESS, 2014, 270 PP. ISBN: 978-1-107-02491-5.

    Get PDF
    Il volume edito da Alix Cohen mira a far emergere l’importanza della pubblicazione delle Vorlesungen über die Anthropologie di Kant nel volume XXV dell’Akademie-Ausgabe, edito nel 1997 da Reinhard Brandt e Werner Stark e la traduzione inglese delle Lezioni nella Cambridge Edition nel 2006. La rilevanza storica e filosofica di queste lezioni vuole essere così messa in luce e svincolata da quel carattere accessorio rispetto agli scritti critici e all’Antropologia stessa, che la hanno caratterizzata per molto – e forse troppo – tempo. Il destino di queste riflessioni antropologiche nell’ultimo decennio è già iniziato a cambiare e si è aperta una nuova fase di ricerca sul ruolo da esse svolto nell’intero sistema kantiano. Il volume curato da Cohen ha sicuramente il merito di continuare questo dibattito e arricchirlo di nuovi contributi che vado qui a riassumere in breve

    An analysis of globalization strategies for real estate service providers

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Urban Studies and Planning, 1998.Includes bibliographical references (leaves 136-139).The technological advancements developed within the past decade have severely influenced the way national real estate companies do business. Geographical boundaries that once segregated companies into specific regions of dominance have all but disappeared. As other industries expand their operations overseas, real estate companies have had to devise their own international strategies in order to capitalize on the new 'global' market. This thesis will begin by stating three traditional hypotheses on how real estate companies should attempt to make the transition into becoming global real estate providers. These hypotheses include establishing a joint venture with an existing international corporation, acquiring a local presence by absorbing foreign real estate firms, and finally by marketing a firm's particular expertise to other foreign markets. The best solution depends on a particular company's strengths and weakness. Every international strategy requires a different strategic plan and implementation process. The second half of the thesis will involve three case studies on real estate providers that have already attempted the "globalization" transition. These companies include Lend Lease, CB Richard Ellis, and LaSalle Partners. The companies on the front line of any new industry venture are the first to test the academic hypotheses. We will compare and contrast the success of the aforementioned strategic plans with respect to these three companies. By carefully reviewing the successes and lessons learned by the industry leaders, other real estate companies can better understand their particular opportunities by going global. It is apparent to most of the real estate industry that the local specialized service provider is a dinosaur. Some companies have started the transition into tomorrow and it seems logical that other service providers will not want to reinvent the wheel.by James H. Gasperoni and David R. Ravin.S.M
    corecore