1,771 research outputs found

    Recomendation systems and crowdsourcing: a good wedding for enabling innovation? Results from technology affordances and costraints theory

    Get PDF
    Recommendation Systems have come a long way since their first appearance in the e-commerce platforms.Since then, evolved Recommendation Systems have been successfully integrated in social networks. Now its time to test their usability and replicate their success in exciting new areas of web -enabled phenomena. One of these is crowdsourcing. Research in the IS field is investigating the need, benefits and challenges of linking the two phenomena. At the moment, empirical works have only highlighted the need to implement these techniques for tasks assignment in crowdsourcing distributed work platforms and the derived benefits for contributors and firms. We review the variety of the tasks that can be crowdsourced through these platforms and theoretically evaluate the efficiency of using RS to recommend a task in creative crowdsourcing platforms. Adopting a Technology Affordances and Constraints Theory, an emerging perspective in the Information Systems (IS) literature to understand technology use and consequences, we anticipate the tensions that this implementation can generate

    Bingo pricing: a game simulation and evaluation using the derivatives approach

    Get PDF
    The Bingo game is well known and played all over the world. Its main feature is the sequential drawing without repetition of a set of numbers. Each of these numbers is compared to the numbers contained in the boxes printed on the different rows (and columns) of the score-cards owned by the Bingo participants. The winner will be the participant that firstly is able to check all the boxes (numbers) into a row (Line) or into the entire score-card (Bingo). Assuming that the score-card has a predetermined purchase price and that the jackpot is divided into two shares, respectively for the Bingo and the Line winner, it is evident that all the score-cards show the same starting value (initial price). After each drawing, every score-card will have different values (current price(s)) according with its probability to gain the Line and/or the Bingo. This probability depends from the number of checked boxes in the rows of the score-card and from the number of checked boxes in the rows of all the other playing score-cards. The first aim of this paper is to provide the base data structure of the problem and to formalize the needed algorithms for the initial price and current price calculation. The procedure will evaluate the single score-card and/or the whole set of playing score-cards according to the results of the subsequent drawings. In fact, during the game development and after each drawing, it will be possible to know the value of each score-card in order to choose if maintain it or sell it out. The evaluation will work in accordance to the traditional Galilee's method of "the interrupted game jackpot repartition". This approach has been also mentioned by Blaise Pascal and Pierre de Fermat in their mail exchange about the "jackpot problem". More advanced objective of the paper would be the application of the stock exchange techniques for the calculation of the future price of the score-card (and/or of a set of score-cards) that will have some checked numbers after a certain number of future drawings. In the same way will be calculated the value of the right to purchase or sell a score-card (and/or of a set of score-cards) at a pre-determined price (option price). Especially during the prototyping phase, the modelling and the development of these kind of problems need the use of computational environments able to manage structured data and with high calculation skills. The software that meet these requirements are APL, J and Matlab , as for their capability to use nested arrays and for the endogenous parallelism features of the programming environments. In this paper we will show the above mentioned issues through the use of Apl2Win/IBM . The formalisation of the game structure has been made in a general way, in order to foresee particular cases that act differently from the Bingo. In this way it is possible to simulate the traditional game with 90 numbers in the basket, 3 rows per 10 columns score-cards, 15 number for the Bingo and 5 numbers for the Line but already, for example, the Roulette with 37 (or 38) numbers, score-cards with 1 (or more) row and 1 column and Line with just 1 number.bingo, options, futures, gambling, market, evaluation

    BIG DATA ANALYTICS FOR FINANCIAL FRAUDS DETECTION

    Get PDF
    Criminals and criminal organizations often make use of companies and other corporate entities to hide their identity, conceal illicit flows of money, launder funds, finance terrorist organizations, evade taxes, create and hide slash funds, commit bribery, corruption, accounting frauds and other financial crimes. These legal entities are frequently organized into complex ownership schemes set up in different countries, and with a “Chinese boxes” structure, in order to make it harder to determine who ultimately controls them and benefits of the illegal conduct. Currently there are a lot of competitors in the market of Financial Fraud Detection but the software that they propose are mainly oriented to supervise and manage the institutions’ internal compliance processes such as the management and transmission of Suspicious Activity Reports (SAR) instead of providing intelligence tools for proactively discovery potential threats and identify the final beneficiaries of illegal operations. Consequently there is a potential for Financial Fraud Detection focused on the on-line, real-time statistical analysis of transactions, operators behaviour, price movements and the use of data mining algorithms that work on heterogeneous sources of big data. After having described the schemes used for executing the three most relevant financial frauds this research proposes a novel approach for the detection of illicit behaviours and suspect transactions. The approach benefits of a multidisciplinary approach for the analysis of the big data streams coming heterogeneous sources such as TV stream, social media and public (official and unofficial) data bases

    Self-assembly of bi-functional patchy particles with anisotropic shape into polymers chains: theory and simulations

    Full text link
    Concentrated solutions of short blunt-ended DNA duplexes, down to 6 base pairs, are known to order into the nematic liquid crystal phase. This self-assembly is due to the stacking interactions between the duplex terminals that promotes their aggregation into poly-disperse chains with a significant persistence length. Experiments show that liquid crystals phases form above a critical volume fraction depending on the duplex length. We introduce and investigate via numerical simulations, a coarse-grained model of DNA double-helical duplexes. Each duplex is represented as an hard quasi-cylinder whose bases are decorated with two identical reactive sites. The stacking interaction between terminal sites is modeled via a short-range square-well potential. We compare the numerical results with predictions based on a free energy functional and find satisfactory quantitative matching of the isotropic-nematic phase boundary and of the system structure. Comparison of numerical and theoretical results with experimental findings confirm that the DNA duplexes self-assembly can be properly modeled via equilibrium polymerization of cylindrical particles and enables us to estimate the stacking energy

    The socio-economic impact of technological innovation. Models and analysis of the digital technologies for cultural and creative industries.

    Get PDF
    The research activity synthesized in this book starts from the consideration that there is a growing need to verify how public investment in innovation can guarantee the best value for money and maximise the impact on economy and society. The cultural heritage sector represents a strategic target for the R&D investment in Europe and it is strongly needed to have also here a set of tools able to assess the socio-economic impact of projects’ activities. With the aim of supporting the maximisation of the research outputs effectiveness and efficiency, thanks to the MAXICULTURE project (FP7-ICT-2011-9-601070) , our research team analysed projects’ outputs both in terms of innovation and improvement related to the state of the art of the ICTs for creative and cultural sector, and in terms of transferability of results to the wider society in general and to the supply-industry in particular. During the research activates we: • performed the analysis of the DigiCult domain through the literature review and analysis of EC FP7 Call 1, Call 3, Call 6, Call 9 and Europeana projects; • developed the assessment methodology for the DigiCult projects’; • gathered the feedback from experts and projects on the methodology through webinars and online questionnaires; • developed the Self-Assessment Toolkit (SAT); • performed the assessment of 19 projects in the DigiCult domain by using the data gathered through the Self-Assessment Toolkit. The analysis produced interesting results such as: • the design of a specific Hype Cycle for the DigiCult projects; • a better understanding about the innovation dynamics in the sector; • the information on how to improve the diffusion of the knowledge generated by DigiCult projects; • the information on how to improve the socio-economic impact of DigiCult projects

    Contextual impacts on industrial processes brought by the digital transformation of manufacturing: a systematic review

    Get PDF
    The digital transformation of manufacturing (a phenomenon also known as "Industry 4.0" or "Smart Manufacturing") is finding a growing interest both at practitioner and academic levels, but is still in its infancy and needs deeper investigation. Even though current and potential advantages of digital manufacturing are remarkable, in terms of improved efficiency, sustainability, customization, and flexibility, only a limited number of companies has already developed ad hoc strategies necessary to achieve a superior performance. Through a systematic review, this study aims at assessing the current state of the art of the academic literature regarding the paradigm shift occurring in the manufacturing settings, in order to provide definitions as well as point out recurring patterns and gaps to be addressed by future research. For the literature search, the most representative keywords, strict criteria, and classification schemes based on authoritative reference studies were used. The final sample of 156 primary publications was analyzed through a systematic coding process to identify theoretical and methodological approaches, together with other significant elements. This analysis allowed a mapping of the literature based on clusters of critical themes to synthesize the developments of different research streams and provide the most representative picture of its current state. Research areas, insights, and gaps resulting from this analysis contributed to create a schematic research agenda, which clearly indicates the space for future evolutions of the state of knowledge in this field

    Self-assembly of short DNA duplexes: from a coarse-grained model to experiments through a theoretical link

    Full text link
    Short blunt-ended DNA duplexes comprising 6 to 20 base pairs self-assemble into polydisperse semi-flexible chains due to hydrophobic stacking interactions between terminal base pairs. Above a critical concentration, which depends on temperature and duplex length, such chains order into liquid crystal phases. Here, we investigate the self-assembly of such double-helical duplexes with a combined numerical and theoretical approach. We simulate the bulk system employing the coarse-grained DNA model recently proposed by Ouldridge et al. [ J. Chem. Phys. 134, 08501 (2011) ]. Then we evaluate the input quantities for the theoretical framework directly from the DNA model. The resulting parameter-free theoretical predictions provide an accurate description of the simulation results in the isotropic phase. In addition, the theoretical isotropic-nematic phase boundaries are in line with experimental findings, providing a route to estimate the stacking free energy.Comment: 13 pages, 10 figure

    analysis of the al and ti additions influences on phases generation and damage in a hot dip galvanizing process

    Get PDF
    Abstract Cheap iron-based alloys, such as Ductile Cast Irons (DCIs) and low carbon steels, are more and more used in the mechanical field because they are characterized by good strength and good workability. However, the low value of electrochemical potential of low carbon steel leads to quick environmental corrosion that can compromise the operative life of mechanical components. Therefore, it is important to protect them against corrosion even for safety and reliability reasons. The use of a traditional protection technique, like Hot Dip Galvanizing (HDG), allows low costs too. In this work, the phase formation during HDG process is presented and discussed. In particular, the influence of Al and Ti additions on the pure Zn bath is shown in the metallographic analysis, presenting also the results of pure Zn bath

    Perceived neighbourhood quality and adult health status: new statistical advices useful to answer old questions?

    Get PDF
    Interest in the quantitative effects of neighbourhood characteristics on adult health has recently increased in social epidemiology. Particularly, investigations concern the statistical influence on health of several individual demographic and socioeconomic characteristics and of neighbourhood characteristics as perceived by respondents. We analyze these issues within an original conceptual framework and employing statistical models unusual in this context. We use data collected in the Los Angeles Family and Neighbourhood Survey (L.A.FANS) to model the number of hospital admissions occurred to each individual as a function of some individual and neighbourhood characteristics, the latter being related to the individual perceptions about the neighbourhood he lives in. We employ generalized additive models with different distributional assumptions: Poisson, Negative Biomial and Zero Inflated Poisson (ZIP). Such models allow us to estimate (through spline functions) potential non linear effects of the covariates on the response. Moreover, non standard representations are used to overcome difficulties in interpreting the results for ZIP models. It turns out that perceived neighbourhood characteristics, and in particular the perception of social cohesion, have a significant effect after controlling for individual characteristics relevant to hospital admissions frequency. From a modeling point of view ZIP and Negative binomial models prove to be superior to standard Poisson model. We have confirmed the role of the neighbourhood where an individual lives in determining his health status. A strength of this analysis is that, due to the choice of the neighbourhood characteristics to be included in the model, the results do not depend of a particular definition of neighbourhood (which is traditionally based on administrative boundaries), since each individual refers his perceptions to his personal definition of it

    THE SOCIO-ECONOMIC IMPACT OF TECHNOLOGICAL INNOVATION: MODELS AND ANALYSIS OF THE DIGITAL TECHNOLOGIES FOR CULTURAL AND CREATIVE INDUSTRIES

    Get PDF
    L'attività di ricerca sintetizzata in questa tesi muove dalla considerazione che c'è una crescente necessità di verificare come gli pubblici investimenti nell'innovazione possano garantire rivelarsi profittevoli e massimizzare il loro impatto sulla società e l'economia europea. Il settore dei beni culturali rappresenta un obiettivo strategico per l'investimento in ricerca e sviluppo in Europa ed è necessario avere anche qui una serie di strumenti utili a valutare l'impatto socio-economico delle attività dei progetti di innovazione tecnologica. Con l'obiettivo di sostenere la massimizzazione dei risultati della ricerca in termini di efficacia ed efficienza, abbiamo analizzato i risultati progettuali sia in termini di avanzamento rispetto allo stato dell'arte delle ICT per il settore creativo e culturale, sia in termini di trasferibilità dei risultati all’industria ed alla società in generale. Durante la ricerca attiva abbiamo effettuato l'analisi del dominio progettuale denominato DigiCult attraverso: • la revisione della letteratura e analisi dei progetti finanziati dalla Commissione Europea nel FP7 a seguito delle Call 1, Call 3, Call 6, Call 9 ed Europeana; • lo sviluppo della metodologia di valutazione per i progetti DigiCult; • la raccolta e l’analisi del feedback di esperti e progetti sulla metodologia attraverso webinar e questionari on-line; • lo sviluppo del Self-Assessment Toolkit (SAT ); • la valutazione di 19 progetti nel settore DigiCult utilizzando i dati raccolti tramite il Self-Assessment Toolkit . L' analisi ha prodotto risultati interessanti quali, tra gli altri: • l’elaborazione di uno specifico Hype Cycle per i progetti DigiCult; • una migliore comprensione delle dinamiche di innovazione nel settore; • l’ottenimento di informazioni su come migliorare la diffusione della conoscenza generata dai progetti DigiCult ; • l’ottenimento di informazioni su come migliorare l'impatto socio-economico dei progetti di DigiCult.The research activity synthesized in this thesis starts from the consideration that there is a growing need to verify how public investment in innovation can guarantee the best value for money and maximise the impact on European economy and society. The cultural heritage sector represents a strategic target for the R&D investment in Europe and it is strongly needed to have also here a set of tool able to assess the socio-economic impact of projects’ activities. With the aim of supporting the maximisation of the research outputs effectiveness and efficiency, we analysed projects’ outputs both in terms of innovation and improvement related to the state of the art of the ICTs for creative and cultural sector, and in terms of transferability of results to the wider society in general and to the supply-industry in particular. During the research activates we: • performed the analysis of the DigiCult domain through the literature review and analysis of EC FP7 Call 1, Call 3, Call 6, Call 9 and Europeana projects; • developed the assessment methodology for the DigiCult projects’; • gathered the feedback from experts and projects on the methodology through webinars and online questionnaires; • developed the Self-Assessment Toolkit (SAT); • performed the assessment of 19 projects in the DigiCult domain by using the data gathered through the Self-Assessment Toolkit. The analysis produced interesting results such as: • the design of a specific Hype Cycle for the DigiCult projects; • a better understanding about the innovation dynamics in the sector; • the information on how to improve the diffusion of the knowledge generated by DigiCult projects; • the information on how to improve the socio-economic impact of DigiCult projects
    • …
    corecore