2,312 research outputs found

    Model-Based Method for Social Network Clustering

    Full text link
    We propose a simple mixed membership model for social network clustering in this note. A flexible function is adopted to measure affinities among a set of entities in a social network. The model not only allows each entity in the network to possess more than one membership, but also provides accurate statistical inference about network structure. We estimate the membership parameters by using an MCMC algorithm. We evaluate the performance of the proposed algorithm by applying our model to two empirical social network data, the Zachary club data and the bottlenose dolphin network data. We also conduct some numerical studies for different types of simulated networks for assessing the effectiveness of our algorithm. In the end, some concluding remarks and future work are addressed briefly

    Deterministic and Probabilistic Risk Management Approaches in Construction Projects: A Systematic Literature Review and Comparative Analysis

    Get PDF
    Risks and uncertainties are inevitable in construction projects and can drastically change the expected outcome, negatively impacting the project’s success. However, risk management (RM) is still conducted in a manual, largely ineffective, and experience-based fashion, hindering automation and knowledge transfer in projects. The construction industry is benefitting from the recent Industry 4.0 revolution and the advancements in data science branches, such as artificial intelligence (AI), for the digitalization and optimization of processes. Data-driven methods, e.g., AI and machine learning algorithms, Bayesian inference, and fuzzy logic, are being widely explored as possible solutions to RM domain shortcomings. These methods use deterministic or probabilistic risk reasoning approaches, the first of which proposes a fixed predicted value, and the latter embraces the notion of uncertainty, causal dependencies, and inferences between variables affecting projects’ risk in the predicted value. This research used a systematic literature review method with the objective of investigating and comparatively analyzing the main deterministic and probabilistic methods applied to construction RM in respect of scope, primary applications, advantages, disadvantages, limitations, and proven accuracy. The findings established recommendations for optimum AI-based frameworks for different management levels—enterprise, project, and operational—for large or small data sets

    On the role of pre and post-processing in environmental data mining

    Get PDF
    The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed

    Mapping Topographic Structure in White Matter Pathways with Level Set Trees

    Full text link
    Fiber tractography on diffusion imaging data offers rich potential for describing white matter pathways in the human brain, but characterizing the spatial organization in these large and complex data sets remains a challenge. We show that level set trees---which provide a concise representation of the hierarchical mode structure of probability density functions---offer a statistically-principled framework for visualizing and analyzing topography in fiber streamlines. Using diffusion spectrum imaging data collected on neurologically healthy controls (N=30), we mapped white matter pathways from the cortex into the striatum using a deterministic tractography algorithm that estimates fiber bundles as dimensionless streamlines. Level set trees were used for interactive exploration of patterns in the endpoint distributions of the mapped fiber tracks and an efficient segmentation of the tracks that has empirical accuracy comparable to standard nonparametric clustering methods. We show that level set trees can also be generalized to model pseudo-density functions in order to analyze a broader array of data types, including entire fiber streamlines. Finally, resampling methods show the reliability of the level set tree as a descriptive measure of topographic structure, illustrating its potential as a statistical descriptor in brain imaging analysis. These results highlight the broad applicability of level set trees for visualizing and analyzing high-dimensional data like fiber tractography output

    Interaction Analysis in Smart Work Environments through Fuzzy Temporal Logic

    Get PDF
    Interaction analysis is defined as the generation of situation descriptions from machine perception. World models created through machine perception are used by a reasoning engine based on fuzzy metric temporal logic and situation graph trees, with optional parameter learning and clustering as preprocessing, to deduce knowledge about the observed scene. The system is evaluated in a case study on automatic behavior report generation for staff training purposes in crisis response control rooms

    Interaction Analysis in Smart Work Environments through Fuzzy Temporal Logic

    Get PDF
    Interaction analysis is defined as the generation of situation descriptions from machine perception. World models created through machine perception are used by a reasoning engine based on fuzzy metric temporal logic and situation graph trees, with optional parameter learning and clustering as preprocessing, to deduce knowledge about the observed scene. The system is evaluated in a case study on automatic behavior report generation for staff training purposes in crisis response control rooms
    corecore