406 research outputs found
Police culture: does culture prevent proper policing?
Master's Project (M.A.) University of Alaska Fairbanks, 2018This project is about identifying the key issues that police officers face in today's society. There is an emphasis on community policing and to adjust police training to account for the strong pull of the police subculture. The main purpose of this project is to strengthen the bonds between the police and the community and changing how officers approach their interactions within the community. The end goal is to alleviate community concerns that police officers are out to get them while also alleviating the concerns officers have that the community hates them. This project will attempt to quell those concerns while proposing a solution that benefits both officers, the police department, and the community
NodeTrix: Hybrid Representation for Analyzing Social Networks
The need to visualize large social networks is growing as hardware
capabilities make analyzing large networks feasible and many new data sets
become available. Unfortunately, the visualizations in existing systems do not
satisfactorily answer the basic dilemma of being readable both for the global
structure of the network and also for detailed analysis of local communities.
To address this problem, we present NodeTrix, a hybrid representation for
networks that combines the advantages of two traditional representations:
node-link diagrams are used to show the global structure of a network, while
arbitrary portions of the network can be shown as adjacency matrices to better
support the analysis of communities. A key contribution is a set of interaction
techniques. These allow analysts to create a NodeTrix visualization by dragging
selections from either a node-link or a matrix, flexibly manipulate the
NodeTrix representation to explore the dataset, and create meaningful summary
visualizations of their findings. Finally, we present a case study applying
NodeTrix to the analysis of the InfoVis 2004 coauthorship dataset to illustrate
the capabilities of NodeTrix as both an exploration tool and an effective means
of communicating results
Acquiring Targets in the Velocity Domain: Toward Predictive Modeling of Virtual Tossing
Tossing, throwing, or flicking objects in a user interface or virtual environment can be used as a faster, lower-precision alternative to traditional pointing, however there is currently no predictive model of user performance with tossing. We report experimental measurements of performance in a 1D tossing task from which a predictive model is derived. We consider a simplified form of tossing where a virtual object on a horizontal surface is accelerated and released, and then decelerates under friction, coming to rest at some final po- sition. The distance traveled after release is determined by the release velocity as well as by the friction model used. To abstract away the details of the friction model, our ex- periment measures the ability of users to accelerate and re- lease a virtual object in 1D (using a mouse) with a given tar- get velocity, with target velocities varying from 6.25 cm/s to 1 m/s. Results indicate that there is a linear relationship be- tween the target release velocity and the standard deviation of the release velocity achieved by the user. We also propose an automatic release technique (instead of requiring the user to manually release using a mouse button) that significantly improves precision. The model derived from our experiment predicts that a user should be able to toss at three different target speeds (effectively tossing toward target locations at three different distances) with an error rate under 4%. We also predict that having four or more targets in the same di- rection would cause the error rate to rise above 10%. Design implications for integrating tossing into graphical user inter- faces are discussed
NodeTrix: Hybrid Representation for Analyzing Social Networks
The need to visualize large social networks is growing as hardware capabilities make analyzing large networks feasible and many new data sets become available. Unfortunately, the visualizations in existing systems do not satisfactorily answer the basic dilemma of being readable both for the global structure of the network and also for detailed analysis of local communities. To address this problem, we present NodeTrix, a hybrid representation for networks that combines the advantages of two traditional representations: node-link diagrams are used to show the global structure of a network, while arbitrary portions of the network can be shown as adjacency matrices to better support the analysis of communities. A key contribution is a set of interaction techniques. These allow analysts to create a NodeTrix visualization by dragging selections from either a node-link or a matrix, flexibly manipulate the NodeTrix representation to explore the dataset, and create meaningful summary visualizations of their findings. Finally, we present a case study applying NodeTrix to the analysis of the InfoVis 2004 coauthorship dataset to illustrate the capabilities of NodeTrix as both an exploration tool and an effective means of communicating results
Recommended from our members
Evolutionary rewiring of bacterial regulatory networks
Bacteria have evolved complex regulatory networks that enable integration of multiple intracellular and extracellular signals to coordinate responses to environmental changes. However, our knowledge of how regulatory systems function and evolve is still relatively limited. There is often extensive homology between components of different networks, due to past cycles of gene duplication, divergence, and horizontal gene transfer, raising the possibility of cross-talk or redundancy. Consequently, evolutionary resilience is built into gene networks – homology between regulators can potentially allow rapid rescue of lost regulatory function across distant regions of the genome. In our recent study [Taylor, et al. Science (2015), 347(6225)] we find that mutations that facilitate cross-talk between pathways can contribute to gene network evolution, but that such mutations come with severe pleiotropic costs. Arising from this work are a number of questions surrounding how this phenomenon occurs
Latency management in scribble-based interactive segmentation of medical images
Objective: During an interactive image segmentation task, the outcome is strongly influenced by human factors. In particular, a reduction in computation time does not guarantee an improvement in the overall segmentation time. This paper characterizes user efficiency during scribble-based interactive segmentation as a function of computation time. Methods: We report a controlled experiment with users who experienced eight different levels of simulated latency (ranging from 100 to 2000 ms) with two techniques for refreshing visual feedback (either automatic, where the segmentation was recomputed and displayed continuously during label drawing, or user initiated, which was only computed and displayed each time the user hits a defined button). Results: For short latencies, the user's attention is focused on the automatic visual feedback, slowing down his/her labeling performance. This effect is attenuated as the latency grows larger, and the two refresh techniques yield similar user performance at the largest latencies. Moreover, during the segmentation task, participants spent in average 72.67% ± 2.42% for automatic refresh and 96.23% ± 0.06% for user-initiated refresh of the overall segmentation time interpreting the results. Conclusion: The latency is perceived differently according to the refresh method used during the segmentation task. Therefore, it is possible to reduce its impact on the user performance. Significance: This is the first time a study investigates the effects of latency in an interactive segmentation task. The analysis and recommendations provided in this paper help understanding the cognitive mechanisms in interactive image segmentation
A Generalized Graph Reduction Framework for Interactive Segmentation of Large Images
The speed of graph-based segmentation approaches, such as random walker (RW) and graph cut (GC), depends strongly on image size. For high-resolution images, the time required to compute a segmentation based on user input renders interaction tedious. We propose a novel method, using an approximate contour sketched by the user, to reduce the graph before passing it on to a segmentation algorithm such as RW or GC. This enables a significantly faster feedback loop. The user first draws a rough contour of the object to segment. Then, the pixels of the image are partitioned into “layers” (corresponding to different scales) based on their distance from the contour. The thickness of these layers increases with distance to the contour according to a Fibonacci sequence. An initial segmentation result is rapidly obtained after automatically generating foreground and background labels according to a specifically selected layer; all vertices beyond this layer are eliminated, restricting the segmentation to regions near the drawn contour. Further foreground/background labels can then be added by the user to refine the segmentation. All iterations of the graph-based segmentation benefit from a reduced input graph, while maintaining full resolution near the object boundary. A user study with 16 participants was carried out for RW segmentation of a multi-modal dataset of 22 medical images, using either a standard mouse or a stylus pen to draw the contour. Results reveal that our approach significantly reduces the overall segmentation time compared with the status quo approach (p < 0.01). The study also shows that our approach works well with both input devices. Compared to super-pixel graph reduction, our approach provides full resolution accuracy at similar speed on a high-resolution benchmark image with both RW and GC segmentation methods. However, graph reduction based on super-pixels does not allow interactive correction of clustering errors. Finally, our approach can be combined with super-pixel clustering methods for further graph reduction, resulting in even faster segmentation
QMEAN server for protein model quality estimation
Model quality estimation is an essential component of protein structure prediction, since ultimately the accuracy of a model determines its usefulness for specific applications. Usually, in the course of protein structure prediction a set of alternative models is produced, from which subsequently the most accurate model has to be selected. The QMEAN server provides access to two scoring functions successfully tested at the eighth round of the community-wide blind test experiment CASP. The user can choose between the composite scoring function QMEAN, which derives a quality estimate on the basis of the geometrical analysis of single models, and the clustering-based scoring function QMEANclust which calculates a global and local quality estimate based on a weighted all-against-all comparison of the models from the ensemble provided by the user. The web server performs a ranking of the input models and highlights potentially problematic regions for each model. The QMEAN server is available at http://swissmodel.expasy.org/qmean
Recommended from our members
Evolutionary resurrection of flagellar motility via rewiring of the nitrogen regulation system
A central process in evolution is the recruitment of genes to regulatory networks. We engineered immotile strains of the bacterium Pseudomonas fluorescens that lack flagella due to deletion of the regulatory gene fleQ. Under strong selection for motility, these bacteria consistently regained flagella within 96 hours via a two-step evolutionary pathway. Step 1 mutations increase intracellular levels of phosphorylated NtrC, a distant homologue of FleQ, which begins to commandeer control of the fleQ regulon at the cost of disrupting nitrogen uptake and assimilation. Step 2 is a switch-of-function mutation that redirects NtrC away from nitrogen uptake and towards its novel function as a flagellar regulator. Our results demonstrate that natural selection can rapidly rewire regulatory networks in very few, repeatable mutational steps
- …