229,254 research outputs found
Parent's use of strategies to monitor children's activities online
Thesis (M.Com. (Information Systems))--University of the Witwatersrand, Faculty of Commerce, Law and Management, School of Economic and Business Sciences, 2017Although studies have been conducted on the effectiveness of different types of filtering software,
limited knowledge is available on parents’ use of strategies to monitor their children’s activities
online. Thus, identifying understanding parents’ use of strategies to monitor children’s activities
online and the extent in which parents use content filtering software will contribute to the body of
knowledge. The purpose of this study is to understand parent’s use of strategies to monitor
children’s activities online and the extent in which they use content filtering software in Gauteng
Province, South Africa. The study adopted a Social Cognitive Theory to develop a conceptual
framework and identify existing theoretical concepts. The conceptual framework adapted
Bandura’s (2001) framework to inform data analysis.
Data were collected through semi-structured interviews and qualitative, thematic content analysis
was used for data analyses. The results of the study indicated that parents do use various
strategies to monitor children’s activities online and further apply knowledge, experience, and
social support as a rationale for using those strategies. The study further revealed that there is a
gap between parents, technology industry and government regarding the use of content filtering
software. Thus, the study recommends parents, industry and government work together to
protecting children online through various strategies and address the concerns regarding the use
of content filtering software. Parents’ need to understand the importance of content filtering
software and discuss this with their children to be able to protect them online without restricting
access to relevant information.
Keywords: Harmful content, blocking, strategies, filtering, online content, software, use,
non-use, strategiesGR201
An examination of Internet filters in school library media centers
Internet filtering has become an issue since the introduction of the Internet in schools and libraries. Concerns about "inappropriate content" on the Internet have arisen, especially when children are involved. This study uses content analysis of messages from LM_NET (a popular mailing list for practicing school library media specialists) to examine common practices and decisions made by school library media specialists with respect to Internet filtering software. Alternatives to Internet filtering are also discussed
ALGORITHMS AND FUNDAMENTAL RIGHTS: THE CASE OF AUTOMATED ONLINE FILTERS
The information that we see on the internet is increasingly tailored by automated ranking and filtering algorithms used by online platforms, which significantly interfere with the exercise of fundamental rights online, particularly the freedom of expression and information. The EU’s regulation of the internet prohibits general monitoring obligations. The paper first analyses the CJEU’s case law which has long resisted attempts to require internet intermediaries to use automated software filters to remove infringing user uploads. This is followed by an analysis of article 17 of the Directive on Copyright in the Digital Single Market, which effectively requires online platforms to use automated filtering to ensure the unavailability of unauthorized copyrighted content. The Commission’s guidance and the AG’s opinion in the annulment action are discussed. The conclusion is that the regulation of the filtering algorithms themselves will be necessary to prevent private censorship and protect fundamental rights online
The development of a portable Earth's field NMR system for the study of Antarctic sea ice : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Electronics at Massey University
A portable Nuclear Magnetic Resonance (NMR) spectrometer based on digital signal processor (DSP) technology has been developed and applied to the study of the structure of Antarctic sea ice. The portability of this system means that external sources of noise can be minimised and remote sites can be investigated. A new sea-ice probe has been developed in conjunction with the spectrometer allowing in-situ measurement of water content, relaxation times and self diffusion. The new probe minimises disturbances to the sea ice sample which have been a problem with previous techniques. The core of the spectrometer consists of a Motorola DSP56303 DSP which controls the NMR experiment under the supervison of a host computer which in this case is a PC laptop. Communication between host and DSP is via either a PCMCIA card or USB interface. DSP software runs the experiment, controls acquisition and performs digital filtering of the NMR data before sending it to the PC for analysis and display. The flexibility of the DSP based core means that this system could be adapted to other control applications with relative ease
Encore: Lightweight Measurement of Web Censorship with Cross-Origin Requests
Despite the pervasiveness of Internet censorship, we have scant data on its
extent, mechanisms, and evolution. Measuring censorship is challenging: it
requires continual measurement of reachability to many target sites from
diverse vantage points. Amassing suitable vantage points for longitudinal
measurement is difficult; existing systems have achieved only small,
short-lived deployments. We observe, however, that most Internet users access
content via Web browsers, and the very nature of Web site design allows
browsers to make requests to domains with different origins than the main Web
page. We present Encore, a system that harnesses cross-origin requests to
measure Web filtering from a diverse set of vantage points without requiring
users to install custom software, enabling longitudinal measurements from many
vantage points. We explain how Encore induces Web clients to perform
cross-origin requests that measure Web filtering, design a distributed platform
for scheduling and collecting these measurements, show the feasibility of a
global-scale deployment with a pilot study and an analysis of potentially
censored Web content, identify several cases of filtering in six months of
measurements, and discuss ethical concerns that would arise with widespread
deployment
The Specification of Requirements in the MADAE-Pro Software Process
MADAE-Pro is an ontology-driven process for multi-agent domain and application engineering which promotes the construction and reuse of agent-oriented applications families. This article introduces MADAE-Pro, emphasizing the description of its domain analysis and application requirements engineering phases and showing how software artifacts produced from the first are reused in the last one. Illustrating examples are extracted from two case studies we have conducted to evaluate MADAE-Pro. The first case study assesses the Multi-Agent Domain Engineering sub-process of MADAE-Pro through the development of a multi-agent system family of recommender systems supporting alternative (collaborative, content-based and hybrid) filtering techniques. The second one, evaluates the Multi-Agent Application Engineering sub-process of MADAE-Pro through the construction of InfoTrib, a Tax Law recommender system which provides recommendations based on new tax law information items using a content-based filtering technique. ONTOSERS and InfoTrib were modeled using ONTORMAS, a knowledge-based tool for supporting and automating the tasks of MADAEPro
Genes2Networks: Connecting Lists of Proteins by Using Background Literature-based Mammalian Networks
In recent years, in-silico literature-based mammalian protein-protein interaction network datasets have been developed. These datasets contain binary interactions extracted manually from legacy experimental biomedical research literature. Placing lists of genes or proteins identified as significantly changing in multivariate experiments, in the context of background knowledge about binary interactions, can be used to place these genes or proteins in the context of pathways and protein complexes.
Genes2Networks is a software system that integrates the content of ten mammalian literature-based interaction network datasets. Filtering to prune low-confidence interactions was implemented. Genes2Networks is delivered as a web-based service using AJAX. The system can be used to extract relevant subnetworks created from “seed” lists of human Entrez gene names. The output includes a dynamic linkable three color web-based network map, with a statistical analysis report that identifies significant intermediate nodes used to connect the seed list. Genes2Networks is available at http://actin.pharm.mssm.edu/genes2networks.
Genes2Network is a powerful web-based software application tool that can help experimental biologists to interpret high-throughput experimental results used in genomics and proteomics studies where the output of these experiments is a list of significantly changing genes or proteins. The system can be used to find relationships between nodes from the seed list, and predict novel nodes that play a key role in a common function
Genes2Networks: Connecting Lists of Proteins by Using Background Literature-based Mammalian Networks
In recent years, in-silico literature-based mammalian protein-protein interaction network datasets have been developed. These datasets contain binary interactions extracted manually from legacy experimental biomedical research literature. Placing lists of genes or proteins identified as significantly changing in multivariate experiments, in the context of background knowledge about binary interactions, can be used to place these genes or proteins in the context of pathways and protein complexes.
Genes2Networks is a software system that integrates the content of ten mammalian literature-based interaction network datasets. Filtering to prune low-confidence interactions was implemented. Genes2Networks is delivered as a web-based service using AJAX. The system can be used to extract relevant subnetworks created from “seed” lists of human Entrez gene names. The output includes a dynamic linkable three color web-based network map, with a statistical analysis report that identifies significant intermediate nodes used to connect the seed list. Genes2Networks is available at http://actin.pharm.mssm.edu/genes2networks.
Genes2Network is a powerful web-based software application tool that can help experimental biologists to interpret high-throughput experimental results used in genomics and proteomics studies where the output of these experiments is a list of significantly changing genes or proteins. The system can be used to find relationships between nodes from the seed list, and predict novel nodes that play a key role in a common function
Protecting Teens Online
Presents findings from a survey conducted between October and November 2004. Looks at the growth in the use of filters to limit access to potentially harmful content online in internet-using households with teenagers aged 12-17
An Evaluation of Popular Copy-Move Forgery Detection Approaches
A copy-move forgery is created by copying and pasting content within the same
image, and potentially post-processing it. In recent years, the detection of
copy-move forgeries has become one of the most actively researched topics in
blind image forensics. A considerable number of different algorithms have been
proposed focusing on different types of postprocessed copies. In this paper, we
aim to answer which copy-move forgery detection algorithms and processing steps
(e.g., matching, filtering, outlier detection, affine transformation
estimation) perform best in various postprocessing scenarios. The focus of our
analysis is to evaluate the performance of previously proposed feature sets. We
achieve this by casting existing algorithms in a common pipeline. In this
paper, we examined the 15 most prominent feature sets. We analyzed the
detection performance on a per-image basis and on a per-pixel basis. We created
a challenging real-world copy-move dataset, and a software framework for
systematic image manipulation. Experiments show, that the keypoint-based
features SIFT and SURF, as well as the block-based DCT, DWT, KPCA, PCA and
Zernike features perform very well. These feature sets exhibit the best
robustness against various noise sources and downsampling, while reliably
identifying the copied regions.Comment: Main paper: 14 pages, supplemental material: 12 pages, main paper
appeared in IEEE Transaction on Information Forensics and Securit
- …