28,700 research outputs found

    Online Learning Video Recommendation System Based on Course and Sylabus Using Content-Based Filtering

    Get PDF
    Learning using video media such as watching videos on YouTube is an alternative method of learning that is often used. However, there are so many learning videos available that finding videos with the right content is difficult and time-consuming. Therefore, this study builds a recommendation system that can recommend videos based on courses and syllabus. The recommendation system works by looking for similarity between courses and syllabus with video annotations using the cosine similarity method. The video annotation is the title and description of the video captured in real-time from YouTube using the YouTube API. This recommendation system will produce recommendations in the form of five videos based on the selected courses and syllabus. The test results show that the average performance percentage is 81.13% in achieving the recommendation system goals, namely relevance, novelty, serendipity and increasing recommendation diversity

    Video Recommendation System for YouTube Considering Users Feedback

    Get PDF
    Youtube is the most video sharing and viewing platform in the world. As there are many people of different tastes, hundreds of categories of videos can be found on YouTube while thousands of videos of each. So, when the site recommends videos for a user it takes some issues which fill the needs of the user. Most of the time a user watches videos related to the previously watched video. But sometimes userFFFD;s mood changes with time or weather. A user may not hear a song in the whole year but can search the song on a rainy day. Another case a user may watch some types of videos at day but another type of videos at night or another at midnight. In this paper, we propose a recommendation system considering some attributes like weather, time, month to understand the dynamic mood of a user. Each attribute is assigned a weight calculated by performing a survey on some YouTube users. Most recently viewed videos is assigned heavy weight and weather is assigned lower. This recommendation system will make YouTube more user-friendly, dynamic and acceptable

    An Educational Intervention to Foster Interest in Sustainable Design

    Get PDF
    Issues compromising the educational role of video social networks include individual, social and structural factors. We propose an educational intervention using online videos and implementation intentions, copying with these issues to foster interest on Sustainable Design. Based on semantic and sentiment analysis on top YouTube videos related to Sustainable Design, we created a recommendation system in English and Spanish. We compared the system\u27s performance with a YouTube search and with participants divided in three groups. One group used YouTube while the other two used the recommendation system. The third group practiced implementation intentions activity and some participants used an eye-tracking device. Precision and recall in English were slightly lower for the system in comparison to YouTube, but the variety of recommended videos increased. In Spanish, precision and recall were higher for the system, increasing the number of videos from Spanish speaking countries. After two months, there was a significant difference in number of Sustainable Design projects from subjects of the full intervention group and the control group, while interest post intervention was a significant predictor of projects. Testing with a bigger number of participants during longer periods of time is recommended.Art and Design Research for Sustainable Development ; September 22, 2018Conference: Tsukuba Global Science Week 2018Date: September 20-22, 2018Venue: Tsukuba International Congress Center Sponsored: University of TsukubaThis research was founded by the Rotary Yoneyama Scholarship Association

    Understanding the Social Mechanism of Cancer Misinformation Spread on YouTube and Lessons Learned: Infodemiological Study

    Get PDF
    Background: A knowledge gap exists between the list of required actions and the action plan for countering cancer misinformation on social media. Little attention has been paid to a social media strategy for disseminating factual information while also disrupting misinformation on social media networks. Objective: The aim of this study was to, first, identify the spread structure of cancer misinformation on YouTube. We asked the question, "How do YouTube videos play an important role in spreading information about the self-administration of anthelmintics for dogs as a cancer medicine for humans?" Second, the study aimed to suggest an action strategy for disrupting misinformation diffusion on YouTube by exploiting the network logic of YouTube information flow and the recommendation system. We asked the question, "What would be a feasible and effective strategy to block cancer misinformation diffusion on YouTube?" Methods: The study used the YouTube case of the self-administration of anthelmintics for dogs as an alternative cancer medicine in South Korea. We gathered Korean YouTube videos about the self-administration of fenbendazole. Using the YouTube application programming interface for the query "fenbendazole," 702 videos from 227 channels were compiled. Then, videos with at least 50,000 views, uploaded between September 2019 and September 2020, were selected from the collection, resulting in 90 videos. Finally, 10 recommended videos for each of the 90 videos were compiled, totaling 573 videos. Social network visualization for the recommended videos was used to identify three intervention strategies for disrupting the YouTube misinformation network. Results: The study found evidence of complex contagion by human and machine recommendation systems. By exposing stakeholders to multiple information sources on fenbendazole self-administration and by linking them through a recommendation algorithm, YouTube has become the perfect infrastructure for reinforcing the belief that fenbendazole can cure cancer, despite government warnings about the risks and dangers of self-administration. Conclusions: Health authorities should upload pertinent information through multiple channels and should exploit the existing YouTube recommendation algorithm to disrupt the misinformation network. Considering the viewing habits of patients and caregivers, the direct use of YouTube hospital channels is more effective than the indirect use of YouTube news media channels or government channels that report public announcements and statements. Reinforcing through multiple channels is the key.ope

    A Study of YouTube recommendation graph based on measurements and stochastic tools

    Get PDF
    International audience— The Youtube recommendation is one the most important view source of a video. In this paper, we focus on the recommendation system in boosting the popularity of videos. We first construct a graph that captures the recommendation system in Youtube and study empirically the relationship between the number of views of a video and the average number of views of the videos in its recommendation list. We then consider a random walker on the recommendation graph, i.e. a random user that browses through videos such that the video it chooses to watch is selected randomly among the videos in the recommendation list of the previous video it watched. We study the stability properties of this random process and we show that the trajectory obtained does not contain cycles if the number of videos in the recommendation list is small (which is the case if the computer's screen is small). Index Terms— Analysis of recommendation system, drift stability analysis, Youtub

    "It is just a flu": {A}ssessing the Effect of Watch History on {YouTube}'s Pseudoscientific Video Recommendations

    Get PDF
    YouTube has revolutionized the way people discover and consume videos, becoming one of the primary news sources for Internet users. Since content on YouTube is generated by its users, the platform is particularly vulnerable to misinformative and conspiratorial videos. Even worse, the role played by YouTube's recommendation algorithm in unwittingly promoting questionable content is not well understood, and could potentially make the problem even worse. This can have dire real-world consequences, especially when pseudoscientific content is promoted to users at critical times, e.g., during the COVID-19 pandemic. In this paper, we set out to characterize and detect pseudoscientific misinformation on YouTube. We collect 6.6K videos related to COVID-19, the flat earth theory, the anti-vaccination, and anti-mask movements; using crowdsourcing, we annotate them as pseudoscience, legitimate science, or irrelevant. We then train a deep learning classifier to detect pseudoscientific videos with an accuracy of 76.1%. Next, we quantify user exposure to this content on various parts of the platform (i.e., a user's homepage, recommended videos while watching a specific video, or search results) and how this exposure changes based on the user's watch history. We find that YouTube's recommendation algorithm is more aggressive in suggesting pseudoscientific content when users are searching for specific topics, while these recommendations are less common on a user's homepage or when actively watching pseudoscientific videos. Finally, we shed light on how a user's watch history substantially affects the type of recommended videos

    "how over is it?" Understanding the Incel Community on YouTube

    Get PDF
    YouTube is by far the largest host of user-generated video content worldwide. Alas, the platform has also come under fire for hosting inappropriate, toxic, and hateful content. One community that has often been linked to sharing and publishing hateful and misogynistic content are the Involuntary Celibates (Incels), a loosely defined movement ostensibly focusing on men's issues. In this paper, we set out to analyze the Incel community on YouTube by focusing on this community's evolution over the last decade and understanding whether YouTube's recommendation algorithm steers users towards Incel-related videos. We collect videos shared on Incel communities within Reddit and perform a data-driven characterization of the content posted on YouTube. Among other things, we find that the Incel community on YouTube is getting traction and that, during the last decade, the number of Incel-related videos and comments rose substantially. We also find that users have a 6.3% chance of being suggested an Incel-related video by YouTube's recommendation algorithm within five hops when starting from a non Incel-related video. Overall, our findings paint an alarming picture of online radicalization: not only Incel activity is increasing over time, but platforms may also play an active role in steering users towards such extreme content

    Assessing enactment of content regulation policies: A post hoc crowd-sourced audit of election misinformation on YouTube

    Full text link
    With the 2022 US midterm elections approaching, conspiratorial claims about the 2020 presidential elections continue to threaten users' trust in the electoral process. To regulate election misinformation, YouTube introduced policies to remove such content from its searches and recommendations. In this paper, we conduct a 9-day crowd-sourced audit on YouTube to assess the extent of enactment of such policies. We recruited 99 users who installed a browser extension that enabled us to collect up-next recommendation trails and search results for 45 videos and 88 search queries about the 2020 elections. We find that YouTube's search results, irrespective of search query bias, contain more videos that oppose rather than support election misinformation. However, watching misinformative election videos still lead users to a small number of misinformative videos in the up-next trails. Our results imply that while YouTube largely seems successful in regulating election misinformation, there is still room for improvement.Comment: 22 page

    Between broadcast yourself and broadcast whatever: YouTube’s homepage as a synthesis of its business strategy

    Get PDF
    YouTube is a company representative of its original context, Web 2.0, that originally positioned itself as an open and collaborative platform to broadcast videos created by all kinds of users; their slogan was, and remains, Broadcast yourself. The acquisition of YouTube by Google introduced it in the search for profit in the framework of OTT (over the top) communication. The YouTube homepage reveals the successive business policies followed by the platform. In this work we analyze the evolution of the unlogged homepage of YouTube Spain between 2009 and 2018. We observe the gradual disappearance of the videos produced by private users and its replacement by those made by professional users (youtubers) and cultural industries. There is a remarkable parallelism between the implementation of business models and the greater or lesser recommendation of video from one kind of user or another

    YouTube as a source of information about unproven drugs for Covid-19: the role of the mainstream media and recommendation algorithms in promoting misinformation

    Full text link
    In this study, we address how YouTube videos promote misinformation about hydroxychloroquine in Brazil. We follow two research questions. RQ1: How is pro-hydroxychloroquine content propagated on YouTube? RQ2: How does YouTube’s recommendation system suggest videos about hydroxychloroquine on the platform? We use mixed methods (content analysis and social network analysis) to analyze 751 YouTube videos. We found that most pro-HCQ videos in our dataset are posted by mainstream media channels (RQ1) and that YouTube was more likely to recommend pro-HCQ videos than anti-HCQ videos (RQ2). Consequently, the Brazilian mainstream media and YouTube’s algorithms fueled the spread of pro-HCQ content
    corecore