Journal of Telematics and Informatics
Not a member yet
    147 research outputs found

    Financial LLM For Stock Price Analysis And Investment Recommendation

    No full text
    The integration of Large Language Models (LLMs) into financial analysis represents a transformative shift in how financial data is interpreted and utilized for market predictions and investment decisions. Traditionally, financial analysis has relied on structured quantitative data, such as stock prices, economic indicators, and company fundamentals. However, the sheer volume of unstructured textual data available from sources like financial news, analyst reports, and social media has made it increasingly important to integrate natural language processing (NLP) capabilities. This research explores the design and implementation of a novel Financial LLM framework that bridges the gap between traditional financial theories and advanced deep learning techniques, offering a more comprehensive and accurate approach to financial forecasting and decision-making. The framework developed in this study employs time series forecasting models, such as Long Short-Term Memory networks (LSTMs) and Temporal Fusion Transformers (TFTs), which are particularly adept at modeling temporal dependencies and predicting stock price movements. These models analyze historical stock prices and market trends to forecast short-term price fluctuations and volatility. The use of LSTMs, known for their ability to capture long-term dependencies in sequential data, and TFTs, which incorporate attention mechanisms to focus on the most relevant time periods, ensures that the model can handle the complexity and volatility of the financial market. In addition to the price prediction models, the framework also incorporates sentiment analysis, which plays a crucial role in understanding market dynamics. By using fine-tuned LLMs, such as FinBERT and BloombergGPT, the system processes unstructured textual data from a wide array of sources, including news articles, analyst reports, and even social media platforms. These sentiment models are trained to understand financial terminology and detect market sentiment, which can often be a critical driver of stock price movements. The sentiment scores generated by these models provide an additional layer of insight into market conditions, helping to predict future market behaviors that might not be immediately evident from the numerical data alone. Furthermore, the framework incorporates a variety of alternative datasets to enhance the predictive accuracy and depth of the analysis. These datasets include macroeconomic indicators, company fundamentals, search trends, and even satellite imagery. By including such diverse and unconventional data sources, the framework is better equipped to capture a broader range of market signals, such as consumer behavior trends, geopolitical events, or supply chain disruptions, which can significantly impact stock prices and market sentiment. The system also integrates sophisticated portfolio optimization techniques to recommend investment strategies. Reinforcement Learning (RL) is employed to optimize asset allocation, ensuring that the portfolio is balanced in a way that maximizes returns while minimizing risk. Modern Portfolio Theory (MPT) further helps in determining the ideal mix of assets to achieve the highest possible return for a given level of risk. These portfolio optimization models are crucial in helping investors manage risk and maximize their long-term returns, particularly in volatile market conditions. Risk management is an integral part of the proposed framework. The system embeds risk assessment modules that use the predicted volatility and market sentiment to calculate potential risks and assess the impact of sudden economic or geopolitical events. These modules ensure that the investment recommendations are robust, even during periods of market turbulence. For example, sudden shifts in market sentiment, as might be triggered by unexpected political events or economic crises, are factored into the risk models to help mitigate potential losses. The practical application of this framework is validated using real-world datasets. The system uses historical stock prices, macroeconomic indicators, and financial text corpora to train and validate the models. A robust backtesting mechanism is implemented to evaluate the performance of the predictive models and portfolio strategies. The backtesting process compares the model’s recommendations against benchmark indices like the S&P 500 Index, employing key performance metrics such as Sharpe Ratio, Annualized Return, and Maximum Drawdown. These metrics help to measure the risk-adjusted returns and overall effectiveness of the strategies, providing insights into their potential for real-world application. One of the key challenges addressed by the framework is data quality. Financial markets are often characterized by noisy, sparse, and incomplete data, which can significantly affect the performance of predictive models. The research discusses techniques for handling such challenges, ensuring that the system can process real-time market data while maintaining high levels of accuracy. Additionally, the system is designed to comply with regulatory standards, ensuring that its recommendations align with legal and ethical guidelines for financial decision-making. Overall, this research demonstrates the potential of Large Language Models in revolutionizing financial analysis. By combining the power of natural language processing with advanced financial modeling, the proposed framework offers a more holistic and accurate view of market conditions. The results show significant improvements in prediction accuracy, risk-adjusted returns, and interpretability, paving the way for innovative, scalable, and user-friendly applications that can be employed by both institutional and retail investors. As financial markets continue to evolve, the integration of LLMs offers a promising future for automated financial decision-making, helping investors navigate the complexities of the modern market with confidence

    Vision-Aid for the Blind

    No full text
    Vision is a beautiful gift to human beings by GOD. Vision allows people to perceive and understand the surrounding world. Till date blind people struggle a lot to live their miserable life. In the presented work, a simple, cheap, friendly user, virtual eye is designed and implemented to improve the mobility of both blind and visually impaired people in a specific area. The presented work includes a wearable equipment consists of head hat, mini hand stick and foot shoes to help the blind person to navigate alone safely and to avoid any obstacles that may be encountered, whether fixed or mobile, to prevent any possible accident. The main component of this system is the ultrasonic sensor which is used to scan a predetermined area around blind by emitting-reflecting waves. The reflected signals received from the barrier objects are used as inputs to Arduino microcontroller. The microcontroller carry out the issued commands and then communicate the status of a given appliance or device back to the earphones using SD Card Technology. The GPS receiver has been used for navigation purpose as well as GSM will act as a mobile phone which informed about the danger of blind person. We are using GPS module with some other advance features. The system is cheap, fast, and easy to use and an innovative affordable solution to blind and visually impaired people in third world countries

    Development of an Automated Control System for Brown Sugar Chopping and Drying Machine Based on Outseal PLC Nano V.5

    No full text
    This study presents the development of an automated control system for a brown sugar chopping and drying machine using Outseal PLC Nano V.5. The system significantly enhances chopping efficiency, reducing processing time by 45% compared to conventional methods. The drying process is optimized using automated temperature control with a PT100 sensor, ensuring a consistent moisture reduction of 80% in the final product. Regression analysis results show a strong correlation (R² = 0.92) between drying time and moisture reduction, confirming the system's effectiveness. The integrated automation successfully maintains a stable drying temperature, preventing overheating and preserving product quality. These advancements contribute to increased productivity, improved product consistency, and a more efficient brown sugar processing system. Future research can explore further refinements in airflow management to enhance drying uniformity. This system modernizes traditional brown sugar processing and supports automation for higher efficiency and sustainability

    Comparative Study of Solar Panel Power Performance and Battery Charging Patterns of Lithium-Ion

    No full text
    New Renewable Energy (EBT) is an environmentally friendly energy source that can be continuously renewed. One of the EBT technologies that is widely used is solar panels, which utilize solar energy as an electrical resource. This research aims to analyze battery charging patterns based on using solar panels with capacities of 5WP, 10WP, and 20WP, as well as charging modules such as TP4056 and INA219 sensors to monitor voltage and amperage. The research results show that solar panels with a capacity of 20 WP have faster charging capabilities compared to solar panels with a capacity of 5 WP and 10 WP. The average battery charging time in sunny weather is around 3 hours, while the battery charging time in cloudy weather is around 6 hours. This shows that solar panels with higher capacity can increase the efficiency of the battery charging process. Keywords : New Renewable Energy, Solar Panels, Lithium-Ion Batteries, INA219 Sensor

    Ghana Accountability For Learning Outcomes Projects (Galop) and its Impact On Academic Progression In Abuakwa North Municipality

    No full text
    This study assesses Ghana's Accountability for Learning Outcomes Project (GALOP) role in students' academic progress in Abuakwa North Municipality. The respondents of this research are the GALOP teachers from nineteen (19) GALOP schools totalling 140 teachers. But the responding rate was 120 teachers. The study employed census sampling methods, and a questionnaire to gather information from respondents. A quantitative research approach was adopted using a descriptive cross-sectional design. The main analytical tools used in the study include frequency, distribution, percentages and linear regression. The findings revealed that GALOP positively impacted the quality of education in low-performing basic education schools. Again, GALOP was found to significantly predict learners' academic progression, indicating that the model (GALOP) activities explained 14.2% of the variance in learners' academic progress. Based on these results, it is recommended that teachers should be given adequate teachers professional development programs to enhance their abilities and improve their teaching competencies, which will enable them to implement the GALOP activities effectively. Effective implementation of GALOP positively impacts the learners’ academic progress. The study further recommends that GES should always give incentives or motivation to teachers to attract effective teaching

    Sneak-App Quality Measurement Using FURPS Model and Euclidean Distance

    No full text
    Ensuring the quality of Sneak-App is crucial for supporting the operations of Sneak Crafters, a Small and Medium Enterprise (SME) specializing in custom sneakers. This study evaluates the Sneak-App's quality using the FURPS model (Functionality, Usability, Reliability, Performance, Supportability) combined with the Euclidean Distance method. The assessment involved distributing questionnaires to 20 respondents, utilizing a Likert scale (1–4) to measure each FURPS sub-indicator. The results indicate that Sneak-App achieved a final quality score of 65%, categorizing it as a moderately good application. Among the five indicators, Reliability recorded the highest score (93%), indicating strong system stability, while Functionality (55%) and Supportability (56%) had the lowest scores, suggesting the need for improvements in feature completeness and system adaptability. These findings validate the effectiveness of the FURPS model and Euclidean Distance method in identifying software strengths and weaknesses. By addressing key areas for improvement, developers can enhance the application's overall usability and performance. This research provides valuable insights for future software development, emphasizing the need for continuous quality enhancement to optimize Sneak-App's role in automating business processes for SMEs

    Internet of Things Training Set: A Modern Approach to Learning

    No full text
    The purpose of this research were to design and construct of Internet of Things training set. The quality of this developed training set was evaluated by experts to find the efficiency. The samples of this research were students in industrial technology program, faculty of industrial technology who had been study of internet of things  system subject selected by  Stratified Sampling Technique. The research instrument were the internet of things training set, quality evaluation and worksheets. The result of the research revealed that the quality about the content aspect of the internet of things training set was at excellent level ( = 4.52), and the quality of worksheet was at excellent level ( = 4.58). The efficiency E1/E2 = 85.27/88.40 which statistical higher than criteria 80/80  at 0.05 significant level as hypothesis setting the efficiency not less than 80/80

    Large Language Model and Retrieval-Augmented Generation Model for Indonesian Publication

    No full text
    Garba Rujukan Digital (GARUDA) is a platform for publications and references in scientific articles, journals, and theses in Indonesia. However, to be able to find specific information in many articles and journals, of course, it is necessary to develop a system to make it easier to find this information. Therefore, a chatbot system with Large Language Model (LLM) and Retrieval Augmented Generation (RAG) was developed which is used to retrieve information through data-based chatbots on GARUDA. To find out the results of this study, a matrix evaluation was carried out using the ROUGE score with an average result of the value range from 42.68% to 68.03%. Thus, the evaluation showed that the output worked quite well in answering questions in scientific articles in the GARUDA Computer Science & IT indexed journal, especially on web-based subtopics.Keywords: chatbot, RAG, LLM, GARUDA Kemdikbud

    Flood Early Warning System Based on Firebase Platform and Internet of Things

    No full text
    Floods are natural disasters that frequently occur and cause significant damage to human life, infrastructure, and the environment. This study designed and implemented a prototype of an early flood warning system based on the Internet of Things (IoT) integrated with Firebase Realtime Database. The system aimed to address the limitations of previous warning systems by providing more accurate information, faster response times, and easier accessibility. The prototype enabled real-time data access through a web-based application accessible via smartphones and was equipped with a DF Player module to deliver audio warnings. A scaled-down prototype (1:10) was tested using an ultrasonic sensor for water level measurement and a tipping bucket sensor for rainfall measurement. The results demonstrated 99% accuracy for water level measurements up to 200 mm, with a data response time of 1 second and text updates every 5 seconds. This system significantly improved the accuracy, update speed, information accessibility, and effectiveness of audio warnings

    Genetic Algorithm To Optimize The Shortest Route for Indomaret Goods Suppliers

    No full text
    This research examines the application of genetic algorithms to optimize the distribution of goods from suppliers to Indomaret outlets in the East Tegal District area by modeling the problem as the Traveling Salesman Problem (TSP). A genetic algorithm is applied to determine the most efficient distribution route, which aims to reduce travel distance and operational costs. Distance data between location points is taken from Google Maps, and the optimization process involves forming an initial population, selection based on fitness function, crossover, and mutation. The research results show that the genetic algorithm can produce an optimal solution with the shortest distance of 9,700 meters, and the highest fitness value of 0.0001031. These findings provide an overview of the effectiveness of genetic algorithms in handling TSP in the context of goods distribution and have the potential for further development in distribution and logistics application

    117

    full texts

    147

    metadata records
    Updated in last 30 days.
    Journal of Telematics and Informatics
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇