Journal of Artificial Intelligence and Computational Technology
https://ojs.omgfzc.com/index.php/JAICT
<p>The "Journal of Artificial Intelligence and Computational Technology " aims to provide a platform for researchers, academics, engineers, and professionals to disseminate their original research, innovations, and advancements in various aspects of engineering and computer technology. The journal covers a broad spectrum of topics, including but not limited to electrical engineering, mechanical engineering, civil engineering, chemical engineering, computer science, information technology, software engineering, artificial intelligence, data science, and interdisciplinary studies at the intersection of engineering and computer technology.</p> <p>The journal publishes high-quality, peer-reviewed articles that contribute significantly to the existing body of knowledge in these fields. It welcomes original research papers, review articles, case studies, experimental studies, and surveys that address emerging trends, challenges, methodologies, and applications in engineering and computer technology.</p> <p>The topics covered by a journal with a focus on engineering and computer technology:</p> <ul> <li>Artificial Intelligence and Machine Learning</li> <li>Software Engineering</li> <li>Computer Networks and Communications</li> <li>Embedded Systems and IoT</li> <li>Cybersecurity</li> <li>Data Science and Big Data</li> <li>Computer Architecture and Systems</li> <li>Robotics and Control Systems</li> <li>Image and Signal Processing</li> <li>Renewable Energy and Sustainable Technologies</li> <li>Biomedical Engineering</li> <li>Civil and Environmental Engineering</li> </ul>Oloum Al Mostgbal Group en-USJournal of Artificial Intelligence and Computational Technology 3008-1645Combatting Cybersecurity Threats on Social Media: Network Protection and Data Integrity Strategies
https://ojs.omgfzc.com/index.php/JAICT/article/view/32
<p>The rise of social media has transformed global communication, but it has also introduced significant cybersecurity threats, including identity theft, phishing, malware distribution, and data breaches. These challenges not only compromise individual users but also pose risks to businesses and governments. This research explores the prevalent cybersecurity threats on social media and proposes an integrated framework to enhance network protection and data integrity. The framework combines both technical solutions such as encryption, multi-factor authentication, and AI-based threat detection and non-technical strategies like user education, platform policies, and collaborative efforts among stakeholders. By synthesizing findings from a comprehensive literature review, this study identifies the most common cyber threats and assesses their impacts on users, businesses, and society at large. The research highlights the importance of proactive measures, including real-time monitoring, secure data practices, and user behavior modification, to mitigate these risks. Additionally, the study emphasizes the need for greater collaboration between platform providers, governments, and users to create a safer digital environment. The proposed framework is flexible and applicable across various social media platforms, providing a holistic approach to combatting evolving cyber threats. This study contributes to the growing body of knowledge on social media cybersecurity, offering practical recommendations for improving security and maintaining the integrity of online networks.</p>Ashraf jalal yousef Zaidieh
Copyright (c) 2024 Journal of Artificial Intelligence and Computational Technology
2024-11-082024-11-081110.70274/jaict.2024.1.1.32Predictive Modelling of Crop Rotation Using Data Mining Approaches
https://ojs.omgfzc.com/index.php/JAICT/article/view/35
<p>Agriculture is crucial for economic growth and food security, particularly in agro-based countries. As the global population grows, the demand for food increases, necessitating improvements in agricultural productivity. Traditional methods have often fallen short, and innovative approaches such as data mining and machine learning are needed. This research aims to develop a predictive model for crop rotation using machine learning techniques. A comprehensive dataset was collected and preprocessed to train various algorithms. The proposed model demonstrated that machine learning could effectively predict suitable crops for cultivation, thereby enhancing crop yield and sustainability. The evaluation results were promising, with the Random Forest model achieving a precision of 0.67 to 1.00, recall of 0.43 to 1.00, and F1-score of 0.60 to 1.00; the Decision Tree model had a precision of 0.50 to 1.00, recall of 0.43 to 1.00, and F1-score of 0.50 to 1.00; and the K-Neighbors Classifier model showed precision of 0.40 to 1.00, recall of 0.43 to 1.00, and F1-score of 0.50 to 1.00.</p>Ibrahim Abdallah Hageltoum Esraa Kamal AbdallahMahmoud I Alfeel Salwa Awad Abbas
Copyright (c) 2024 Journal of Artificial Intelligence and Computational Technology
2024-11-082024-11-081110.70274/jaict.2024.1.1.35Developing Parallel Requirements Prioritization Machine Learning Model Integrating with MoSCoW Method
https://ojs.omgfzc.com/index.php/JAICT/article/view/33
<p>Requirements Prioritization (RP) is an attempt to rank the requirements based on the value added to the business. It is a preprocessingstep in software implementation as well as a prevalent need thing to get customer satisfaction, decrease the risk of requirements volatility, develop cost-effective software, and maintain the level of quality in the software system. Many research focusing on prioritizing the requirements using one or several criteria like time, dependency, and scalability. However, all of them concern with sequential prioritization only. To the best of our knowledge no work focused on parallel ranking in prioritization, which permit the simultaneous requirements implementation that reducing the implementation time. In this study we developed a new requirements prioritization for determine the requirements priority level in parallel format using Random Forest classifier based MoSCoW method (RF-MM). When we applied our prioritization model on to (Testcase MIS system with priority) industrial dataset. the total implementation time were equal to 76.0 seconds when ranking in sequential format; whereas the total time were equal to 33 seconds in parallel ranking. Hence, the parallel ranking capable of reducing implementation time to more than half.</p>Kawthar Ishag Ali FadlallahMahir M. SharifMoawia Elfaki Yahia Eldow
Copyright (c) 2024 Journal of Artificial Intelligence and Computational Technology
2024-11-082024-11-081110.70274/jaict.2024.1.1.33A Case Study: An Android Security
https://ojs.omgfzc.com/index.php/JAICT/article/view/31
<p style="text-align: justify; margin: 12.0pt 0in .0001pt 0in;"><span lang="EN-GB" style="font-size: 9.0pt; font-family: 'Palatino Linotype',serif;">Software security has made great progress; code analysis tools perform extensive checks for code defects, it is useful to have a basic understanding of the different warnings and emphasize the bug. However, this is beyond the state of the art for many types of application security flaws. Thus, such tools frequently serve for an analyst to help them zero in on security relevant portions of code so they can find flaws more efficiently, rather than a tool that simply finds flaws automatically. In cooperation with a security expert, we carried out a case study with the mobile phone platform Android, and employed the reverse engineering tool-suite Bauhaus for this security assessment. During the investigation we found some inconsistencies in the implementation of the Android security concepts. Based on the lessons learned from the case study, we propose several research topics in the area of reverse engineering that would support a security analyst during security assessments.</span></p>Shagufta Akhtar
Copyright (c) 2024 Journal of Artificial Intelligence and Computational Technology
2024-11-082024-11-081110.70274/jaict.2024.1.1.31Quantitative Studies on Hacking Methods and Types of Cyber-Attackers of Electronic Banking Networks
https://ojs.omgfzc.com/index.php/JAICT/article/view/36
<p>Cyber-attacks on electronic banking networks are a growing concern in today's digital age. Cybercriminals try to penetrate the Electronic Banking infrastructure, by bypassing all protection mechanisms, and the media continue to report new bank cyberattacks and thefts. The study attempts to identify the impact of attacks and methods that threaten bank networks. To know the kinds and methods of the attackers of bank networks, a questionnaire was prepared and distributed to the selected banks and their branches, namely Omdurman National Bank, Bank of Khartoum, and Bank of Sudan. From the studied sample, competitors stood out as the most qualitative network attackers on the banks, while penetration through the Internet was one of the ways through which the attacks on the financial banks networks could be carried out. The sample of the study showed that the quality of bank attackers are the competitors, with a rate of 29.2%, which is considered the largest percentage of all assumptions and the highest network attacker’s method Through the internet with a rate of 52.7%. The study concluded that Competitors and Internet have highest impact per methods and kinds of attacks respectively. This conclusion supports the problem statement since the impact was known within the case study Banks.</p>Mahmoud I AlfeelSuliman M M AbakarMafawez AlharbiIbrahim Abdallah Hageltoum
Copyright (c) 2024 Journal of Artificial Intelligence and Computational Technology
2024-11-082024-11-081110.70274/jaict.2024.1.1.36Machine Learning: A survey of requirements prioritization: A Review Study
https://ojs.omgfzc.com/index.php/JAICT/article/view/34
<p>In any software systems, the requirements prioritization considered as pivotal task. This paper aims to explain and discuss the works done on requirements prioritization based Machine learning along with the dependency-aware requirements prioritization. Machine learning has become of attention to scientists, researchers, and users because of the existence of vast data and deep learning algorithms that can analyze massive sets of data. The basic algorithms are used for dependency learning calculation, resolve the stakeholder’s conflicts, classify requirements, and scalability improvement. This paper will present a brief background and comprehensive presentation of a number of machine learning techniques for requirements prioritization and those concerning the requirement dependency in its simple, complex, and Hybrid form. There are a number of papers, articles, and research papers that deal with requirements prioritization, few of them handling the dependency. This paper will present a brief background for several requirements prioritization based on machine learning. then make a comparison and discussion of a number of selected techniques in terms of algorithm type, issues addressed, and evaluated data level. method that handling the dependency regarding of strengths and weakness.</p>Kawthar Ishag Ali FadlallahMoawia Elfaki Yahia Eldow
Copyright (c) 2024 Journal of Artificial Intelligence and Computational Technology
2024-11-082024-11-081110.70274/jaict.2024.1.1.34