@unilorin.edu.ng
Department of Computer Science, Faculty of Communication and Information Sciences, University of Ilorin, Nigeria
University of Ilorin, Nigeria
Computer Science, Artificial Intelligence, Computer Networks and Communications, Computer Science Applications
Scopus Publications
Scholar Citations
Scholar h-index
Scholar i10-index
Abubakar Abdulkarim, Nasir Faruk, Emmanuel Alozie, Hawau Olagunju, Ruqayyah Yusuf Aliyu, Agbotiname Lucky Imoize, Kayode S. Adewole, Yusuf Olayinka Imam-Fulani, Salisu Garba, Bashir Abdullahi Baba,et al.
MDPI AG
Globally, effective and efficient healthcare is critical to the wellbeing and standard of living of any society. Unfortunately, several distant communities far from the national grid do not have access to reliable power supply, owing to economic, environmental, and technical challenges. Furthermore, unreliable, unavailable, and uneconomical power supply to these communities contributes significantly to the delivery of substandard or absence of qualitative healthcare services, resulting in higher mortality rates and associated difficulty in attracting qualified healthcare workers to the affected communities. Given these circumstances, this paper aims to conduct a comprehensive review of the status of renewable energy available to rural healthcare clinics around the globe, emphasizing its potential, analysis, procedures, modeling techniques, and case studies. In this light, several renewable energy modeling techniques were reviewed to examine the optimum power supply to the referenced healthcare centers in remote communities. To this end, analytical techniques and standard indices for reliable power supply to the isolated healthcare centers are suggested. Specifically, different battery storage systems that are suitable for rural healthcare systems are examined, and the most economical and realistic procedure for the maintenance of microgrid power systems for sustainable healthcare delivery is defined. Finally, this paper will serve as a valuable resource for policymakers, researchers, and experts in rural power supply to remote healthcare centers globally.
Kayode S. Adewole and Vicenç Torra
Elsevier BV
Kayode S. Adewole and Andreas Jacobsson
IEEE
Internet of Things (IoT) technology has created a new dimension for data collection, transmission, processing, storage, and service delivery. With the advantages offered by IoT technologies, interest in smart home automation has increased over the years. Nevertheless, smart connected homes are characterized with the security and privacy problems that are associated with aggregating multiple sensors' data and exposing them to the Internet. In this paper, we propose LPM, a lightweight privacy-aware model that leverages information theoretic correlation analysis and gradient boosting to fuse multiple sensors' data at the edge nodes of smart connected homes. LPM employs federated learning, edge and cloud computing to reduce privacy leakages of sensitive data. To demonstrate its applicability, two services, commonly provided by smart homes, i.e., occupancy detection and people count estimation, were experimentally investigated. The results show that LPM can achieve accuracy, F1 score and AUC-ROC of 99.98%, 99.13%, and 99.98% respectively for occupancy detection as well as Mean Squared Error (MSE), Mean Absolute Error (MAE), and $R^{2}$ of 0.0011,0.0175, and 98.39% respectively for people count estimation. LPM offers the opportunity to each smart connected home to participate in collaborative learning that is achieved through the federated machine learning component of the proposed model.
Erdal Akin, Héctor Caltenco, Kayode S. Adewole, Reza Malekian, and Jan A. Persson
IEEE
Segmenting images is an intricate and exceptionally demanding field within computer vision. Instance Segmentation is one of the subfields of image segmentation that segments objects on a given image or video. It categorizes the class labels according to individual instances, ensuring that distinct instance markers are assigned to each occurrence of the same object class, even if multiple instances exist. With the development of computer systems, segmentation studies have increased very rapidly. One of the state-of-the-art algorithms recently published by Meta AI, which segments everything on a given image, is called the Segment Anything Model (SAM). Its impressive zero-shot performance encourages us to use it for diverse tasks. Therefore, we would like to leverage the SAM for an effective instance segmentation model. Accordingly, in this paper, we propose a hybrid instance segmentation method in which Object Detection algorithms extract bounding boxes of detected objects and load SAM to produce segmentation, called Box Prompted SAM (BP-SAM). Experimental evaluation of the COCO2017 Validation dataset provided us with promising performance.
Kayode Adewole and Andreas Jacobsson
SCITEPRESS - Science and Technology Publications
Emmanuel Alozie, Abdulwaheed Musa, Nasir Faruk, Agbotiname Lucky Imoize, Abubakar Abdulkarim, Aliyu D. Usman, Yusuf Olayinka Imam-Fulani, Kayode S. Adewole, Abdulkarim A. Oloyede, Olugbenga A. Sowande,et al.
Elsevier BV
Nehemiah Musa, Abdulsalam Ya’u Gital, Nahla Aljojo, Haruna Chiroma, Kayode S. Adewole, Hammed A. Mojeed, Nasir Faruk, Abubakar Abdulkarim, Ifada Emmanuel, Yusuf Y. Folawiyo,et al.
Springer Science and Business Media LLC
Yusuf Olayinka Imam-Fulani, Nasir Faruk, Olugbenga A. Sowande, Abubakar Abdulkarim, Emmanuel Alozie, Aliyu D. Usman, Kayode S. Adewole, Abdulkarim A. Oloyede, Haruna Chiroma, Salisu Garba,et al.
MDPI AG
The rapid increase in data traffic caused by the proliferation of smart devices has spurred the demand for extremely large-capacity wireless networks. Thus, faster data transmission rates and greater spectral efficiency have become critical requirements in modern-day networks. The ubiquitous 5G is an end-to-end network capable of accommodating billions of linked devices and offering high-performance broadcast services due to its several enabling technologies. However, the existing review works on 5G wireless systems examined only a subset of these enabling technologies by providing a limited coverage of the system model, performance analysis, technology advancements, and critical design issues, thus requiring further research directions. In order to fill this gap and fully grasp the potential of 5G, this study comprehensively examines various aspects of 5G technology. Specifically, a systematic and all-encompassing evaluation of the candidate 5G enabling technologies was conducted. The evolution of 5G, the progression of wireless mobile networks, potential use cases, channel models, applications, frequency standardization, key research issues, and prospects are discussed extensively. Key findings from the elaborate review reveal that these enabling technologies are critical to developing robust, flexible, dependable, and scalable 5G and future wireless communication systems. Overall, this review is useful as a resource for wireless communication researchers and specialists.
Haruna Chiroma, Ponman Nickolas, Nasir Faruk, Emmanuel Alozie, Imam-Fulani Yusuf Olayinka, Kayode S. Adewole, Abubakar Abdulkarim, Abdulkarim A. Oloyede, Olugbenga A. Sowande, Salisu Garba,et al.
Elsevier BV
Emmanuel Alozie, Nasir Faruk, Abubakar Abdulkarim, Aliyu D. Usman, Yusuf Olayinka Imam-Fulani, Salisu Garba, Kayode S. Adewole, Abdulkarim A. Oloyede, Olugbenga A. Sowande, Bashir Abdullahi Baba,et al.
IEEE
Signal propagation in a particular region differs from another due to differences in atmospheric, climatic and environmental properties, distinct terrain and clutter features. Adequate analysis is essential to understand the radio propagation behavior in a particular region. The ITU-R designated four rain regions, M, N, P, and Q, for Nigeria representing the rain rate distribution and also provided further classifications based on ground conductivity, among other salient parameters. Based on these classifications, this paper utilized the EDX Signal Pro software® to model and simulate a typical Point-to-Point (P2P), Non-Line of Sight (NLOS) link scenarios for 5G networks. The objective was to estimate and compare the total loss, excess loss, and flat fade margins for each rain region in Nigeria. Results obtained from the comparison showed that signals propagating in region N experience the highest level of losses and fading, while, region Q has the least losses and fading.
Emmanuel Alozie, Nasir Faruk, Abubakar Abdulkarim, Aliyu D. Usman, Yusuf Olayinka Imam-Fulani, Salisu Garba, Kayode S. Adewole, Abdulkarim A. Oloyede, Olugbenga A. Sowande, Bashir Abdullahi Baba,et al.
IEEE
ITU-R categorized Nigeria into four rain regions (i.e., M, N, P, and Q) depending on the atmospheric conditions. Previous works that have conducted rain and attenuation modeling and simulations have assumed similar signal propagation behavior and parameters within locations categorized under the same region. This paper aims to explore these assumptions by conducting an extensive radio propagation simulations of point-to-point microwave links with a clear line-of-sight in order to estimate the total path loss (attenuation), excess path loss, and flat fade margins for each of the locations within the M-regions, i.e., Kano, Sokoto, and Adamawa. Results obtained showed that in all the locations, the loss monotonically increases with distance, with Sokoto having the highest level of signal losses and fading, while, Adamawa has the lowest. The deviation for both total loss, excess loss, and flat fading margin was found to be between 15 dB across the region.
Yusuf Olayinka Imam-Fulani, Nasir Faruk, Aliyu D. Usman, Abubakar Abdulkarim, Abdoulie M.S Tekanyi, Abdulmalik S. Yaro, Emmanuel Alozie, Salisu Garba, Kayode S. Adewole, Abdulkarim A. Oloyede,et al.
IEEE
5G communication systems provide an end-to-end wireless connection to billions of users and devices across the globe. The quality of signals received during radio communication is notably influenced by the behavior of the radio propagation channel. A major parameter used in characterizing this channel is the Path Loss Exponent (PLE). Several works that have estimated and analyzed the PLE mostly considered the effect of distance and carrier frequency. However, the effect of base station antenna height and channel bandwidth on the PLE for 5G networks have not been adequately considered. To address this, the impact of antenna height of base station and channel bandwidth on the PLE within the 5G Frequency Range 1 (FR1) frequencies was investigated in this study, specifically at 800, 3500, and 5900 MHz. The licensed EDX Signal Pro software® with Cirrus high resolution global terrain and clutter data base was utilized to model, simulate and analyze the PLE for Kano City, Nigeria. Results showed that for the tested frequencies, an increase in either base station height or channel bandwidth leads to a significant reduction in the PLE. This study can be utilized by network planning engineers and the wireless research community to further improve network implementation and optimization toward understanding the behavior of signal propagation in 5G networks and beyond.
Kayode S. Adewole, Hawau Olagunju, Nasir Faruk, Agbotiname Lucky Imoize, Emmanuel Alozie, Olugbenga A. Sowande, Abubakar Abdulkarim, Aliyu D. Usman, Yusuf Olayinka Imam-Fulani, Abdulkarim A. Oloyede,et al.
IEEE
Terrestrial Radio Propagation (TRP) involves radio wave propagation from one station to another over the surface of the earth. Radio communication systems have been deployed for broadcasting, mobile cellular, and public safety. Radio propagation planning plays a crucial role in designing and deploying terrestrial radio networks. Radio propagation models (Path loss Models) are utilized during the link budget, coverage, and interference estimations. The models guide radio network engineers in choosing appropriate placements of radio network equipment, such as the base stations. However, the high cost attributed to TRP data collection, lack of open access, and accurate TRP datasets hinders the development of path loss models that can reliably predict outcomes for different use cases. To address this problem, this paper aims to present a robust TRP repository that provides a platform for hosting and disseminating TRP datasets, which the research community could use for path loss modeling. The repository was implemented using the latest technologies, and its performance was evaluated. The system has been deployed for public access.
Kayode Adewole and Vicenç Torra
SCITEPRESS - Science and Technology Publications
: The development in smart meter technology has made grid operations more efficient based on fine-grained electricity usage data generated at different levels of time granularity. Consequently, machine learning algorithms have benefited from these data to produce useful models for important grid operations. Although machine learning algorithms need historical data to improve predictive performance, these data are not readily available for public utilization due to privacy issues. The existing smart grid data simulation frameworks generate grid data with implicit privacy concerns since the data are simulated from a few real energy consumptions that are publicly available. This paper addresses two issues in smart grid. First, it assesses the level of privacy violation with the individual household appliances based on synthetic household aggregate loads consumption. Second, based on the findings, it proposes two privacy-preserving mechanisms to reduce this risk. Three inference attacks are simulated and the results obtained confirm the efficacy of the proposed privacy-preserving mechanisms.
Kayode S. Adewole, Hammed A. Mojeed, James A. Ogunmodede, Lubna A. Gabralla, Nasir Faruk, Abubakar Abdulkarim, Emmanuel Ifada, Yusuf Y. Folawiyo, Abdukareem A. Oloyede, Lukman A. Olawoyin,et al.
MDPI AG
Electrocardiography (ECG) is one of the most widely used recordings in clinical medicine. ECG deals with the recording of electrical activity that is generated by the heart through the surface of the body. The electrical activity generated by the heart is measured using electrodes that are attached to the body surface. The use of ECG in the diagnosis and management of cardiovascular disease (CVD) has been in existence for over a decade, and research in this domain has recently attracted large attention. Along this line, an expert system (ES) and decision support system (DSS) have been developed for ECG interpretation and diagnosis. However, despite the availability of a lot of literature, access to recent and more comprehensive review papers on this subject is still a challenge. This paper presents a comprehensive review of the application of ES and DSS for ECG interpretation and diagnosis. Researchers have proposed a number of features and methods for ES and DSS development that can be used to monitor a patient’s health condition through ECG recordings. In this paper, a taxonomy of the features and methods for ECG interpretation and diagnosis were presented. The significance of the features and methods, as well as their limitations, were analyzed. This review further presents interesting theoretical concepts in this domain, as well as identifies challenges and open research issues on ES and DSS development for ECG interpretation and diagnosis that require substantial research effort. In conclusion, this paper identifies important future research areas with the purpose of advancing the development of ES and DSS for ECG interpretation and diagnosis.
Kayode S. Adewole and Vicenç Torra
Springer Science and Business Media LLC
AbstractThe introduction of advanced metering infrastructure (AMI) smart meters has given rise to fine-grained electricity usage data at different levels of time granularity. AMI collects high-frequency daily energy consumption data that enables utility companies and data aggregators to perform a rich set of grid operations such as demand response, grid monitoring, load forecasting and many more. However, the privacy concerns associated with daily energy consumption data has been raised. Existing studies on data anonymization for smart grid data focused on the direct application of perturbation algorithms, such as microaggregation, to protect the privacy of consumers. In this paper, we empirically show that reliance on microaggregation alone is not sufficient to protect smart grid data. Therefore, we propose DFTMicroagg algorithm that provides a dual level of perturbation to improve privacy. The algorithm leverages the benefits of discrete Fourier transform (DFT) and microaggregation to provide additional layer of protection. We evaluated our algorithm on two publicly available smart grid datasets with millions of smart meters readings. Experimental results based on clustering analysis using k-Means, classification via k-nearest neighbor (kNN) algorithm and mean hourly energy consumption forecast using Seasonal Auto-Regressive Integrated Moving Average with eXogenous (SARIMAX) factors model further proved the applicability of the proposed method. Our approach provides utility companies with more flexibility to control the level of protection for their published energy data.
Kayode S. Adewole, Taofeekat T. Salau-Ibrahim, Agbotiname Lucky Imoize, Idowu Dauda Oladipo, Muyideen AbdulRaheem, Joseph Bamidele Awotunde, Abdullateef O. Balogun, Rafiu Mope Isiaka, and Taye Oladele Aro
MDPI AG
Network intrusion, such as denial of service, probing attacks, and phishing, comprises some of the complex threats that have put the online community at risk. The increase in the number of these attacks has given rise to a serious interest in the research community to curb the menace. One of the research efforts is to have an intrusion detection mechanism in place. Batch learning and data streaming are approaches used for processing the huge amount of data required for proper intrusion detection. Batch learning, despite its advantages, has been faulted for poor scalability due to the constant re-training of new training instances. Hence, this paper seeks to conduct a comparative study using selected batch learning and data streaming algorithms. The batch learning and data streaming algorithms considered are J48, projective adaptive resonance theory (PART), Hoeffding tree (HT) and OzaBagAdwin (OBA). Furthermore, binary and multiclass classification problems are considered for the tested algorithms. Experimental results show that data streaming algorithms achieved considerably higher performance in binary classification problems when compared with batch learning algorithms. Specifically, binary classification produced J48 (94.73), PART (92.83), HT (98.38), and OBA (99.67), and multiclass classification produced J48 (87.66), PART (87.05), HT (71.98), OBA (82.80) based on accuracy. Hence, the use of data streaming algorithms to solve the scalability issue and allow real-time detection of network intrusion is highly recommended.
Moshood A. Hambali, Tinuke O. Oladele, Kayode S. Adewole, Arun Kumar Sangaiah, and Wei Gao
Springer Science and Business Media LLC
Emmanuel Alozie, Abubakar Abdulkarim, Ibrahim Abdullahi, Aliyu D. Usman, Nasir Faruk, Imam-Fulani Yusuf Olayinka, Kayode S. Adewole, Abdulkarim A. Oloyede, Haruna Chiroma, Olugbenga A. Sowande,et al.
MDPI AG
Radio waves are attenuated by atmospheric phenomena such as snow, rain, dust, clouds, and ice, which absorb radio signals. Signal attenuation becomes more severe at extremely high frequencies, usually above 10 GHz. In typical equatorial and tropical locations, rain attenuation is more prevalent. Some established research works have attempted to provide state-of-the-art reviews on modeling and analysis of rain attenuation in the context of extremely high frequencies. However, the existing review works conducted over three decades (1990 to 2022), have not adequately provided comprehensive taxonomies for each method of rain attenuation modeling to expose the trends and possible future research directions. Also, taxonomies of the methods of model validation and regional developmental efforts on rain attenuation modeling have not been explicitly highlighted in the literature. To address these gaps, this paper conducted an extensive literature survey on rain attenuation modeling, methods of analyses, and model validation techniques, leveraging the ITU-R regional categorizations. Specifically, taxonomies in different rain attenuation modeling and analysis areas are extensively discussed. Key findings from the detailed survey have shown that many open research questions, challenges, and applications could open up new research frontiers, leading to novel findings in rain attenuation. Finally, this study is expected to be reference material for the design and analysis of rain attenuation.
Nasir Faruk, Quadri Ramon Adebowale, Imam-Fulani Yusuf Olayinka, Kayode S. Adewole, Abubakar Abdulkarim, Abdulkarim A. Oloyede, Haruna Chiroma, Olugbenga A. Sowande, Lukman A. Olawoyin, Salisu Garba,et al.
Elsevier BV
Abubakar Abdulkarim, Nasir Faruk, Emmanuel Alozie, Olugbenga. A. Sowande, Imam-Fulani Yusuf Olayinka, Aliyu D. Usman, Kayode S. Adewole, Abdulkarim A. Oloyede, Haruna Chiroma, Salisu Garba,et al.
IEEE
The demand for high-speed internet services is increasing due to emerging needs such as e-commerce, e-health, education, and other high-technology applications. Wireless communication networks have now become a necessity, especially with the introduction of the 5G networks which have the potential to provide extraordinary data rates with extremely low latency. The deployment and operation of 5G and beyond networks in built-up environments would require a complex and reliable radio propagation model that guides network engineers to achieve effective coverage estimation and appropriate base station placements. The inefficiency, and sometimes inconsistencies of deterministic and empirical path loss models necessitated the need to integrate machine learning models. Recently, different machine learning-based pathloss models have been developed to overcome drawbacks associated with conventional pathloss models due to their significant learning and prediction abilities. This paper aims to review path loss models relative to machine learning-based algorithms with a focus on models developed in the last 21 years (2000 to 2021) to study their network parameters and architectures, designs, and applicability, and proffer further research directions.
Kayode Sakariyah Adewole and Vicenç Torra
Springer International Publishing
Babajide J. Odejide, Amos O. Bajeh, Abdullateef O. Balogun, Zubair O. Alanamu, Kayode S. Adewole, Abimbola G. Akintola, Shakirat A. Salihu, Fatima E. Usman-Hamza, and Hammed A. Mojeed
Springer International Publishing
Quadri Ramon Adebowale, Nasir Faruk, Kayode S. Adewole, Abubakar Abdulkarim, Abdulkarim A. Oloyede, Haruna Chiroma, Olugbenga. A. Sowande, Aliyu D. Usman, Imam-Fulani Yusuf Olayinka, Abduljalal Yusharu Kassim,et al.
IEEE
The propagation of Electromagnetic waves signal in terrestrial frequency bands in a build-up environment is affected by many factors, leading to signal degradation, diffraction, reflection, scattering, among others. Furthermore, the physical layer interface is one of the most critical factors needed to be carefully analyzed for optimum design of wireless systems. In view of these, several channel models were proposed to optimally predict how radio waves behaves in a typical built-up and complex environments. In the case of Nigeria, when these propagation models are considered at disparate environment, a many of them are susceptible to tremendous prediction error. Hence, there is a need to develop a model suitable for such an environment to minimize errors. This paper used Scale Conjugate and Levenberg-Marquardt algorithms to develop a multi-frequency bands ANN-based path loss prediction model. Furthermore, the paper investigated the effect of ANN system parameters on the model’s performance. Findings revealed that Standard Deviation Error (SDE) and the Correlation Coefficient (R) depend on the model’s network architecture. In addition, the Levenberg-Marquardt algorithm fits the network with complex structures compared to the scale conjugate algorithm. It was further discovered that increment of the number of hidden neurons, ordinarily, does not, in the same way, means an increase in the performance of the model.
Abimbola Ganiyat Akintola, Abdullateef Balogun, Hammed Adeleke Mojeed, Fatima Usman-Hamza, Shakirat Aderonke Salihu, Kayode Sakariyau Adewole, Ghaniyyat Bolanale Balogun, and Peter Ogirima Sadiku
International Association of Online Engineering (IAOE)
Due to the exponential rise of mobile technology, a slew of new mobile security concerns has surfaced recently. To address the hazards connected with malware, many approaches have been developed. Signature-based detection is the most widely used approach for detecting Android malware. This approach has the disadvantage of being unable to identify unknown malware. As a result of this issue, machine learning (ML) for identifying and categorising malware apps was created. Conventional ML methods are concerned with increasing classification accuracy. However, the standard classification method performs poorly in recognising malware applications due to the unbalanced real-world datasets. In this study, an empirical analysis of the detection performance of ML methods in the presence of class imbalance is conducted. Specifically, eleven (11) ML methods with diverse computational complexities were investigated. Also, a synthetic minority oversampling technique (SMOTE) and random undersampling (RUS) are deployed to address the class imbalance in the Android malware datasets. The experimented ML methods are tested using the Malgenome and Drebin Android malware datasets that contain features gathered from both static and dynamic malware approaches. According to the experimental findings, the performance of each experimented ML method varies across the datasets. Moreover, the presence of class imbalance deteriorated the performance of the ML methods as their performances were amplified with the deployment of data sampling methods (SMOTE and RUS) used to alleviate the class imbalance problem. Besides, ML models with SMOTE technique are superior to other experimented methods. It is therefore recommended to address the inherent class imbalance problem in Android Malware detection.