A comprehensive systematic and bibliometric review of the IoT-based healthcare systems
In the healthcare sector, the growth in technology has had a huge effect. Besides, when introduced to the world of healthcare, the Internet of Things (IoT) will simplify the transition by helping physicians closely track their patients, allowing rapid recovery. Aged patients/people should be intensively checked, and their loved ones must be aware of their wellbeing periodically. Therefore, using IoT in healthcare will simplify the lives of physicians and patients alike. Hence, this study explored a comprehensive review of intelligent IoT-based embedded healthcare systems. The papers around intelligent IoT-based healthcare systems printed until Dec-2022 are studied, and some research lines are suggested for the upcoming researchers. Thus, this study's innovation will apply healthcare systems based on IoT to include certain strategies for the future deployment of new generations of IoT-based health technology. The findings revealed that IoT is beneficial for governments to strengthen society's health and economic relations. Besides, owing to novel functional principles, IoT needs modern safety infrastructure. This study is helpful for prevalent and useful electronic healthcare services, health experts, and clinicians.
Quantum-PSO based unsupervised clustering of users in social networks using attributes
Unsupervised cluster detection in social network analysis involves grouping social actors into distinct groups, each distinct from the others. Users in the clusters are semantically very similar to those in the same cluster and dissimilar to those in different clusters. Social network clustering reveals a wide range of useful information about users and has many applications in daily life. Various approaches are developed to find social network users' clusters, using only links or attributes and links. This work proposes a method for detecting social network users' clusters based solely on their attributes. In this case, users' attributes are considered categorical values. The most popular clustering algorithm used for categorical data is the K-mode algorithm. However, it may suffer from local optimum due to its random initialization of centroids. To overcome this issue, this manuscript proposes a methodology named the Quantum PSO approach based on user similarity maximization. In the proposed approach, firstly, dimensionality reduction is conducted by performing the relevant attribute set selection followed by redundant attribute removal. Secondly, the QPSO technique is used to maximize the similarity score between users to get clusters. Three different similarity measures are used separately to perform the dimensionality reduction and similarity maximization processes. Experiments are conducted on two popular social network datasets; ego-Twitter, and ego-Facebook. The results show that the proposed approach performs better clustering results in terms of three different performance metrics than K-Mode and K-Mean algorithms.
An ontology-based approach to designing a NoSQL database for semi-structured and unstructured health data
With the advent of ICT-based healthcare applications, various formats of health data are generated every day in huge volume. Such data, consisting of unstructured, semi-structured and structured data, has every characteristic of Big data. NoSQL databases are generally preferred for storing such type of health data with the objective of improving query performance. However, for efficient retrieval and processing of Big Health Data and for resource optimization, suitable data models and design of the NoSQL databases are important requirements. Unlike relational databases, no standard methods or tools exist for NoSQL database design. In this work, we adopt an ontology-based schema design approach. We propose that an ontology, which captures the domain knowledge, be used for developing a health data model. An ontology for primary healthcare is described in this paper. We also propose an algorithm for designing the schema of a NoSQL database, keeping in mind the characteristics of the target NoSQL store, using a related ontology, a sample query set, some statistical information of the queries, and performance requirements of the query set. The ontology proposed by us for primary healthcare domain and the above mentioned algorithm along with a set of queries are used for generating a schema targeting MongoDB datastore. The performance of the proposed design is compared with a relational model developed for the same primary healthcare data and the effectiveness of our proposed approach is demonstrated. The entire experiment has been carried out on MongoDB cloud platform.
Blockchain and COVID-19 pandemic: applications and challenges
The year 2020 has witnessed the emergence of coronavirus (COVID-19) that has rapidly spread and adversely affected the global economy, health, and human lives. The COVID-19 pandemic has exposed the limitations of existing healthcare systems regarding their inadequacy to timely and efficiently handle public health emergencies. A large portion of today's healthcare systems are centralized and fall short in providing necessary information security and privacy, data immutability, transparency, and traceability features to detect fraud related to COVID-19 vaccination certification, and anti-body testing. Blockchain technology can assist in combating the COVID-19 pandemic by ensuring safe and reliable medical supplies, accurate identification of virus hot spots, and establishing data provenance to verify the genuineness of personal protective equipment. This paper discusses the potential blockchain applications for the COVID-19 pandemic. It presents the high-level design of three blockchain-based systems to enable governments and medical professionals to efficiently handle health emergencies caused by COVID-19. It discusses the important ongoing blockchain-based research projects, use cases, and case studies to demonstrate the adoption of blockchain technology for COVID-19. Finally, it identifies and discusses future research challenges, along with their key causes and guidelines.
Cloud-edge load balancing distributed protocol for IoE services using swarm intelligence
Rapid development of the Internet of Everything (IoE) and cloud services offer a vital role in the growth of smart applications. It provides scalability with the collaboration of cloud servers and copes with a big amount of collected data for network systems. Although, edge computing supports efficient utilization of communication bandwidth, and latency requirements to facilitate smart embedded systems. However, it faces significant research issues regarding data aggregation among heterogeneous network services and objects. Moreover, distributed systems are more precise for data access and storage, thus machine-to-machine is needed to be secured from unpredictable events. As a result, this research proposed secured data management with distributed load balancing protocol using particle swarm optimization, which aims to decrease the response time for cloud users and effectively maintain the integrity of network communication. It combines distributed computing and shift high cost computations closer to the requesting node to reduce latency and transmission overhead. Moreover, the proposed work also protects the communicating machines from malicious devices by evaluating the trust in a controlled manner. Simulation results revealed a significant performance of the proposed protocol in comparison to other solutions in terms of energy consumption by 20%, success rate by 17%, end-to-end delay by 14%, and network cost by 19% as average in the light of various performance metrics.
Deep learning-based user experience evaluation in distance learning
The Covid-19 pandemic caused uncertainties in many different organizations, institutions gained experience in remote working and showed that high-quality distance education is a crucial component in higher education. The main concern in higher education is the impact of distance education on the quality of learning during such a pandemic. Although this type of education may be considered effective and beneficial at first glance, its effectiveness highly depends on a variety of factors such as the availability of online resources and individuals' financial situations. In this study, the effectiveness of e-learning during the Covid-19 pandemic is evaluated using posted tweets, sentiment analysis, and topic modeling techniques. More than 160,000 tweets, addressing conditions related to the major change in the education system, were gathered from Twitter social network and deep learning-based sentiment analysis models and topic models based on latent dirichlet allocation (LDA) algorithm were developed and analyzed. Long short term memory-based sentiment analysis model using word2vec embedding was used to evaluate the opinions of Twitter users during distance education and also, a topic model using the LDA algorithm was built to identify the discussed topics in Twitter. The conducted experiments demonstrate the proposed model achieved an overall accuracy of 76%. Our findings also reveal that the Covid-19 pandemic has negative effects on individuals 54.5% of tweets were associated with negative emotions whereas this was relatively low on emotion reports in the YouGov survey and gender-rescaled emotion scores on Twitter. In parallel, we discuss the impact of the pandemic on education and how users' emotions altered due to the catastrophic changes allied to the education system based on the proposed machine learning-based models.
Blockchain application for central bank digital currencies (CBDC)
Central Bank Digital Currency (CBDC) is a digital version of domestic currency with a unit of account equivalent to its domestic currency. Blockchain or Distributed Ledger technology (DLT) can be used to implement CBDC to execute and settle peer-to-peer transactions. With the emergence of private money, such as cryptocurrencies and stablecoins, and the growing use of digital payments to lessen the global pandemic spread, CBDC is an active research area among central banks worldwide. Many central banks started their CBDC projects by building DLT proofs of concept (PoCs) to replicate wholesale payment systems and expand their investigation into other use cases, such as delivery versus Payment (DvP) and cross-border remittance. Many large economies like the United States have projects exploring CBDC. The People's Bank of China (PBoC), China Central Bank, has already started a pilot testing of their digital retail currency. This paper discusses the application of blockchain for CBDC by presenting CBDC projects by central banks. Moreover, this paper analyses issues, identify challenges and discusses future works in this rapidly evolving field.
COVID-19 CT-images diagnosis and severity assessment using machine learning algorithm
As a pandemic, the primary evaluation tool for coronavirus (COVID-19) still has serious flaws. To improve the existing situation, all facilities and tools available in this field should be used to combat the pandemic. Reverse transcription polymerase chain reaction is used to evaluate whether or not a person has this virus, but it cannot establish the severity of the illness. In this paper, we propose a simple, reliable, and automatic system to diagnose the severity of COVID-19 from the CT scans into three stages: mild, moderate, and severe, based on the simple segmentation method and three types of features extracted from the CT images, which are ratio of infection, statistical texture features (mean, standard deviation, skewness, and kurtosis), GLCM and GLRLM texture features. Four machine learning techniques (decision trees (DT), K-nearest neighbors (KNN), support vector machines (SVM), and Naïve Bayes) are used to classify scans. 1801 scans are divided into four stages based on the CT findings in the scans and the description file found with the datasets. Our proposed model divides into four steps: preprocessing, feature extraction, classification, and performance evaluation. Four machine learning algorithms are used in the classification step: SVM, KNN, DT, and Naive Bayes. By SVM method, the proposed model achieves 99.12%, 98.24%, 98.73%, and 99.9% accuracy for COVID-19 infection segmentation at the normal, mild, moderate, and severe stages, respectively. The area under the curve of the model is 0.99. Finally, our proposed model achieves better performance than state-of-art models. This will help the doctors know the stage of the infection and thus shorten the time and give the appropriate dose of treatment for this stage.
Security provisions in smart edge computing devices using blockchain and machine learning algorithms: a novel approach
It is difficult to manage massive amounts of data in an overlying environment with a single server. Therefore, it is necessary to comprehend the security provisions for erratic data in a dynamic environment. The authors are concerned about the security risk of vulnerable data in a Mobile Edge based distributive environment. As a result, edge computing appears to be an excellent perspective in which training can be done in an Edge-based environment. The combination of Edge computing and consensus approach of Blockchain in conjunction with machine learning techniques can further improve data security, mitigate the possibility of exposed data, and it reduces the risk of a data breach. As a result, the concept of federated learning provides a path for training the shared data. A dataset was collected that contained several vulnerable, exposed, recovered, and secured data and data security was precepted under the surveillance of two-factor authentication. This paper discusses the evolution of data and security flaws and their corresponding solutions in smart edge computing devices. The proposed model incorporates data security using consensus approach of Blockchain and machine learning techniques that include several classifiers and optimization techniques. Further, the authors applied the proposed algorithms in an edge computing environment by distributing several batches of data to different clients. As a result, the client privacy was maintained by using Blockchain servers. Furthermore, the authors segregated the client data into batches that were trained using the federated learning technique. The results obtained in this paper demonstrate the implementation of a Blockchain-based training model in an edge-based computing environment.
Privacy protection framework for face recognition in edge-based Internet of Things
Edge computing (EC) gets the Internet of Things (IoT)-based face recognition systems out of trouble caused by limited storage and computing resources of local or mobile terminals. However, data privacy leak remains a concerning problem. Previous studies only focused on some stages of face data processing, while this study focuses on the privacy protection of face data throughout its entire life cycle. Therefore, we propose a general privacy protection framework for edge-based face recognition (EFR) systems. To protect the privacy of face images and training models transmitted between edges and the remote cloud, we design a local differential privacy (LDP) algorithm based on the proportion difference of feature information. In addition, we also introduced identity authentication and hash technology to ensure the legitimacy of the terminal device and the integrity of the face image in the data acquisition phase. Theoretical analysis proves the rationality and feasibility of the scheme. Compared with the non-privacy protection situation and the equal privacy budget allocation method, our method achieves the best balance between availability and privacy protection in the numerical experiment.
Enabling Secure Data sharing with data deduplication and sensitive information hiding in cloud-assisted Electronic Medical Systems
Data sharing is very important for medical researchers to do research on certain diseases in cloud-assisted electronic medical systems. Nonetheless, there are large amounts of duplicate data in shared electronic medical records, which incurs redundant storage. In addition, data sharing of electronic medical records might expose the sensitive information of patients. In order to address above problems, we propose a secure data sharing scheme with data deduplication and sensitive information hiding in cloud-assisted electronic medical systems in this paper. In order to protect the sensitive information privacy and enhance the deduplication efficiency, we replace the patient's sensitive information of electronic medical records by wildcards before encrypting the whole electronic medical records. The authorized researcher can decrypt and obtain the electronic medical records under the condition that the sensitive information of shared electronic medical records is hidden. Moreover, we clarify the diagnose information of the electronic medical records into different types according to the duplicate ratio. The authorized researchers can selectively download data according to the duplicate ratio of diagnostic information. Our proposed scheme can resist brute-force attacks and single-point-of-failure attack. The experimental results show our proposed scheme is more efficient than the existing schemes.
Using FIWARE and blockchain in smart cities solutions
Nowadays, Blockchain has been widely used to store decentralized and secure transactions involving cryptocurrency, e.g., Bitcoin, Ethereum, etc. However, Blockchain can also store other types of information besides monetary transactions. On the other hand, innovative solutions for smart cities are concerned with how services and information can be safely stored and shared. For this reason, smart city systems can benefit from using Blockchain to integrate their data and services. These smart solutions also demand consistency and standardization across the industry. However, this Blockchain integration varies according to its implementation. FIWARE, a framework of an open-source platform for smart solutions, adopts NGSI Standards (Context Information Management (CIM); NGSI-LD API: Tech. Rep., CIM and ETSI Industry Specification Group (ISG), 2020) to enable the integration of components and provides the basis for interoperability and portability among smart solutions. Unfortunately, FIWARE does not support any integration with Blockchain technology. Hence, this paper proposes a set of new components to allow FIWARE to be integrated with Blockchain technology. With these proposed components, it is possible to support Blockchain technology with smart city applications via the FIWARE platform. For instance, we have designed and implemented a FIWARE Blockchain adapter to submit/listen to transactions from/to FIWARE Context Broker to/from any Blockchain implementation without human intervention. In addition, we present a global post-pandemic vaccination case study to evaluate the proposed approach in the Smart City context.
Artificial intelligence based health indicator extraction and disease symptoms identification using medical hypothesis models
Patient health record analysis models assist the medical field to understand the current stands and medical needs. Similarly, collecting and analyzing the disease features are the best practice for encouraging medical researchers to understand the research problems. Various research works evolve the way of medical data analysis schemes to know the actual challenges against the diseases. The computer-based diagnosis models and medical data analysis models are widely applied to have a better understanding of different diseases. Particularly, the field of medical electronics needs appropriate health indicator extraction models in near future. The existing medical schemes support baseline solutions but lack optimal hypothesis-based solutions. This work describes the optimal hypothesis model and Akin procedures for health record users, to aid health sectors in clinical decision-making on health indications. This work proposes Medical Hypothesis and Health Indicators Extraction from Electronic Medical Records (EMR) and International Classification of Diseases (ICD-10) patient examination database using the Akin Method and Friendship method. In this Health Indicators and Disease Symptoms Extraction (HIDSE), the evidence checking procedures find and collect all possible medical evidence from the existing patient examination report. Akin Method is making the hypothesis decision from count-based evidence principles. The health indicators extraction scheme extracts all relevant information based on the health indicators query and partial input. Similarly, the friendship method is used for making information associations between medical data attributes. This Akin-Friendship model helps to build hypothesis structures and trait-based feature extraction principles. This is called as Composite Akin Friendship Model (CAFM). This proposed model consists of various test cases for developing the medical hypothesis systems. On the other hand, it provides limited accuracy in disease classification. In this regard, the proposed HIDSE implements Deep Learning (DL) based Akin Friendship Method (DLAFM) for improving the accuracy of this medical hypothesis model. The proposed DLAFM, Convolutional Neural Networks (CNN) associated Legacy Prediction Model for Health Indicator (LPHI) is developed to tune the CAFM principles. The results show the proposed health indicator extraction scheme has 8-10% of better system performance than other existing techniques.
A pre-trained convolutional neural network with optimized capsule networks for chest X-rays COVID-19 diagnosis
Coronavirus disease (COVID-19) is rapidly spreading worldwide. Recent studies show that radiological images contain accurate data for detecting the coronavirus. This paper proposes a pre-trained convolutional neural network (VGG16) with Capsule Neural Networks (CapsNet) to detect COVID-19 with unbalanced data sets. The CapsNet is proposed due to its ability to define features such as perspective, orientation, and size. Synthetic Minority Over-sampling Technique (SMOTE) was employed to ensure that new samples were generated close to the sample center, avoiding the production of outliers or changes in data distribution. As the results may change by changing capsule network parameters (Capsule dimensionality and routing number), the Gaussian optimization method has been used to optimize these parameters. Four experiments have been done, (1) CapsNet with the unbalanced data sets, (2) CapsNet with balanced data sets based on class weight, (3) CapsNet with balanced data sets based on SMOTE, and (4) CapsNet hyperparameters optimization with balanced data sets based on SMOTE. The performance has improved and achieved an accuracy rate of 96.58% and an F1- score of 97.08%, a competitive optimized model compared to other related models.
Energy efficiency in cloud computing data centers: a survey on software technologies
Cloud computing is a commercial and economic paradigm that has gained traction since 2006 and is presently the most significant technology in IT sector. From the notion of cloud computing to its energy efficiency, cloud has been the subject of much discussion. The energy consumption of data centres alone will rise from 200 TWh in 2016 to 2967 TWh in 2030. The data centres require a lot of power to provide services, which increases CO2 emissions. In this survey paper, software-based technologies that can be used for building green data centers and include power management at individual software level has been discussed. The paper discusses the energy efficiency in containers and problem-solving approaches used for reducing power consumption in data centers. Further, the paper also gives details about the impact of data centers on environment that includes the e-waste and the various standards opted by different countries for giving rating to the data centers. This article goes beyond just demonstrating new green cloud computing possibilities. Instead, it focuses the attention and resources of academia and society on a critical issue: long-term technological advancement. The article covers the new technologies that can be applied at the individual software level that includes techniques applied at virtualization level, operating system level and application level. It clearly defines different measures at each level to reduce the energy consumption that clearly adds value to the current environmental problem of pollution reduction. This article also addresses the difficulties, concerns, and needs that cloud data centres and cloud organisations must grasp, as well as some of the factors and case studies that influence green cloud usage.
Edge computing based secure health monitoring framework for electronic healthcare system
Nowadays, Smart Healthcare Systems (SHS) are frequently used by people for personal healthcare observations using various smart devices. The SHS uses IoT technology and cloud infrastructure for data capturing, transmitting it through smart devices, data storage, processing, and healthcare advice. Processing such a huge amount of data from numerous IoT devices in a short time is quite challenging. Thus, technological frameworks such as edge computing or fog computing can be used as a middle layer between cloud and user in SHS. It reduces the response time for data processing at the lower level (edge level). But, Edge of Things (EoT) also suffers from security and privacy issues. A robust healthcare monitoring framework with secure data storage and access is needed. It will provide a quick response in case of the production of abnormal data and store/access the sensitive data securely. This paper proposed a Secure Framework based on the Edge of Things (SEoT) for Smart healthcare systems. This framework is mainly designed for real-time health monitoring, maintaining the security and confidentiality of the healthcare data in a controlled manner. This paper included clustering approaches for analyzing bio-signal data for abnormality detection and Attribute-Based Encryption (ABE) for bio-signal data security and secure access. The experimental results of the proposed framework show improved performance with maintaining the accuracy of up to 98.5% and data security.
Scalable feature subset selection for big data using parallel hybrid evolutionary algorithm based wrapper under apache spark environment
Extant sequential wrapper-based feature subset selection (FSS) algorithms are not scalable and yield poor performance when applied to big datasets. Hence, to circumvent these challenges, we propose parallel and distributed hybrid evolutionary algorithms (EAs) based wrappers under Apache Spark. We propose two hybrid EAs based on the Binary Differential Evolution (BDE), and Binary Threshold Accepting (BTA), namely, (i) Parallel Binary Differential Evolution and Threshold Accepting (PB-DETA), where BDE and BTA work in tandem in every iteration, and (ii) its ablation variant, Parallel Binary Threshold Accepting and Differential Evolution (PB-TADE). Here, BTA is invoked to enhance the search capability and avoid premature convergence of BDE. For comparison purposes, we also parallelized two state-of-the-art algorithms: adaptive DE (ADE) and permutation based DE (DE-FS), and named them PB-ADE and P-DE-FS respectively. Throughout, logistic regression (LR) is employed to compute the fitness function, namely, area under the receiver operator characteristic curve (AUC). The effectiveness of the proposed algorithms is tested over the five big datasets of varying dimensions. It is noteworthy that the PB-TADE turned out to be statistically significant than the rest. All the algorithms have shown the repeatability property. The proposed parallel model attained a speedup of 2.2-2.9. We also reported feature subset with high AUC and least cardinality.
Popular deep learning algorithms for disease prediction: a review
Due to its automatic feature learning ability and high performance, deep learning has gradually become the mainstream of artificial intelligence in recent years, playing a role in many fields. Especially in the medical field, the accuracy rate of deep learning even exceeds that of doctors. This paper introduces several deep learning algorithms: Artificial Neural Network (NN), FM-Deep Learning, Convolutional NN and Recurrent NN, and expounds their theory, development history and applications in disease prediction; we analyze the defects in the current disease prediction field and give some current solutions; our paper expounds the two major trends in the future disease prediction and medical field-integrating Digital Twins and promoting precision medicine. This study can better inspire relevant researchers, so that they can use this article to understand related disease prediction algorithms and then make better related research.
EHHR: an efficient evolutionary hyper-heuristic based recommender framework for short-text classifier selection
With various machine learning heuristics, it becomes difficult to choose an appropriate heuristic to classify short-text emerging from various social media sources in the form of tweets and reviews. The No Free Lunch theorem asserts that no heuristic applies to all problems indiscriminately. Regardless of their success, the available classifier recommendation algorithms only deal with numeric data. To cater to these limitations, an umbrella classifier recommender must determine the best heuristic for short-text data. This paper presents an efficient reminisce-enabled classifier recommender framework to recommend a heuristic for new short-text data classification. The proposed framework, "Efficient Evolutionary Hyper-heuristic based Recommender Framework for Short-text Classifier Selection (EHHR)," reuses the previous solutions to predict the performance of various heuristics for an unseen problem. The Hybrid Adaptive Genetic Algorithm (HAGA) in EHHR facilitates dataset-level feature optimization and performance prediction. HAGA reveals that the influential features for recommending the best short-text heuristic are the average entropy, mean length of the word string, adjective variation, verb variation II, and average hard examples. The experimental results show that HAGA is 80% more accurate when compared to the standard Genetic Algorithm (GA). Additionally, EHHR clusters datasets and rank heuristics cluster-wise. EHHR clusters 9 out of 10 problems correctly.
An AI-empowered affect recognition model for healthcare and emotional well-being using physiological signals
Affective Computing is one of the central studies for achieving advanced human-computer interaction and is a popular research direction in the field of artificial intelligence for smart healthcare frameworks. In recent years, the use of electroencephalograms (EEGs) to analyze human emotional states has become a hot spot in the field of emotion recognition. However, the EEG is a non-stationary, non-linear signal that is sensitive to interference from other physiological signals and external factors. Traditional emotion recognition methods have limitations in complex algorithm structures and low recognition precision. In this article, based on an in-depth analysis of EEG signals, we have studied emotion recognition methods in the following respects. First, in this study, the DEAP dataset and the excitement model were used, and the original signal was filtered with others. The frequency band was selected using a butter filter and then the data was processed in the same range using min-max normalization. Besides, in this study, we performed hybrid experiments on sash windows and overlays to obtain an optimal combination for the calculation of features. We also apply the Discrete Wave Transform (DWT) to extract those functions from the preprocessed EEG data. Finally, a pre-trained k-Nearest Neighbor (kNN) machine learning model was used in the recognition and classification process and different combinations of DWT and kNN parameters were tested and fitted. After 10-fold cross-validation, the precision reached 86.4%. Compared to state-of-the-art research, this method has higher recognition accuracy than conventional recognition methods, while maintaining a simple structure and high speed of operation.
Video coding deep learning-based modeling for long life video streaming over next network generation
Availability is one of the primary goals of smart networks, especially, if the network is under heavy video streaming traffic. In this paper, we propose a deep learning based methodology to enhance availability of video streaming systems by developing a prediction model for video streaming quality, required power consumption, and required bandwidth based on video codec parameters. The H.264/AVC codec, which is one of the most popular codecs used in video steaming and conferencing communications, is chosen as a case study in this paper. We model the predicted consumed power, the predicted perceived video quality, and the predicted required bandwidth for the video codec based on video resolution and quantization parameters. We train, validate, and test the developed models through extensive experiments using several video contents. Results show that an accurate model can be built for the needed purpose and the video streaming quality, required power consumption, and required bandwidth can be predicted accurately which can be utilized to enhance network availability in a cooperative environment.