Social Network sites are one of the most prominent websites used in almost every facet of life. A social network reflects relationships among social entities, including acquaintances, co-workers, or co-authors. With the extreme success of Social networks, the misuse has also been escalated and unlocked the way for various illegal behavior and security threats. Social Network anomaly detection has become a critical topic to be explored by researchers. The nature of the input data is a significant aspect of an anomaly detection technique. In the field of anomaly detection in the social network, input networks can be classified as static or dynamic, Attributed or unattributed. The number of nodes and the connections between the nodes in static networks would not change with time. Attribute networks are pervasive in various domains and constitute a vital element of modern technological architecture, where node attributes support the topological structure in data exploration. While social networks build up over time, evaluating them as if they had been static is very beneficial. Numerous studies have been done for Anomaly detection, but to the best of our knowledge, research in static attributed anomaly detection has been very limited. This review tries to portray earlier research on detecting anomalies for social static attributed networks and thoroughly discusses state-of-the-art embedding approaches
WSN is a low-power system and are often used in numerous monitoring uses, such as healthcare, environmental, and systemic health surveillance, in addition to military surveillance. It is important to reduce network resource usage since many of these applications need to be installed in locations that are virtually inaccessible to humans. Many protocols for WSN to extend the presence of the network have been established to solve this problem. In the energy efficiency of WSN networks, routing protocols play an important role since they help minimize power usage and response time and provide sensor networks with high data density and service quality. This study also employed a Hopfield neural network and the findings from this study are presented next to each other to enable comparison. This paper also discusses how to easily and accurately capture and handle WSN collisions. Future experiments that require the usage of neural networks and so many fuzzy structures will be able to prevent a crash in these respects.
Cloud computing is used to achieve sustainability in terms of computing. It reduces energy and resource consumption. Most of the companies have been moving their applications to the cloud to reduce power, energy re-source, and carbon emission. Today's computing landscape is rapidly shifting toward creating applications to leverage Cloud platforms to have necessary features such as elasticity, virtualization, low cost, and pay-per-use. Cloud computing's rising demand and versatility are achieving acceptance in the research community as a means of implementing large-scale electronic systems in the format of workflows (set of tasks). One of the most important objectives of this effort is to trim down makespan which is the total period taken by the resources to complete all workflow activities. Another foremost objective of this work is to satiate all the user-delineated time constraints while scheduling workflow activities.
The advent of the World Wide Web and the rapid adoption of social media platforms (such as Facebook and Twitter) paved the way for information dissemination that has never been witnessed in the human history before. With the current usage of social media platforms, consumers are creating and sharing more information than ever before, some of which are misleading with no relevance to reality. Automated classification of a text article as misinformation or disinformation is a challenging task. Even an expert in a particular domain has to explore multiple aspects before giving a verdict on the truthfulness of an article. In this work, we propose to use a machine learning ensemble approach for the automated classification of news articles. Our study explores different textual properties that can be used to distinguish fake contents from real. By using those properties, we train a combination of different machine learning algorithms using various ensemble methods and evaluate their performance on 4 real world datasets. Experimental evaluation confirms the superior performance of our proposed ensemble learner approach in comparison to individual learners. The advent of the World Wide Web and the rapid adoption of social media platforms (such as Facebook and Twitter) paved the way for information dissemination that has never been witnessed in human history before. Besides other use cases, news outlets benefitted from the widespread use of social media platforms by providing updated news in near real-time to its subscribers. The news media evolved from newspapers, tabloids, and magazines to a digital form such as online news platforms, blogs, social media feeds, and other digital media formats. It became easier for consumers to acquire the latest news at their fingertips. Facebook referrals account for 70% of traffic to news websites. These social media platforms in their current state are extremely powerful and useful for their ability to allow users to discuss and share ideas and debate over issues such as democracy, education, and health. However, such platforms are also used with a negative perspective by certain entities commonly for monetary gain and in other cases for creating biased opinions, manipulating mindsets, and spreading satire or absurdity. The phenomenon is commonly known as fake news.
The main theme of this paper is to implement the mobility model in the Cooja simulator and to investigate the impact of mobility on the performance of Routing Protocol over Low power Lossy networks (RPL) in the IoT environment. In the real world, mobility occurs frequently. Therefore in this paper, a frequently used mobility model - Random Way Point (RWP) is used for analysis. RWP can be readily applied to many existing applications. By default, the Cooja simulator does not support mobility models. For this, the Bonn Motion is introduced into Cooja as a plugin. As IoT deals with the resource-constrained environment, a comparison is done between the static environment and the mobile environment in terms of power consumption. As expected, the results indicate that mobility affects the RPL in terms of Power Consumption
Background: This study investigated the utilization of consumer health informatics in health promotion among the staff of tertiary institutions in Rivers state. Subjects & Methods: A cross-sectional descriptive research design was used. Two research questions guided this paper. The population of this paper comprised all the 13,046 staff of tertiary institutions in Rivers state. A sample of 1226 staff was drawn using multi-stage sampling techniques. An instrument titled, “Utilization of Consumer Health Informatics in Health Promotion Questionnaire” (UCHIHPQ) was adopted for data collection. The instrument was validated and reliability yielded an index of 0.80. Mean and Standard Deviation statistics were used to answer the research questions. The statistical analysis was performed with the use of SPSS v23. Results: The result revealed among others that to a very large extent, the respondents accepted that consumer health informatics was used to improve their nutritional and physical health status. Conclusion: It was therefore concluded and recommended among others that staff of tertiary institutions in Rivers state should regularly use intelligent informatics applications to attain a healthy balance between self-reliance and seeking professional help concerning nutritional and physical health matter
n India, the education sector has been always attentive to adopt innovations and techniques in the teaching-learning process due to various challenges. But nowadays, academic institutions are becoming flexible in accepting the new teaching and learning techniques to satisfy the student sector which as cited as the most vital entity in the educational sector. New technologies, tools, and techniques are proving as a boom for innovative teaching and learning practices. One of the emerging teaching technique is Blended learning which is a process refers to “mixing of the different learning environment for educational transfer”. It combines the traditional face to face classroom method with online learning method supported by advanced technology and tools. Blended learning should be viewed not only as a temporal construct but also as a fundamental redesign model. Through this content, delivery becomes digital and online. Truly blended learning requires teachers should adopt the approach as guides and mentors and learning should go beyond the classroom walls. Blended learning is also known as Hybrid learning. Although, Indian Government is taking initiatives to implement a blending learning approach yet there is a need to access the behavioral aspect of the students to use this blending learning approach. Adopting a blended learning approach must start with a re-examination of the intended learning outcomes.The deployment of cloud in the blended learning process makes its existence more strong. This study is the review of literature selected to identify the need for blended learning deploy with cloud in the teaching-learning process in the Higher Education Sector.
Many people are distracted from the normal lifestyle, because of the hearing loss they have. Most of them do not use the hearing aids due to various discomforts in wearing them. The main and the foremost problem available in it is; the device introduces unpleasant whistling sounds, caused by the changing environmental noise, which is faced by the user daily. This paper describes the development of an algorithm, which focuses on the adaptive feedback cancellation, that improves the listening effort of the user. The genetic algorithm is one of the computational techniques, that is used in enhancing the above features. The performance can also be compared with other comprehensive analysis methods, to evaluate its standards.
The notable developments in renewable energy facilities and resources help reduce the cost of production and increase production capacity. Therefore, developers in renewable energy evaluate the overall performance of the various equipment, methods, and structure and then determine the optimal variables for the design of energy production systems. Variables include equipment characteristics and quality, geographical location, and climatic variables such as solar irradiance, temperature, humidity, dust, etc. This paper investigated and reviewed the current big data methods and tools in solar energy production. It discusses the comprehensive two-stage design and evaluation for examining the optimal structure for renewable energy systems. In the design stage, technical and economic aspects are discussed based on a robust analysis of all input/output variables for determining the highest performance. Next, assess and evaluate the effectiveness of each method under different circumstances conditions. Then convert each qualitative indicator into a quantitative measure using extensive data analysis methods to determine the overall performance of the various qualitative variables. The paper also provides an in-depth analysis of the mathematical techniques used in measuring the efficiency of the renewable energy production system and discussing future axes of work in the field of specific energy.
Green computing is focusing on reducing the energy consumption, resource usage, carbon dioxide emission. It was found that Last year, Google used about 12.4 terawatt-hours of electricity. Energy consumption in data centers is reduced by decreasing the resource utilization that is by switching off or shifting the computing nodes to sleep mode. But when the servers are being used the energy consumption is minimized by using energy efficient scheduling and optimization techniques.In WNS the sensor nodes are deployed in remote areas, these sensors are powered by battery that decreases the lifetime, therefore by using energy efficient techniques can increase the uptime of the battery-operated devices in WNS. This work studied various energy efficient techniques that minimize energy consumption usage in Data Centers (DC) and the algorithms that increase the uptime of the battery-operated device in (WNS) Wireless Network Sensors.
The transportation problem is widely applied in the real world. This problem aims to minimize the total shipment cost from a number of sources to a number of destinations. This paper presents a new method named Dhouib-Matrix-TP1, which generates an initial basic feasible solution based on the standard deviation metric with a very reduced number of simple iterations. A comparative study is carried out in order to verify the performance of the proposed Dhouib-Matrix-TP1 heuristic.
The evolution in Information Technology has gone a long way of bringing Igbo, one of the major Nigerian languages evolved. Some online service providers report news, publish articles and search with this language. The advancement will likely result to generation of huge textual data in the language, that needs to be organized, managed and classified efficiently for easy information access, extraction and retrieval by the end users. This work presents an enhanced model for Igbo text classification. The classification was based on N-gram and K-Nearest Neighbour techniques. Considering the peculiarities in Igbo language, N-gram model was adopted for the text representation. The text was represented with Unigram, Bigram and Trigram techniques. The classification of the represented text was done using the K-Nearest Neighbour technique. The model is implemented with the Python programming language together with the tools from Natural Language Toolkit (NLTK). The evaluation of the Igbo text classification system performance was done by calculating the recall, precision and F1-measure on N-gram represented text. The result shows text classification on bigram represented Igbo text has highest degree of exactness (precision); trigram has the lowest level of precision and result obtained with the three N-gram techniques has the same level of completeness (recall). Bigram text representation technique is extremely recommended for any text-based system in Igbo. This model can be adopted in text analysis, text mining, information retrieval, natural language processing and any intelligent text-based system in the language.
Improving the agricultural productivity is an imminent need to meet the food requirement of constantly growing population rate. It can be gracefully satisfied if the farming process is integrated through technologies such as big data and IoT. The integration of agricultural processes with modern technologies has emerged as the smart agriculture technology. This research work is focused on proving the suitability of the big data analytics for smart agricultural processes in terms of increasing production and quality of yields with less resources and overhead. This research paper expounds the extensive review carried out on the related works in smart agricultural farming, challenges in implementing the smart farming technologies at large scale, followed by the conceptual framework model for the effective implementation of big data together with IoT devices in smart farming.
Improving the agricultural productivity is an imminent need to meet the food requirement of constantly growing population rate. It can be gracefully satisfied if the farming process is integrated through technologies such as big data and IoT. The integration of agricultural processes with modern technologies has emerged as the smart agriculture technology. This research work is focused on proving the suitability of the big data analytics for smart agricultural processes in terms of increasing production and quality of yields with less resources and overhead. This research paper expounds the extensive review carried out on the related works in smart agricultural farming, challenges in implementing the smart farming technologies at large scale, followed by the conceptual framework model for the effective implementation of big data together with IoT devices in smart farming.
Social bots are computer programs created for automating general human activities like the generation of messages. The rise of bots in social network platforms has led to malicious activities such as content pollution like spammers or malware dissemination of misinformation. Most of the researchers focused on detecting bot accounts in social media platforms to avoid the damages done to the opinions of users. In this work, n-gram based approach is proposed for a bot or human detection. The content-based features of character n-grams and word n-grams are used. The character and word n-grams are successfully proved in various authorship analysis tasks to improve accuracy. A huge number of n-grams is identified after applying different pre-processing techniques. The high dimensionality of features is reduced by using a feature selection technique of the Relevant Discrimination Criterion. The text is represented as vectors by using a reduced set of features. Different term weight measures are used in the experiment to compute the weight of n-grams features in the document vector representation. Two classification algorithms, Support Vector Machine, and Random Forest are used to train the model using document vectors. The proposed approach was applied to the dataset provided in PAN 2019 competition bot detection task. The Random Forest classifier obtained the best accuracy of 0.9456 for bot/human detection.
Social bots are computer programs created for automating general human activities like the generation of messages. The rise of bots in social network platforms has led to malicious activities such as content pollution like spammers or malware dissemination of misinformation. Most of the researchers focused on detecting bot accounts in social media platforms to avoid the damages done to the opinions of users. In this work, n-gram based approach is proposed for a bot or human detection. The content-based features of character n-grams and word n-grams are used. The character and word n-grams are successfully proved in various authorship analysis tasks to improve accuracy. A huge number of n-grams is identified after applying different pre-processing techniques. The high dimensionality of features is reduced by using a feature selection technique of the Relevant Discrimination Criterion. The text is represented as vectors by using a reduced set of features. Different term weight measures are used in the experiment to compute the weight of n-grams features in the document vector representation. Two classification algorithms, Support Vector Machine, and Random Forest are used to train the model using document vectors. The proposed approach was applied to the dataset provided in PAN 2019 competition bot detection task. The Random Forest classifier obtained the best accuracy of 0.9456 for bot/human detection.
The high density of H-CRAN associated with frequent UE handover may degrade the throughput. The infrastructure equipment like RRHs and BBUs consumes more energy to reduce UE energy consumptions. In this paper, we propose a utility-based joint power control and resource allocation (UJPCRA) algorithm for heterogeneous cloud radio access network (H-CRAN). In this framework, the power consumption of baseband units (BBUs), remote radio heads (RRHs), and macrocell base station (MBS) are estimated by predicting their dynamic loads. The data rate achievable for UE associated with each RRH and MBS on resource block RBk is then estimated. The user wishing to connect to a RRH or MBS then checks the corresponding utility with minimum expected energy consumption and the maximum expected data rate. If any UE with high priority traffic connected to MBS could not achieve its desired data rate requirements, then it can cooperatively seek the assistance of any RRH for assigning the balance RBs. The throughput may be enhanced by the high density of H-CRAN and frequent UE handover. Inter- and intracell interference causes the H-CRAN macrocells’ improved data rate to diminish. To lower UE energy consumption, infrastructure devices like RRHs and BBUs need more energy. As a result, there is a trade-off between operators and UE energy conservation. It is possible to determine the power consumption of BBUs, RRHs, and MBS using predictions of their dynamic loads. The UE may then forecast the data rate for each RRH and MBS on the resource block. When a user wishes to connect to an RRH or MBS, they look at the utility with the highest expected data rate and the least predicted energy usage first. A UE with high priority traffic connected to the MBS can cooperatively ask any RRH for assistance in allocating the remaining RBs if it is unable to achieve its intended data rate needs. Experimental results have shown that the proposed JRAUA algorithm achieves higher throughput, resource utilization, and energy efficiency with reduced packet loss ratio, when compared to the existing techniques.
In the modern world, Tuberculosis (TB) is regarded as a serious health issue with a high rate of mortality. TB can be cured completely by early diagnosis. For achieving this, one tool utilized is CXR (Chest X-rays) which is used to screen active TB. An enhanced deep learning (DL) model is implemented for automatic Tuberculosis detection. This work undergoes the phases like preprocessing, segmentation, feature extraction, and optimized classification. Initially, the CXR image is preprocessed and segmented using AFCM (Adaptive Fuzzy C means) clustering. Then, feature extraction and several features are extracted. Finally, these features are given to the DL classifier Deep Belief Network (DBN). To improve the classification accuracy and to optimize the DBN, a metaheuristic optimization Adaptive Monarch butterfly optimization (AMBO) algorithm is used. Here, the Deep Belief Network with Adaptive Monarch butterfly optimization (DBN-AMBO) is used for enhancing the accuracy, reducing the error function, and optimizing weighting parameters. The overall implementation is carried out on the Python platform. The overall performance evaluations of the DBN-AMBO were carried out on MC and SC datasets and compared over the other approaches on the basis of certain metrics.
Unmanned aerial vehicles (UAVs) are aerial systems controlled remotely or autonomously by astronauts. Massive advancements in electronics and information technology have prompted the popularity and growth of UAVs. As a result of the huge advances made in electronics and information technology, civilian tasks can now be accomplished with UAV in a more effective, efficient, and secure way. Known as a drone, UAVs are developed and operated using a variety of technologies such as machine learning, computer vision, artificial intelligence, and collision avoidance. Having become more affordable and accessible, drone technology has become more popular among civilians. Therefore, this technology is constantly evolving and can be used across a variety of fields. The application of drones makes a huge difference in the most demanding and complex industrial environments such as those in the mining industry, maritime, oil, gas, and seaports. The usage of drones is increasing among industrialists to improve and optimize processes, as well as to enhance operational efficiency in industrial process. This chapter discusses UAVs on a wide range of topics, including evolution and historical perspectives of UAV, taxonomy of UAV, significance of UAV to society and industry, and industrial and academic perspectives on UAV.
Alzheimer Disease is a chronic neurological brain disease. Early diagnosis of Alzheimer illness may the prevent the occurrence of memory cellular injury. Neuropsychological tests are commonly used to diagnose Alzheimer’s disease. The above technique, has a limited specificity and sensitivity. This article suggests solutions to this issue an early diagnosis model of Alzheimer’s disease based on a hybrid meta-heuristic with a multi-feed-forward neural network. The proposed Alzheimer’s disease detection model includes four major phases: pre-processing, feature extraction, feature selection and classification (disease detection). Initially, the collected raw data is pre-processed using the SPMN12 package of MATLAB. Then, from the pre-processed data, the statistical features (mean, median and standard deviation) and DWT are extracted. Then, from the extracted features, the optimal features are selected using the new Hybrid Sine cosine firefly (HSCAFA). This HSCAFA is a conceptual improvement of standard since cosine optimization and firefly optimization algorithm, respectively. Finally, the disease detection is accomplished via the new regression- based multi-faith neighbors’ network (MFNN). The final detected outcome is acquired from regression-based MFNN. The proposed methodology is performed on the PYTHON platform and the performances are evaluated by the matrices such as precision, recall, and accuracy.