Computer Science articles list

An exhaustive review on state-of-the-art techniques for anomaly detection on attributed networks

Social Network sites are one of the most prominent websites used in almost every facet of life. A social network reflects relationships among social entities, including acquaintances, co-workers, or co-authors. With the extreme success of Social networks, the misuse has also been escalated and unlocked the way for various illegal behavior and security threats. Social Network anomaly detection has become a critical topic to be explored by researchers. The nature of the input data is a significant aspect of an anomaly detection technique. In the field of anomaly detection in the social network, input networks can be classified as static or dynamic, Attributed or unattributed. The number of nodes and the connections between the nodes in static networks would not change with time. Attribute networks are pervasive in various domains and constitute a vital element of modern technological architecture, where node attributes support the topological structure in data exploration. While social networks build up over time, evaluating them as if they had been static is very beneficial. Numerous studies have been done for Anomaly detection, but to the best of our knowledge, research in static attributed anomaly detection has been very limited. This review tries to portray earlier research on detecting anomalies for social static attributed networks and thoroughly discusses state-of-the-art embedding approaches

Mohd haroon

A study of wsn and analysis of packet drop during transmission

WSN is a low-power system and are often used in numerous monitoring uses, such as healthcare, environmental, and systemic health surveillance, in addition to military surveillance. It is important to reduce network resource usage since many of these applications need to be installed in locations that are virtually inaccessible to humans. Many protocols for WSN to extend the presence of the network have been established to solve this problem. In the energy efficiency of WSN networks, routing protocols play an important role since they help minimize power usage and response time and provide sensor networks with high data density and service quality. This study also employed a Hopfield neural network and the findings from this study are presented next to each other to enable comparison. This paper also discusses how to easily and accurately capture and handle WSN collisions. Future experiments that require the usage of neural networks and so many fuzzy structures will be able to prevent a crash in these respects.

Mohd haroon

Satiating a user-delineated time constraints while scheduling workflow in cloud environments

Cloud computing is used to achieve sustainability in terms of computing. It reduces energy and resource consumption. Most of the companies have been moving their applications to the cloud to reduce power, energy re-source, and carbon emission. Today's computing landscape is rapidly shifting toward creating applications to leverage Cloud platforms to have necessary features such as elasticity, virtualization, low cost, and pay-per-use. Cloud computing's rising demand and versatility are achieving acceptance in the research community as a means of implementing large-scale electronic systems in the format of workflows (set of tasks). One of the most important objectives of this effort is to trim down makespan which is the total period taken by the resources to complete all workflow activities. Another foremost objective of this work is to satiate all the user-delineated time constraints while scheduling workflow activities.

Mohd haroon

Fake news detection using machine learning ensemble methods

The advent of the World Wide Web and the rapid adoption of social media platforms (such as Facebook and Twitter) paved the way for information dissemination that has never been witnessed in the human history before. With the current usage of social media platforms, consumers are creating and sharing more information than ever before, some of which are misleading with no relevance to reality. Automated classification of a text article as misinformation or disinformation is a challenging task. Even an expert in a particular domain has to explore multiple aspects before giving a verdict on the truthfulness of an article. In this work, we propose to use a machine learning ensemble approach for the automated classification of news articles. Our study explores different textual properties that can be used to distinguish fake contents from real. By using those properties, we train a combination of different machine learning algorithms using various ensemble methods and evaluate their performance on 4 real world datasets. Experimental evaluation confirms the superior performance of our proposed ensemble learner approach in comparison to individual learners. The advent of the World Wide Web and the rapid adoption of social media platforms (such as Facebook and Twitter) paved the way for information dissemination that has never been witnessed in human history before. Besides other use cases, news outlets benefitted from the widespread use of social media platforms by providing updated news in near real-time to its subscribers. The news media evolved from newspapers, tabloids, and magazines to a digital form such as online news platforms, blogs, social media feeds, and other digital media formats. It became easier for consumers to acquire the latest news at their fingertips. Facebook referrals account for 70% of traffic to news websites. These social media platforms in their current state are extremely powerful and useful for their ability to allow users to discuss and share ideas and debate over issues such as democracy, education, and health. However, such platforms are also used with a negative perspective by certain entities commonly for monetary gain and in other cases for creating biased opinions, manipulating mindsets, and spreading satire or absurdity. The phenomenon is commonly known as fake news.

Kamal Singh

Impact of mobility on power consumption in rpl

The main theme of this paper is to implement the mobility model in the Cooja simulator and to investigate the impact of mobility on the performance of Routing Protocol over Low power Lossy networks (RPL) in the IoT environment. In the real world, mobility occurs frequently. Therefore in this paper, a frequently used mobility model - Random Way Point (RWP) is used for analysis. RWP can be readily applied to many existing applications. By default, the Cooja simulator does not support mobility models. For this, the Bonn Motion is introduced into Cooja as a plugin. As IoT deals with the resource-constrained environment, a comparison is done between the static environment and the mobile environment in terms of power consumption. As expected, the results indicate that mobility affects the RPL in terms of Power Consumption

Chandra Sekhar Sanaboina

Acceptance of cloud deployed blended learning environment by students in higher education sector-a literature review

n India, the education sector has been always attentive to adopt innovations and techniques in the teaching-learning process due to various challenges. But nowadays, academic institutions are becoming flexible in accepting the new teaching and learning techniques to satisfy the student sector which as cited as the most vital entity in the educational sector. New technologies, tools, and techniques are proving as a boom for innovative teaching and learning practices. One of the emerging teaching technique is Blended learning which is a process refers to “mixing of the different learning environment for educational transfer”. It combines the traditional face to face classroom method with online learning method supported by advanced technology and tools. Blended learning should be viewed not only as a temporal construct but also as a fundamental redesign model. Through this content, delivery becomes digital and online. Truly blended learning requires teachers should adopt the approach as guides and mentors and learning should go beyond the classroom walls. Blended learning is also known as Hybrid learning. Although, Indian Government is taking initiatives to implement a blending learning approach yet there is a need to access the behavioral aspect of the students to use this blending learning approach. Adopting a blended learning approach must start with a re-examination of the intended learning outcomes.The deployment of cloud in the blended learning process makes its existence more strong. This study is the review of literature selected to identify the need for blended learning deploy with cloud in the teaching-learning process in the Higher Education Sector.

Inderbir Kaur

Study of an effective way of detecting unexpected permission authorization to mobile apps

The recent boom in Android mobile device usage has caused a shift in the information technology and has affected the way how information and data are stored, shared among the mobile users. The advent of social networking applications also demands the availability of resources that can be shared among the authentic users. This paper reviews and compares the available techniques and solutions for detecting Unexpected Permission Authorization to Mobile Apps. It is observed that malware for the android system is also growing significantly, current solutions for detecting malware on smartphones are still ineffective.

Manisha patil

Implementation of big data analytics for simulating, predicting and optimizing the solar energy production

The notable developments in renewable energy facilities and resources help reduce the cost of production and increase production capacity. Therefore, developers in renewable energy evaluate the overall performance of the various equipment, methods, and structure and then determine the optimal variables for the design of energy production systems. Variables include equipment characteristics and quality, geographical location, and climatic variables such as solar irradiance, temperature, humidity, dust, etc. This paper investigated and reviewed the current big data methods and tools in solar energy production. It discusses the comprehensive two-stage design and evaluation for examining the optimal structure for renewable energy systems. In the design stage, technical and economic aspects are discussed based on a robust analysis of all input/output variables for determining the highest performance. Next, assess and evaluate the effectiveness of each method under different circumstances conditions. Then convert each qualitative indicator into a quantitative measure using extensive data analysis methods to determine the overall performance of the various qualitative variables. The paper also provides an in-depth analysis of the mathematical techniques used in measuring the efficiency of the renewable energy production system and discussing future axes of work in the field of specific energy.

ACAA PUB

Computer fundamentals pdf

Computer as a revolution left no area of life untouched in the present world. It is of tremendous help in all field of life. Hence, the knowledge of computer is a necessity for existence of everybody in this global village. The invention of computer has transformed our simple manual works to sophisticated life of automated works to meet the global demand for the higher productivity and increased efficiency with high precision. Computer is increasingly becoming compulsory in nearly all fields of studies, not because of anything but its accuracy and versatility in processing data. Many tasks at home or office are being automated rapidly with computer. Thus it is becoming apparent that in whatever discipline or working sector, the computer is now a very vital tool for efficiency improvement and precision of job or task execution. This is designed to meet the prerequisite need of everybody that are interested and wish to know about computers science and computing in general. A computer is an electronic device, operating under the control of instructions stored in its own memory. These instructions tell the achine what to do. The computer is capable of accepting data (input), processing data arithmetically and logically, producing output from the processing, and storing the results for future use. Most computers that sit on a desktop are called Personal Computers (PCs). The "computer" is an ensemble of different machines that you will be using to get your job done. A computer is primarily made of the Central Processing Unit (usually referred to as the computer), the monitor, the keyboard, and the mouse. Other pieces of hardware are commonly referred to as peripherals. In everyday life activities, we process data or encounter cases of data processing. A typical example of data processing is the generation of statement of student result from the marks score in an examination and continuous assessment. It is essential to know that information is as good as the data from which it is derived, and the transformation process which they are subjected to. Meaningless data or inappropriate processing produces wrong information. Thus computer gives you results corresponding to what data you supply and how you process it (i.e. ëgabbage- in, gabbage-outí) Summarily, the intelligent performance of a computer depends on correctness of input data and the intelligence performance of the human being that drives it.

Kamal Singh

A systematic approach on reducing the energy consumption in green computing

Green computing is focusing on reducing the energy consumption, resource usage, carbon dioxide emission. It was found that Last year, Google used about 12.4 terawatt-hours of electricity. Energy consumption in data centers is reduced by decreasing the resource utilization that is by switching off or shifting the computing nodes to sleep mode. But when the servers are being used the energy consumption is minimized by using energy efficient scheduling and optimization techniques.In WNS the sensor nodes are deployed in remote areas, these sensors are powered by battery that decreases the lifetime, therefore by using energy efficient techniques can increase the uptime of the battery-operated devices in WNS. This work studied various energy efficient techniques that minimize energy consumption usage in Data Centers (DC) and the algorithms that increase the uptime of the battery-operated device in (WNS) Wireless Network Sensors.

Dr H Shaheen

N-gram and k-nearest neighbour based igbo text classification model

The evolution in Information Technology has gone a long way of bringing Igbo, one of the major Nigerian languages evolved. Some online service providers report news, publish articles and search with this language. The advancement will likely result to generation of huge textual data in the language, that needs to be organized, managed and classified efficiently for easy information access, extraction and retrieval by the end users. This work presents an enhanced model for Igbo text classification. The classification was based on N-gram and K-Nearest Neighbour techniques. Considering the peculiarities in Igbo language, N-gram model was adopted for the text representation. The text was represented with Unigram, Bigram and Trigram techniques. The classification of the represented text was done using the K-Nearest Neighbour technique. The model is implemented with the Python programming language together with the tools from Natural Language Toolkit (NLTK). The evaluation of the Igbo text classification system performance was done by calculating the recall, precision and F1-measure on N-gram represented text. The result shows text classification on bigram represented Igbo text has highest degree of exactness (precision); trigram has the lowest level of precision and result obtained with the three N-gram techniques has the same level of completeness (recall). Bigram text representation technique is extremely recommended for any text-based system in Igbo. This model can be adopted in text analysis, text mining, information retrieval, natural language processing and any intelligent text-based system in the language.

Dr. Nkechi Ifeanyi-Reuben

Design and development of framework for big data based smart farming system

Improving the agricultural productivity is an imminent need to meet the food requirement of constantly growing population rate. It can be gracefully satisfied if the farming process is integrated through technologies such as big data and IoT. The integration of agricultural processes with modern technologies has emerged as the smart agriculture technology. This research work is focused on proving the suitability of the big data analytics for smart agricultural processes in terms of increasing production and quality of yields with less resources and overhead. This research paper expounds the extensive review carried out on the related works in smart agricultural farming, challenges in implementing the smart farming technologies at large scale, followed by the conceptual framework model for the effective implementation of big data together with IoT devices in smart farming.

Dr H Shaheen

A memetic algorithm for the inventory routing problem

In this article, we study an Inventory Routing Problem with deterministic customer demand in a two-tier supply chain. The supply chain network consists of a supplier using a single vehicle with a given capacity to deliver a single product type to multiple customers. We are interested in population-based algorithms to solve our problem. A Memetic Algorithm (MA) is developed based on the Genetic Algorithm (GA) and Variable Neighborhood Search methods. The proposed meta-heuristics are tested on small and large reference benchmarks. The results of the MA are compared to those of the classical GA and to the optimal solutions in the literature. The comparison shows the efficiency of using MA and its ability to generate high quality solutions in a reasonable computation time.

Mohamed Salim Amri Sakhri

Design and development of framework for big data based smart farming system

Improving the agricultural productivity is an imminent need to meet the food requirement of constantly growing population rate. It can be gracefully satisfied if the farming process is integrated through technologies such as big data and IoT. The integration of agricultural processes with modern technologies has emerged as the smart agriculture technology. This research work is focused on proving the suitability of the big data analytics for smart agricultural processes in terms of increasing production and quality of yields with less resources and overhead. This research paper expounds the extensive review carried out on the related works in smart agricultural farming, challenges in implementing the smart farming technologies at large scale, followed by the conceptual framework model for the effective implementation of big data together with IoT devices in smart farming.

Dr H Shaheen

N-gram-based machine learning approach for bot or human detection from text messages

Social bots are computer programs created for automating general human activities like the generation of messages. The rise of bots in social network platforms has led to malicious activities such as content pollution like spammers or malware dissemination of misinformation. Most of the researchers focused on detecting bot accounts in social media platforms to avoid the damages done to the opinions of users. In this work, n-gram based approach is proposed for a bot or human detection. The content-based features of character n-grams and word n-grams are used. The character and word n-grams are successfully proved in various authorship analysis tasks to improve accuracy. A huge number of n-grams is identified after applying different pre-processing techniques. The high dimensionality of features is reduced by using a feature selection technique of the Relevant Discrimination Criterion. The text is represented as vectors by using a reduced set of features. Different term weight measures are used in the experiment to compute the weight of n-grams features in the document vector representation. Two classification algorithms, Support Vector Machine, and Random Forest are used to train the model using document vectors. The proposed approach was applied to the dataset provided in PAN 2019 competition bot detection task. The Random Forest classifier obtained the best accuracy of 0.9456 for bot/human detection.

Chandra Sekhar Sanaboina

N-gram-based machine learning approach for bot or human detection from text messages

Social bots are computer programs created for automating general human activities like the generation of messages. The rise of bots in social network platforms has led to malicious activities such as content pollution like spammers or malware dissemination of misinformation. Most of the researchers focused on detecting bot accounts in social media platforms to avoid the damages done to the opinions of users. In this work, n-gram based approach is proposed for a bot or human detection. The content-based features of character n-grams and word n-grams are used. The character and word n-grams are successfully proved in various authorship analysis tasks to improve accuracy. A huge number of n-grams is identified after applying different pre-processing techniques. The high dimensionality of features is reduced by using a feature selection technique of the Relevant Discrimination Criterion. The text is represented as vectors by using a reduced set of features. Different term weight measures are used in the experiment to compute the weight of n-grams features in the document vector representation. Two classification algorithms, Support Vector Machine, and Random Forest are used to train the model using document vectors. The proposed approach was applied to the dataset provided in PAN 2019 competition bot detection task. The Random Forest classifier obtained the best accuracy of 0.9456 for bot/human detection.

Chandra Sekhar Sanaboina

Utility-based joint power control and resource allocation algorithm for heterogeneous cloud radio access network (h-cran)

The high density of H-CRAN associated with frequent UE handover may degrade the throughput. The infrastructure equipment like RRHs and BBUs consumes more energy to reduce UE energy consumptions. In this paper, we propose a utility-based joint power control and resource allocation (UJPCRA) algorithm for heterogeneous cloud radio access network (H-CRAN). In this framework, the power consumption of baseband units (BBUs), remote radio heads (RRHs), and macrocell base station (MBS) are estimated by predicting their dynamic loads. The data rate achievable for UE associated with each RRH and MBS on resource block RBk is then estimated. The user wishing to connect to a RRH or MBS then checks the corresponding utility with minimum expected energy consumption and the maximum expected data rate. If any UE with high priority traffic connected to MBS could not achieve its desired data rate requirements, then it can cooperatively seek the assistance of any RRH for assigning the balance RBs. The throughput may be enhanced by the high density of H-CRAN and frequent UE handover. Inter- and intracell interference causes the H-CRAN macrocells’ improved data rate to diminish. To lower UE energy consumption, infrastructure devices like RRHs and BBUs need more energy. As a result, there is a trade-off between operators and UE energy conservation. It is possible to determine the power consumption of BBUs, RRHs, and MBS using predictions of their dynamic loads. The UE may then forecast the data rate for each RRH and MBS on the resource block. When a user wishes to connect to an RRH or MBS, they look at the utility with the highest expected data rate and the least predicted energy usage first. A UE with high priority traffic connected to the MBS can cooperatively ask any RRH for assistance in allocating the remaining RBs if it is unable to achieve its intended data rate needs. Experimental results have shown that the proposed JRAUA algorithm achieves higher throughput, resource utilization, and energy efficiency with reduced packet loss ratio, when compared to the existing techniques.

Dr H Shaheen

Early diagnosis of tuberculosis using deep learning approach for iot based healthcare applications

In the modern world, Tuberculosis (TB) is regarded as a serious health issue with a high rate of mortality. TB can be cured completely by early diagnosis. For achieving this, one tool utilized is CXR (Chest X-rays) which is used to screen active TB. An enhanced deep learning (DL) model is implemented for automatic Tuberculosis detection. This work undergoes the phases like preprocessing, segmentation, feature extraction, and optimized classification. Initially, the CXR image is preprocessed and segmented using AFCM (Adaptive Fuzzy C means) clustering. Then, feature extraction and several features are extracted. Finally, these features are given to the DL classifier Deep Belief Network (DBN). To improve the classification accuracy and to optimize the DBN, a metaheuristic optimization Adaptive Monarch butterfly optimization (AMBO) algorithm is used. Here, the Deep Belief Network with Adaptive Monarch butterfly optimization (DBN-AMBO) is used for enhancing the accuracy, reducing the error function, and optimizing weighting parameters. The overall implementation is carried out on the Python platform. The overall performance evaluations of the DBN-AMBO were carried out on MC and SC datasets and compared over the other approaches on the basis of certain metrics.

Dr H Shaheen

Evolution and significance of unmanned aerial vehicles

Unmanned aerial vehicles (UAVs) are aerial systems controlled remotely or autonomously by astronauts. Massive advancements in electronics and information technology have prompted the popularity and growth of UAVs. As a result of the huge advances made in electronics and information technology, civilian tasks can now be accomplished with UAV in a more effective, efficient, and secure way. Known as a drone, UAVs are developed and operated using a variety of technologies such as machine learning, computer vision, artificial intelligence, and collision avoidance. Having become more affordable and accessible, drone technology has become more popular among civilians. Therefore, this technology is constantly evolving and can be used across a variety of fields. The application of drones makes a huge difference in the most demanding and complex industrial environments such as those in the mining industry, maritime, oil, gas, and seaports. The usage of drones is increasing among industrialists to improve and optimize processes, as well as to enhance operational efficiency in industrial process. This chapter discusses UAVs on a wide range of topics, including evolution and historical perspectives of UAV, taxonomy of UAV, significance of UAV to society and industry, and industrial and academic perspectives on UAV.

Dr H Shaheen

Security issues in cloud computing and its countermeasures

Cloud computing is a technology of delivering resources such as hardware, software (virtual too) and bandwidth over the network to the consumers worldwide. All the services are requested and accessed through a web browser or web service. The main advantage that cloud is provided to the nation worldwide is that it is not so easily affordable to one and all. Multi-conglomerate companies invest a lot of money on the cloud and let people access it for a smaller cost and even free at the lowest level of the consumer chain. In this paper we address to the problems that the cloud technology faces and how it can be overcome.

Pavan m

 1 2 3 >  Last ›