Author Archives: Kamran Khan

A Comparative Study and Analysis of EZW and SPIHT Methods for Wavelet Based Image Compression

Chetan R. Dudhagara and Mayur M. Patel

View:    Abstract |  HTML Full Text |  PDF |  XML|

In recent years there has been widely increase the use of digital media everywhere. To increase the use of digital media, there is a huge problem of storage, manipulation and transmission of data over the internet. These digital media such as image, audio and video require large memory space. So it is necessary to compress the digital data to require less memory space and less bandwidth to transmission of data over network. Image compressions techniques are used to compress the data for reduce the storage requirement. It plays an important role for transfer of data such as image over the network. Two methods are used in this paper on Barbara image. This compression study is performed by using Set Partitioning In Hierarchical Trees (SPIHT) and Embedded Zero tree Wavelet (EZW) compression techniques. There are many parameters are used to compare this techniques. Mean Square Error (MSE), Pick Signal to Noise Ration (PSNR) and Compression Ratio (CR) are used at different level of decompositions.

Hide Abstract

Information Technology Use in Agriculture

Pratik Vanjara

View:    Abstract |  HTML Full Text |  PDF |  XML|

Two major trends that have an effect on our planet: increase and urbanization. The anticipated increase for the primary one and half this century is discouraging. Betting on the estimate, there'll be nine to ten billion individuals by mid-century. This population is simply beneath seven billion that means that there'll be a couple of fifty percentage increases from the start to the center of this century. One could dialogue the relative accuracy of explicit models, however all of them agree that there'll be several, more mouths to enclose the approaching decades.

IT has reworked several different aspects of human endeavor and has helped produce systems for responding to a good varies of social group wants. Indeed, transportation, communication, national security, and health systems square measure utterly dependent thereon to perform even basic functions. However, data, and its automatic technological embodiment, has not compact agriculture to identical level.

Hide Abstract

A Study on Machine Learning in Big Data

L.Dhanapriya and S. Manju

View:    Abstract |  HTML Full Text |  PDF |  XML|

In the recent development of IT technology, the capacity of data has surpassed the zettabyte, and improving the efficiency of business is done by increasing the ability of predictive through an efficient analysis on these data which has emerged as an issue in the current society. Now the market needs for methods that are capable of extracting valuable information from large data sets. Recently big data is becoming the focus of attention, and using any of the machine learning techniques to extract the valuable information from the huge data of complex structures has become a concern yet an urgent problem to resolve. The aim of this work is to provide a better understanding of this Machine Learning technique for discovering interesting patterns and introduces some machine learning algorithms to explore the developing trend.

Hide Abstract

Analysis of Clone Detection Approaches in Static Wireless Sensor Networks

Sachin Lalar1, Shashi Bhushan2 and Surender3

View:    Abstract |  HTML Full Text |  PDF |  XML|

Wireless Sensor Networks (WSNs) are developing very fast in the wireless networks. The wireless sensor network has the characteristics of limited memory, small size and limited battery. WSNs are vulnerable to the different types of attacks due to its characteristics. One of the attacks is clone node attack in which attacker capture the nodes from the network and stoles the information from it and replicates it in the network. From the clone nodes, the attacker can easily launch the different type of attacks in the network. To detect the clone node, different methods has been implemented .Each method having advantages and limitations. In the this paper, we explain the different methods to detect the clone nodes in the static wireless sensor network and compare their performance based on the communication cost and memory.

Hide Abstract

A Systematic Review on the Suspicious Profiles Detection on Online Social Media Data

Asha* and Balkishan

View:    Abstract |  HTML Full Text |  PDF |  XML|

Escalating crimes on digital facet alarms the law enforcement bodies to keep a gaze on online activities which involve massive amount of data. This will raise a need to detect suspicious activities on online available social media data by optimizing investigations using data mining tools. This paper intends to throw some light on the data mining techniques which are designed and developed for closely examining social media data for suspicious activities and profiles in different domains. Additionally, this study will categorize the techniques under various groups highlighting their important features, challenges and application realm.

Hide Abstract

Development of Program in VB to Compute Tractor Parameters on Automatic Steering

Raghuvirsinh Parmar*, Nitin Karwasra, Aseem Verma and Baldev Dogra

View:    Abstract |  HTML Full Text |  PDF |  XML|

A comprehensible Visual Basic (VB) computer program is developed to find out the tractor parameters for an automated steering system. Tractor parameters such as real wheel and front wheel trajectory coordinates, clockwise front and rear wheel angle and corrected front and rear wheel angle can be calculated. Front wheel trajectory data, rear wheel trajectory, distance between front and rear wheel(X13), Tractor velocity vector(V) and Tractor turning angle with center line are input variables for developed software. The tractor parameters are being calculated with the help of mathematical model which are already fed in program. The developed program successfully calculates the tractor parameters on automated steering . This developed software could guide an autonomous agricultural tractor in the field. Furthermore, software navigates agricultural tractor both in straight or curved path at normal field operational speed.

Hide Abstract

A Framework for Integration of A Patients’ Body Area Network With Iot

Dawood A. Khan

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper, we give a framework for integration of patients’ Body Area Network with IoT. We also discuss the enabling technologies that may help with the proliferation of the IoT in healthcare, besides mitigating various interoperability challenges in a healthcare IoT. We use a healthcare use-case of an artificial pancreas for diabetic patients to discuss our framework. We describe the framework as a formal model of a healthcare IoT, which we map onto the components of a proposed end-to-end, closed-loop health-care IoT architecture. In this paper, we also discuss dependability in a healthcare IoT. As such, we describe why certification, standardisation, and dependability should be central for a healthcare IoT.

Hide Abstract

A Soft Approach to Estimate Woody Volume of a Live Tree

Nikita Singla* and Derminder Singh

View:    Abstract |  HTML Full Text |  PDF |  XML|

Tree volume is one of the oldest areas of interest and is a crucial task in tree management system. Estimating the woody volume of a live tree is important for economic, scientific purposes and provides a tool to researcher/grower. It provides the useful information about the commercial value of wood to the potential buyer/seller. However, manual methods are being used largely to calculate woody volume of a tree. These methods are based on different log rules, cumbersome and laborious. The present work proposed a digital image processing technique to estimate the woody volume of a live tree. The developed program successfully determines the woody volume of standing tree trunk with MATLAB image processing techniques. In this method three parameters an individual tree were extracted from digital images of the tree. Calibration factor was also calculated to make the method independent of camera distance from the tree. The method was tested on several samples of trees and compared to experimental results. The soft approach generates information about height, diameter and volume of the tree. The percentage error of height, diameter at breast height and volume of standing tree by proposed method and experimental results was found to be less than 6.65%.

Hide Abstract

An Automated Complex Word Identification from Text: A Survey

Jaspreet Singh* , Gurvinder Singh and Rajinder Singh Virk

View:    Abstract |  HTML Full Text |  PDF |  XML|

Complex Word Identification (CWI) is the process of locating difficult words from a given sentence. The aim of automated CWI system is to make non-native English user understand the meaning of target word in the sentence. CWI systems assist second language learners and dyslexic users through simplification of text. This study introduces the CWI process and investigates the performance of twenty systems submitted in the SemEval -2016 for CWI. The G-score measure which is harmonic mean of accuracy and recall is taken for the performance evaluation of systems. This paper explores twenty CWI systems and identifies that why sv000gg system outperformed with highest G-score as 0.773 and 0.774 for the two respective submissions.

Hide Abstract

Tools, Strategies and Models for Incorporating Software Quality Assurance in Risk Oriented Testing

Vinita Malik1* and Sukhdip Singh2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Evolution of software is cumbersome process and also needs many iterations of software testing for satisfying some quality criteria. Software quality assurance activities must be effectively used for the proper software quality management and to achieve good product quality .Effective quality management is related to Value Engineering and Risk Management. In the present paper we will study relevance of quality assurance tools, strategies and models while doing risk based testing for the proper product function orientation .By analysing risks we can get to know how much we need to do the testing of software and further can assure the software quality.

Hide Abstract

A Survey on Accelerated Mapreduce for Hadoop

Jyotindra Tiwari1*, Mahesh Pawar2 and Anjana Pandey1

View:    Abstract |  HTML Full Text |  PDF |  XML|

Big Data is defined by 3Vs which stands for variety, volume and velocity. The volume of data is very huge, data exists in variety of file types and data grows very rapidly. Big data storage and processing has always been a big issue. Big data has become even more challenging to handle these days. To handle big data high performance techniques have been introduced. Several frameworks like Apache Hadoop has been introduced to process big data. Apache Hadoop provides map/reduce to process big data. But this map/reduce can be further accelerated. In this paper a survey has been performed for map/reduce acceleration and energy efficient computation in quick time.

Hide Abstract

Comparative Study of AODV and ODMRP Routing Protocols

Jitendra Soni* and Kokila Uikey

View:    Abstract |  HTML Full Text |  PDF |  XML|

Mobile ad-hoc Network [MANET] is the collection of mobile nodes deployed with the short-lived purpose. It is the most innovative and useful variety which provide the facility to establish communication without the prerequisite of any infrastructure. Here, wireless communication medium is usually used for communication and connection establishment purpose. Generally, it is deployed with mobile nodes but can be used for stationary design also. Open nature communication makes it vulnerable for several security threats. This paper has considered the simulation of AODV and ODMRP using Qualnet 5.2 simulator.

Hide Abstract

Identifying Various Roadways Obstacles in Infrastructure less Environment Using Depth Learning Approach

Chandra Kishor Pandey, Neeraj Kumar*, Vinay Kumar Mishra and Abhishek Bajpai

View:    Abstract |  HTML Full Text |  PDF |  XML|

Traffic conditions in infrastructure-less environment are in many ways not ideal for driving. This is due to undefined road curvature, faded and unmaintained lane markings and various obstacles situations cause vital life loses and damage of vehicles in accidents. This paper provides an efficient approach of finding various roadways obstacles situation using our depth learning approach based on the data collected through a Smartphone. The existing methods are suitable for planned or structured roads. The proposed approach is suitable for planed as well as unplanned roads i.e. for infrastructure-less environment. The approach is capable of effectively classifying roadways obstacles into predefined categories using depth learning approach. While compared with other similar approach this approach is a cost effective approach

Hide Abstract

Linear Cryptanalysis of Substitution Ciphers Using Particle Swarm Optimization

Dr. G. Rajkumar

View:    Abstract |  HTML Full Text |  PDF |  XML|

Cryptanalysis is a standout amongst the most vital requesting zones of capable research in the request of the security. An approach of data security is Cryptography. Cryptanalysis is the investigation to break cryptography without the encryption key. Cryptanalysis is breaking or separating cipher text content into its identical plain-content without past data of the secret key or without knowing the real approach to unscramble the cipher text content. Particle Swarm Optimization (PSO) is a population based, self-versatile find improvement of optimization performance motivated by group performance of bird flocking or fish schooling. In this paper discussed with use of PSO in automated cryptanalysis of simple substitution ciphers. In this manner, encrypted data can be sent by any individual utilizing the general puplic key, yet the data can be decoded just by the holder of the secret key.

Hide Abstract

Optical Flow Estimation Using Total Least Squares Variants

Maria A. De Jesus and Vania V. Estrela*

View:    Abstract |  HTML Full Text |  PDF |  XML|

The problem of recursively approximating motion resulting from the Optical Flow (OF) in video thru Total Least Squares (TLS) techniques is addressed. TLS method solves an inconsistent system Gu=z , with G and z in error due to temporal/spatial derivatives, and nonlinearity, while the Ordinary Least Squares (OLS) model has noise only in z. Sources of difficulty involve the non-stationarity of the field, the ill-posedness, and the existence of noise in the data. Three ways of applying the TLS with different noise conjectures to the end problem are observed. First, the classical TLS (cTLS) is introduced, where the entries of the error matrices of each row of the augmented matrix [G;z] have zero mean and the same standard deviation. Next, the Generalized Total Least Squares (GTLS) is defined to provide a more stable solution, but it still has some problems. The Generalized Scaled TLS (GSTLS) has G and z tainted by different sources of additive zero-mean Gaussian noise and scaling [G;z] by nonsingular D and E, that is, D[G;z] E makes the errors iid with zero mean and a diagonal covariance matrix. The scaling is computed from some knowledge on the error distribution to improve the GTLS estimate. For moderate levels of additive noise, GSTLS outperforms the OLS, and the GTLS approaches. Although any TLS variant requires more computations than the OLS, it is still applicable with proper scaling of the data matrix.

Hide Abstract

Constructive and Clustering Methods to Solve Capacitated Vehicle Routing Problem

M. A. H. Akhand* , Tanzima Sultana, M. I. R. Shuvo and Al-Mahmud

View:    Abstract |  HTML Full Text |  PDF |  XML|

Vehicle Routing Problem (VRP) is a real life constraint satisfaction problem to find minimal travel distances of vehicles to serve customers. Capacitated VRP (CVRP) is the simplest form of VRP considering vehicle capacity constraint. Constructive and clustering are the two popular approaches to solve CVRP. A constructive approach creates routes and attempts to minimize the cost at the same time. Clarke and Wright’s Savings algorithm is a popular constructive method based on savings heuristic. On the other hand, a clustering based method first assigns nodes into vehicle wise cluster and then generates route for each vehicle. Sweep algorithm and its variants and Fisher and Jaikumar algorithm are popular among clustering methods. Route generation is a traveling salesman problem (TSP) and any TSP optimization method is useful for this purpose. In this study, popular constructive and clustering methods are studied, implemented and compared outcomes in solving a suite of benchmark CVRPs. For route optimization, Genetic Algorithm (GA), Ant Colony Optimization (ACO) and Velocity Tentative Particle Swarm Optimization (VTPSO) are employed in this study which are popular nature inspired optimization techniques for solving TSP. Experimental results revealed that parallel Savings is better than series Savings in constructive method. On the other hand, Sweep Reference Point using every stop (SRE) is the best among clustering based techniques.

Hide Abstract

Design of a Gray Encoder and Counter Using D-FFs and MUXs

Sayyid Samir Al-Busaidi, Afaq Ahmad* and Medhat Awadalla

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper proposes a novel design for Binary to Gray code encoders and/or counters using multiplexers and flip-flops. The proposed design are modular based, whereby other stages can be added as per the requirement of the desired applications. Moreover, the external clock timing signal drives only the first stage, while all remaining stages are linked to the outputs from preceding stages. The successive stages transitions at half the rate of the preceding stage thereby, makes the design power efficient since the dissipated power is quadratic frequency dependent. The proposed design can be modified to increase the counters duration or increase the counters resolution according to the applications need. Increasing the Gray counters time span by powers of two simply necessitates augmenting the design by more stages, while maintaining a constant clock rate. On the other hand, doubling the time resolution of the Gray counter over a constant time span can be achieved by adding another stage while subsequently doubling the clock rate.

Hide Abstract

Website Design of Indian Central Universities: Issues and Challenges

Mudasir M kirmani1 and Syed Mohsin Saif*2

View:    Abstract |  HTML Full Text |  PDF |  XML|

World has shrunk into logically small e-village where everyone can communicate with every other person with great ease via both audio and visual media. The existence of being a global village has only been possible with interconnected communication links connecting users from geographically distant areas. The most predominant media for achieving this is with the help of well developed website. Educational institutions play vital role in reshaping any nation by imbibing quality attributes in culture, civilization and modernization. The information available on websites of educational institutions has become very important for prospective students and at present most of the universities are completing the process of admissions online with websites as mediators. Therefore, the need of the hour is to have updated and informative websites with ease of access. A key feature of the ongoing growth of Worldwide Web over the past decade has seen proliferation of web portals and mobile applications that focus on supporting education. The main aim of this research work is to study websites of central universities established post 2004 to explore their quality parameters and to get an insight into the challenges faced by prospective users. The research work recommends a common design framework for all central universities in order to help prospective users with understanding and usage of central universities websites.

Hide Abstract

Cardiovascular Disease Prediction Using Data Mining Techniques: A Review

Mudasir M Kirmani

View:    Abstract |  HTML Full Text |  PDF |  XML|

Cardiovascular disease represents various diseases associated with heart, lymphatic system and circulatory system of human body. World Health Organisation (WHO) has reported that cardiovascular diseases have high mortality rate and high risk to cause various disabilities. Most prevalent causes for cardiovascular diseases are behavioural and food habits like tobacco intake, unhealthy diet and obesity, physical inactivity, ageing and addiction to drugs and alcohol are to name few. Factors such as hypertension, diabetes, hyperlipidemia, Stress and other ailments are at high risk to cardiovascular diseases. There have been different techniques to predict the prevalence of cardiovascular diseases in general and heart disease in particular from time to time by implementing variety of algorithms. Detection and management of cardiovascular diseases can be achieved by using computer based predictive tool in data mining. By implementing data mining based techniques there is scope for better and reliable prediction and diagnosis of heart diseases. In this study we studied various available techniques like decision Tree and its variants, Naive Bayes, Neural Networks, Support Vector Machine, Fuzzy Rules, Genetic Algorithms, and Ant Colony Optimization to name few. The observations illustrated that it is difficult to name a single machine learning algorithm for the diagnosis and prognosis of CVD. The study further contemplates on the behaviour, selection and number of factors required for efficient prediction.

Hide Abstract

A Survey on User Authentication Techniques

K. Sharmila1,*, V. Janakiand A. Nagaraju3

View:    Abstract |  HTML Full Text |  PDF |  XML|

Confidentiality and Authentication were once treated different but now-a-days, improvement in technology is demanding both of them to be used together. Though technology is increasing tremendously, smart hackers on the environment always challenges the authentication factors, thereby enforcing more number of factors for authentication to be included. As factors increase, failure rate for authentication may also be more when any one of the factors doesn’t work.  A qualitative survey of user authentication systems being used in today’s environment is presented here and a comparative study of various authentication mechanisms used in the world of Information security by various researchers is shown.

Hide Abstract

Survey on Packet Marking Algorithms for IP Traceback

Y. Bhavani1* , V. Janaki2 and R. Sridevi3

View:    Abstract |  HTML Full Text |  PDF |  XML|

Distributed Denial of Service (DDoS) attack is an unavoidable attack. Among various attacks on the network, DDoS attacks are difficult to detect because of IP spoofing. The IP traceback is the only technique to identify DDoS attacks. The path affected by DDoS attack is identified by IP traceback approaches like Probabilistic Packet marking algorithm (PPM) and Deterministic Packet Marking algorithm (DPM). The PPM approach finds the complete attack path from victim to the source where as DPM finds only the source of the attacker. Using DPM algorithm finding the source of the attacker is difficult, if the router get compromised. Using PPM algorithm we construct the complete attack path, so the compromised router can be identified. In this paper, we review PPM and DPM techniques and compare the strengths and weaknesses of each proposal.

Hide Abstract

Iris Biometric Modality: A Review

Iqra Altaf Mattoo and Parul Agarwal*

View:    Abstract |  HTML Full Text |  PDF |  XML|

Biometric Recognition is the most suitable and informed identification method which is used in different fields due to its uniqueness of the countless behavioural and physiological traits like hand geometry, finger prints, iris recognition, face recognition, handwriting, voice recognition, etc. Iris recognition system is widely being used as it has inherently distinctive patterns that provide a robust method for the identification purpose. Different nations have already started to use biometric recognition system for the identification purposes including patient identification, border security, etc. In this review paper, different steps that are involved in Iris Recognition system are defined and evaluation of different Iris Recognition methods used by different researchers for each recognition step is done as well.

Hide Abstract

Geographical Information Based Expert System to Estimate Submersible Pump System Specifications

Gurpreet Kaur1, Derminder Singhand Rajan Aggarwal2

 

View:    Abstract |  HTML Full Text |  PDF |  XML|

During past two decades, groundwater utilization has increased tremendously in the state of Punjab (India) particularly for agriculture purposes. Higher energy demand for lifting water is a result of decline in water table. Punjab is presently facing water crises which exasperate monetary conditions of small farmers, abrogate the natural value and unfavorably influence farming generation and economy of the state.In this research, an expert system was developed using Java Standard Edition 7 which provide appropriate selection of submersible pump set and required associated components such as power cable rating and size, generator capacity, ammeter rating, voltmeter rating, capacitor rating and Polyvinyl Chloride pipe diameter based on the spatial information of last 18 years (1998-2015) of ground water table for the state of Punjab. The developed system will be beneficial for the farmer’s in estimation of the required submersible pumping system and allied electrical components.

Hide Abstract

Intrusion Detection System Based on Data Mining Techniques

Abhinav Kumra, W. Jeberson, and Klinsega Jeberson

View:    Abstract |  HTML Full Text |  PDF |  XML|

Network security is one of the most important non-functional requirements in a system. Over the years, many software solutions have been developed to enhance network security. Intrusion Detection System (IDS) we have provided an overview of different types of intrusion Detection Systems, the advantages and disadvantages of the same. The need for IDS in a system environment and the generic blocks in IDS is also mentioned.The examples are as follows: (1) Misuse intrusion detection system that uses state transition analysis approach, (2) Anomaly based system that uses payload modeling and (3) Hybrid model that combines the best practices of Misuse and Anomaly based intrusion systems.

Hide Abstract

Research Summary of A Study for the Estimation of Legacy Programs for Effective Re engineering

Harmeet Kaur1, Shahanawaj Ahamad2 and  Govinder N. Verma3

View:    Abstract |  HTML Full Text |  PDF |  XML|

The present research estimates the efficacy of a legacy program and the areas of its development. The research also intends to put forward as to what extent reengineering of a legacy program has to be done on the basis of the estimation approach. The study has tried to outline the current issues and trends in reengineering of a legacy program from various perspectives. An all-inclusive literature review reveals that a lot of work has already been piled up with legacy system estimation and the reengineering domain, yet the basic assumptions of Complexity, Quality and Effort have not been worked out collectively. Hence the present research underlines this very maxim and studies the reengineering of a legacy program on the paradigms of Quality, Complexity, and Effort Estimation collectively. The findings put forward an equation and reengineering scale which would be highly compatible with present technology for the feasibility of an effective reengineering.

Hide Abstract

Enhanced K-Means Clustering Algorithm Using Collaborative Filtering Approach

Ankush Saklecha and Jagdish Raikwal

View:    Abstract |  HTML Full Text |  PDF |  XML|

Clustering is well-known unsupervised learning method. In clustering a set of essentials is separated into uniform groups.K-means is one of the most popular partition based clustering algorithms in the area of research. But in the original K-means the quality of the resulting clusters mostly depends on the selection of initial centroids, so number of iterations is increase and take more time because of that it is computationally expensive. There are so many methods have been proposed for improving accuracy, performance and efficiency of the k-means clustering algorithm. This paper proposed enhanced K-Means Clustering approach in addition to Collaborative filtering approach to recommend quality content to its users. This research would help those users who have to scroll through pages of results to find important content.

Hide Abstract

Component Based Software Development Life Cycle Models: A Comparative Review

Preeti Gulia and Palak*

View:    Abstract |  HTML Full Text |  PDF |  XML|

The development of high quality software is the need of current technology driven world. Component Based Software Engineering (CBSE) has provided a cost effective, fast and modular approach for developing complex software.  CBSE is mainly based on the concept of reusability. Apart from these CBSE has several advantages as well as challenges which are summarized in this paper. Large and complex software development requires management of reusable components and can be selected from component repository and assembled to obtain a working application. Development of components and their assembly is different from traditional softwares which leads to the need of new development paradigms for Component Based Systems (CBS). Software development life cycle (SDLC) provides planned and systematic arrangement of activities to be carried out to deliver high quality products within time and budget.  This paper presents a comparative study of component based software development life cycle models with their strengths and weaknesses.

Hide Abstract

A Tpa-Authentication Scheme for Public Cloud Using Kerberos Protocol

Arpit Agrawal* and Shubhangi Verma

View:    Abstract |  HTML Full Text |  PDF |  XML|

Cloud computing is the new generation technology provides the way of sharing of resources, memory, software anything in the form of service using internet. Security is an important and unique phenomenon gives safe and isolated environment. Security model and principles are defined to implement security features with any applications. Confidentiality, authentication and integrity are primary principles for trust establishment. Existing work only concentrates on integrity concept and does not imposes for authentication or access control. A Kerberos based strong authentication scheme has been generated using third party auditing concept to improve the strength of authentication as well as trust on CSP. This work will implement security service architecture to create Kerberos environment and establish communication between Kerberos and CSP. The complete work will be implemented using Java technology and Open Stack serve for public cloud environment.

Hide Abstract

A Review on IoT Security Issues and Countermeasures

J. Yashaswini

View:    Abstract |  HTML Full Text |  PDF |  XML|

Internet of things (IoT) is a system of connected physical objects that are accessible through an internet. The things in IoT is an object that assigned with an IP address and have the ability to collect and transfer the data over a network without manual intervention. As IOT does not need any human to machine interaction, it seems to be one of the largest waves of revolution as per the research going on, hence security is needed. The quick development of IOT has derived with the challenges in terms of security of things. This paper focus on the general security issues in IoT and measures used to overcome with those security issues.

Hide Abstract

A Novel Approach to Address Sensor Interoperability Using Gabor Filter

Neha Bhatia1, Himani2 and Chander Kant3

View:    Abstract |  HTML Full Text |  PDF |  XML|

Biometric authentication using fingerprint is one of the unique and reliable method of verification processes. Biometric System suffers a significant loss of performance when the sensor is changed during enrollment and authentication process. In this paper fingerprint sensor interoperability problem is addressed using Gabor filter and classifying images into good and poor quality. Gabor filters play an important role in many application areas for the enhancement of various types of fingerprint images. Gabor filters can remove noise, preserve the real ridges and valley structures, and it is used for fingerprint image enhancement. Experimental results on the FVC2004 databases show improvements of this approach.

Hide Abstract

Hybrid Intrusion Detection System for Private Cloud & Public Cloud

Riddhi Gaur and Uma Kumari

View:    Abstract |  HTML Full Text |  PDF |  XML|

Internet based applications and data storage services can be easily acquired by the end users by the permission of Cloud computing. Providing security to the cloud computing environment has become important issue with the increased demand of cloud computing. Other than the traditional security methods, additional methods like control access, confidentiality, firewalls and user authentication are required in order to provide security to the cloud computing environment. One of the needful components in terms of cloud security is Intrusion Detection System (IDS). To detect various attacks on cloud, Intrusion Detection System (IDS) is the most commonly used mechanism. This paper discusses about the intrusion detection and different intrusion detection techniques namely anomaly based techniques and signature based techniques.

Hide Abstract

A Review on Electronic Dictionary and Machine Translation System Developed in North-East India

Saiful Islamand Bipul Syam Purkayastha

View:    Abstract |  HTML Full Text |  PDF |  XML|

Electronic Dictionary and Machine Translation system are both the most important language learning tools to achieve the knowledge about the known and unknown natural languages. The natural languages are the most important aspect in human life for communication. Therefore, these two tools are very important and frequently used in human daily life. The Electronic Dictionary (E-dictionary) and Machine Translation (MT) systems are specially very helpful for students, research scholars, teachers, travellers and businessman. The E-dictionary and MT are very important applications and research tasks in  Natural Language Processing (NLP). The demand of research task in E-dictionary and MT system are growing in the world as well as in India. North-East (NE)  is a very popular and multilingual region of India. Even then, a small number of E-dictionary and MT system have been developed for NE languages. Through this paper,  we want to elaborate about the importance, approaches and features of E-dictionary and MT system. This paper also tries to review about the existing E-dictionary and MT system which are developed for NE languages in NE India.

Hide Abstract

Agile Practices in Indian Organizations

Uma Kumari

View:    Abstract |  HTML Full Text |  PDF |  XML|

Agile software development is a conceptual framework that promotes development using iterations throughout the project development. In software development Agile means quick moving. For the satisfaction of customer and to cope up customer frequent changed requirements, heavyweight methodology is being kicked. Two major challenges in software development are high quality software production and stakeholder requirement meeting. An independent online web based survey; interview survey and questionnaire survey were conducted. Motive was to find the total percentage of users in India, who are using Agile and it was tried to find out “does it increase the productivity, quality and cost of software?” Hypothesis has been proved using statistical one way ANOVA method. Different Hypothesis that are designed

Hypothesis I

Production increases on using different methodology of Agile instead of heavyweight methods.

Hypothesis II

Quality increases on using different methodology of Agile instead of heavyweight methods.

Hypothesis III

Cost reduceses on using different methodology of Agile instead of heavyweight methods.

Hide Abstract

Major challenges of Mobile Adhoc Networks

Dharmesh Bhatt* and Bijendra Agrawal

View:    Abstract |  HTML Full Text |  PDF |  XML|

In the unknown environment of network the security the major issue for safe communication between the nodes. In such networks a lot of active and passive attacks are done by the attackers in groups in the data packets and routing messages. Mobile Adhoc networks are such type of networks which does not have any centralized authentication and the nodes communicate with each other. In MANET security plays a vital role for safe communication. The focus is given on security of Mobile Adhoc Networks. MANET is open network which is available for the malicious and trustworthy users to communicate. The MANET has to design a robust solution which deals with such type of malicious attackers so that a health network transmission takes place between trustworthy nodes. The Mobile Adhoc Networks are the network which works in isolation along with the wired infrastructure. This elasticity of MANET is its biggest strength but it is also a big vulnerability for security issues. In paper we will discuss the active and passive attacks including black hole attacks, spoofing, warm hole, flooding and traffic monitoring, traffic analysis and eavesdropping respectively.

Hide Abstract

Comparative study on Classification of Digital Images

H. B. Basanth Kumar

View:    Abstract |  HTML Full Text |  PDF |  XML|

Digital images are widespread today. The use of digital images is classified into natural images and computer graphic images. Discrimination of natural images and computer graphic (CG) images are used in the applications which include flower classification, indexing of images, video classification and many more. With the rapid growth in the image rendering technology, the user can produce very high realistic computer graphic images using sophisticated graphics software packages. Due to high realism in CG images, it is very difficult for the user to distinguish it from natural images by a naked eye. This paper presents comparative study of the existing schemes used to classify digital images.

Hide Abstract

Plastic Money Security Issues in India

Reena Hoodaand  Deepak Dhaka2

View:    Abstract |  HTML Full Text |  PDF |  XML|

The benefit of plastic money is its accessibility and acceptability with lightness than cash in wallet plus ready notion with the major problem of fraudsters who may exploit the rights of the genuine stakeholders. At present, the computing resources, communication technology, availability of the cards in competitive market are escalating India to be placed at the growing edge of popularity of plastic money. People after demonetization are intensely employing the technology; moreover it’s a step towards the digitalization of India. With increasing population of the smart phone users, people are more aware about the cautions needed to pursue while using P-cards. However as compare to the developed nations, in a land of villages- India, the problem become more perilous as there are petite infrastructural facility & literacy, high percentage of less skilled population, old customs and non-reach of the government facilities, spoiling the p-money applicability & security and digital India campaign.. Therefore, besides the plastic money rewards and contributions to the digitalization, present paper contributed towards a range of security issues in addition to technique to make money protected and reliable.

Hide Abstract

Wavelet Statistical Feature based Malware Class Recognition and Classification using Supervised Learning Classifier

Aziz Makandar and Anita Patrot

 

View:    Abstract |  HTML Full Text |  PDF |  XML|

Malware is a malicious instructions which may harm to the unauthorized private access through internet. The types of malware are incresing day to day life, it is a challenging task for the antivius vendors to predict and caught on access time. This paper aims to design an automated analysis system for malware classes based on the features extracted by Discrete Wavelet Transformation (DWT) and then by applying four level  decomposition of malware. The proposed system works in three stages, pre-processing, feature extraction and classification. In preprocessing, input image is normalized in to 256x256 by applying wavelet we are denoising the image which helps to enhance the image. In feature extraction, DWT is used to decompose image into four level. For classification the support vector machine (SVM) classifiers are used to discriminate the malware classes with statistical features extracted from level 4 decomposition of DWT such as Daubechies (db4), Coiflet (coif5) and Bi-orthogonal (bior 2.8). Among these wavelet features the db4 features effectively classify the malware class type with high accuracy 91.05% and 92.53% respectively on both dataset. The analysis of proposed method conducted on two dataset and the results are promising. 

Hide Abstract

Breast Cancer Detection using Image Processing Techniques

Prannoy Giri and K. Saravanakumar 

View:    Abstract |  HTML Full Text |  PDF |  XML|

Breast Cancer is one of the significant reasons for death among ladies. Many research has been done on the diagnosis and detection of breast cancer using various image processing and classification techniques. Nonetheless, the disease remains as one of the deadliest disease. Having conceive one out of six women in her lifetime. Since the cause of breast cancer stays obscure, prevention becomes impossible. Thus, early detection of tumour in breast is the only way to cure breast cancer. Using CAD (Computer Aided Diagnosis) on mammographic image is the most efficient and easiest way to diagnosis for breast cancer. Accurate discovery can effectively reduce the mortality rate brought about by using mamma cancer. Masses and microcalcifications clusters are an important early symptoms of possible breast cancers. They can help predict breast cancer at it’s infant state.  The image for this work is being used from the DDSM Database (Digital Database for Screening Mammography) which contains approximately 3000 cases and is being used worldwide for cancer research. This paper quantitatively depicts the analysis methods used for texture features for detection of cancer. These texture featuresare extracted from the ROI of the mammogram to characterize the microcalcifications into harmless, ordinary  or threatening. These features are further decreased using Principle Component Analysis(PCA) for better identification of Masses. These features are further compared and passed through Back Propagation algorithm (Neural Network) for better understanding of the cancer pattern in the mammography image.

Hide Abstract

An Automated Testing Model using Test Driven Development Approach

C.P. Patidar1 and Arun Dev Dongre2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Today we live in the era of software and web applications. Software is used in every minor and major field. In defense, medical, education, research, government, administration and much other field software became a necessary part. Software also brings transparency in the systems. Software also makes people’s life easy and comfortable. Software testing is a very important part of any software development process. Software testing requires approximately forty percent budget of any software development process. Like in an automobile industry every vehicle is tested before it goes to the customer. Also in software testing it is must to test the software before deployment. Because if software deployed without testing then user will face the bug and user will be unhappy with the software. In this paper we compare manual and automated testing and proposed an automated testing model with test driven development (TDD).

Hide Abstract

An Optimal Resource Provisioning Algorithm for Cloud Computing Environment

Shivangi Nigam and Abhishek Bajpai

View:    Abstract |  HTML Full Text |  PDF |  XML|

Resource Provisioning in a Cloud Computing Environment ensures flexible and dynamic access of the cloud resources to the end users. The Multi-Objective Decision Making approach considers assigning priorities to the decision alternatives in the environment. Each alternative represents a cloud resource defined in terms of various characteristics termed as decision criteria. The provisioning objectives refer to the heterogeneous requirements of the cloud users. This research study proposes a Resource Interest Score Evaluation Optimal Resource Provisioning (RISE-ORP) algorithm which uses Analytical Hierarchy Process (AHP) and Ant Colony Optimization (ACO) as a unified MOMD approach to design an optimal resource provisioning system. It uses AHP as a method to rank the cloud resources for provisioning. The ACO is used to examine the cloud resources for which resource traits best satisfy the provisioning. The performance of this approach is analyzed using CloudSim. The experimental results show that our approach offers improvement in the performance of previously used AHP approach for resource provisioning.

Hide Abstract

I-v Fuzzy Shortest Path in a Multigraph

Siddhartha Sankar Biswas

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this research paper the author introduces the notion of i-v fuzzy multigraph. The classical Dijkstra’s algorithmic rule to search out the shortest path in graphs isn't applicable to fuzzy graph theory. Consequently the author proposes a brand new algorithmic rule referred to as by IVF-Dijkstra's algorithmic rule with the philosophy of the classical Dijkstra's formula to unravel the SPP in an i-v fuzzy multigraph

Hide Abstract

Secure ASP.NET Web Application by Discovering Broken Authentication and Session Management Vulnerabilities

Rupal R Sharma1 and Ravi K Sheth2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Today, web application security is most significant battlefield between victim, attacker and resource of web service. The owner of web applications can’t see security vulnerability in web application which develops in ASP.NET. This paper explain one algorithm which aim to identify broken authentication and session management vulnerability. The given method of this paper scan the web application files. The created scanner generator relies on studying the source character of the application limited ASP.NET files and the code be beholden files. A program develop for this motive is to bring about a report which describes vulnerabilities types by mentioning the indict name, disclose description and its location. The aim of the paper is to discover the broken authentication and session management vulnerabilities. The indicated algorithm will uphold organization and developer to repair the vulnerabilities and recover from one end to the other security.

Hide Abstract

Software Upgradation Model Based on Agile Methodology

Manvender Singh Rathore* and Deepa V. Jose

View:    Abstract |  HTML Full Text |  PDF |  XML|

Agile software development work on twelve principle for software development which implies requirements and solutions evolved through the combined teamwork of disciplined and interdisciplinary teams. The objective of this paper is to connect agile methodology with Version Control System for more efficient and effective utilization of resources. In this proposed model based on agile methodology, the Version Control System plays a vital role to make work done faster as compared to SCRUM. This paper compares various existing agile methodologies. The efficiency of the proposed model is proved through comparative analysis with existing agile methods and using ANOVA mathematical model. Bitbucket as Version Control System is used as web based hosting service and the proposed model is compared by maintaining similar sprints in SCRUM and VSprint model. VCS and previous SRS documents are the important components of this proposed model which helps in increasing the work speed at different phases of software development which the existing models does not consider.

Hide Abstract

Enhanced Content Based Double Encryption Algorithm Using Symmetric Key Cryptography

Junestarfield Lyngdoh Kynshi and Deepa V Jose

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper aims to solve the problems of the existing technique of the content based double encryption algorithm using symmetric key cryptography. Simple binary addition, folding method and logical XOR operation are used to encrypt the content of a plaintext as well as the secret key.This algorithm helps to achieve the secure transfer of data throught the network. It solved the problems of the existing algorithm and provides a better solution. The plaintext are encrypted using the above methods and produce a cipher text. The secret key is encrypted and shared through secure network and without knowing the secret key it is difficult to decipher the text. As per expected, enhanced encryption algorithm gives better result than the existing encryption algorithm.

Hide Abstract

Detection and Isolation of Zombie Attack under Cloud Environment

Sunil Kumar and Maninder Singh

View:    Abstract |  HTML Full Text |  PDF |  XML|

Network security, data security and several other security types such as the computer security collectively compose the word “Cloud Security”. Cloud computing posses a new challenge because traditional security mechanism is being followed are insufficient to safeguard the cloud resources. Cloud computing can easily be targeted by the attackers. A group of malicious users or illegitimate users can attack on system which may lead to denial the services of legitimate users. Such kinds of attacks are performed by the malicious (zombie) attackers. The zombie attack will degrade the network performance to large extend. Traditional techniques are not easily capable to detect the zombie attacker in the cloud network. So in this paper we have proposed a technique which is the enhancement of the mutual authentication scheme in order to detect and isolate zombie attack for the efficient performance of the network.

Hide Abstract

Classification of Human Organ Using Image Processing

Sindhu* and V. Vaidhehi

View:    Abstract |  HTML Full Text |  PDF |  XML|

The collection of large database of digital image has been used for efficient and advanced way for classifying and intelligent retrieval of medical imaging. This research work is to classify human organs based on MRI images. The various MRI images of organ have been considered as the data set. The main objective of this research work is to automate the medical imaging system. Digital images retrieved based on its shape by Canny Edge Detection and is clustered together in one class using K-Means Algorithm. 2564 data sets related to brain and heart is considered for this research work. The system was trained to classify the image which results in faster execution in medical field, also helped in obtain noiseless and efficient data.

Hide Abstract

A New Distributed Intrusion Detection System in Computer Network: An Approach to Detect Malicious Intrusion Threats at Initial Stage

Parveen Sadotra and Chandrakant Sharma

View:    Abstract |  HTML Full Text |  PDF |  XML|

Internet is a blessing for human community modern days and use of network is indispensable in present time. Use of networks and internet has also brought large numbers of security threats to our database and information systems.  There are so many intrusion attacks on public and private networks. Main objective of this research work to study about problem associated with intrusion in network system and analyzes the use of intrusion Detection systems. Scrutinize the use of various IDS and develop a new IDS which should be most effective and easy to use also cost effective for users. So, we will be presenting our newly developed application based IDS which is to be suitable way to detect threat in the network system which can be cost effective and easy to use also it should have instantaneous alert system to notify intrusion to security professionals.

Hide Abstract

Intelligent Information Retrieval Using Hybrid of Fuzzy Set and Trust

Suruchi Chawla

View:    Abstract |  HTML Full Text |  PDF |  XML|

The main challenge for effective web Information Retrieval(IR) is to infer the information need from user’s query and retrieve relevant documents. The precision of search results is low due to vague and imprecise user queries and hence could not retrieve sufficient relevant documents. Fuzzy set based query expansion deals with imprecise and vague queries for inferring user’s information need.  Trust based web page recommendations retrieve search results according to the user’s information need.  In this paper an algorithm is designed for Intelligent Information Retrieval using hybrid of Fuzzy set and Trust in web query session mining to perform Fuzzy query expansion for inferring user’s information need and trust is used for recommendation of web pages according to the user’s information need. Experiment was performed on the data set collected in domains Academics, Entertainment and Sports and search results confirm the improvement of precision.

Hide Abstract

Automatic Feedback Generation in Software Performance Engineering: A Review

Javaid Iqbal and Syed Abrar Ul Haq

View:    Abstract |  HTML Full Text |  PDF |  XML|

Automation in generation of architectural feedback from performance indexes like probability distributions, mean values and variances has been of interest to the researchers from last decade. It is well established that due to the complexity in interpreting the performance indices obtained from performance analysis of software architecture and short time to the market, an automated approach is vital for acceptance of architecture based software performance engineering approach by software industry. In last decade some work has beendone in this direction. Aim of this paper is to explore the existing research in the field, which will be valuable for researchers looking forward to contributing to this research.

Hide Abstract

Assessment of Accuracy Enhancement of Back Propagation Algorithm by Training the Model using Deep Learning

Baby Kahkeshan and Syed Imtiyaz Hassan

View:    Abstract |  HTML Full Text |  PDF |  XML|

Deep learning is a branch of machine learning which is recently gaining a lot of attention due to its efficiency in solving a number of AI problems. The aim of this research is to assess the accuracy enhancement by using deep learning in back propagation algorithm. For this purpose, two techniques has been used. In the first technique, simple back propagation algorithm is used and the designed model is tested for accuracy. In the second technique, the model is first trained using deep learning via deep belief nets to make it learn and improve its parameters values and then back propagation is used over it. The advantage of softmax function is used in both the methods. Both the methods have been tested over images of handwritten digits and accuracy is then calculated. It has been observed that there is a significant increase in the accuracy of the model if we apply deep learning for training purpose.

Hide Abstract