Author Archives: Kamran Khan

Impact of Biometric Attendance System on Secondary and Higher Secondary Educational Institutions Across J&K

Mudasir M Kirmani

View:    Abstract |  HTML Full Text |  PDF |  XML|

Managing and monitoring attendance of employees is very important aspect for smooth functioning of any public or private organization. To obtain and maintain the attendance of employees in an organization has  become a challenging aspect to deal with. In order to avoid human bias and direct human intervention different government institutions have implemented Biometric attendance system in educational institutions to record employee attendance on daily basis. This research work aims to study the impact of Biometric Attendance system(BAS) on educational system vis-a-vis punctuality of employees in an educational institute. The study indicates that biometric modalities are universally secure and accurate, but in practice the scenarios of attendance systems in Jammu & Kashmir has highlighted some loopholes which are existing at present in the Biometric attendance system.

Hide Abstract

Enhancing the Classification Accuracy of Noisy Dataset By Fusing Correlation Based Feature Selection with K-Nearest Neighbour

Samir Kumar Singha and Syed Imtiyaz Hassan*

View:    Abstract |  HTML Full Text |  PDF |  XML|

The performance of data mining and machine learning tasks can be significantly degraded due to the presence of noisy, irrelevant and high dimensional data containing large number of features. A large amount of real world data consist of noise or missing values. While collecting data, there may be  many irrelevant features that  are collected by the storage repositories. These redundant and irrelevant feature values distorts the classification principle  and  simultaneously increases calculations overhead and decreases the prediction ability of the classifier. The high-dimensionality of such datasets possesses major bottleneck  in the field of data mining, statistics, machine learning. Among several methods of dimensionality reduction,  attribute or feature selection technique is often used in dimensionality reduction. Since the k-NN algorithm is sensitive to irrelevant attributes therefore its performance degrades significantly when a dataset contains missing values or noisy data. However, this weakness of the k-NN algorithm can be minimized when combined with the other feature selection techniques. In this research we combine the Correlation based Feature Selection (CFS) with k-Nearest Neighbour (k-NN) Classification algorithm to find better result in classification when the dataset contains missing values or noisy data.  The reduced attribute set decreases the time required for classification. The research shows that when dimensionality reduction is done using CFS and classified with k-NN algorithm, dataset with nil or very less noise may have negative impact in the classification accuracy, when compared with classification accuracy of k-NN algorithm alone. When additional noise is introduced to these datasets, the performance of k-NN degrades significantly. When these noisy datasets are classified using CFS and k-NN together, the percentage in  classification accuracy is improved.

Hide Abstract

Cellular Automata Based Study of Spectral Signatures of Dal_Lake Infrared Imagery

Fasel Qadir

View:    Abstract |  HTML Full Text |  PDF |  XML|

Among all the water bodies in Jammu & Kashmir Dal Lake has a peculiar significance due to its location in the heart of the capital city Srinagar. Historical studies over last fifteen hundred years indicate a continuous squeezing of the Lake due to different natural and manmade interventions. Over this long period, the governance of the land has passed through various wise and ugly human plans besides some slow natural processes. The mathematical modelling of such a dynamics is not an easy task because of the many intervening variables and the difficulty which implies their measurements. On the other hand, during the last decades, the use of Cellular Automata (CA) techniques to simulate the behaviour of linear or non-linear systems is becoming of great interest. This fact is mainly due to the fact that this approach depends largely on local relations and a series of rules instead of precise mathematical formulae. The infrared (IR) satellite imagery can be helpful in identifying the different areas of interest using CA as a tool of image processing. The study will not only separate the areas of interest but also pave a way towards a comprehensive study of all the identified zones using spectral signatures received from the continuous IR imagery of both pre-monsoon and post-monsoon periods in future.

Hide Abstract

Approximation Query Layer (AQLayer): Design and Architecture

Tamanna Siddiqui and Mohammad AlKadri

View:    Abstract |  HTML Full Text |  PDF |  XML|

Data Scientists need to manipulate with data (retrieve, aggregate, join, …. ) - when they do their tasks - for that it will be very useful to build a layer which prepare the data to be convenient for analysing step; with an approximation processing and some error tolerant defined by the user, that layer will handle both inserting records or collections to the database and retrieving the information from that database.

In this paper we will focus on the structure and the design of this layer, and dig deeper how this layer will translate the user’s inquiring manner to a SQL statement suitable for approximation processing.

Hide Abstract

Trends of Manufacturing Systems with Distributed Computing

Abedalhakeem T. E. Issa

View:    Abstract |  HTML Full Text |  PDF |  XML|

The industry developed dramatically in the second half of the 20th century, And with it developed and manufacturing systems ranging from manual to fully computerized systems employing information and communication technology (ICT). This fact has made the manufacturing systems to be totally dependent on ICT and therefore these systems have to keep pace with the advancement in ICT. Distributed computing has totally changed the computing paradigm in recent times resulting in rapid employment of these technologies in the manufacturing sector. An important variable in the equation determining the trend of manufacturing technologies is the purchaser choice and preference which has become active recently. To address these heterogeneous user demands, the Autonomous Decentralized System (ADS) concept was introduced five decades ago. The ADS has been a significant development incorporated in modern manufacturing systems and have been standardised as the de-facto standard for factory automation. These systems hold the assure for on-line system maintenance, timeliness and assurance, ensuring greater productivity and cost benefit emerging as the system of choice in automated manufacturing systems. This paper reviews the ADS, its application to a manufacturing system, assesses the state of the art and discusses the future trends.

Hide Abstract

Modeling and Simulation of Fiber Bragg Grating (Fbg) as A Strain Sensor

Muhammad Arif Bin Jalil

View:    Abstract |  HTML Full Text |  PDF |  XML|

This study presents the modelling, simulation, and characterization of the Fiber Bragg grating (FBG) on maximum reflectivity, bandwidth, the effect of applied strain to the wavelength shift, ʎB and sensitivity of the wavelength shift with strain for optical sensing system. In this study, a commercial FBG with the center wavelength of 1550nm is used in order to measure the spectral response of FBG to strain. The parameters used in these simulations are the fiber grating length, L ranging from 1 to 10mm, the changes in refractive index, ∆n from 0.0002 to 0.0020, the effective refractive index, is 1.46 and the grating period of FBG,Λ for 530nm in the performance of FBG. The bandwidth and spectrum reflectivity are analyzed from the variation of refractive index and grating length. Simulations on the FBG are carried out using OriginPro 2016 and Microsoft Excel 2010 software. The Excel sheet is used to generate data and the OriginPro 2016 is used to generate the graphs. The results obtained indicates the variation in grating length and refractive index affect the spectral reflectivity and the bandwidth. In addition, results obtained show that the changes in the Bragg wavelength are due to an increase in length of the grating region which due to the applied strain.

Hide Abstract

Assessing the Effect of Participatory Approach for Delivering Agricultural Information Service (AIS) to Smallholder Farmers

V. V. Sumanthkumar1, Nilesh Mishra2, Kanika Singh3, P. Vijaya Raju4 and Madhu Babu5

View:    Abstract |  HTML Full Text |  PDF |  XML|

Information has been described as the fifth most important need of mankind ranking after air, water, food and shelter.” - Kemp (1976) and we believe smallholder farmers are no different. In fact, in addition to other key inputs (such as improved seed variety, micro and macronutrients etc.) access to correct, timely and actionable information (in this case scientifically supported farming practices that include but are not limited to date of sowing, irrigation, harvesting, better price, market etc.) can be considered as a critical factor in achieving better yield and thereby higher profits. Source of this information is also a critical factor in acceptance and adoption of these information. For instance, a report published in 2005 by National Sample Survey Organization (NSSO, 2005), a government of India entity, highlights that most preferred source for agricultural information services for smallholder farmers are their fellow farmers. Reports suggest that participatory approach in video based agriculture extension has proved to be more effective as compared to traditional extension method but there is not much or limited evidence of same for audio based dissemination of agricultural information services (AIS).

In the above backdrop and taking this fact into consideration a team of researchers at ICRISAT along with its partners conducted a research to study the effect participatory approach for delivering agricultural information services to smallholder farmers. Where in, improved and scientifically supported farming practices were recorded as small audio capsule in participating farmer’s voice and were periodically delivered to other farmers who were part of this research study. During the study, it was found that agricultural information services, to be delivered, when recorded in farmers own voice from the same farming community creates excitement for the fellow farmers and positively affects the listening pattern of those messages. This in turn translates into increased adoption, better yield and higher profit.

Hide Abstract

Control Algorithm to The Semantic D ata Security in Computer Networks

Khaled Sadeq Al-shredei, Mohamad A. A. Al-Rababah and Nour E. Oweis

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper we will talk about information technology semantic , which plays an important role in many areas of human activity and therefore the task of data security semantic is very important for information technology and will consider this research to support the idea of data semantic cards safety of RDF tri- called safety data. Also with the security of confidential data labels in different security levels in different departments have different ranges and variety of use of user data. There are ongoing contacts with the idea of safety data on the labels of three diverse In this research has identified the main rules to control user access to the data by comparing the levels of access to the user ( the security levels triple ) Based on these rules and advanced algorithms will do read and modify data RDF triple.

Hide Abstract

A study of grain and particle sizes of 40%Ni-Fe, 50%Ni-Fe and 75%Ni-Fe nanopowder prepared by mechanical alloying

T. ASHOK KUMAR¹*, A. RAJADURAI² and GOUTHAMA³

View:    Abstract |  HTML Full Text |  PDF |  XML|

Mechanical alloying through high energy ball milling was used to produce Ni-Fe alloy powders starting from elemental Ni and Fe powders of average particle size 80 and 25 μm respectively. High Energy Planetary ball milling at room temperature was performed for various time durations ranging between 2 to 100 hours. X-ray Diffraction (XRD) and Particle Size Analyzer were used for characterization of powders. Comparing the three alloys when the percentage of nickel increases the rate of reduction of grain and particle size decreases. During mechanical alloying while the grain size changes the particle size almost remain constant or the particle size changes the grain size almost remain constant. This may be due to the energy consumption either changes in grain size or particle size during mechanical alloying.

Hide Abstract

Privacy Preservation and Data Security on Internet Using Mutual Ssl

S. Sebastian and R. S. Chouhan

View:    Abstract |  HTML Full Text |  PDF |  XML|

It is essential to maintain a ratio between privacy protection and knowledge discovery. Internet users  depend daily on SSL/HTPS for secure communication on internet.

Over the years, many attacks on the certificate trust model it uses have been evolved. Mutual SSL authentication shared verification alludes to two parties validating each other through checking the digital certificate so that both sides are guaranteed of the other’s identity.

In technical terms, it alludes to a client (web program or client application) authenticate themselves to the server (server application) and that server likewise confirming itself to the client through checking the general public key certificate issued by trusted Certificate Authorities (CA). Since confirmation depends on computerized Certificate, certification authorities, for example, Verisign or Microsoft Declaration Server are a critical part of mutual authentication process.

From an abnormal state perspective, the way toward authenticating and setting up an encrypted channel using certificate-based mutual SSL authentication.

Hide Abstract

Age Estimation Using OLPP Features

M. S. Vaishnavi* and  A. Vijayalakshmi

View:    Abstract |  HTML Full Text |  PDF |  XML|

Aging face recognition poses as a key difficulty in facial recognition. It refers to identification of a person face over varied ages. It includes issues like age estimation, progression and verification. Non-availability of facial aging databases make it harder for any system to achieve good accuracy as there are no good training sets available. Age estimation when done correctly has a varied number of real life applications like age detailed vending machines, age specific access control and finding missing children. This paper implements age estimation using Park Aging Mind laboratory - Face database that contains metadata and 293 unique images of 293 individuals. Ages range from 19 to 45 with a median age of 32. Race is classified into two categories : African-American and Caucasian giving an accuracy of 98%. Sobel edge detection and Orthogonal locality preservation projection were used as the dominant features for the training and testing of age estimation. A Multi-stage binary classification using support vector machine was used to classify images into an age group thereafter predicting an individual’s age. The effectiveness of this method can be increased by using a large dataset with a wider age range.

Hide Abstract

Comparative Analysis of Clustering Techniques for Various Models of Node Deployment Strategies

Alan J. George and Deepa V. Jose

View:    Abstract |  HTML Full Text |  PDF |  XML|

Energy efficiency has always remained a pressing matter in the world of Wireless Sensor Networks. Irrespective of the number of routing protocols that exist for Wireless Sensor Networks, only a handful can be named as efficient. Yet above all these routing protocols stands the emblematic one, the LEACH protocol. This research work is aimed at bringing forth a new routing strategy based on the LEACH protocol, which aims at improving the energy efficiency in Wireless Sensor Networks and applying the given clustering technique, in randomly deployed and fixed sensor network simulation environment using MATLAB. In depth simulations have proven that the proposed clustering strategy gives better performance compared to LEACH based on the lifetime of the Nodes. A comparative analysis of the rate of energy consumed on various node deployment strategies has also been carried out.

Hide Abstract

A Comparative Review on Different Methods of Face Recognition

Tenzin Dawa1 and N. Vijayalakshmi2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Face Recognition is a biometric system which can be used to identify or verify a person from digital image by using the facial features that are unique to each other. There are many techniques which can be used in a face recognition system. In this paper we review some of the algorithms and compare them to see which technique is better compared to one another. Techniques that are compared in this technique are Non-negative matrix factorization (NMF) with Support Vector Machine (SVM), Partial Least Squares (PLS) with Hidden Markov Model (HMM) and Local Ternary Pattern (LTP) with Booth’s Algorithm.

Hide Abstract

Prediction of Bike Sharing Demand

Purnima Sachdeva and K N Sarvanan

View:    Abstract |  HTML Full Text |  PDF |  XML|

Bike sharing systems have been gaining prominence all over the world with more than 500 successful systems being deployed in major cities like New York, Washington, London. With an increasing awareness of the harms of fossil based mean of transportation, problems of traffic congestion in cities and increasing health consciousness in urban areas, citizens are adopting bike sharing systems with zest. Even developing countries like India are adopting the trend with a bike sharing system in the pipeline for Karnataka. This paper tackles the problem of predicting the number of bikes which will be rented at any given hour in a given city, henceforth referred to as the problem of ‘Bike Sharing Demand’. In this vein, this paper investigates the efficacy of standard machine learning techniques namely SVM, Regression, Random Forests, Boosting by implementing and analyzing their performance with respect to each other.This paper also presents two novel methods, Linear Combination and Discriminating Linear Combination, for the ‘Bike Sharing Demand’ problem which supersede the aforementioned techniques as good estimates in the real world.

Hide Abstract

Ambulance Tracking System Using Restful Api

C. S. Vikas1 and  Ashok Immanuel2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Saving peoples live is the most important in today’s world. The best way to save lives is to have an ambulance system which is effective and can be reachable to the user/client, this paper gives the solution which focus to make the ambulance available to a nearby user/client/patient in the least possible time which will help save many lives. After extensive study and analysis new technology evolved. Navigator.geolocation method based on RestFUL Web Services is used this will help the ambulance location to be updated in the database so that it can be seen by the user who is using this application and makes it easy for him to book the ambulance. The client’s location will be pin pointed on the google map and even the ambulance which is nearby the user will be pin pointed on the map, once the patient is on board the ambulance location is taken and then the list of hospitals are pointed out on the map which helps the ambulance driver to choose the nearby hospital to take the patient on time.

Hide Abstract

Churn Analysis in Telecommunication Using Logistic Regression

Helen Treasa Sebastian and Rupali Wagh

View:    Abstract |  HTML Full Text |  PDF |  XML|

Since the beginning of data mining the discovery of knowledge from the Databases has been carried out to solve various problems and has helped the business come up with practical solutions. Large companies are behind improving revenue due to the increase loss in customers.

The process where one customer leaves one company and joins another is called as churn. This paper will be discussing how to predict the customers that might churn, R package is being used to do the prediction. R package helps represent large dataset churn in the form of graphs which will help to depict the outcome in the form of various data visualizations. Churn is a very important area in which the telecom domain can make or lose their customers and hence the business/industry spends a lot of time doing predictions, which in turn helps to make the necessary business conclusions. Churn can be avoided by studying the past history of the customers. Logistic Regression is been used to make necessary analysis. To proceed with logistic regression we must first eliminate the outliers that are present, this has be achieved by cleaning the data (for redundancy, false data etc) and the resultant has been populated into a prediction excel using which the analysis has been performed.

Hide Abstract

Performance Investigation of Antivirus – A Comparative Analysis

Remya Thomas and M. Nachamai*

View:    Abstract |  HTML Full Text |  PDF |  XML|

Antivirus as name implies prevent the devices such as computers, mobiles and pen-drive from viruses. All gadgets which interact with open network are prone to virus. Virus is a malicious software program which replicates by copying its code multiple times or by infecting any computer program (like modifying the existing program) which can affect its process. Virus perform harmful task on affected host computer such as possessing on hard disk, CPU time, accessing private information etc. This paper specifies the performance of (McAFee, Avast, Avira, Bitdefender, Norton) antivirus and its effectiveness on the computer. The performance is tested based on the time acquired by each antivirus to act on a computer. The parameters used to analyze the performance are quick scan, full scan and custom scan with respect to time. Through the analysis Bitdefender performance is better than other selected antivirus.

Hide Abstract

Improved fair Scheduling Algorithm for Hadoop Clustering

Sneha and Shoney Sebastian

View:    Abstract |  HTML Full Text |  PDF |  XML|

Traditional way of storing such a huge amount of data is not convenient because processing those data in the later stages is very tedious job. So nowadays, Hadoop is used to store and process large amount of data. When we look at the statistics of data generated in the recent years it is very high in the last 2 years. Hadoop is a good framework to store and process data efficiently. It works like parallel processing and there is no failure or data loss as such due to fault tolerance. Job scheduling is an important process in Hadoop Map Reduce. Hadoop comes with three types of schedulers namely FIFO (First in first out), Fair and Capacity Scheduler. The schedulers are now a pluggable component in the Hadoop Map Reduce framework. This paper talks about the native job scheduling algorithms in Hadoop. Fair scheduling algorithm is analysed with its algorithm considering its response time, throughput and performance. Advantages and drawbacks of fair scheduling algorithm is discussed. Improvised fair scheduling algorithm is proposed with new strategy. Analysis is made with respect to response time, throughput and performance is calculated in naive fair scheduling and improvised fair scheduling. Improvised fair Scheduling algorithms is used in the cases where there is jobs with high and less processing time.

Hide Abstract

Predictive Analytics – The Cognitive Analysis

T.Venkat Narayana Rao1, Sohail Ali Shaik1 and S. Manminder Kaur1

View:    Abstract |  HTML Full Text |  PDF |  XML|

Predictive analytics plays an important role in the decision-making process and intuitive business decisions, by overthrowing the traditional instinct process. Predictive analytics utilizes data-mining techniques in order to predict the future outcomes with a high level of certainty. This advanced branch of data engineering is composed of various analytical and statistical methods which are used to develop models that predict the future occurrences. This paper examines the concepts of predictive analytics and various mining methods to achieve the prior. In conclusion, paper discusses process and issues involved in the knowledge discovery process.

Hide Abstract

Z-Dijkstra’s Algorithm to solve Shortest Path Problem in a Z-Graph

Siddhartha Sankar Biswas

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper the author introduces the notion of Z-weighted graph or Z-graph in Graph Theory, considers the Shortest Path Problem (SPP) in a Z-graph. The classical Dijkstra’s algorithm to find the shortest path in graphs  is not applicable to Z-graphs. Consequently the author proposes a new algorithm called by Z-Dijkstra's Algorithm with the philosophy of the classical Dijkstra's Algorithm to solve the SPP in a Z-graph.

Hide Abstract

Comparative Study of Routing Protocols in MANET

Sachin Lalar and Arun Kumar Yadav

View:    Abstract |  HTML Full Text |  PDF |  XML|

Routing protocol is the essential and vital performance factor in the Mobile Ad-hoc Network. The routing protocols in MANET are accomplished to handle a lot number of nodes with restricted resources. There is a variety of routing protocol exist in MANET. The routing protocol which is chosen may have an effect on the performance of network. In this paper, We perform a comparative study of DSDV, CSGR, WRP, AODV, OLSR, DSR, TORA, ZRP, ZHLS, DYMO routing protocol with respect to Routing Approaches, Routing structure, Route selection, Route, Routing table, Route maintenance, Operation of protocols, Strength, Weakness.

Hide Abstract

Evaluation of Object Utilization Through Statistical Model in Software Development Process

Kumar Rahul1, Brijesh Kumar Sinha1 and Vijay Kumar2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Objects needs verification through statistical model in software development process which are important in software industries now a day. Software development process consist of several steps right from analysis to deployment and maintenance, therefore statistical model would certainly analyses object(s) and its various qualities and its relationship during software development process. Earlier, we have designed a TMS where, object(s) being available for various purposes like accessibility, reusability in a development of software product or embedded product, thus statistical model justify the level of accessibility in terms of profitability and quantity of access. So far various statistical models have been implemented to identify and established the relationship but not all statistical model are used to analyses and calculate the parametric standard and determine the reusability factor in software development process model. In fact, this statistical model justified at various level of development and would help in determine cost of accessibility (CoA) and cost of reusability (CoR)

Hide Abstract

A Novel Blind Digital Watermarking Based on SVD and Extreme Learning Machine

Neelam Dabas1, Rampal Singh2 and Vikash Chaudhary3

View:    Abstract |  HTML Full Text |  PDF |  XML|

Modification of media and illegal production is a big problem now a days because of free availability of digital media. Protection and securing the digital data is a challenge. An Integer Wavelet Transformation (IWT) domain based robust watermarking scheme with Singular Value Decomposition (SVD) and Extreme Learning Machine (ELM) have been proposed and tested on different images. In this proposed scheme, a watermark or logo is embedded in the IWT domain as ownership information with SVD and ELM is trained to learn the relationship between the original coefficient and the watermarked one. This trained ELM is used in the extraction process to extract the embedded logo from the image. Experimental results show that the proposed watermarking scheme is robust against various image attacks like Blurring, Noise, Cropping, Rotation, Sharpening etc. Performance analysis of proposed watermarking scheme is measured with Peak Signal to Noise Ratio (PSNR) and Bit Error Rate (BER)

Hide Abstract

Comparison of Viola-Jones and Kanade-Lucas-Tomasi Face Detection Algorithms

Kamath Aashish and A. Vijayalakshmi

View:    Abstract |  HTML Full Text |  PDF |  XML|

Face detection technologies are used in a large variety of applications like advertising, entertainment, video coding, digital cameras, CCTV surveillance and even in military use. It is especially crucial in face recognition systems. You can’t recognise faces that you can’t detect, right? But a single face detection algorithm won’t work in the same way in every situation. It all comes down to how the algorithm works. For example, the Kanade-Lucas-Tomasi algorithm makes use of spatial common intensity transformation to direct the deep search for the position that shows the best match. It is much faster than other traditional techniques for checking far fewer potential matches between pictures. Similarly, another common face detection algorithm is the Viola-Jones algorithm that is the most widely used face detection algorithm. It is used in most digital cameras and mobile phones to detect faces. It uses cascades to detect edges like the nose, the ears etc. However, if there is a group of people and their faces are close to each other, the algorithm might not work that well as edges tend to overlap in a crowd. It might not detect individual faces. Therefore, in this work, we test both the Viola-Jones and the Kanade-Lucas-Tomasi algorithm for each image to find out which algorithm works best in which scenario.

Hide Abstract

Eighteenth Order Convergent Method for Solving Non-Linear Equations

V.B. Kumar Vatti1, Ramadevi Sri1 and  M.S. Kumar Mylapalli2

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper, we suggest and discuss an iterative method for solving nonlinear equations of the type f(x)=0  having eighteenth order convergence. This new technique based on Newton’s method and extrapolated Newton’s method. This method is compared with the existing ones through some numerical examples to exhibit its superiority.

AMS Subject Classification:  41A25, 65K05, 65H05.

Hide Abstract

Virtual Reality Treatments for Specific Phobias: A Review

Sumit Roy and R. Kavitha

View:    Abstract |  HTML Full Text |  PDF |  XML|

Virtual reality is becoming one of the seamless technology which can be used to treat several psychological problems such as anxiety disorders. With the advancement of technology virtual reality is becoming available to ordinary practitioners to carry out non-clinical therapies. An effective virtual reality system provides the user with total immersion and becomes a part of the virtual world. This study provides an insight as how virtual reality could provide means to overpower anxiety disorders through a controlled environment which is being projected to participants suffering from specific phobias.

Hide Abstract

RFID Security Issues in IoT: A Comparative Study

Denver Braganza and B. Tulasi

View:    Abstract |  HTML Full Text |  PDF |  XML|

The landscape of Internet of Things (IoT) has been evolving at an increasing rate over the recent years. With the ease of availability of mobile devices, there has been a tremendous leap in technology associated with it. Thus, the need for efficient intercommunication among these devices arises. To ensure that IoT is seamlessly integrated into the daily life of people using appropriate technology is essential. One of the important associated technologies with IoT is RFID. RFID proves to be a simpler and efficient technology to implement IoT at various levels. Since IoT is greatly imapcting the lives of people, one of the major concerns of IoT is the security. IoT will have millions of devices and users connected to each other.

It is important to authenticate both users and devices to prevent any breach of information. With the limitations in RFID technology, various authentication protocols have been developed to provide optimal solutions.

Hide Abstract

Fuzzy Logic based Stock Value Prediction using Fundamental Analysis

Chittaranjan Mangale*, ShyamSundar Meena and Preetesh Purohit

View:    Abstract |  HTML Full Text |  PDF |  XML|

Stock market is very versatile and fluctuates with time. For the same way it becomes difficult to predict movement of the stock, there are various approaches and tools through which the price of the stock is determined by the past patterns. Mostly the approaches are in terms of fundamental approach and technical approach. For the long-term valuation fundamental approach is used. Every stock is having its own value that does not depend on the price of the stock that is known as Intrinsic value. The proposed model works through phases of data collection, feature processing, fuzzy logic mapping and stock value calculation. Fuzzy logic is used to map the quality as well as quantity valuation factors. The IF THEN rules are applied on the linguistic variable.  The fuzzy model outcomes the stock value which is used to provide stock worth. The stock value is calculated by Dividend discount model. Accuracy of the system is 0.77. The results offer the backbone for the value and not the price.

Hide Abstract

Phoneme Segmentation of Tamil Speech Signals Using Spectral Transition Measure

K. Geetha1 and R. Vadivel2

 

View:    Abstract |  HTML Full Text |  PDF |  XML|

Process of identifying the end points of the acoustic units of the speech signal is called speech segmentation.  Speech recognition systems can be designed using sub-word unit like phoneme. A Phoneme is the smallest unit of the language. It is context dependent and tedious to find the boundary.  Automated phoneme segmentation is carried in researches using Short term Energy, Convex hull, Formant, Spectral Transition Measure(STM), Group Delay Functions, Bayesian Information Criterion, etc.  In this research work, STM is used to find the phoneme boundary of Tamil speech utterances.  Tamil spoken word dataset was prepared with 30 words uttered by 4 native speakers with a high quality microphone. The performance of the segmentation is analysed and results are presented.

Hide Abstract

Noise Removal and Filtering Techniques Used in Medical Images

Nalin Kumar and M. Nachamai

View:    Abstract |  HTML Full Text |  PDF |  XML|

Noise removal techniques have become an essential practice in medical imaging application for the study of anatomical structure and image processing of MRI medical images. To report these issues many de-noising algorithm has been developed like Weiner filter, Gaussian filter, median filter etc. In this research work is done with only three of the above filters which are already mentioned were successfully used in medical imaging. The most commonly affected noises in medical MRI image are Salt and Pepper, Speckle, Gaussian and Poisson noise. The medical images taken for comparison include MRI images, in gray scale and RGB. The performances of these algorithms are examined for various noise types which are salt-and-pepper, Poisson, speckle, blurred and Gaussian Noise. The evaluation of these algorithms is done by the measures of the image file size, histogram and clarity scale of the images. The median filter performs better for removing salt-and-pepper noise and Poisson Noise for images in gray scale, and Weiner filter performs better for removing Speckle and Gaussian Noise and Gaussian filter for the Blurred Noise as suggested in the experimental results. 

Hide Abstract

Security Enhancement of AODV Protocol using Fuzzy based Trust Computation in Mobile Ad Hoc Networks

Ashish Kumar Jain and Vrinda Tokekar

View:    Abstract |  HTML Full Text |  PDF |  XML|

Mobile ad hoc network (MANET) possess self-configuration, self-control and self-maintenance capabilities. Nodes of MANET are autonomous routers. Hence, they are vulnerable to security attacks. Collaborative attacks such as black hole and wormhole in MANET are difficult to be detected and prevented.  Trust based routing decision is an effective approach for security enhancement in MANET. In this study, trust computing using fuzzy based max-product composition scheme is applied to compute aggregated trust values to determine malicious nodes and thereby safe route in MANETs. The results show performance improvement of proposed protocol over AODV protocol. Network metrics are analysed under different mobility conditions and different positions of black hole nodes.

Hide Abstract

A New 8T SRAM Circuit with Low Leakage and High Data Stability Idle Mode at 70nm Technology

P. Raikwal1*, V. Neema1 and A.Verma2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Memory has been facing several problems in which the leakage current is the most severe. Many techniques have been proposed to withstand leakage control such as power gating and ground gating.  In this paper a new 8T SRAM cell, which adopts a single bit line scheme has been proposed to limit the leakage current as well as to gain high hold static noise margin. The proposed cell with low threshold voltage, high threshold voltage and dual threshold voltage are used to effectively reduce leakage current, and delay. Additionally, the comparison has been performed between conventional 6T SRAM cell and the new 8T SRAM cell. The proposed circuit consumes 671.22 pA leakage current during idle state of the circuit which is very less as compare to conventional 6T SRAM cell with sleep and hold transistors and with different β ratio. The proposed new 8T SRAM cell shows highest noise immunity 0.329mv during hold state. Furthermore, the proposed new 8T SRAM circuit represents minimum read and write access delays 114.13ps and 38.56ps respectively as compare to conventional 6T SRAM cell with different threshold voltages and β ratio.

Hide Abstract

Security with Respect to MANET (Mobile Ad-hoc Network) and IoT (Internet of Things)

Vikram Agrawal

View:    Abstract |  HTML Full Text |  PDF |  XML|

MANET is self organizing, decentralized and dynamic network. In which participating nodes can move anywhere. The nodes can be host or router anytime [1]. Mobile ad hoc network is decentralized network so if one node is participating as router for particular time but if that node leave network then it is very difficult to transfer data packets. The main feature of MANET network of self organizing capability of node has advantage and disadvantage as well. By this it is easy to maintain network and convert topology but at same time we need to tolerate data transfer. The MANET is also used for big network and internet but there is no smart objects like IoT which can share information machine to machine. Now rapidly increase internet users worldwide to access global information and technology [2]. IoT is basically used to converge applications and services to open global business opportunities which can use I-GVC (Information-driven Global Value Chain) for efficient productivity.

Hide Abstract

An Integer solution in Intuitionistic Transportation Problem with Application in Agriculture

M. A. Lone, S. A. Mir and M. S. Wani

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper, we investigate a Transportation problem which is a special kind of linear programming in which profits; supply and demands are considered as Intuitionistic triangular fuzzy numbers. The crisp values of these Intuitionistic triangular fuzzy numbers are obtained by defuzzifying them and the problem is formulated into linear programming problem. The solution of the formulated problem is obtained through LINGO software. If the obtained solution is non-integer then Branch and Bound method can be used to obtain an integer solution.

Hide Abstract

Training Neural Network Elements Created From Long Shot Term Memory

Kostantin P. Nikolic

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper presents the application of stochastic search algorithms to train artificial neural networks. Methodology approaches in the work created primarily to provide training complex recurrent neural networks. It is known that training recurrent networks is more complex than the type of training feedforward neural networks. Through simulation of recurrent networks is realized propagation signal from input to output and training process achieves a stochastic search in the space of parameters. The performance of this type of algorithm is superior to most of the training algorithms, which are based on the concept of gradient. The efficiency of these algorithms is demonstrated in the training network created from units that are characterized by long term and long shot term memory of networks. The presented methology is effective and relative simple.

Hide Abstract

An Emerging Trend of Big Data for High Volume and Varieties of Data to Search of Agricultural Data

Parag Shukla1, Bankim Radadiya2 and Kishor Atkotiya3

View:    Abstract |  HTML Full Text |  PDF |  XML|

The data in amount of large size in world is growing day by day. Data is growing because of use of digitization, internet, smart cell and social media and networking. A collection of data sets which is very large in size as well as complex is called Big Data which is use in current trend and next generation data transformation, data analysis and data storage of agricultural crops, seeds, works, labors, tools and environment data. Generally before current scenario, size of the data is measure in Megabyte and Gigabyte but now a day it measure in Petabyte and Exabyte.

Traditional database systems are not able to capture, store and analyze this large scale of data. As the internet is growing, amount of big data continue to grow. Big data analytics provide new ways for businesses updates and requirement for updating and government to analyze unstructured data. Now a days, Big data is one of the most important and challenging point in information technology world. It is executing very important role in future.

Big data changes the way of world for management and use big amount of data. Some of the applications are in areas such as medical issues, healthcare, traffic issues, banking management, retail management, education  and so on. Organizations are becoming more reliable & flexible and more open. Figure 2 is display a big data and analytics road map for large amount of data analysis and storage.

Hide Abstract

A Flexible Conceptual Framework for Supporting Collaborative Work

Aiman Turani

View:    Abstract |  HTML Full Text |  PDF |  XML|

Many practitioners and researcher have stated that the widespread of Computer Support of Collaborative work has not been as successful as hoped. There have been many challenges that have faced managers and team members when conducting collaborative work sessions in the virtual environment. Collaboration Script is considered a relatively a new approach in assisting designing successful collaboration sessions. Collaboration Script is a script that formally defines the flow of activities that are needed during a collaboration session. Yet, researchers have been uncertain regarding the complexity nature of such scripts.

The key strength of the proposed framework is in its ability to derive a collaborative scripting language that can describe complex designs and in the same time keeps its simplicity.  The resulted scripting language should be similar to software programming scripting languages and in the same time based on profound collaboration techniques. Team leaders and managers with basic programming skills should be able to adopt and use such language in a short time and in the same time not afraid of designing complex team collaboration sessions.

The proposed framework is composed of four main layers. The First layer is based on well known collaborative techniques that contain a set of mini-activities which are located in the second layer. The third layer formulates the notation and rules of the proposed scripting language. It encloses the required components and commands that should be used in such script. The forth layer provides the needed support to implement such scripts by providing appropriate set of collaborative and supportive tools.

Hide Abstract

A Novel Approach to Group Research Proposal and Allocate the Research Reviewer by Using Text-Mining and Clustering

Geeta Dangar*, Vipul Vekariya, Daxa Vekariya

View:    Abstract |  HTML Full Text |  PDF |  XML|

In any of the organizations many efforts is taken for research project selection.It is necessary for each research proposal, before it given to the appropriate experts, research proposals needed grouping according to their research topics and area of research. In this paper clustering methods and text data mining approaches are used to group research proposals and give it to expert reviewer for reviewing the proposals. This paper uses the real data of BNSF (Beijing Natural Science Foundation).

Hide Abstract

Empirically Implementation Adaboost to Solve Ambiguity

Boshra F. Zopon AL Bayaty1, Shashank Joshi2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Word sense disambiguation is process of identifying correct meaning based on algorithm used. Many more research is carried out in this domain popular dataset referred is  wordnet. This paper discuss about word sense disambiguation using adaboost algorithm. In thiswork wordnet data and senseval standards are used resolve meaning of word with the help of given context.

Hide Abstract

An Incentive-Based Peer-to-Peer Grid Scheduling

K.Srinivasa Rao1* and M.V.S.N. Maheswar2

View:    Abstract |  HTML Full Text |  PDF |  XML|

In a grid commuting environment, resources are autonomous, wide-area distributed, and they are usually not free. These unique characteristics make scheduling in a self-sustainable and market-lime grid highly challenging. The goal of our work is to build such a global computational grid that every participant has enough incentive to stay and play in it. There are two parties in the grid: resources consumers and resource providers. Thus the performance objective of scheduling is two-fold: for consumers, high successful execution rate of jobs, and for providers, fair allocation of benefits. We propose an incentive-based grid scheduling, which is composed of a P2P decentralized scheduling framework and incentive-based scheduling algorithms. We present an incentive-based scheduling scheme, which utilizes a peer-to-peer decentralized scheduling framework, a set of local heuristic algorithms, and three market instruments of job announcement, price, and competition degree. The results show that our approach outperforms other scheduling schemes in optimizing incentives for both consumers and providers, leading to highly successful job execution and fair profit allocation.

Hide Abstract

Enabling New Generation Security Paradigm With Quantum Cryptography

T Venkat  Narayana Rao1, Maithreyi Simhachalam2, Smitha Bandyala2, B. Vasundara Devi4

View:    Abstract |  HTML Full Text |  PDF |  XML|

Quantum cryptography is a technology that ensures the security. Quantum cryptography ensures secure communication based on the fundamental physical laws. The quantum cryptography is based on the two elements of quantum mechanics -the Heisenberg Uncertainty principle and the principle of photon polarization. This paper focuses on the principle of quantum cryptography, mechanism how photons are encrypted and its contribution towards real-time security in all domains of application development.

 

Hide Abstract

Website Injection for Fraudulent Activities and Ways to Combat

T Venkat Narayana Rao1, Jella Shruthi2, Thakkallapally Sneha2

View:    Abstract |  HTML Full Text |  PDF |  XML|

Now a days , web injection exhibits in different modes, but basically occurs when malicious and unwanted actors tamper directly with browser sessions for their business profits. Malware’s are injected through ad networks into websites. How an individual play different roles in this kind of tampering browsers is being discussed. The consequences of malware attacks are explored, as these are new trends in website attacks and  describe  types of malware you need to watch out  on your site. Finally, this paper discusses solutions for reducing malware threats and also includes some best practices for protecting website and business.

Hide Abstract

Various Techniques of DDoS Attacks Detection & Prevention at Cloud: A Survey

Dalima Parwani1, Amit Dutta2, Piyush Kumar Shukla3, Meenu Tahiliyani1

View:    Abstract |  HTML Full Text |  PDF |  XML|

Cloud Computing is one of the fastest growing concept of transmitting the data or storing data so that the user can access data from anywhere. But with the advancement of growing new technology various challenges have emerged such as security from various attacks, computational cost and power consumption.  One of the major issues is the Distributed Denial of Service Attack. It is a type of attack where a multitude of compromised systems start attacking on a single target that enables denial of services for the user of the targeted system. Hence, various techniques are implemented for the detection and prevention of these attacks; some of the techniques work better while some have issues, concerns and problems. In this paper, a complete survey and analysis of various Distributed Denial of Service Attack detection and prevention technique is analyzed and discussed so that on the basis of issues surfaced, a new, reformed and efficient technique is implemented for the detection and prevention of Distributed Denial of Service Attack especially in Cloud Computing System.

Hide Abstract

New Era of Effective Web Designing Technology Using HTML 5

Ashish P. Joshi1, Chetan  R. Dudhagara2, Hasamukh B. Patel2 

View:    Abstract |  HTML Full Text |  PDF |  XML|

HTML 5 is scripting language which used primarily to designing web page. Earlier HTML 4 was used which is not more comfortable with the graphics, animations, audios, videos, mobile technology, cross browser independency, game development, etc. using in web page. Now a day HTML 5 becomes very popular for its methodology and efficiency in field of designing the web page with remove the above disability. Here, we discuss about the technology which is support by HTML-5 and conclude that how it is better than the previous.

Hide Abstract

A Robust DCT Based Digital Image Watermarking Using Fusion of Computational Intelligence Techniques

Monika Patel*, Priti Srinivas Sajja

 

View:    Abstract |  HTML Full Text |  PDF |  XML|

In our digital society, the major source of communication is digital media such as images, video, audio or text. So impersonate and copyright protection of digital content remains an indispensable problem across the globe. To deal with this problem, digital watermarking has become advantageous solution for providing ownership of digital content. In this paper we have developed a robust DCT based image watermarking using fusion of Fuzzy Logic and Neural Networks. In this technique, we embed and extract biometric watermark using the fuzzy-neuro based watermarking in middle frequency band. This paper illustrates a modified algorithm for DCT based digital watermarking. To demonstrate working of the aforesaid algorithm, an experimental system is also developed with different input and displays the comparison of the PSNR values for them. 

Hide Abstract

Computational Approach in Complex Structure Prediction for Drug Design Activity

Chetan  R. Dudhagara1, Ashish P. Joshi2, Mayur M. Patel1

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper is based on multidisciplinary approach in the fields of computer science (or information technology), bioinformatics and cheminformatics and its effect on modern drug related activities.  The computational methods is used to study the formation of inter molecular structures in drug discovery activity has been subject of research during last decades. The drug activity is obtained through molecule binding. The drug activity is obtained through the molecular binding of one molecule (i.e. the ligand) to the pocket of another, usually larger; molecule (i.e. the receptor) Mostly protein is used as a receptor or larger molecule. Cheminformatics is the application of informatics methods to solve problem related to chemicals. The term “cheminformatics” was introduced only few years ago. Each compound is trying to become a stable form. In the area of molecular modeling, molecular docking is the method to predict preferred orientation of one molecule to second molecule when bound to each other to form a stable complex. 

Hide Abstract

Traceability of Implementation to Design and Requirements Specifications: A Formal Technical Review Method (Reverse Engineering Tool)

Rashmi Yadav*, Ravindra Patel and Abhay Kothari

View:    Abstract |  HTML Full Text |  PDF |  XML|

The software  quality of a software product is challenging  for the software industry. The reason that  software industry demand of product in less time period so developer or team in on stress due to that they are missing something so software product not up to mark. The purpose of this paper viewing significance of formal technical review of requirement gathering and  design any software, products or tools and reviews missing a thing and improve software product quality. This research paper elaborates how to perform requirement gathering  and review that, for the reverse reverse engineering tool.

Hide Abstract

Root to Fruit (3): A Framework to Manage Knowledge About Sorting Algorithms

Pramod Kadam,Sachin Kadam

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper continues with the initial thought of evolutionary study of sorting problem and sorting algorithms (Root to Fruit (1): An Evolutionary Study of Sorting Problem[1] and Root to Fruit (2): Evolutionary Approach for sorting algorithms)[2] and concluded with a suggestion of creating a framework to manage sorting algorithms related /around knowledge. This paper also consist some possible difficulties and problems in the implementation of suggested knowledge base framework.

Hide Abstract

Traffic Sign Symbol Recognition Using Single Dimension PCA

Shashidhar T Halakatti1, Shambulinga T Halakatti2

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper, the image processing method has been used to recognize traffic sign symbol from the static image. In this paper we have to recognize traffic sign symbol by using single dimension principal component analysis(sdpca).We have focused on traffic sign symbol of triangular shapes .Images are pre processed with several image preprocessing techniques, such as threshold techniques, Gaussian filter and Canny edge detection then the single dimension principal component analysis algorithm stages are performed to recognize the traffic sign symbol.

Hide Abstract

Analysis of Uk’s Retail Industry Via Online Editorial Media

Jolly Masih*, Mohit Sharma, Amita Sharma

View:    Abstract |  HTML Full Text |  PDF |  XML|

The study aimed at understanding the retail industry of UK, its latest trends and performance of different retail sections namely high street retail, supermarkets and online retail. Online editorial media was used to collect data using relevant keywords through data crawlers and Google search for retail studies. It was found that Britain’s high street has faced several problems in past few months like high vacancy rates, parking problems, high rents, and an impact of recession which called for protection of high street as it was considered as a symbol of Britain’s great culture. Customers shopping trend had been changing towards high street and shopping malls and they were inclined towards online shopping due to convenience and time savings. Consumers also believed that shopping out of town lead to long driving time, extra purchases and waste of fuel. Same can also be stated with recent decrease in share of supermarket giants like Tesco & Sainsbury. Vigorous price war was initiated between supermarkets, high street shops and online retail shops. Centre vacancy rate of high street shops was 10.3% in October and in July's rate was 10.1%. Footfall on high streets also fell by 1.4% with a rise of 1.9% for out-of-town shopping centres. Due to convenience in shopping, availability of wide variety and time savings, online retail registered a growth of 15.8% in year 2014 as compared to 2013. Mobile shopping and tablet shopping created a revolution in retail industry and increased the percentage of purchases made by several folds. Keeping this in view present study was an attempt to analyse various factors causing a decline in share of high street and supermarkets and an increased interest in online retail.

Hide Abstract