Author Archives: Kamran Khan

An Initiation of Dynamic Programming to Solve the Graphical As Well As Network Problems for the Minimum Path Between Nodes

Manish Jain, Adarsh Srivastava, Anand Kumar Dixit and Somnath Ghosh

View:    Abstract |  HTML Full Text |  PDF |  XML|

Dynamic programming is a useful technique for making a sequence of interrelated decisions. It provides a step wise procedure for finding the optimal combination of decisions. Dynamic programming provides a useful way to find out the minimum distance between the two nodes within the network.The multistage decision policy with recursive approach will provides an efficient way while using Dynamic programming. In multistage decision process the problem is divided into several parts called as sub problems and then each sub problem will be solved individually and the final result will be obtained by combining the results of all the sub problems.

Hide Abstract

The Influence of Pseudorandom Communication on Complexity Theory

Roohi Ali¹ and Sana Iqbal²

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper is concerned with The visualization of 802.11 mesh networks is a technical grand challenge. Given the current status of introspective communication, security experts famously desire the simulation of linked lists. In our research, we use stochastic archetypes to confirm that Internet QoS and 802.11 mesh networks are mostly incompatible.

Hide Abstract

Cyber Crime Effecting E-Commerce Technology

N. Leena

View:    Abstract |  HTML Full Text |  PDF |  XML|

The early struggles with the internet was finding a way to safely buy and sell goods or transfer funds using computer and telecommunication networks. The goal was to enable e-commerce by providing a safe, convenient and immediate payment system on the internet. But internet is notorious for giving its users a feeling of anonymity. The inadequate security results in major damage. Now a days a number of critical transactions are carried out by computer systems over networks. There is an internet security threat - cyber crime which enables ecommerce transaction face significant financial and information losses. 

Hide Abstract

Feasibility and Importance of Wireless Intelligent Network (WIN)

Tariq Ahamad

View:    Abstract |  HTML Full Text |  PDF |  XML|

In modern world of technology wireless users are much more sophisticated telecommunications users than they were few years ago. They are longer satisfied with just completing a clear call, today’s subscribers demand innovative ways to use the wireless phone. They want multiple services that allow them to handle or select incoming calls in a variety of ways. Wireless intelligent network is developed to drive intelligent network capabilities such as service independence, separation of basic switching functions from service and Application functions and independence of applications from lower-level communication details into wireless networks. The primary weapon for empowering providers to deliver distinctive services with enhanced flexibility is wireless intelligent networks (wins).

Hide Abstract

Design and Analysis of New Software Conformance Testing: NA Mutation Testing

Naveen Tyagi¹ and Ashish Chaturvedi²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Software conformance testing is the process of evaluating the accuracy of an implementation built to the requirements of a functional specification. Tedious conformance testing of software is not practical because variable input values and variable sequencing of inputs results in so many possible combinations of tests. Mutation testing is a technique for unit testing software that, although powerful, but computationally expensive. Recent engineering advances have given us techniques and algorithms for significantly reducing the cost of mutation testing. This paper demonstrates NA (Naveen-Ashish) Mutation to design a system that will approximate mutation.

Hide Abstract

Hybrid Multi Resolution Method of Image Fusion for Medical Image Processing

P. Selvaperumal

View:    Abstract |  HTML Full Text |  PDF |  XML|

Image fusion is a process of combining multiple input images of the same scene into a single fused image, which preserves relevant information and also retains the important features from each of the original images and makes it more suitable for human and machine perception. The reason for going onto image fusion is that, in the medical image processing, different sources of images produce complementary information and so one has to fuse all the sources of images to get more details required for the diagnosis of the patients. In this method the raw data is the MR scan image of a patient’s brain which is observed at different angles or resolutions. The images possess both different as well as common information with respect to each other. Thus when these images are fused together the redundant images are neglected and the complementary images are added thereby producing an accurate diagnosis with a single image.

Hide Abstract

Enhancing Data Security by Using Crypto-Steganography in Image

Ajay Gadicha¹, V.T. Ingole² and Amit Manikrao³

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper, actually we will present a technique of secure data transmission through hiding of data in image file by replacing it’s one of the LSB bit. The watermarked bit embedded into image sample to increases the robustness against noise hence by combining cryptography and steganography we will increases the security of data.

Hide Abstract

Software Evaluation for Developing Software Reliability and Metrics

Sri Pammi Srinivas

View:    Abstract |  HTML Full Text |  PDF |  XML|

The Dissertation presents a study and Implementation of different software metrics. We find that there are specific metrics for different stages of the software development cycle. The Metrics in this dissertation as a literature study is Metrics for Software Requirements, Metrics for Design Level. The Software Metrics covered in this Dissertation are Object-Oriented Design Metrics, Metrics for Coding Level, Metrics for Testing Level, Building cost estimation model and Software Reliability Models. Reliability metric was one of the first early prediction models to be used. The late prediction models mostly consist of the Software Reliability growth models. Jelinski and Moranda’s model developed one of the earliest reliability models. Musa Basic Execution Time Model postulated that software reliability theory should be based on execution time, which is the actual processor time utilized in executing the program, rather than on calendar time.

Hide Abstract

A Comprehensive Study to Reduce Traffic Accidents Using Fuzzy Logic Approach

Somnath Ghosh

View:    Abstract |  HTML Full Text |  PDF |  XML|

In recent years, many people died or injured because of traffic accidents all over the world. When statistics are investigated India is the most dangerous country in terms of number of traffic accidents among Asian countries. However, we agree that this rate is higher in India since many traffic accidents are not recorded, for example single vehicle accidents or some accidents without injury or fatality. In this study, using fuzzy logic method; which has increasing usage area in Intelligent Transportation Systems, a model is developed which would obtain to prevent the vehicle pursuit distance automatically. Using velocity of vehicle and pursuit distance that can be measured with a sensor on vehicle a model has been established to brake.

Hide Abstract

Consumer Awareness to Protect Internet Users – A Scenario

P. Srinivas

View:    Abstract |  HTML Full Text |  PDF |  XML|

This Paper focuses on the “encouragement” approach which communication has taken. This is because in order for QoS initiatives to be meaningful to consumers the information must keep pace with changing technological and market developments. The communication providers are best placed to provide the relevant QoS indicators in a timely fashion. However, the ability to provide meaningful data does not necessarily mean that communication providers will be willing to provide it without regulatory intervention. As competition increases, there are clear incentives for providers with a high quality of service to produce and promote timely and accessible QoS information for consumers. Conversely there is no economic incentive for those providers offering a low quality of service to do the same. The trends are likely to affect both the type and quality of the services offered in the market. For example, with some routine Internet activities such as web browsing and email all that is normally required is sufficient bandwidth. The IP-protocol should not as rule experience any difficulty with delay, jitter etc. However as consumers demand more interactive functions such as conversations and video-conferencing, a low level of end-to-end delay and jitter, low packet loss, and a guaranteed bandwidth are all needed to ensure standards are maintained.

Hide Abstract

Speech Enhancement Using Neural Network

Syed Minhaj Ali¹ and Bhavna Gupta²

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper describes a neural network speech enhancement system using Multilayer Perceptron (MLP) network and trained using the back propagation algorithm (BPA). Speech enhancement is generally refers to map noisy speech into cleaner speech. Noisy speech signals are obtained by adding random noise to the clean signals .Speech enhancement is then performed on the noisy signals by using the ADALINE. Here we show that neural nets can be used to significantly boost recognition accuracy, without retraining the speech recognizer.

Hide Abstract

A Compression Method for PML Document Based on Internet of Things

Jamal Mohammad Aqib

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper we present a compression method for PML (Physical Markup Language) document based on Huffman Algorithm. The algorithm and Java code pieces are presented. And the test result shows the compression method has good compression ratio for the PML document. The Physical Markup Language (PML) is designed to be a general, standard means for describing the physical world including human and machines. How to save the PML document in internet server efficiently will become more and more important.

Hide Abstract

Modeling for Prediction of Characteristic Deflection of Flexible Pavements- Comparison of Models Based on Artificial Neural Network and Multivariate Regression Analysis

Suneet Kaur¹, V.S. Ubboveja² and Alka Agarwal3

View:    Abstract |  HTML Full Text |  PDF |  XML|

Pavement surface deflection of a highway is a primary factor for evaluating the pavement strength of a flexible pavement. Benkelman Beam Deflection (BBD) technique is widely used in the country for evaluating the structural capacity of an existing flexible pavement as also for estimation and design of overlays for strengthening of a weak pavement. The field test for measuring the surface deflection is expensive and time consuming, and alternate modeling methods to estimate surface deflection of a pavement, therefore, would result in substantial savings in time and money in the preparation of detailed project reports for the large highway rehabilitation and strengthening projects being undertaken in the country. An attempt has been made in this paper to compare the results obtained from the models based on Multivariate Regression analysis and Artificial Neural Network to predict reasonably accurate characteristic deflection of flexible pavements. Data used for building the model was collected from field tests conducted by various entities in the state of Madhya Pradesh engaged in the rehabilitation and strengthening of highways in the State passing through extensive black cotton soil areas. 

Hide Abstract

Secure Watermarking Technique for Emerging Clouds

R. Vinoth and S. Sivasankar

View:    Abstract |  HTML Full Text |  PDF |  XML|

Trust and security have prevented businesses from fully accepting cloud platforms. To protect clouds, providers must first secure virtualized datacenter resources, uphold user privacy, and preserve data integrity. It is suggested using a trust-overlay network over multiple data centers to implement a reputation system for establishing trust between service providers and data owners. Data coloring and software watermarking techniques protect shared data objects and massively distributed software modules. These techniques Safeguard multi-way authentications, enable single sign-on in the cloud, and tighten access control for sensitive data in both public and private clouds.

Hide Abstract

A Concept of File Deletion and Restoration as A Threat to Commit Cyber Crime

Anisha Kumar

View:    Abstract |  HTML Full Text |  PDF |  XML|

Information Technology solutions have paved a way to a new world of internet, business networking and e-banking, budding as a solution to reduce costs, change the sophisticated economic affairs to more easier, speedy, efficient, and time saving method of transactions. Internet has emerged as a blessing for the present pace of life but at the same time also resulted in various threats to the consumers and other institutions which results in committing cyber crime. Despite the increase in government compliance requirements and the proliferation of security tools, companies continue to underestimate the threat from phishing, data loss, and other cyber vulnerabilities. This paper contributes an understanding of the effects of negative use of Information technology, as how a simple technological aspect of file deletion can result in cyber crime. A few aspects to trace the deleted information with restoring software is also mentioned in this paper.

Hide Abstract

Wireless Sensor Network for Monitoring A Patient Having Hole in the Heart Continuously Using Zigbee

P. Ramanathan¹ and Pradip Manjrekar²

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this research paper, we presented the wireless Sensor network (WSN)for monitoring a patient having hole in the heart continuously using ZIGBEE The output of biosensor has to be transmitted via Zigbee and the same has to be sent to the remote wireless monitor for observing the condition of the patient having hole in the heart and for the same proper medical treatment should be given to him/ her without surgery using alternative medicine to cure hole in the heart. The remote wireless monitor is constructed of Zigbee and personal computer (PC). The measured signal has to be sent to the PC, which can be data collection. When the measured signals over the standard value, the personal computer sends Global system for Mobile communication (GSM) short message to the care taker. Although Bluetooth is better than Zigbee for transmission rate, but Zigbee has lower power consumption. Hence, Zigbee is generally used for 24 hours monitor of communication transmission system. The first procedure of the system that we use the biosensor to measure Heart rate and blood pressure from human body using Zigbee the measured signal sends to the PC via the RS-232 serial port communication interface. We can send the signal to remote PC or PDA from the internet. In particular, when measured signals over the standard value, the personal computer will send GSM short message to absent care taker’s mobile phone. 

Hide Abstract

A Comprehensive Study of Technologies and Trends Used in Latest Mobile Computing Environment

Vaibhav Shukla and Som Nath Ghosh

View:    Abstract |  HTML Full Text |  PDF |  XML|

The present study discussed about all the basic components of mobile devices which are being used in the development of mobile phones. The author focused on client server architecture and mobile computing. Various types of operating systems which are used very frequently in the mobile devices are Window CE and Symbian OS, these are the two very frequently used operating systems which we used in the mobile devices.

Hide Abstract

A Hybrid Multi-Agent Routing Algorithm Based on Ant Colony Optimization for MANET

V. K. Saraswat and Amit Singhal

View:    Abstract |  HTML Full Text |  PDF |  XML|

There are many routing algorithms are presented for Mobile Ad hoc Network such as Destination sequenced Distance Vector (DSDV) which is known as proactive routing algorithm based on routing table, due to continuously updating routing tables, it consumes large portion of network capacity. Another is Ad hoc on Demand Distance Vector (AODV) is known as reactive algorithm. In this algorithm, when path breaks, it needs a new route discovery, this cause a route discovery latency, therefore these are not suitable for real time communication. With the inspiration by Swarm Intelligence (SI), we borrowed idea from ant colony and Ant Colony Optimization framework and proposed an Enhanced hybrid routing algorithm which can improve the performance in MANET. The hybrid quality of the algorithm makes it suitable for the entire environment in comparison with reactive and proactive protocols. The introduced routing algorithm is highly adaptive, efficient and scalable. The main aim of this algorithm to reduce the end-to-end delay in the context of pause time and mobility on packet received. We refer to this algorithm as the “Ant Colony based Multi Agent Routing Algorithm (ACMRA).

Hide Abstract

Framework for Threshold Based Centralized Load Balancing Policy for Heterogeneous Systems

Archana B. Saxena and Deepti Sharma

View:    Abstract |  HTML Full Text |  PDF |  XML|

We propose threshold based centralized load balancing policy for heterogeneous systems, where all incoming jobs are acknowledged by central server and completed with the help of workstations. All the activities (Load distribution & Load sharing) within system are regulated with the help of a central server by maintaining Capability matrix and Load matrix. We present details how threshold based Load balancing is implemented in anticipated system. A node is assigned a relative load threshold on the basis of their operational capability. Overloaded (Load > Threshold) and under loaded (Load < threshold) nodes can route load balancing through central server. We intend to do a simulation study to compare the propose scheme with conventional Load balancing scheme to show that projected scheme will increase system throughput and condense execution time. The computer simulation is based on time, sender and receiver initiative load balancing scheme, and is tested for a number of decision thresholds. We hope our result will match our expectation.

Hide Abstract

Amalgam Version of Itinerant Augmented Reality

Akhil Khare, Shashank Sharma and Sonali Goyal

View:    Abstract |  HTML Full Text |  PDF |  XML|

Augmented reality is a powerful user interface technology that augments the user’s environment with computer generated entities In this venture we investigate building indoor location based applications for a mobile augmented reality system. Augmented reality is a natural interface to visualize special information such as position or direction of locations and objects for location based applications that process and present information based on the user’s position in the real world. To enable such applications we construct an indoor tracking system that covers a substantial part of a building. It is based on visual tracking of fiducial markers enhanced with an inertial sensor for fast rotational updates This is especially problematic for mobile augmented reality systems, which ideally require extremely precise position tracking for the user’s head, but which may not always be able to achieve the necessary level of accuracy. While it is possible to ignore variable positional accuracy in an augmented reality user interface, this can make for a confusing system; for example, when accuracy is low, virtual objects that are nominally registered with real ones may be too far off to be of use. Our system uses inferencing and path planning to guide users toward targets that they choose.

Hide Abstract

Energy Efficient Cluster Based Partial Multihop Algorithm for Hetrogeneous (CPMH) WSN

M. Vasim Babu

View:    Abstract |  HTML Full Text |  PDF |  XML|

A Wireless Sensor Network (WSN) consists of spatially distributed autonomous sensors to cooperatively monitor physical or environmental conditions, such as temperature, sound, vibration, pressure, motion or pollutants. In recent years, there has been a growing interest in wireless sensor networks. One of the major issues in wireless sensor network is developing an energy-efficient routing protocol. The challenge lies in efficiently providing acceptable accuracy while conforming to the many constraints of WSNs. Energy limited is one main bottle-neck for wireless sensor networks. Due to unreasonable cluster head electing and intensive energy consumption of cluster head in clustering algorithm, I propose cluster based partial multihop algorithm for heterogeneous WSN, Which aimed to optimize cluster head voting and balance energy consumption of cluster head. I also propose a hierarchical tree routing method that reduces the distance of cluster-head to the base station.

Hide Abstract

Securing the Information Network

Rashmi Agrawal

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper, actually we will present a technique of secure data transmission through hiding of data in image file by replacing it’s one of the LSB bit. The watermarked bit embedded into image sample to increases the robustness against noise hence by combining cryptography and steganography we will increases the security of data.

Hide Abstract

Construction of High Performance Stream Ciphers for Text Web Browsers Using Adders

Raj Kumar and V.K. Saraswat

View:    Abstract |  HTML Full Text |  PDF |  XML|

There are several security measures to protect the sensitive, confidential and secretive data over the internet. One of the basic mechanism is to apply a secret code of encryption. The basic problem with encryption has been speed, the reliability of mechanism and the compatibility with different web browsers. We have several encryption algorithms like RSA, IDEA, TEA and DES. These algorithms are developed by different people and written in many forms i.e. in different programming languages. Here we are proposing a model of stream cipher based on encryption techniques with digital adder’s operation. The text is translated into hex code using some secret key (say password). The secret key is only shared by sender and receivers. So only authorized personnel can access the data sent. Our model proposes to use n-bit serial adder to encrypt data with fast speed and high reliability. The encrypted data are converted into hexadecimal to enhance the compatibility of the proposed mechanism with different machines and platforms. We have performed an analytical analysis to determine the best working algorithms for text web browser.

Hide Abstract

Implementing Power Saving Strategies in Wireless Sensor Networks Using IEEE 802.16e

Pranali D. Tembhurne¹, R.K. Krishna² and Ashish Jaiswal³

View:    Abstract |  HTML Full Text |  PDF |  XML|

IEEE 802.16e is the latest broadband wireless access standard designed to support mobility. IEEE 802.16e (MAC + PHY) is an emerging standard for mobile wireless broadband access systems. In any mobile networks, power saving is one of the most important features for the extension of devices’ lifetime. To enhance the power efficiency of broadband wireless sensor networks based on battery power, power saving strategies based on cross-layer design is proposed for IEEE 802.16e sleepmode operation. In this paper first we implement the MAC layer & PHY layer. MAC layer with its features are implemented. Second we use the modulation techniques (BPSK, QPSK & 16QAM) with OFDM physical layer to reduce the packet loss. Also analyzing the throughput & delay with time.

Hide Abstract

Advanced Algorithmic Approach for IP Lookup (Ipv6)

Pankaj Gupta, Deepak Jain, Nikhil Anthony, Pranav Gupta, Harsh Bhojwani and Uma Nagaraj

View:    Abstract |  HTML Full Text |  PDF |  XML|

Internet address lookup is a challenging problem because of increasing routing table sizes, increased traffic, higher speed links, and the migration to 128 bit IPv6 addresses. IP routing lookup requires computing the best matching prefix, for which standard solutions like hashing were believed to be inapplicable. The best existing solution we know of, BSD radix tries, scales badly as IP moves to 128 bit addresses. This paper presents a novel algorithm “Distributed memory organization” for lookup of 128 bit IPv6 addresses and “Asymmetric Linear search” on hash tables organized by prefix lengths. Our scheme scales very well when traffic on routers is unevenly distributed and in genera it requires only 3-4 lookups, independent of the address bits and table size. Thus it scales very well for IPv4 and IPv6 under such network conditions. Using the proposed techniques a router can achieve a much higher packet forwarding rate and throughput.

Hide Abstract

Basic Graphs Terminologies, Their Representation and Applications

Vaibhav Shukla and Somnath Ghosh

View:    Abstract |  HTML Full Text |  PDF |  XML|

In the present study, general graph terminologies are explained with their representation and application in wide fields like operational research, genetics, physics and chemistry. Basically in the study of graph theory we can represent the relationship between various objects in very simple and convenient way in the form of picture.

Hide Abstract

CIODD : Cluster Identification and Outlier Detection in Distributed Data

Eena Gilhotra¹ and Saroj Hiranwal²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Clustering has become an increasingly important task in modern application domains such as marketing and purchasing assistance, multimedia, molecular biology etc. The goal of clustering is to decompose or partition a data set into groups such that both the intra-group similarity and the intergroup dissimilarity are maximized. In many applications, the size of the data that needs to be clustered is much more than what can be processed at a single site. Further, the data to be clustered could be inherently distributed. The increasing demand to scale up to these massive data sets which are inherently distributed over networks with limited bandwidth and computational resources has led to methods for parallel and distributed data clustering. In this thesis, we present CIODD, a cohesive framework for cluster identification and outlier detection for distributed data. The core idea is to generate independent local models and combine the local models at a central server to obtain global clusters. A feedback loop is then provided from the central site to the local sites to complete and refine the global clusters obtained. Our experimental results show the efficiency and accuracy of the CIODD approach. 

Hide Abstract

Different Methods for Statistical Face Representation – Review

T. Syed Akheel¹, S.A.K. Jilani² and K. Kanthamma²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Face is an important biometric feature for personal identification. Human beings easily detect and identify faces in a scene but it is very challenging for an automated system to achieve such objectives. Hence there is need to have reliable identification method for interacting users .In this paper we discuss past research on biometric face feature extraction and recognition of static images. We will present implementation outline of these methods along with their comparative measures and result analysis.

Hide Abstract

Detecting Faces According to their Pose

Mohammad Mojtaba Keikhayfarzaneh1,2*,Javad Khalatbari1, B.Somayeh Mousavi3 and Saeedeh Motamed1,4

View:    Abstract |  HTML Full Text |  PDF |  XML|

Human face detection is one of the vital techniques in computer vision. It plays so important role in a wide variety of applications such as surveillance systems, video tracking applications and image database management. In this paper a new method to detect faces with different pose in colour images, is proposed. Skin colour, lip position, face shape information and statistical texture properties are the key parameters for developing fussy rule based classifiers to extract face candidate from an image. The algorithm consist of two main parts: detecting frontal face system and detecting profile face system. In first step, skin regions are identified in HSI colour space, using fuzzy system, applying distance of each pixel colour to skin colour cluster as input and producing a skin-likelihood image in output. The regions owning the most likelihood of belonging to skin are labelled and enter the frontal face detecting part . To extract frontal face regions ,fuzzy rule based system is used, applying face and lip position, lip area data and face shape. The detected faces are removed and remain areas are tested by the profile face finding algorithm. This algorithm utilizes statistical texture properties of ear to verify profile face detection. By this system, 98%, 90% and 83.33% detection rates are achieved, respectively for frontal, near frontal and profile faces.

Hide Abstract

User Interface Modeling

Ravinder G. Reddy

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper focuses on the present user interface modeling techniques for capturing the user needs to perform in order to reach his goals. Newly conceived HCl modeling languages need to foresee their role as members of the family of languages that constitute the UML representations for software design, due to their wide acceptance by both researchers and practitioners. This paper extends one such modeling language (MoLIC) and presents an overview & comparison of the trends in user interface modeling in UMLi.

Hide Abstract

Security Management System for Oilfield Based on Embedded Wireless Communications: A Design Approach

Syed Minhaj Ali¹, Satish V Reve¹, Roohi Ali²* and Sana Iqbal³

 

View:    Abstract |  HTML Full Text |  PDF |  XML|

This paper is concerned with the security management system for oilfield based on Embedded wireless communication. The site of oil-well is distributed dispersedly, but distribution area is wide. Oil exhauster continuous working for 24 hours. Regional of Perambulation for the site of oil-well is very vast, meanwhile, as for the problem of petroleum being stolen, transmission line being stolen, transformer being stolen, which has been one of the important objective condition restriction of geographical environment, the implementation of safe management for oilfield is very difficult. We overcome these difficulties, which management works. With Consideration of the comes from geographical environment. The cost is not only high if the fiber cable would be lay between working station in the several tens of square kilometers, but also need to put into a lot of human and material resources with line maintenance and guard against theft. It is a perfect scheme that remote wireless monitoring and control system is established.

Hide Abstract

Analysis of Routing Attacks in Peer to Peer Overlay Networks

Anil Saroliya¹ and Vishal Shrivastava²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Peer-to-peer (P2P) systems are distributed systems in which nodes act as peers, such systems are becoming very popular in applications like file sharing. In this kind of architecture, security in each transaction is fundamental requirements. The aim of a Distributed hash tables provides the method for locating resources (generally files) within a P2P network. In this paper our target is to analyze the routing attacks on existing protocols of such networks. Chord is preferred as the target DHT protocol for various causes which will be discussed in the paper. Routing attacks analysis finds the vulnerabilities of existing protocols and anticipates a defense mechanism which is discussed herewith.

Hide Abstract

Application of Cloud Computing for the Better E-Governance in Developing Countries

Ashish Rastogi

View:    Abstract |  HTML Full Text |  PDF |  XML|

Cloud Computing is the latest Information Technology revolution that helps developing countries to implement their E-governance services at a very low cost and provide better services to their citizens. These changes naturally should reflect the way government functions in terms of the organization of the government, its relationship with its citizens, institutions and businesses and cooperation with other governments. The critical problem (Rastogi 2010) discussed for the developing countries is the necessary infrastructure to implement the E-services. In this paper we try to research that how the ‘cloud computing’ architecture helps the developing countries to overcome the various problems and which will ultimately lead to development and an overall economic progress of countries.

Hide Abstract

Establishment of Evaluation Scheme of Model Driven Architecture

R. Vinodani Katiyar¹ and Rohit Chandra²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Model Driven Architecture serving to those who are mainly interested in building platform independent software architecture. At the same time it is quite interesting to see and calculate its performance on an overall basis. However no such scheme or matrices are available to us to do the same. This white paper is an attempt to come forward and start with some fruitful scheme that can help us to do the same.

Hide Abstract

Detecting Terror- Related Activities on the Web Using Neural Network

Deepak Tinguriya and Binod Kumar

View:    Abstract |  HTML Full Text |  PDF |  XML|

Terrorist Detection System (TDS) is aimed at detecting suspicious users on the Internet by the content of information they access. TDS consists of two main modules: a training module activated in batch mode, and an on-line detection module. The training module is provided with web pages that include terror related content and learns the typical interests of terrorists by applying data mining algorithms to the training data. The detection module performs real-time monitoring on users" traffic and analyzes the content of the pages they access. An alarm is issued upon detection of a user whose content of accessed pages is "too" similar to typical terrorist content. TDS feasibility was tested in a network environment. Its detection rate was better than the rate of a state of the art Intrusion Detection System based on anomaly detection. In this Paper we present an Neural based Self organization map algorithm in TDS, where the detection algorithm was enhanced to improve the detection and reduce the false alarms in Terrorist Detection System.

Hide Abstract

Boundary Value Analysis for Non-Numerical Variables: Strings

Anupriya Jain, Sachin Sharma, Seema Sharma and Deepti Juneja

View:    Abstract |  HTML Full Text |  PDF |  XML|

The purpose of boundary value analysis is to concentrate effort on error prone area by accurately pinpointing the boundaries of condition. Boundary value analysis produces test inputs near each sub domain’s to find failure cause by incorrect implementation of boundary. The major limitation of boundary value analysis is that it fails to test non-numerical variables. This paper focuses on as an antidote to enter the string values.

Hide Abstract

Rule Based Approach for English to Sanskrit Machine Translation and Synthesizer System

D.T. Mane¹, P.R. Devale¹ and S.D. Suryawanshi²

View:    Abstract |  HTML Full Text |  PDF |  XML|

The area of Artificial Intelligence is very useful in providing people with a machine, which understands diverse languages spoken by the common man. It presents the user with an interface, with which he feels more comfortable. Since, there are many different languages spoken in this world, we are constantly in need for translators to enable people speaking different languages to share ideas and communicate with one another. English is the global language .The most of the information is available in English. The India is a country which has several regional languages. Sanskrit is the mother of all native language of India. A great storage of knowledge with subjects like medicine, mathematics, Geography, Geology, Astronomy, philosophy and many others is kept alive and fresh Sanskrit lore for thousands of years.English to sanskrit translator and Synthesizer is very useful to people in India , sentence in English is translated in to sanskrit using rule based approach and from sanskrit it is easier to transform in to native languages.

Hide Abstract

Multiple Classifiers System for Medical Diagnosis

M. Solomon Pushparaj¹ and P.J. Kulkarni²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Data mining helps in decision making. Due to the peculiar feature of the medical profession, physician desperately needs a helping tool to take an efficient and intelligent decision. Good performance, the ability to appropriately deal with missing data and with noisy data (errors in data), the transparency of diagnostic knowledge, the ability to explain decisions, and the ability of the algorithm to reduce the number of tests necessary to obtain reliable diagnosis are the various features desired from the machine learning classifier to solve the medical diagnostic task. Every machine learning method has its own features and no single method can provide all the desired features. We solved this problem by using multiple machine learning methods. In this paper we developed multiple classifiers system which helps the physician in the time of decision making process. Backpropagation algorithm (ANN), K-NN Algorithm (CBR) and Modified towing splitting rule algorithm (CT) are used in this system. We tested the system with three different disease datasets like diabetes, heart disease, breast cancer. It showed better results in reliability and performance which two are most desired features in the medical diagnostic task.

Hide Abstract

A Novel Approach to Construct Decision Tree Using Quick C4.5 Algorithm

Deepti Juneja, Sachin Sharma, Anupriya Jain and Seema Sharma

View:    Abstract |  HTML Full Text |  PDF |  XML|

With the rapid growth in size and number of available databases in commercial, industrial, administrative and other applications, it is necessary and interesting to examine how to extract knowledge from huge amount of data. There are several mining algorithms available to solve diverse data mining problems. One of the knowledge and discovery in databases operations is the problem of inducing decision trees. C4.5 is one of the most important algorithms in Decision Tree Induction. In this paper the limitations of the existing C4.5 algorithm are discussed and an enhancement technique for improving its efficiency is proposed.

Hide Abstract

Key Management With Cryptography

Syed Minhaj Ali¹, Satish V Reve¹, Roohi Ali²* and Sana Iqbal³

View:    Abstract |  HTML Full Text |  PDF |  XML|

In this paper, we present an idea of adopting certificateless public key encryption (CL-PKE) schemes over mobile ad hoc network (MANET), which has not been explored before. In current literature, essentially there exists two main approaches, namely the public key cryptography and identity-based(IDbased)cryptography .Unfortunately, they both have some inherent drawbacks. In the public key cryptography system, a certificate authority (CA) is required to issue certificates between users’ public keys and private key stoen sure their authenticity, whilst in an ID-based cryptography system, users’ private keys are generated by a key generation center (KGC), which means the KGC knows every users’ keys (the key escrow problem). To avoid these obstacles, Al-Riyami and Paterson proposed certificateless cryptography systems where the public keys do not need to be certified and the KGC does not know users’ keys. Essentially, certificateless cryptography relies between the public key cryptography and ID-based cryptography. In this work, we adopt this system’s advantage over MANET .To implement CL-PKE over MANET and to make it practical ,we incorporate the idea of Shamir’s secret sharing scheme. The master secret keys are shared among some or all the MANET nodes. This makes the system self-organized once the network has been initiated. In order to provide more flexibility, we consider both a full distribution system and a partial distribution system. Furthermore, we carry out two simulations to support our schemes. We firstly simulate our scheme to calculate our encryption, decryption and key distribution efficiency. Then we also simulate our scheme with AODV to test the network efficiency. The simulations are performed over OPNET.

Hide Abstract

Secure and Auditable Agent-Based Communication Protocol for E-Health System Framework

M. Aramudhan¹ and K. Mohan²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Security is essential for e-health system as it provides highly sensitive distributed medical data and exchanged among the healthcare professionals, customers and providers over Internet. Internet is an open access system that allows anyone to participate and access the data. Hence, it is necessary to protect the data, service from the unauthorized visibility, use and also maintain a high degree of accessibility. It is achieved using suitable access control policies and techniques that enforce differentiated levels of service visibility and access to the users. This paper introduces a Secure and Auditable Agent- based Communication Protocol (SAACP) which performs on key exchange mechanism with mobile agents to reduce the delay in communication. Intelligent mobile agents are proposed for dynamically negotiating the policy of the users. This protocol offers user friendly, privacy and safe communication through well-built secure mechanism that gives confident to the users and healthcare professional to access the e-health system.

Hide Abstract

IRIS Template Classification Using Selective Sub Bands of Wavelets

Ch.V. Narayana¹, P.S.R. Chandra Murty² and E. Sreenivasa Reddy³

View:    Abstract |  HTML Full Text |  PDF |  XML|

Considering multiple biometric templates per user account by biometric authentication systems for high acceptance rate leads to large storage space and computational overheads. Classification of these templates into significant sub groups will reduce the above overheads. Iris templates carry very distinctive texture information such as brightness, shape, size, uniformity, directionality, regularity etc .Iris texture classification based on wavelet pattern analysis is one of the most effective existing methods. However using all frequency sub-bands in decomposition for classification may increase space and time complexity of classification algorithms. In this paper sub-bands with high energy and entropy are only considered for classification to reduce the overheads due to space and time. Fractal dimensions are used to select significant sub-bands for decomposition at each level. Further statistical features of these significant sub-bands are used for classification. This paper describes iris texture classification using selective sub-bands of wavelets based on fractal dimensions and its results are compared with the other classification methods using conventional features.

Hide Abstract

A New Intelligent Predictive Caching Algorithm for Internet Web Servers

B.V. Pawar¹ and J.B. Patil²

View:    Abstract |  HTML Full Text |  PDF |  XML|

Web caching is used to improve the performance of the Internet Web servers. Document caching is used to reduce the time it takes Web server to respond to client requests by keeping and reusing Web objects that are likely to be used in the near future in the main memory of the Web server, and by reducing the volume of data transfer between Web server and secondary storage. The heart of a caching system is its page replacement policy, which needs to make good replacement decisions when its cache is full and a new document needs to be stored. The latest and most popular replacement policies like GDSF and GDSF# use the file size, access frequency, and age in the decision process. The effectiveness of any replacement policy can be evaluated using two metrics: hit ratio (HR) and byte hit ratio (BHR). There is always a trade-off between HR and BHR [1]. In this paper, using three different Web server logs, we use trace driven analysis to evaluate the effects of different replacement policies on the performance of a Web server. We propose a modification of GDSF# policy, IPGDSF#. Our simulation results show that our proposed replacement policy IPGDSF# performs better than several policies proposed in the literature in terms of hit rate as well as byte hit rate.

Hide Abstract

Some Notable Reliability Techniques for Disk File Systems

Wasim Ahmad Bhat and S.M.K. Quadri

View:    Abstract |  HTML Full Text |  PDF |  XML|

File system operations include data operations and metadata operations. Data operations act upon actual user data while metadata operations modify the structure of the file system, like creating, deleting, or renaming files, directories, etc. During a metadata operation, the system must ensure that data are written to disk in such a way that the file system can be recovered to a consistent state after a system crash. In this paper we look at some most notable techniques which ensure reliability of disk file systems against system crashes and failures.

Hide Abstract

Identifying Some Problems with Selection of Software Testing Techniques

Sheikh Umar Farooq and S.M.K. Quadri

View:    Abstract |  HTML Full Text |  PDF |  XML|

Testing techniques refer to different methods or ways of testing particular features of a computer program, system or product. Presently there are so many different software testing techniques that we can use. Whether we decide to automate or just execute tests manually, there is a selection of testing techniques to choose from. We have to make sure that we select technique(s) that will help to ensure the most efficient and effective testing of the system. The fundamental problem in software testing thus throws an open question, as to what would be the techniques that we should adopt for an efficient and effective testing. Thus, the selection of right testing techniques at the right time for right problem will make the software testing efficient and effective. In this paper we discuss how should testing techniques be compared with one another and why do we face a problem in making appropriate testing technique selection.

Hide Abstract

Comparative Analysis on Speckle Noise Reduction Techniques on Computed Tomographic Images

Hitesh H. Vandra and H.N. Pandya

View:    Abstract |  HTML Full Text |  PDF |  XML|

Reduction of speckle noise is one of the most important processes to increase the quality of computed topographic images. Image variances or speckle is a granular noise that inherently exists in and degrades the quality of CT images. Before using CT images for diagnosis purpose, the very first step is to reduce the effect of Speckle noise. Most of speckle reduction techniques have been studied by researchers; however, there is no comprehensive method that takes all the constraints into consideration. Filtering is one of the common method which is used to reduce the speckle noises. This paper compares different speckle reduction filters and presents the performance analysis for reducing speckle noise in computed topographic images in terms of the assessment parameters PSNR and MSE. 

Hide Abstract

Logical Study of Passwords

Mushtaq Ahmad Rather and Tariq Ahamad Ahanger

View:    Abstract |  HTML Full Text |  PDF |  XML|

Over the years security experts in the field of Information Technology have had a tough time in making passwords secure. This paper studies and takes a careful look at this issue from the angle of logical and cognitive science. We have studied the process of passwords to rank its strengths and weaknesses in order to establish a quality metric for passwords. Finally we related the process to human senses which enables us to propose a constitutional scheme for the process of password. The basic proposition is to exploit relationship between human senses and password to ensure improvement in authentication while keeping it an enjoyable activity.

Hide Abstract