Volume 10 | Number 2
Page numbers: 260-538
Journal allows immediate open access to content in HTML + PDF
Volume 10 | Number 2
Page numbers: 260-538
Journal allows immediate open access to content in HTML + PDF
Muhammad Arif Bin Jalil
This study presents the modelling, simulation, and characterization of the Fiber Bragg grating (FBG) on maximum reflectivity, bandwidth, the effect of applied strain to the wavelength shift, ʎB and sensitivity of the wavelength shift with strain for optical sensing system. In this study, a commercial FBG with the center wavelength of 1550nm is used in order to measure the spectral response of FBG to strain. The parameters used in these simulations are the fiber grating length, L ranging from 1 to 10mm, the changes in refractive index, ∆n from 0.0002 to 0.0020, the effective refractive index, is 1.46 and the grating period of FBG,Λ for 530nm in the performance of FBG. The bandwidth and spectrum reflectivity are analyzed from the variation of refractive index and grating length. Simulations on the FBG are carried out using OriginPro 2016 and Microsoft Excel 2010 software. The Excel sheet is used to generate data and the OriginPro 2016 is used to generate the graphs. The results obtained indicates the variation in grating length and refractive index affect the spectral reflectivity and the bandwidth. In addition, results obtained show that the changes in the Bragg wavelength are due to an increase in length of the grating region which due to the applied strain.
Abedalhakeem T. E. Issa
The industry developed dramatically in the second half of the 20th century, And with it developed and manufacturing systems ranging from manual to fully computerized systems employing information and communication technology (ICT). This fact has made the manufacturing systems to be totally dependent on ICT and therefore these systems have to keep pace with the advancement in ICT. Distributed computing has totally changed the computing paradigm in recent times resulting in rapid employment of these technologies in the manufacturing sector. An important variable in the equation determining the trend of manufacturing technologies is the purchaser choice and preference which has become active recently. To address these heterogeneous user demands, the Autonomous Decentralized System (ADS) concept was introduced five decades ago. The ADS has been a significant development incorporated in modern manufacturing systems and have been standardised as the de-facto standard for factory automation. These systems hold the assure for on-line system maintenance, timeliness and assurance, ensuring greater productivity and cost benefit emerging as the system of choice in automated manufacturing systems. This paper reviews the ADS, its application to a manufacturing system, assesses the state of the art and discusses the future trends.
Tamanna Siddiqui and Mohammad AlKadri
Data Scientists need to manipulate with data (retrieve, aggregate, join, …. ) - when they do their tasks - for that it will be very useful to build a layer which prepare the data to be convenient for analysing step; with an approximation processing and some error tolerant defined by the user, that layer will handle both inserting records or collections to the database and retrieving the information from that database.
In this paper we will focus on the structure and the design of this layer, and dig deeper how this layer will translate the user’s inquiring manner to a SQL statement suitable for approximation processing.
Among all the water bodies in Jammu & Kashmir Dal Lake has a peculiar significance due to its location in the heart of the capital city Srinagar. Historical studies over last fifteen hundred years indicate a continuous squeezing of the Lake due to different natural and manmade interventions. Over this long period, the governance of the land has passed through various wise and ugly human plans besides some slow natural processes. The mathematical modelling of such a dynamics is not an easy task because of the many intervening variables and the difficulty which implies their measurements. On the other hand, during the last decades, the use of Cellular Automata (CA) techniques to simulate the behaviour of linear or non-linear systems is becoming of great interest. This fact is mainly due to the fact that this approach depends largely on local relations and a series of rules instead of precise mathematical formulae. The infrared (IR) satellite imagery can be helpful in identifying the different areas of interest using CA as a tool of image processing. The study will not only separate the areas of interest but also pave a way towards a comprehensive study of all the identified zones using spectral signatures received from the continuous IR imagery of both pre-monsoon and post-monsoon periods in future.
Samir Kumar Singha and Syed Imtiyaz Hassan*
The performance of data mining and machine learning tasks can be significantly degraded due to the presence of noisy, irrelevant and high dimensional data containing large number of features. A large amount of real world data consist of noise or missing values. While collecting data, there may be many irrelevant features that are collected by the storage repositories. These redundant and irrelevant feature values distorts the classification principle and simultaneously increases calculations overhead and decreases the prediction ability of the classifier. The high-dimensionality of such datasets possesses major bottleneck in the field of data mining, statistics, machine learning. Among several methods of dimensionality reduction, attribute or feature selection technique is often used in dimensionality reduction. Since the k-NN algorithm is sensitive to irrelevant attributes therefore its performance degrades significantly when a dataset contains missing values or noisy data. However, this weakness of the k-NN algorithm can be minimized when combined with the other feature selection techniques. In this research we combine the Correlation based Feature Selection (CFS) with k-Nearest Neighbour (k-NN) Classification algorithm to find better result in classification when the dataset contains missing values or noisy data. The reduced attribute set decreases the time required for classification. The research shows that when dimensionality reduction is done using CFS and classified with k-NN algorithm, dataset with nil or very less noise may have negative impact in the classification accuracy, when compared with classification accuracy of k-NN algorithm alone. When additional noise is introduced to these datasets, the performance of k-NN degrades significantly. When these noisy datasets are classified using CFS and k-NN together, the percentage in classification accuracy is improved.
Mudasir M Kirmani
Managing and monitoring attendance of employees is very important aspect for smooth functioning of any public or private organization. To obtain and maintain the attendance of employees in an organization has become a challenging aspect to deal with. In order to avoid human bias and direct human intervention different government institutions have implemented Biometric attendance system in educational institutions to record employee attendance on daily basis. This research work aims to study the impact of Biometric Attendance system(BAS) on educational system vis-a-vis punctuality of employees in an educational institute. The study indicates that biometric modalities are universally secure and accurate, but in practice the scenarios of attendance systems in Jammu & Kashmir has highlighted some loopholes which are existing at present in the Biometric attendance system.
Baby Kahkeshan and Syed Imtiyaz Hassan
Deep learning is a branch of machine learning which is recently gaining a lot of attention due to its efficiency in solving a number of AI problems. The aim of this research is to assess the accuracy enhancement by using deep learning in back propagation algorithm. For this purpose, two techniques has been used. In the first technique, simple back propagation algorithm is used and the designed model is tested for accuracy. In the second technique, the model is first trained using deep learning via deep belief nets to make it learn and improve its parameters values and then back propagation is used over it. The advantage of softmax function is used in both the methods. Both the methods have been tested over images of handwritten digits and accuracy is then calculated. It has been observed that there is a significant increase in the accuracy of the model if we apply deep learning for training purpose.
Javaid Iqbal and Syed Abrar Ul Haq
Automation in generation of architectural feedback from performance indexes like probability distributions, mean values and variances has been of interest to the researchers from last decade. It is well established that due to the complexity in interpreting the performance indices obtained from performance analysis of software architecture and short time to the market, an automated approach is vital for acceptance of architecture based software performance engineering approach by software industry. In last decade some work has beendone in this direction. Aim of this paper is to explore the existing research in the field, which will be valuable for researchers looking forward to contributing to this research.
The main challenge for effective web Information Retrieval(IR) is to infer the information need from user’s query and retrieve relevant documents. The precision of search results is low due to vague and imprecise user queries and hence could not retrieve sufficient relevant documents. Fuzzy set based query expansion deals with imprecise and vague queries for inferring user’s information need. Trust based web page recommendations retrieve search results according to the user’s information need. In this paper an algorithm is designed for Intelligent Information Retrieval using hybrid of Fuzzy set and Trust in web query session mining to perform Fuzzy query expansion for inferring user’s information need and trust is used for recommendation of web pages according to the user’s information need. Experiment was performed on the data set collected in domains Academics, Entertainment and Sports and search results confirm the improvement of precision.
Parveen Sadotra and Chandrakant Sharma
Internet is a blessing for human community modern days and use of network is indispensable in present time. Use of networks and internet has also brought large numbers of security threats to our database and information systems. There are so many intrusion attacks on public and private networks. Main objective of this research work to study about problem associated with intrusion in network system and analyzes the use of intrusion Detection systems. Scrutinize the use of various IDS and develop a new IDS which should be most effective and easy to use also cost effective for users. So, we will be presenting our newly developed application based IDS which is to be suitable way to detect threat in the network system which can be cost effective and easy to use also it should have instantaneous alert system to notify intrusion to security professionals.
Sindhu* and V. Vaidhehi
The collection of large database of digital image has been used for efficient and advanced way for classifying and intelligent retrieval of medical imaging. This research work is to classify human organs based on MRI images. The various MRI images of organ have been considered as the data set. The main objective of this research work is to automate the medical imaging system. Digital images retrieved based on its shape by Canny Edge Detection and is clustered together in one class using K-Means Algorithm. 2564 data sets related to brain and heart is considered for this research work. The system was trained to classify the image which results in faster execution in medical field, also helped in obtain noiseless and efficient data.
Sunil Kumar and Maninder Singh
Network security, data security and several other security types such as the computer security collectively compose the word “Cloud Security”. Cloud computing posses a new challenge because traditional security mechanism is being followed are insufficient to safeguard the cloud resources. Cloud computing can easily be targeted by the attackers. A group of malicious users or illegitimate users can attack on system which may lead to denial the services of legitimate users. Such kinds of attacks are performed by the malicious (zombie) attackers. The zombie attack will degrade the network performance to large extend. Traditional techniques are not easily capable to detect the zombie attacker in the cloud network. So in this paper we have proposed a technique which is the enhancement of the mutual authentication scheme in order to detect and isolate zombie attack for the efficient performance of the network.
Junestarfield Lyngdoh Kynshi and Deepa V Jose
This paper aims to solve the problems of the existing technique of the content based double encryption algorithm using symmetric key cryptography. Simple binary addition, folding method and logical XOR operation are used to encrypt the content of a plaintext as well as the secret key.This algorithm helps to achieve the secure transfer of data throught the network. It solved the problems of the existing algorithm and provides a better solution. The plaintext are encrypted using the above methods and produce a cipher text. The secret key is encrypted and shared through secure network and without knowing the secret key it is difficult to decipher the text. As per expected, enhanced encryption algorithm gives better result than the existing encryption algorithm.
Manvender Singh Rathore* and Deepa V. Jose
Agile software development work on twelve principle for software development which implies requirements and solutions evolved through the combined teamwork of disciplined and interdisciplinary teams. The objective of this paper is to connect agile methodology with Version Control System for more efficient and effective utilization of resources. In this proposed model based on agile methodology, the Version Control System plays a vital role to make work done faster as compared to SCRUM. This paper compares various existing agile methodologies. The efficiency of the proposed model is proved through comparative analysis with existing agile methods and using ANOVA mathematical model. Bitbucket as Version Control System is used as web based hosting service and the proposed model is compared by maintaining similar sprints in SCRUM and VSprint model. VCS and previous SRS documents are the important components of this proposed model which helps in increasing the work speed at different phases of software development which the existing models does not consider.
Rupal R Sharma1 and Ravi K Sheth2
Today, web application security is most significant battlefield between victim, attacker and resource of web service. The owner of web applications can’t see security vulnerability in web application which develops in ASP.NET. This paper explain one algorithm which aim to identify broken authentication and session management vulnerability. The given method of this paper scan the web application files. The created scanner generator relies on studying the source character of the application limited ASP.NET files and the code be beholden files. A program develop for this motive is to bring about a report which describes vulnerabilities types by mentioning the indict name, disclose description and its location. The aim of the paper is to discover the broken authentication and session management vulnerabilities. The indicated algorithm will uphold organization and developer to repair the vulnerabilities and recover from one end to the other security.
Siddhartha Sankar Biswas
In this research paper the author introduces the notion of i-v fuzzy multigraph. The classical Dijkstra’s algorithmic rule to search out the shortest path in graphs isn't applicable to fuzzy graph theory. Consequently the author proposes a brand new algorithmic rule referred to as by IVF-Dijkstra's algorithmic rule with the philosophy of the classical Dijkstra's formula to unravel the SPP in an i-v fuzzy multigraph
Shivangi Nigam and Abhishek Bajpai
Resource Provisioning in a Cloud Computing Environment ensures flexible and dynamic access of the cloud resources to the end users. The Multi-Objective Decision Making approach considers assigning priorities to the decision alternatives in the environment. Each alternative represents a cloud resource defined in terms of various characteristics termed as decision criteria. The provisioning objectives refer to the heterogeneous requirements of the cloud users. This research study proposes a Resource Interest Score Evaluation Optimal Resource Provisioning (RISE-ORP) algorithm which uses Analytical Hierarchy Process (AHP) and Ant Colony Optimization (ACO) as a unified MOMD approach to design an optimal resource provisioning system. It uses AHP as a method to rank the cloud resources for provisioning. The ACO is used to examine the cloud resources for which resource traits best satisfy the provisioning. The performance of this approach is analyzed using CloudSim. The experimental results show that our approach offers improvement in the performance of previously used AHP approach for resource provisioning.
C.P. Patidar1 and Arun Dev Dongre2
Today we live in the era of software and web applications. Software is used in every minor and major field. In defense, medical, education, research, government, administration and much other field software became a necessary part. Software also brings transparency in the systems. Software also makes people’s life easy and comfortable. Software testing is a very important part of any software development process. Software testing requires approximately forty percent budget of any software development process. Like in an automobile industry every vehicle is tested before it goes to the customer. Also in software testing it is must to test the software before deployment. Because if software deployed without testing then user will face the bug and user will be unhappy with the software. In this paper we compare manual and automated testing and proposed an automated testing model with test driven development (TDD).
Prannoy Giri and K. Saravanakumar
Breast Cancer is one of the significant reasons for death among ladies. Many research has been done on the diagnosis and detection of breast cancer using various image processing and classification techniques. Nonetheless, the disease remains as one of the deadliest disease. Having conceive one out of six women in her lifetime. Since the cause of breast cancer stays obscure, prevention becomes impossible. Thus, early detection of tumour in breast is the only way to cure breast cancer. Using CAD (Computer Aided Diagnosis) on mammographic image is the most efficient and easiest way to diagnosis for breast cancer. Accurate discovery can effectively reduce the mortality rate brought about by using mamma cancer. Masses and microcalcifications clusters are an important early symptoms of possible breast cancers. They can help predict breast cancer at it’s infant state. The image for this work is being used from the DDSM Database (Digital Database for Screening Mammography) which contains approximately 3000 cases and is being used worldwide for cancer research. This paper quantitatively depicts the analysis methods used for texture features for detection of cancer. These texture featuresare extracted from the ROI of the mammogram to characterize the microcalcifications into harmless, ordinary or threatening. These features are further decreased using Principle Component Analysis(PCA) for better identification of Masses. These features are further compared and passed through Back Propagation algorithm (Neural Network) for better understanding of the cancer pattern in the mammography image.
Malware is a malicious instructions which may harm to the unauthorized private access through internet. The types of malware are incresing day to day life, it is a challenging task for the antivius vendors to predict and caught on access time. This paper aims to design an automated analysis system for malware classes based on the features extracted by Discrete Wavelet Transformation (DWT) and then by applying four level decomposition of malware. The proposed system works in three stages, pre-processing, feature extraction and classification. In preprocessing, input image is normalized in to 256x256 by applying wavelet we are denoising the image which helps to enhance the image. In feature extraction, DWT is used to decompose image into four level. For classification the support vector machine (SVM) classifiers are used to discriminate the malware classes with statistical features extracted from level 4 decomposition of DWT such as Daubechies (db4), Coiflet (coif5) and Bi-orthogonal (bior 2.8). Among these wavelet features the db4 features effectively classify the malware class type with high accuracy 91.05% and 92.53% respectively on both dataset. The analysis of proposed method conducted on two dataset and the results are promising.
Reena Hooda1 and Deepak Dhaka2
The benefit of plastic money is its accessibility and acceptability with lightness than cash in wallet plus ready notion with the major problem of fraudsters who may exploit the rights of the genuine stakeholders. At present, the computing resources, communication technology, availability of the cards in competitive market are escalating India to be placed at the growing edge of popularity of plastic money. People after demonetization are intensely employing the technology; moreover it’s a step towards the digitalization of India. With increasing population of the smart phone users, people are more aware about the cautions needed to pursue while using P-cards. However as compare to the developed nations, in a land of villages- India, the problem become more perilous as there are petite infrastructural facility & literacy, high percentage of less skilled population, old customs and non-reach of the government facilities, spoiling the p-money applicability & security and digital India campaign.. Therefore, besides the plastic money rewards and contributions to the digitalization, present paper contributed towards a range of security issues in addition to technique to make money protected and reliable.
H. B. Basanth Kumar
Digital images are widespread today. The use of digital images is classified into natural images and computer graphic images. Discrimination of natural images and computer graphic (CG) images are used in the applications which include flower classification, indexing of images, video classification and many more. With the rapid growth in the image rendering technology, the user can produce very high realistic computer graphic images using sophisticated graphics software packages. Due to high realism in CG images, it is very difficult for the user to distinguish it from natural images by a naked eye. This paper presents comparative study of the existing schemes used to classify digital images.
Dharmesh Bhatt* and Bijendra Agrawal
In the unknown environment of network the security the major issue for safe communication between the nodes. In such networks a lot of active and passive attacks are done by the attackers in groups in the data packets and routing messages. Mobile Adhoc networks are such type of networks which does not have any centralized authentication and the nodes communicate with each other. In MANET security plays a vital role for safe communication. The focus is given on security of Mobile Adhoc Networks. MANET is open network which is available for the malicious and trustworthy users to communicate. The MANET has to design a robust solution which deals with such type of malicious attackers so that a health network transmission takes place between trustworthy nodes. The Mobile Adhoc Networks are the network which works in isolation along with the wired infrastructure. This elasticity of MANET is its biggest strength but it is also a big vulnerability for security issues. In paper we will discuss the active and passive attacks including black hole attacks, spoofing, warm hole, flooding and traffic monitoring, traffic analysis and eavesdropping respectively.
Agile software development is a conceptual framework that promotes development using iterations throughout the project development. In software development Agile means quick moving. For the satisfaction of customer and to cope up customer frequent changed requirements, heavyweight methodology is being kicked. Two major challenges in software development are high quality software production and stakeholder requirement meeting. An independent online web based survey; interview survey and questionnaire survey were conducted. Motive was to find the total percentage of users in India, who are using Agile and it was tried to find out “does it increase the productivity, quality and cost of software?” Hypothesis has been proved using statistical one way ANOVA method. Different Hypothesis that are designed
Production increases on using different methodology of Agile instead of heavyweight methods.
Quality increases on using different methodology of Agile instead of heavyweight methods.
Cost reduceses on using different methodology of Agile instead of heavyweight methods.
Saiful Islam* and Bipul Syam Purkayastha
Electronic Dictionary and Machine Translation system are both the most important language learning tools to achieve the knowledge about the known and unknown natural languages. The natural languages are the most important aspect in human life for communication. Therefore, these two tools are very important and frequently used in human daily life. The Electronic Dictionary (E-dictionary) and Machine Translation (MT) systems are specially very helpful for students, research scholars, teachers, travellers and businessman. The E-dictionary and MT are very important applications and research tasks in Natural Language Processing (NLP). The demand of research task in E-dictionary and MT system are growing in the world as well as in India. North-East (NE) is a very popular and multilingual region of India. Even then, a small number of E-dictionary and MT system have been developed for NE languages. Through this paper, we want to elaborate about the importance, approaches and features of E-dictionary and MT system. This paper also tries to review about the existing E-dictionary and MT system which are developed for NE languages in NE India.
Riddhi Gaur and Uma Kumari
Internet based applications and data storage services can be easily acquired by the end users by the permission of Cloud computing. Providing security to the cloud computing environment has become important issue with the increased demand of cloud computing. Other than the traditional security methods, additional methods like control access, confidentiality, firewalls and user authentication are required in order to provide security to the cloud computing environment. One of the needful components in terms of cloud security is Intrusion Detection System (IDS). To detect various attacks on cloud, Intrusion Detection System (IDS) is the most commonly used mechanism. This paper discusses about the intrusion detection and different intrusion detection techniques namely anomaly based techniques and signature based techniques.
Neha Bhatia1, Himani2 and Chander Kant3
Biometric authentication using fingerprint is one of the unique and reliable method of verification processes. Biometric System suffers a signiﬁcant loss of performance when the sensor is changed during enrollment and authentication process. In this paper fingerprint sensor interoperability problem is addressed using Gabor ﬁlter and classifying images into good and poor quality. Gabor ﬁlters play an important role in many application areas for the enhancement of various types of fingerprint images. Gabor ﬁlters can remove noise, preserve the real ridges and valley structures, and it is used for fingerprint image enhancement. Experimental results on the FVC2004 databases show improvements of this approach.
Internet of things (IoT) is a system of connected physical objects that are accessible through an internet. The things in IoT is an object that assigned with an IP address and have the ability to collect and transfer the data over a network without manual intervention. As IOT does not need any human to machine interaction, it seems to be one of the largest waves of revolution as per the research going on, hence security is needed. The quick development of IOT has derived with the challenges in terms of security of things. This paper focus on the general security issues in IoT and measures used to overcome with those security issues.
Arpit Agrawal* and Shubhangi Verma
Cloud computing is the new generation technology provides the way of sharing of resources, memory, software anything in the form of service using internet. Security is an important and unique phenomenon gives safe and isolated environment. Security model and principles are defined to implement security features with any applications. Confidentiality, authentication and integrity are primary principles for trust establishment. Existing work only concentrates on integrity concept and does not imposes for authentication or access control. A Kerberos based strong authentication scheme has been generated using third party auditing concept to improve the strength of authentication as well as trust on CSP. This work will implement security service architecture to create Kerberos environment and establish communication between Kerberos and CSP. The complete work will be implemented using Java technology and Open Stack serve for public cloud environment.
Preeti Gulia and Palak*
The development of high quality software is the need of current technology driven world. Component Based Software Engineering (CBSE) has provided a cost effective, fast and modular approach for developing complex software. CBSE is mainly based on the concept of reusability. Apart from these CBSE has several advantages as well as challenges which are summarized in this paper. Large and complex software development requires management of reusable components and can be selected from component repository and assembled to obtain a working application. Development of components and their assembly is different from traditional softwares which leads to the need of new development paradigms for Component Based Systems (CBS). Software development life cycle (SDLC) provides planned and systematic arrangement of activities to be carried out to deliver high quality products within time and budget. This paper presents a comparative study of component based software development life cycle models with their strengths and weaknesses.
Ankush Saklecha and Jagdish Raikwal
Clustering is well-known unsupervised learning method. In clustering a set of essentials is separated into uniform groups.K-means is one of the most popular partition based clustering algorithms in the area of research. But in the original K-means the quality of the resulting clusters mostly depends on the selection of initial centroids, so number of iterations is increase and take more time because of that it is computationally expensive. There are so many methods have been proposed for improving accuracy, performance and efficiency of the k-means clustering algorithm. This paper proposed enhanced K-Means Clustering approach in addition to Collaborative filtering approach to recommend quality content to its users. This research would help those users who have to scroll through pages of results to find important content.
Harmeet Kaur1, Shahanawaj Ahamad2 and Govinder N. Verma3
The present research estimates the efficacy of a legacy program and the areas of its development. The research also intends to put forward as to what extent reengineering of a legacy program has to be done on the basis of the estimation approach. The study has tried to outline the current issues and trends in reengineering of a legacy program from various perspectives. An all-inclusive literature review reveals that a lot of work has already been piled up with legacy system estimation and the reengineering domain, yet the basic assumptions of Complexity, Quality and Effort have not been worked out collectively. Hence the present research underlines this very maxim and studies the reengineering of a legacy program on the paradigms of Quality, Complexity, and Effort Estimation collectively. The findings put forward an equation and reengineering scale which would be highly compatible with present technology for the feasibility of an effective reengineering.
Abhinav Kumra, W. Jeberson, and Klinsega Jeberson
Network security is one of the most important non-functional requirements in a system. Over the years, many software solutions have been developed to enhance network security. Intrusion Detection System (IDS) we have provided an overview of different types of intrusion Detection Systems, the advantages and disadvantages of the same. The need for IDS in a system environment and the generic blocks in IDS is also mentioned.The examples are as follows: (1) Misuse intrusion detection system that uses state transition analysis approach, (2) Anomaly based system that uses payload modeling and (3) Hybrid model that combines the best practices of Misuse and Anomaly based intrusion systems.
Gurpreet Kaur1, Derminder Singh1 and Rajan Aggarwal2
During past two decades, groundwater utilization has increased tremendously in the state of Punjab (India) particularly for agriculture purposes. Higher energy demand for lifting water is a result of decline in water table. Punjab is presently facing water crises which exasperate monetary conditions of small farmers, abrogate the natural value and unfavorably influence farming generation and economy of the state.In this research, an expert system was developed using Java Standard Edition 7 which provide appropriate selection of submersible pump set and required associated components such as power cable rating and size, generator capacity, ammeter rating, voltmeter rating, capacitor rating and Polyvinyl Chloride pipe diameter based on the spatial information of last 18 years (1998-2015) of ground water table for the state of Punjab. The developed system will be beneficial for the farmer’s in estimation of the required submersible pumping system and allied electrical components.
Iqra Altaf Mattoo and Parul Agarwal*
Biometric Recognition is the most suitable and informed identification method which is used in different fields due to its uniqueness of the countless behavioural and physiological traits like hand geometry, finger prints, iris recognition, face recognition, handwriting, voice recognition, etc. Iris recognition system is widely being used as it has inherently distinctive patterns that provide a robust method for the identification purpose. Different nations have already started to use biometric recognition system for the identification purposes including patient identification, border security, etc. In this review paper, different steps that are involved in Iris Recognition system are defined and evaluation of different Iris Recognition methods used by different researchers for each recognition step is done as well.
Y. Bhavani1*, V. Janaki2 and R. Sridevi3
Distributed Denial of Service (DDoS) attack is an unavoidable attack. Among various attacks on the network, DDoS attacks are difficult to detect because of IP spoofing. The IP traceback is the only technique to identify DDoS attacks. The path affected by DDoS attack is identified by IP traceback approaches like Probabilistic Packet marking algorithm (PPM) and Deterministic Packet Marking algorithm (DPM). The PPM approach finds the complete attack path from victim to the source where as DPM finds only the source of the attacker. Using DPM algorithm finding the source of the attacker is difficult, if the router get compromised. Using PPM algorithm we construct the complete attack path, so the compromised router can be identified. In this paper, we review PPM and DPM techniques and compare the strengths and weaknesses of each proposal.
K. Sharmila1,*, V. Janaki2 and A. Nagaraju3
Confidentiality and Authentication were once treated different but now-a-days, improvement in technology is demanding both of them to be used together. Though technology is increasing tremendously, smart hackers on the environment always challenges the authentication factors, thereby enforcing more number of factors for authentication to be included. As factors increase, failure rate for authentication may also be more when any one of the factors doesn’t work. A qualitative survey of user authentication systems being used in today’s environment is presented here and a comparative study of various authentication mechanisms used in the world of Information security by various researchers is shown.
Mudasir M Kirmani
Cardiovascular disease represents various diseases associated with heart, lymphatic system and circulatory system of human body. World Health Organisation (WHO) has reported that cardiovascular diseases have high mortality rate and high risk to cause various disabilities. Most prevalent causes for cardiovascular diseases are behavioural and food habits like tobacco intake, unhealthy diet and obesity, physical inactivity, ageing and addiction to drugs and alcohol are to name few. Factors such as hypertension, diabetes, hyperlipidemia, Stress and other ailments are at high risk to cardiovascular diseases. There have been different techniques to predict the prevalence of cardiovascular diseases in general and heart disease in particular from time to time by implementing variety of algorithms. Detection and management of cardiovascular diseases can be achieved by using computer based predictive tool in data mining. By implementing data mining based techniques there is scope for better and reliable prediction and diagnosis of heart diseases. In this study we studied various available techniques like decision Tree and its variants, Naive Bayes, Neural Networks, Support Vector Machine, Fuzzy Rules, Genetic Algorithms, and Ant Colony Optimization to name few. The observations illustrated that it is difficult to name a single machine learning algorithm for the diagnosis and prognosis of CVD. The study further contemplates on the behaviour, selection and number of factors required for efficient prediction.
Mudasir M kirmani1 and Syed Mohsin Saif*2
World has shrunk into logically small e-village where everyone can communicate with every other person with great ease via both audio and visual media. The existence of being a global village has only been possible with interconnected communication links connecting users from geographically distant areas. The most predominant media for achieving this is with the help of well developed website. Educational institutions play vital role in reshaping any nation by imbibing quality attributes in culture, civilization and modernization. The information available on websites of educational institutions has become very important for prospective students and at present most of the universities are completing the process of admissions online with websites as mediators. Therefore, the need of the hour is to have updated and informative websites with ease of access. A key feature of the ongoing growth of Worldwide Web over the past decade has seen proliferation of web portals and mobile applications that focus on supporting education. The main aim of this research work is to study websites of central universities established post 2004 to explore their quality parameters and to get an insight into the challenges faced by prospective users. The research work recommends a common design framework for all central universities in order to help prospective users with understanding and usage of central universities websites.