Nisar A Lala*, Altaf A Balkhi and G M Mir
Cognitive radio (CR) is a promising solution to improve the spectrum utilization by enabling unlicensed users to exploit the spectrum in an opportunistic manner. Spectrum handoff is a different type of handoff in CR necessitated by the reappearance of primary user (PU) in the licensed band presently occupied by the secondary users (SUs). Spectrum handoff procedures aim to help the SUs to vacate the occupied licensed spectrum and find suitable target channel to resume the unfinished transmission. The purpose of spectrum mobility management in cognitive radio networks is to make sure that the transitions are made smoothly and rapidly such that the applications running on a cognitive user perceive minimum performance degradation during a spectrum handoff. In this paper, we will survey the literature on spectrum handoff in cognitive radio networks.
Mohammad Zunnun Khan*1, M. A. Khanam2 and M. H Khan2
To measure testability before the actual development starts will play a crucial role to the developer, designers and end users as well. Early measurement of testability, especially in early requirement stage to assist the developer for the further development process, and will also assures us to produce and deliver the high quality requirement that can surely reduce the overall cost and improves the quality of development process. Taking view of this fact, this paper identifies testability estimation factors namely understandability and modifiability and establishes the correlation among testability, understandability and modifiability. Further, a model is developed to quantify software testability in requirement phase and named as Requirement Testability Model of Object Oriented Software-RTMOOS. Furthermore, the correlation of Testability with these factors has been tested and justified with the help of statistical measures.
Mudasir M Kirmani
Practical effort estimation is very basic and most essential aspect of very organisation to sustain better project management. Accuracy in efforts makes the organisation to surge its business and build motivation among its customers effectively. It was observed that continuous improvement is required for software effort assessment after regular eras.The fundamental point of this examination work is to assess the execution of Re-UCP strategy for estimation of efforts for web application projects. This research work compares the existing effort estimation model results with the Re-UCP effort estimation methods results for web application development projects.
Mudasir M Kirmani
Data Warehouse design requires a radical rebuilding of tremendous measures of information, frequently of questionable or conflicting quality, drawn from various heterogeneous sources. Data Warehouse configuration assimilates business learning and innovation know-how. The outline of theData Warehouse requires a profound comprehension of the business forms in detail. The principle point of this exploration paper is to contemplate and investigate the transformation model to change over the E-R outlines to Star Schema for developing Data Warehouses. The Dimensional modelling is a logical design technique used for data warehouses. This research paper addresses various potential differences between the two techniques and highlights the advantages of using dimensional modelling along with disadvantages as well. Dimensional Modelling is one of the popular techniques for databases that are designed keeping in mind the queries from end-user in a data warehouse. In this paper the focus has been on Star Schema, which basically comprises of Fact table and Dimension tables. Each fact table further comprises of foreign keys of various dimensions and measures and degenerate dimensions if any. We also discuss the possibilities of deployment and acceptance of Conversion Model (CM) to provide the details of fact table and dimension tables according to the local needs. It will also highlight to why dimensional modelling is preferred over E-R modelling when creating data warehouse.
Chetan R. Dudhagara and Hasamukh B. Patel
In a recent era of modern technology, there are many problems for storage, retrieval and transmission of data. Data compression is necessary due to rapid growth of digital media and the subsequent need for reduce storage size and transmit the data in an effective and efficient manner over the networks. It reduces the transmission traffic on internet also. Data compression try to reduce the number of bits required to store digitally. The various data and image compression algorithms are widely use to reduce the original data bits into lesser number of bits. Lossless data and image compression is a special class of data compression. This algorithm involves in reducing numbers of bits by identifying and eliminating statistical data redundancy in input data. It is very simple and effective method. It provides good lossless compression of input data. This is useful on data that contains many consecutive runs of the same values. This paper presents the implementation of Run Length Encoding for data compression.PDF Downloads: 31