An Effective Intrusion Detection System for correction and detection of Gray Hole attack in MANETs
Mobile Adhoc Networks is a collection of wireless mobile nodes, which form temporary networks without relying on any existing infrastructure or centralized administration or standard support services regularly available in wide area networks to which the host may normally be connected. In this paper the simulation results has been compared between previous & current approach for the correction and detection of Gray Hole attack in MANETs and all the results are taken by NS2 Simulator.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Dependency based query answering of the simple english text
The improvement of current Question Answering (QA) systems relies on finding ways to support the traditional statistic approach to QA with logic reasoning. In this presentation we show one way of supporting an Interactive Question answering system with logic reasoning. The most likely answer to a question by searching a predicate format of topic-arranged question patterns and responses. Answer Finder is a framework for the development of question-answering systems. Answer Finder is currently being used to test the applicability of graph representations for the detection and extraction of answers. In this paper we briefly describe Answer Finder and introduce our method to learn graph patterns that link questions with their corresponding answers in arbitrary sentences. The method is based on the translation of the logical forms of questions and answer sentences into graphs, and the application of operations based on graph overlaps and the construction of paths within graphs. The method is general and can be applied to any graph-based representation of the contents of questions and answers.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Enhancing particle swarm optimization using chaotic operators for association rule mining
Association Rule (AR) mining is a data mining task that discovers interesting relations between variables in databases. The main focus of research in association rule mining is on improving the computational efficiency of mining the rules. The conventional methods available for mining association rules depend on the threshold values of minimum support and minimum confidence. The setting up of these values needs great care and knowledge about the application domain. This paper deals with mining association rules using chaotic Particle swarm optimization (cPSO). Particle swarm optimization (PSO) is simple but powerful population based search technique for solving optimization problems. Chaotic PSO modifies the velocity updation function by introducing chaotic operators derived from chaotic maps. Both PSO and chaotic PSO generate AR with consistency and better computational accuracy when tested on three datasets. The range of attribute values in the dataset is found to affect the performance of the system.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Propose new ideas and analysis for parallel processing for application of data mining and data analysis
This paper emphasize on how parallelism can be applied in Data Analysis. In recent decades, where the large amount of data is produced by machines: software logs, cameras, microphones, RFIDs, etc. Creation speed rate of these data will increase exponentially with Moore’s Law. Saving or storing such amount of data is inexpensive and using some parallel processing methods, the data can also be investigated and mined effectively. So this article intends to debate about parallel programming procedures used in Data Analysis and Data Mining. The key motive for this parallelism is to make analysis more rapidly. This is generally attained by using multiple processors or multiple computers, execution dissimilar aspects of data analysis or mining, performing the tasks alongside and later consolidating the data into a single report.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Review of Regression Testing on Object-Oriented Programs
The purpose of regression testing is to ensure that bug fixes and new functionality introduced in a new version of a software do not adversely affect the correct functionality inherited from the previous version. Regression testing is an expensive and frequently executed maintenance process used to revalidate modified software. It is costly but crucial problem in software development. The paper try to do the survey of current research on regression testing and current practice in industry and also try to find out whether there are gaps between them.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Similarity search in recent biased time series databases using adaptive framework
A time series database is a collection of data that are generated in series as time goes on and constitutes a large portion of data stored in computers like stock-price movements, weather data, bio-medical measurements, video data etc., Two time sequences of same length are said to be similar if the Euclidean distance is less or equal to a given threshold. The main issue of similarity search in time series databases is to improve the search performance since time sequences are usually of high dimension. So it is important to reduce the search space for efficient processing of similarity search in large time series databases. We have used Adaptive Framework for the data reduction purpose which improves the search performance in Recent-Biased time series databases. We have applied a set of linear transformations on the reduced sequence that can be used as the basis for similarity queries on time series data. We have also formalized the intuitive notions of exact and approximate similarity in time series data. Our experiments show that the performance of this method is competitive to that of processing ordinary queries using the index and much faster than sequential scanning.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Basic technologies for integrating data and information from disparate source
This article discus a bout basic technologies which normally are used for different integration solutions and approaches to integrating data and information for enterprise information systems. In fact we are trying to find out the techniques that might be utilized in every solutions and approaches in order to integrate data and information from disparate sources this article discussing a bout basic and core technologies that work as hart of data integration solutions. This article contains eight sections and each section describes one technology. The eight basic technologies are included; Information Extraction, Data Cleansing, Extensible Markup Language (XML), Schema Matching, Schema Mapping, Schema Standards, Web Dynamic Technologies, Keyword Search.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Data leakage detection in single sign on system
In Single Sign On (SSO) Systems clients are allowed to login into multiple websites using a single user id and password. In that case single sign on data distributor has given sensitive data to a set of supposedly trusted agents (third parties). Some of the data are leaked and found in an unauthorized place (e.g. on the web). The distributor must assess the likely hood that the leaked data came from one or more agents, as opposed to having been independently gathered by other means. This paper presents a novel approach for data allocation strategies (across the agents) that improve the probability of identifying leakages of user profile information’s. In some cases, we can also inject “realistic but fake” data records to further improve our chances of detecting leakage and identifying the guilty party.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
In wireless sensor network vis global snook armor demeanor privacy
In sensor network many protocols have been developed for the purpose of confidentiality, contextual information, providing security for the content of message and transferred over network .It become a complex session for sensor network (ie, locating target objects in monitoring application, as well as protecting information). There have been several recent works on location privacy. It will be concise for the adversary and can capture only network-traffic in small area. The proposing system is the location privacy in large sensor networks. The antagonist model, global eavesdropper has become real and vanquishes existing techniques. We also propose two techniques that protect the information: recurrent location and provenance bluff. It provides a extreme level of location privacy while the other provides trade-off between privacy, cost for communication, latency. The squander view method is used to monitor the attacker within a small time of sequence. These techniques are efficient and effective in sheltering location information from attacker.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
A Survey of Current trends in road extraction from Satellite images
Road Network detection is important role of earth observation. It is detected from various satellite images like multispectral images, pan-sharpened images, SAR images and Aerial images. These are differentiated from various resolutions. In this survey most of the road networks detected from areal images, world view and quick bird images. Because very high-resolution images and Very less types of multispectral images has been used for road network detection. All researchers are focused on very high resolution satellite images for road network detection.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]