Design of a frequent pattern mining based on systolic trees
Frequent pattern mining algorithms are designed to find commonly occurring sets in databases. This class of algorithms is typically very memory intensive, leading to prohibitive runtimes on large databases. A class of reconfigurable architectures has been recently developed that have shown promise in accelerating some data mining applications. In this paper, I propose a new architecture for frequent pattern mining based on a systolic tree structure. The goal of this architecture is to mimic the internal memory layout of the original pattern mining software algorithm while achieving a higher throughput. We provide a detailed analysis of the area and performance requirements of our systolic tree-based architecture, and show that our reconfigurable platform is faster than the original software algorithm for mining long frequent patterns.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Handling Manet routing attacks using risk aware mitigation mechanism with extended D-S theory
Mobile Ad hoc Networks (MANET) have been highly vulnerable to attacks due to the dynamic nature of its network infrastructure. Among these attacks, routing attacks have received considerable attention since it could cause the most devastating damage to MANET. Even though there exist several intrusion response techniques to mitigate such critical attacks, existing solutions typically attempt to isolate malicious nodes based on binary or na?ve fuzzy response decisions. However, binary responses may result in the unexpected network partition, causing additional damages to the network infrastructure, and na?ve fuzzy responses could lead to uncertainty in countering routing attacks in MANET. In this paper, we propose a risk-aware response mechanism to systematically cope with the identified routing attacks. Our risk-aware approach is based on an extended Dempster-Shafer mathematical theory of evidence introducing a notion of importance factors. In addition, our experiments demonstrate the effectiveness of our approach with the consideration of several performance metrics.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Implementation of new secured data compression technique using huffman code and symmetric key algorithm
Data compression is a common requirement for most of the computerized applications. There are many number of unsecured data compression algorithms, which are dedicated to compress different unsecured data formats. Even for a single data type there are number of different compression algorithms, which use different approaches. In this research, we propose a simple and efficient data compression algorithm particularly suited to be used on available commercial basis using secured manner. Our intention is transmitting text data in secured as well as compressed in the open environment. It is using double compression technique based on Huffman coding algorithm and simple symmetric key algorithm. Experiment itself evaluates the performance of new secured data compression algorithm with other data compression algorithm.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Optimised layered approach on congestion control with efficient energy and QOS improvement for wireless network
Internet is widely used in the fast growing world and there are many limitations in the network. There are two main failures that occur in the network they are wireless packet loss and congestion. The performance of SCTP in wireless network is considered as the active area. Finding the failure in the wireless network is very important and it is analysed in this paper. So in this paper the route is established by on demand routing technique in which many features like the energy of the node, location, bandwidth of the packet are analysed. So when the packets need to be send it analyses all the features and establishes the route with the efficient node that has one hop count by using the SCTP transport protocol. On transmitting the packets some energy from the node gets reduced so there is a high chance of congestion since the capacity of the node is reduced. This can be avoided by the proactive approach by reducing the traffic send by the source when the node level reaches threshold. By using the multi-homing technique in the SCTP approach the packets can be sending till the alternate route is re-established. Thus the packet loss rate and end-end delay is reduced and QOS is increased in the system.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Hybrid approch for detection of event in sleep EEG
The wide variety of waveforms in EEG signals and the high non-stationary nature of many of them is one of the main difficulties to develop automatic detection systems for them. In sleep stage classification a relevant transient wave is the K-complex. This report comprehends the developing of new Fuzzy_Neural algorithm in order to achieve an automatic K-complex detection from EEG raw data. The Fuzzy c-means algorithm is used for the rough and rapid recognition of K-complex and the Neural Network classifier does the exact evaluation on the detected K-complex. This Pattern recognition technique is a hardware independent solution for the biomedical signal processing field. This represents a significant criterion for the objective assessment of a patient’s sleep quality.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Satellite Imagery Land Cover Classification using K-Means Clustering Algorithm Computer Vision for Environmental Information Extraction
Segmentation and classification of high resolution satellite imagery is a challenging problem due to the fact that it is no longer meaningful to carry out this task on a pixel-by-pixel basis. The fine spatial resolution implies that each object is an aggregation of a number of pixels in close spatial proximity, and accurate classification requires that this aspect be subtly considered. K-means clustering algorithm is a better method of classifying high resolution satellite imagery. The extracted regions are classified using a minimum distance decision rule. Several regions are selected as training samples for region classification. Each region is compared to the training samples and is assigned to its closest class. The procedure significantly reduces the mixed pixel problem suffered by most pixel based methods. In this paper, we used K-means clustering algorithm to classify satellite imagery into specific objects within it for cadastral and environmental planning purposes, thereby eliminating the above mentioned problems and getting better classification accuracy with the overall performance for accuracy percentage as 88.889% and Kappa values as 0.835.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Wireless pulmonary disease prediction system
Many chronic acute diseases affect the respiration of the patients. The monitoring of vital body signs such as respiration is becoming increasingly important, especially in view of the general aging of the population and the associated increase in chronic diseases. A quantitative, semi quantitative or even qualitative monitoring of respiration may help to detect pulmonary disorders such as apnea, Cheyne – stokes or Biot’s respiration. When used in a home setting, respiratory monitoring should be easy to use and not constrictive to for the patient. Moreover, because longtime monitoring entails direct connection between the patient and the system, the integration of such systems into clothing is desirable. A novel approach is presented for non constrictive long term – monitoring of respiration, which could particularly become suitable for home care applications. The system is based on textile integrated force sensors, which detect expansion of the thorax during respiration and allow wireless data transmission for maximum mobility. Possible application include long - term monitoring of patients with chronic pulmonary diseases, early recognition of disease and the performance measurement of athletes during excise. Results from performance tests under various conditions are presented. In addition, long – term respiratory monitoring systems could be used to monitor the performance of athletes and to optimize their training. Many qualitative, quantitative and semi – quantitative methods that are integrated in clothing for respiratory monitoring are available.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
A Novel Approach for Resolving Watermark Disputes through Watermark Authentication Server
Due to the rapid development of internet, perfect copy and illegal use of digital data becomes easy, which enforces the newer mechanism to provide means of protecting all forms of digital data. The purpose of this paper is to propose a novel approach of protecting the ownership rights and resolving the dispute of unauthorized addition of second watermark on already watermarked data by the use of watermark authentication server (WAS). Watermark authentication server serves as a trusted third party and solves the problem of deadlock in watermarking.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
A review of feature selection models for classification
The success of a machine learning algorithm depends on quality of data .The data given for classification, should not contain irrelevant or redundant attributes. This invariably increases the processing time. The data set, selected for classification should contain the right attributes for accurate results. Feature selection is an essential data processing step, prior to applying a learning algorithm. Here we discuss some basic feature selection models and evaluation function. Experimental results are compared for individual datasets with filter and wrapper model.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Design of parallel vector/scalar floating pointco-processor for FPGAs
Current FPGA soft processor systems use dedicated hardware modules or accelerators to speed up data-parallel applications. This work explores an alternative approach of using a soft vector processor as a general-purpose accelerator. Due to the complexity and expense of floating point hardware, these algorithms are usually converted to fixed point operations or implemented using floating-point emulation in software. As the technology advances, more and more homogeneous computational resources and fixed function embedded blocks are added to FPGAs and hence implementation of floating point hardware becomes a feasible option. In this research we have implemented a high performance, autonomous floating point vector co-processor (FPVC) that works independently within an embedded processor system. We have presented a unified approach to vector and scalar computation, using a single register file for both scalar operands and vector elements. The FPVC is completely autonomous from the embedded processor, exploiting parallelism and exhibiting greater speedup than alternative vector processors. The FPVC supports scalar computation so that loops can be executed independently of the main embedded processor.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]