Excellent agriculture using data mining techniques-a study
This paper presents a deep study on the application of Data mining Techniques in the field of Agriculture. Data mining software applications using various methodologies have been developed by both commercial and research centers. Using this software applications Agricultural eco system is also managed. Eco system management includes soil management. Recent technologies are nowadays able to provide a lot of information on agricultural-related activities, which can then be analyzed in order to find important information .Efficient techniques can be developed and tailored for solving complex soil datasets using data mining.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Security Maintaince System in Data Mining Using Distance Measure Technique
Now a day’s Security is the main thing. In this paper we focus on distance measures applied to ensure the security of the separate sensitive information. Protecting data security is an key issue in data distribution. Security maintains system in data mining using distance measure techniques typically aim to protect separate security, with minimal impact on the quality of the released data. Now a days, a few of models are proposed to ensure the security protecting and/or to reduce the information loss as much as possible. i.e., they further improve the flexibility of the anonymous strategy to make it more closeness to reality, and then to meet the diverse needs of the people. Different proposals and algorithms have been designed for them at the same time. In this scenario we provide a survey of distance measure techniques for security preserving. We discuss the distance measure methods, the major achievement ways and the strategies of distance measure algorithms, and summarize their advantage and disadvantage. Then we give a demonstration of the work finished. Finally conclude further research directions of distance measure techniques by analyzing the existing work.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Introduction to Query Processing and Optimization
Query Processing is the scientific art of obtaining the desired information from a database system in a predictable and reliable fashion. Database systems must be able to respond to requests for information from the user i.e. process queries. In large database systems, which may be running on un-predictable and volatile environments, it is difficult to produce efficient database query plans based on information available solely at compile time. Getting the database results in a timely manner deals with the technique of Query Optimization. Efficient processing of queries is an important requirement in many interactive environments that involve massive amounts of data. Efficient query processing in domains such as the Web, multimedia search, and distributed systems has shown a great impact on performance. This paper will introduce the basic concepts of query processing and query optimization in the relational database. We also describe and difference query processing techniques in relational databases.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Computer virus: their problems & major attacks in real life
Today’s enterprise networks are distributed to different geographical locations and applications are more centrally located, information represents the most important asset. With the growing number of data communication services, channels and available software applications, data are processed in large quantities and in a more efficient manner. This technological enhancement offers new flexible opportunities also measure security threats poses in the networks. These threats can external or Internal, external threats divided as hacking, virus attack, Trojans, worms etc. There are thousand and thousand of different viruses these days which improve every day. Although the wild spread of new and strong viruses, it still infects and spread only with user’s permission. This research paper highlights the phases of computer virus, computer virus, history of worst computer attack, type of computer virus with effect on computer & few examples of virus on their types, working of computer virus, and problem occur due to virus in computers.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
A Proposed method for designing an intelligent system for optical handwritten character recognition
The accurate recognition of Latin-script, typewritten text is now considered largely a solved problem. Typical accuracy rates exceed 99%, although certain applications demanding even higher accuracy require human review for errors. Other areas—including recognition of hand printing, cursive handwriting, and printed text in other scripts (especially those with a very large number of characters)--are still the subject of active research. Recognition of cursive text is an active area of research, with recognition rates even lower than that of hand-printed text. Higher rates of recognition of general cursive script will likely not be possible without the use of contextual or grammatical information. For example, recognizing entire words from a dictionary is easier than trying to parse individual characters from script. Reading the Amount line of a cheque (which is always a written-out number) is an example where using a smaller dictionary can increase recognition rates greatly. Knowledge of the grammar of the language being scanned can also help determine if a word is likely to be a verb or a noun, for example, allowing greater accuracy. The shapes of individual cursive characters themselves simply do not contain enough information to recognize all handwritten cursive script accurately (greater than 98%). It is necessary to understand that OCR technology is a basic technology also used in advanced scanning applications. Due to this, an advanced scanning solution can be unique and patented and not easily copied despite being based on this basic OCR technology. In this paper, an intelligent system for “OPTICAL CHARACTER RECOGINITION” using Artificial Neural Network based approach and a Feature Extraction algorithm before an ANN can be applied for classification of characters which promises to provide increased efficiency for the character recognition is proposed.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
A review prevention of jamming attacks in wireless network using internal threat model
In a network multiple clients sends multiple messages from multiple computers to one computer at the same time, the problem of jamming occurs at the intermediate node, So that the opponents can easily target the messages of high importance. Opponents are able to retrieve the data from the messages. This problem is overcome by using the cryptographic primitives to prevent jamming attacks. In internal threat model, RSA algorithm provides security to message transmission over the network. The schemes to prevent jamming attacks such as Strong Hiding Commitment Schemes (SHCS), Cryptographic Puzzles Hiding Schemes (CPHS), and All-Or-Nothing Transformation Hiding Schemes (AONTSHS).
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Medical Image Viewing Over Mobile Devices Using 3G Network
The future of healthcare delivery systems and telemedicinal applications has undergone a tremendous change due to e-health. E-health was the result of the integration of networks and telecommunications, dealing with applications of collecting, sorting and transferring medical data from distant locations for performing remote medical collaborations and diagnosis. Medical information is either in multidimensional or multi resolution form, this creates enormous amount of data. Efficient storage, retrieval, management and transmission of this voluminous data is extremely complex. The solution is to reduce this complexity is to compress the medical data so that the diagnostics capabilities are not compromised. For medical images, only a small portion of the image might be diagnostically useful, but the cost of a wrong interpretation is high. Combination of Lossless and Lossy compression schemes with secure transmission play a key role in telemedicine applications that help in accurate diagnosis and research. In this paper, we propose a combined compression method for Digital Imaging and Communications in Medicine images. The method includes the compression of region of interest using lossless image compression technique i.e. Predictive coding while the remaining area of image (other than region of interest) is compressed using the near lossless image compression techniques i.e. DCT. The image later is reconstructed by merging the region of interest with non-region of interest to get the compressed image, which is then sent over a wireless network using a 3G connection for fast and errorless transmission, to be accessed by authorized users on mobile devices.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Performance analysis of algorithm using Openmp
Parallel programming represents the next turning point in how software engineers write software. Today, low-cost multicore processors are widely available for both desktop computers and laptops. As a result, applications will increasingly need to be paralleled to fully exploit the multicore-processor throughput gains that are becoming available. Unfortunately, writing parallel code is more complex than writing serial code. This is where the OpenMP programming model enters the parallel computing picture. OpenMP helps developers create multithreaded applications more easily while retaining the look and feel of serial programming. The term algorithm performance is a systematic and quantitative approach for constructing software systems to meet the performance objectives such as response time, throughput, scalability and resource utilization. The performances (speedup) of parallel algorithms on multi-core system have been presented in this paper. The experimental results on a multi-core processor show that the proposed parallel algorithms achieves good performance compared to the sequential.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Architecture for Secure Cloud Computing using Garbled Circuits
Cloud computing security challenges for secure outsourcing of data and computations are increasing. Many hardware or fully Homomorphic Encryption solutions exists for Cloud computing security. The hardware based solution is not able to scale and fully Homomorphic Encryption is yet not practically used due to its low efficiency. Client should have trust service provider for confidentiality and integrity of their data and computation. In this paper, we describe architecture for secure outsourcing data and computation by providing confidentiality and integrity. In this architecture, Private Cloud which performs encryption and decryption of data and computations using Garbled Circuit for DNA matching using Levenshtein distance. Public Cloud does processing of operations. Comparison of our architecture with others techniques can be carried for proving its efficiency for secure outsourcing of data and computation. Efficient design can be used for different application where large mathematical computation is applied.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Clustering and mining multi-versioned XML documents
Clustering is a process that partitions data in such a way that homogeneous data items are grouped into sets which is referred to as clusters. When the content or structure changes over time of the multi-version XML documents then Clustering is done. In real-world applications, the number of changes from one version of an XML document to another version of an XML document cannot be predicted. It is always possible that an initial clustering solution becomes obsolete after the modification take place in document. XML clustering algorithms is use to calculate pair-wise distances between documents. A time-efficient technique determined the pair-wise distances in a timely manner. In this paper we proposed a time-efficient technique to reassess pair- wise distances between clustered multi-version XML documents which change over a time, without performing redundant calculations. While performing redundant calculations we considering the previously known distances and the set of changes which might have affected the documents versions Mining is the process of searching the XML documents from the formed clusters and extracting the particular data from that searched XML document. For mining we have one Metric which has details that in which cluster a particular record should be. So when we want any reports we directly go to metric and see where we will find¬ that records and directly access files inside that cluster.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]