Effect of Method of Curing on the Compressive Strength of Pavement Concrete produced with Blended Cement and Hardening Accelerator
The present experimental study explains the effect of method of curing on the compressive strength of Pavement Concrete produced with Blended cement and non-chloride hardening Accelerator. Portland Pozzolana Cement (PPC) was used in the production of concrete mixtures. Concrete mixtures were designed as per the new guidelines of IS 10262: 2009. Accelerator dosage was varied from 2 liters to 5 liters per cubic meter of concrete in seven equal intervals. Compressive strength of standard cube specimens (150 mm) at early age and at full maturity cured with water and alternatively with wax based membrane forming curing compound, was studied. Performance of accelerator at a given age of concrete was assessed based on the maximum percentage increase in the compressive strength, measured with reference to the strength of control mixture (without accelerator) at the corresponding age. Average efficiency of the curing compound at a given age, calculated as the ratio of average compressive strength of concrete cured with it to that cured with water, was also studied. The test results revealed that the type of curing did not affect optimum performance of accelerator. Average efficiency of the curing compound was found to be more at early age than at later age.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Security issues in cloud computing
Cloud computing has quickly become one of the most prominent buzzwords in the IT world due to its revolutionary model of computing as a utility. It promises increased flexibility, scalability, and reliability, while promising decreased operational and support costs. It is much more than simple internet. It is a construct that allows user to access applications that actually reside at location other than user’s own computer or other Internet-connected devices. In this paper, we have discussed security risks and concerns in cloud computing and enlightened steps that an enterprise can take to reduce security risks and protect their resources. We have also explained cloud computing strengths/benefits, weaknesses, and applicable areas in information risk management. Cloud computing is continuously evolving and there are several major cloud computing providers such as Amazon, Google, Microsoft, Yahoo and several others who are providing services such as Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS), Storage-as-aService and Infastructure-as-a-Service (IaaS) and this paper has discussed some of the services being provided.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Dynamic distribution of electricity in multiple areas by single power generating unit through simplex method
Dynamic Load sharing in power resources domain provides a dynamic way of handling power shortages. In this paper, we propose simplex method with ratio and proportion, current and electricity formulae and electrical engineering concepts to dynamically share power. The power distribution logically divided into various layers. The resource need is calculated from each layers using three parameters viz., Power Normal Pn, Power Excess Pe, Power Deficient Pd. Two level of repositories viz., Local Repository, Global Repository are used. The function of local repository is to have the threshold value at each layer. The global repository is installed in Grid (thermal power plant) which calculates values from all the level. We introduced threshold calculation, which includes ratio, and proportion functionality at each layer and calculated the values of Pex, Pn, Pd. In which, message transmission is done with the help of message control system between layers using clock triggering pulses. It sends a series of bits to identify whether the need of power is normal, deficient or excess. The load frequency control system maintains load demand and the frequency of machine in the generating power plant.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Model-To-Model Transformation with approach by modeling: From UML to IoC Application model
The continuing evolution of business needs and technology makes applications more demanding in terms of development, maintenance, usability and management. To cope with this complexity, various frameworks and patterns are integrated for producing stable, maintainable and testable code. Given this diversity of solutions, the generation of a code based on UML models has become important. This paper presents, after establishing the different meta-models, the application of the MDA (Model Driven Architecture) to generate, from the UML model, the Code following the IoC (Inversion of Control) and Dao (Data Acces Object) patterns. The model-to-model transformations are also clearly and formally established by using the standard MOF 2.0 QVT (Meta-Object Facility 2.0 Query-View-Transformation) as transformation language. The transformation rules defined in this paper can generate, from the class diagram, an XML file containing the Business and the Data Access package. This file can be used to generate the necessary code of an architecture overview of IoC and DAO patterns.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Performance analysis of contrast enhancement using various statistical operations and neighborhood processing
Histogram Equalization is a simple and effective contrast enhancement technique. In spite of its popularity Histogram Equalization still have some limitations –produces artifacts, unnatural images and the local details are not considered, therefore due to these limitations many other Equalization techniques have been derived from it with some up gradation. In this proposed method statistics play an important role in image processing, where statistical operations is applied to the image to get the desired result such as manipulation of brightness and contrast. Thus, a novel algorithm using statistical operations and neighborhood processing has been proposed in this paper where the algorithm has proven to be effective in contrast enhancement based on the theory and experiment.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Data Mining – Business Statistics
Data mining is a new discipline lying at the interface of statistics, database technology, pattern recognition, machine learning, and other areas. The amount of data being generated and stored is growing exponentially, due in large part to the continuing advances in computer technology. From the financial sector to telecommunications operations, companies increasingly rely on analysis of huge amounts of data to compete. A new generation of techniques and tools is emerging to intelligently assist humans in analysing mountains of data. New problems arise, partly as a consequence of the sheer size of the data sets involved, and partly because of issues of pattern matching. However, since statistics provides the intellectual glue underlying the effort, it is important for statisticians to become involved. Our goal here is to provide a brief overview of the key issues in knowledge discovery in an industrial context and outline representative applications.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Energy Consumption of Tree Based Hierarchical Approach in WSN
In WSN, there are so many sensors, just sends the data to the base station. Base station is the controller of whole WSN. Here this paper shows how BS can communicate to direct closest node as CH and further node can also elected as S-CH. Here CBR protocol used to work on energy consumption based on the format of tree for collecting the data. Represent the chart & C – language implementation of node wise energy consumption both sending as well as receiving. Also node A can communicate to BS direct with distance measurement.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Evolution of Speech Recognition – A Brief History of Technology Development
This paper gives a brief overview of evolution of speech recognition starting from 1779 till date. It also discuss about the past, present and future of speech recognition.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Impact Meta Heuristic Algorithm on collection data Machines Learning to data mining methods
Intrusion detection system shave been around for quite some time, to protect systems from inside and outside threats. Researchers and scientists are concerned on how to enhance the intrusion detection performance, to be able to deal with real-time attacks and detect them fast from quick response. One way to improve performance is to use minimal number of features to define a model in a way that it can be used to accurately discriminate normal from anomalous behaviour. Many feature selection techniques are out there to reduce feature sets or extract new features out of them. In this paper, we propose an anomaly detectors generation approach using Meta heuristic algorithm in conjunction with several features selection techniques, including principle components analysis, sequential floating, and correlation-based feature selection. Meta heuristic algorithm was applied with deterministic crowding niching technique, to generate a set of detectors from a single run. In this test, based on various algorithms, we conclude that NWINE data is low in accuracy and only in the clustering algorithm, which rises precisely.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]
Software metrics – a survey to procedure oriented, object oriented and web based metrics
Without the help of measurement quality software cannot be built. Measurement is an essential aspect for achieving the basic management objectives of prediction, progress, and process improvement. Software measurement is the raw data associated with various elements of the software process and product. It acts as a quantitative basis for the development and validation of models for a software development process. The major goal of software metrics is identification and measurement of the essential parameters affecting the software development. An oft-repeated phrase by De Marco holds true; “You can’t manage what you can’t measure!” [DEM86]. All process improvement must be based on measuring where you have been, where you are now, and properly using the data to predict where you are heading. Collecting good metrics and properly using them always leads to process improvement! This paper gives an exhaustive overview of metrics used in software development in different language paradigms. Metrics are classified as procedure oriented, object oriented and web oriented metrics. The object oriented metrics are further classified into Chidamber and MOOD metrics. In an object-oriented system, traditional metrics which are generally applied to the methods comprising the operations of a class is also highlighted. Web based objects further classified as multimedia files, web building blocks, scripts and links are also described.
Please Login using your Registered Email ID and Password to download this PDF.
This article is not included in your organization's subscription.The requested content cannot be downloaded.Please contact Journal office.Click the Close button to further process.
[PDF]