An Evaluation of Proximate Composition on Cereal Grains for
Confectionery and Pasta Production
Alon Davidy
Abstract Full text PDF
Index: 10.183x/A550106
Visual Cryptography Scheme with Authentication Using Shamir Andmk Reddy Techniques
Neha K. Lakde, Prof. P. B. Shelke
Abstract Full text PDF
Index: 10.183x/B550714
Application of Taguchi for Optimization of Process Parameters Inimproving Thickness Variation in Single Stand Cold Rolling Mill
Vivek Anil Vaidya
Abstract Full text PDF
Index: 10.183x/C551523
Segmentation of Blood Vessels and Optic Disc In
Kota Prajwal Kant
Abstract Full text PDF
Index:10.183x/D552433
Numerical-Based Radar Cross Section Estimation of a Dielectric Cylinder
Subhalakshmy A.B, Neethu P.S., Hema Singh
Abstract Full text PDF
Index: 10.183x/E553439
Design of an efficient coloring technique for images using color transfer to process Corruptive Artifacts
Seema Tidke, Shrikant Zade
Abstract Full text PDF
Index: 10.183x/F554044
Multiresponse Optimization of Surface Grinding Operation of En19 Alloy Steel Using Grey Relational Analysis (GRA)
B.Madhu Sudan, S.Jayakrishna, B.Harish Raj
Abstract Full text PDF
Optimization Of Tractor Trolley Axle Using Fem
Ramachendran, G. Paramesh, Madhusudhan
Abstract Full text PDF
Detecting Spam Zombies By Monitoring Outgoing Messages
Birru Devender,Korra Srinivas,Ch.Tulasi Ratna Mani
Abstract Full text PDF
Fuzzy Keyword Searches for Multiple PHR Owners in Cloud Computing
Birru Devender, Md. Khalid Imam Rahmani
Abstract Full text PDF
Implementation of Sha-1 Algorithm on Fpga
K. sandeep kumar, Y jagadeesh, P rajeshwar
Abstract Full text PDF
Packet Loss Control Using Tokens at The Network Edge
Birru Devender, Korra Srinivas, Ch.Tulasi Ratna Mani, K. Aparna LakshmI
Abstract Full text PDF
Use of Endocrine Technologies in Monitoring Hormonal Disorders in College Going Girls of Mewar Region – Rajasthan
Dr. Asha Gupta
Abstract Full text PDF
Alon Davidy
Abstract Full text PDF
Index: 10.183x/A550106
Destructive means of measuring the proximate compositions of grains involves crushing of samples and application of chemical reagents on samples leading to losses of food along the food chain. However, this study tends to evaluate the proximate compositions of Quality Protein Maize (QPM) and Sorghum (Sorghum bicolor) using non-destructive and destructive methods. A randomized block design of 6 × 2 × 2 × 3, SPSS 20.0 statistical package, and analysis of variance (ANOVA) were used to analyze the results. The result shows that non-destructive measurement had higher significant values of moisture, protein , fat, fiber, ash and carbohydrate contents are 11.600.06%,9.80.05%,4.070.09%, -, -, 61.37% and 12.470.03%, 8.90%, 3.130.03%, 2.100%, 1.20%, and 66.600.12%, over the destructive measurement with 10.670.52%, 10.550.13%, 4.950.22%, 3.260.05%, 1.710.01%, 68.870.56 and 9.070.03%, 11.130.06%, 4.830.03%, 2.380.01%, 1.080.01%, and 70.790.02% for QPM and sorghum respectively. These results indicates that Non- destructive method was therefore found to be more economical and faster without affecting the actual proximate compositions of the grains, hence recommended for used over the conventional destructive method in Nigeria and elsewhere.
Author Keywords:- Proximate Composition, Quality Protein Maize, Destructive Method, Protein Content, Sorghum
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 06
© Copyright 2014, All rights reserved.
Author Keywords:- Proximate Composition, Quality Protein Maize, Destructive Method, Protein Content, Sorghum
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 06
© Copyright 2014, All rights reserved.
Visual Cryptography Scheme with Authentication Using Shamir Andmk Reddy Techniques
Neha K. Lakde, Prof. P. B. Shelke
Abstract Full text PDF
Index: 10.183x/B550714
In our daily life Information is increasingly important. Information gets more value when shared with others. Due to advances in technologies related to networking and communication, it is possible to share the information like audio, video and image easily. There are lots of security related issues. Hacker's may try to access unauthorized data and misuse it. Various techinques are required to solve this problem. Techniques to provide security, while sharing information are termed as Secret sharing schemes. When it comes to visual information like image and video, it is termed as Visual secret sharing scheme. Visual cryptography (VC) is a technique used for protecting image-based secrets.This paper presents a detail survey of different visual cryptography scheme used for visual cryptography . The basic concept of visual cryptography scheme is, to split secret image into some shares, which separately reveals no knowledge about the secret information. Shares are then distributed to participants. By stacking these shares directly, secret information can be revealed and visually recognized. All shares are necessary to combine to reveal the secret image. In this paper we have introduced a technique for visual cryptography in which any type of image can be chosen as a passward, images then divided and then apply Shamir and M K Reddy encryption and decryption techniques . After decryption system get match with original image then system give result as the user is authenticate otherwise non authenticate. The system introduced in this paper satisfy the needs of authentication.
Author Keywords:- Visual Cryptography , Visual Secret Sharing Scheme
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 08
© Copyright 2014, All rights reserved.
Author Keywords:- Visual Cryptography , Visual Secret Sharing Scheme
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 08
© Copyright 2014, All rights reserved.
Application of Taguchi for Optimization of Process Parameters Inimproving Thickness Variation in Single Stand Cold Rolling Mill
Vivek Anil Vaidya
Abstract Full text PDF
Index: 10.183x/C551523
Taguchi Method is a statistical approach to optimize the process parameters and improve the quality of products that are manufactured. This paper focuses onapplication of Taguchi Method of Design of Experiment in optimization of process parameters in cold rolling of steel. The purpose of a cold rolling mill is to successively reduce the thickness of the metal strip and/or impart the desired mechanical and micro structural properties. Optimization for cold rolling mills rolling parameters are continuously being improved due to today's stringent high throughput, quality and low scrap loss requirements for products to make process robust. Taguchi based design of experiment applied in second pass in single stand reversing cold rolling mill to optimize rolling parameters for improving thickness variation of the steel strip. A suitable orthogonal array selected and experiment conducted in Single stand reversing cold rolling Mill. After conducting experiment thickness variation measured and signal to noise ratio calculated. With help of graphs, optimum parameter values were obtained and confirmation test carried out.
Author Keywords:- Optimization, Cold rolling, uncoiler, Taguchi method, orthogonal array &Signal to noise ratio.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 09
© Copyright 2014, All rights reserved.
Author Keywords:- Optimization, Cold rolling, uncoiler, Taguchi method, orthogonal array &Signal to noise ratio.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 09
© Copyright 2014, All rights reserved.
Segmentation of Blood Vessels and Optic Disc In
Kota Prajwal Kant
Abstract Full text PDF
Index:10.183x/D552433
Retinal image analysis is increasingly prominent as a non-intrusive diagnosis method in modern ophthalmology. In this paper, we present a novel method to segment blood vessels and optic disc in the fundus retinal images. The method could be used to support non-intrusive diagnosis in modern ophthalmology since the morphology of the blood vessel and the optic disc is an important indicator for diseases like diabetic retinopathy, glaucoma and hypertension. Our method takes as first step the extraction of the retina vascular tree using the graph cut technique. The blood vessel information is then used to estimate the location of the optic disc. The optic disc segmentation is performed using two alternative methods. The Markov Random Field (MRF) image reconstruction method segments the optic disc by removing vessels from the optic disc region and the Compensation Factor method segments the optic disc using prior local intensity knowledge of the vessels. The proposed method is tested on three public data sets, DIARETDB1, DRIVE and STARE. The results and comparison with alternative methods show that our method achieved exceptional performance in segmenting the blood vessel and optic disc.
Author Keywords: - Retinal images, vessel segmentation, optic disc segmentation, graph cut segmentation.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 10
© Copyright 2014, All rights reserved.
Author Keywords: - Retinal images, vessel segmentation, optic disc segmentation, graph cut segmentation.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 10
© Copyright 2014, All rights reserved.
Numerical-Based Radar Cross Section Estimation of a Dielectric Cylinder
Subhalakshmy A.B, Neethu P.S., Hema Singh
Abstract Full text PDF
Index: 10.183x/E553439
In this paper two dimensional integral equation based numerical approach is used to analyze the scattered field from a homogeneous dielectric circular cylinder. The surface integral equation method replaces the dielectric scatterer surface into equivalent surface electric and magnetic currents. The cylinder contour is discretized into N sections to obtain 2N linear equations, represented in the matrix form. The equation is approximated as a sum of weighted terms using known expansion functions. The resultant matrix equation is solved to obtain electric and magnetic currents and hence cylinder RCS.
Author Keywords: - Dielectric homogeneous cylinder, method of moments, polarization, radar cross section.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 06
© Copyright 2014, All rights reserved.
Author Keywords: - Dielectric homogeneous cylinder, method of moments, polarization, radar cross section.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 06
© Copyright 2014, All rights reserved.
Design of an efficient coloring technique for images using color transfer to process Corruptive Artifacts
Seema Tidke, Shrikant Zade
Abstract Full text PDF
Index: 10.183x/F554044
In this paper, The system is using two effective techniques which are used to inpaint the part of the image as well as coloring the image. For the First part, we are using Patch Based Inpainting algorithm in which inpaint the missing part of the image. Of course it is patch based Algorithm so first, the image is divided into patches and selected part of the image is inpainted.In the second part we are using patch based Colorization by YcbCr method. The algorithm by which we used luminance evaluation of the reference & target images. This method is used for coloring the Gray scale image by using color from the given image with color difference (Cb Cr) evalution.By luminance value matching the colorization of images is possible. Both techniques are very effective and speedy to develop the system. The Gray scale image colorization is very useful application in the world of image processing. This work presents a general technique for "colorizing" grayscale images by transferring color between a source color image and a target or grayscale image by matching luminance and texture information between the images. The system uses a very simple algorithm that uses a decorrelated color space and then applies simple operations there. Thus the color space with decorrelated axes is a useful tool for manipulating color images in this work. Colorization of a gray scale image is achieved here by matching by luminance values onto the data points in a simple operation, thus believable output images are obtained given suitable input images. Here, we use some features like energy, entropy, homogeneity, contrast and correlation based on correlation matrix for the purpose of texture matching between two images.
Author Keywords:- inpaitning,colorization, Contrast, Correlation, Energy, Entropy, Homogeneity, luminance, Mean and Standard Deviation, YCbCr Color Space.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Author Keywords:- inpaitning,colorization, Contrast, Correlation, Energy, Entropy, Homogeneity, luminance, Mean and Standard Deviation, YCbCr Color Space.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Multiresponse Optimization of Surface Grinding Operation of En19 Alloy Steel Using Grey Relational Analysis (GRA)
B.Madhu Sudan, S.Jayakrishna, B.Harish Raj
Abstract Full text PDF
Conventional grinding fluid is widely used in grinding process, which results in high consumption and impacting the environment. Minimum Quantity Lubrication (MQL) is alternative source for the Conventional grinding process. In this study, Water based nanofluid applied to grinding process with MQL approach for its excellent convection heat transfer and thermal conductivity properties. The grinding characteristics of hardened steel can be investigated. Water based nanofluid MQL grinding can significantly reduce the grinding temperature, decrease the grinding forces and gives better surface finish than conventional grinding process. The process parameters considered are Nanofluid Type, Nanofluid Concentration, Depth of Cut & feed rate and multiple responses are surface roughness, Temperature, Grinding Wheel Wear & Material Removal Rate. CuO 2% concentration has the better surface roughness. Analysis of Variance (ANOVA) is done to find out significant process parameter at 95% confidence level. ANOVA shows that Nanofluid Type has significant factor, because its p-value less than 0.05. The use of GRA converts the multi response variable to a single response Grey relational grade and simplifies the optimization procedure.
Author Keywords:- Conventional Grinding, Gray Relational Analysis, MQL, Nanofluid Concentration, Surface Roughness.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 07
© Copyright 2014, All rights reserved.
Author Keywords:- Conventional Grinding, Gray Relational Analysis, MQL, Nanofluid Concentration, Surface Roughness.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 07
© Copyright 2014, All rights reserved.
Optimization Of Tractor Trolley Axle Using Fem
Ramachendran, G. Paramesh, Madhusudhan
Abstract Full text PDF
Tractor trailers are very popular mode of transport, especially in rural area & used for transport of various materials like building construction material, agricultural crops, heavy machineries & other miscellaneous material. In rural area off road condition includes uneven agricultural field surfaces and bumpy village roads on which the tractor has to operate. These ground irregularities leads to unexpected loads coming on the tractor components. The existing trolley designed by the industry uses heavy axle without considering static and dynamic loading conditions which in turn leads to higher factor of safety increasing the overall cost of the axle. In this study, existing trolley axle is redesigned considering the static and dynamic load conditions. Based on finite element analysis, redesign of axle was carried out for reducing the cost, weight and maintains the mechanical strength with easy manufacturability and cost reduction. Results of static, modal and transient analysis of proposed axle under loading due to modified combine showed that the proposed model is suitable to install on trolley. The design is optimized based on the manufacturing cost of the axle. The failure analysis is performed on the axle of trolley used in agricultural area. These results provide a technical basis to prevent future damage to the location axle.
Author Keywords:- Trolley axle, optimization, Ansys, weight reduction.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Author Keywords:- Trolley axle, optimization, Ansys, weight reduction.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Detecting Spam Zombies By Monitoring Outgoing Messages
Birru Devender,Korra Srinivas,Ch.Tulasi Ratna Mani
Abstract Full text PDF
Research have studied numerous means of One of the key security threats on the Internet occurs because of Compromised machines, such systems often used to launch various security attacks such as spamming and spreading malware, DDoS, and identity theft. The spamming provides a key economic incentive for attackers to recruit the large number of compromised machines, so keeping focus on the detection of the compromised machines in a network that are involved in the spamming activities, commonly known as spam zombies. So here to develop effective spam zombie detection system named SPOT by monitoring outgoing messages of a network. SPOT is designed based on a statistical tool called Sequential Probability Ratio Test, this is a very powerful technique which gives the result by considering very few number of observations made at less time consumption and improves the speed to execute the process. A methodology called Paul-Graham Implementation is used to detect the spams by applying the SPOT filter using SPRT which is tracked when a message is passed from the network which is called outgoing messages. This implementation is based on Bayesian calculation which determines the rating of spam and based on this rating, the technique identifies the mail is either spam or not and also verifies weather the system is compromised system or not.
Author Keywords:- DDoS, and identity theft. The spamming provides a key economic incentive for attackers to recruit the large number of compromised machines.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Author Keywords:- DDoS, and identity theft. The spamming provides a key economic incentive for attackers to recruit the large number of compromised machines.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Fuzzy Keyword Searches for Multiple PHR Owners in Cloud Computing
Birru Devender, Md. Khalid Imam Rahmani
Abstract Full text PDF
With the beginning of cloud computing, it has become increasingly popular for PHR owners to outsource their documents to public cloud servers while allowing users to retrieve this data. For privacy concerns, secure searches over encrypted cloud data have motivated several research works under the single PHR owner model. However, most cloud servers in practice do not just serve one PHR owner; instead, they support multiple PHR owners to share the benefits brought by cloud computing. In this paper, we propose schemes to deal with Preserving Fuzzy keyword Search in a Multi-owner model along with it solves the problem of effective fuzzy keyword search over encrypted cloud data while maintaining keyword privacy. Fuzzy keyword search greatly enhances system usability by returning the similar documents when users' searching inputs exactly match the predefined keywords or the closest possible similar Health records based on keyword similarity semantics, when an exact match fails. We systematically construct a novel secure search protocol which Generate fuzzy keyword set and also used Advanced Technique for Constructing Fuzzy Keyword Sets.
Author Keywords:- Searchable Encryption, Fuzzy keyword searches, Cloud Computing.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Author Keywords:- Searchable Encryption, Fuzzy keyword searches, Cloud Computing.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Implementation of Sha-1 Algorithm on Fpga
K. sandeep kumar, Y jagadeesh, P rajeshwar
Abstract Full text PDF
SHA (Secure Hash Algorithm) is famous message compress standard used in computer cryptography, it can compress a long message to become a short message abstract. The algorithm can be used in many Secure Algorithm, especially for DSS. In this paper, the improved version SHA-1 is implemented and analysed. It is then improved and implemented in Verilog and FPGA. Xilinx is used to compile and generate the function modules, RTL level description circuit and simulated waveform. RTL level description is the circuit connection in FPGA chip. It shows the connection of the modules. Simulated waveform shows us the timing and the function of the SHA-1 module. SHA-1 module that designed in this paper used less memory units and logic elements. It can be used in DSA or any protocols or secure algorithm.
Author Keywords:- SHA-1; Secure Hash Algorithm; FPGA; Networks Security.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Author Keywords:- SHA-1; Secure Hash Algorithm; FPGA; Networks Security.
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 05
© Copyright 2014, All rights reserved.
Packet Loss Control Using Tokens at The Network Edge
Birru Devender, Korra Srinivas, Ch.Tulasi Ratna Mani, K. Aparna LakshmI
Abstract Full text PDF
Research has studied numerous means of One of the key presently the Internet accommodates simultaneous audio, video, and data traffic. This requires the Internet to guarantee the packet loss which at its turn depends very much on congestion control. A series of protocols have been introduced to supplement the insufficient TCP mechanism controlling the network congestion. CSFQ was designed as an open-loop controller to provide the fair best effort service for supervising the per-flow bandwidth consumption and has become helpless when the P2P flows started to dominate the traffic of the Internet. Token-Based Congestion Control (TBCC) is based on a closed-loop congestion control principle, which restricts token resources consumed by an end-user and provides the fair best effort service with O(1) complexity. As Self-Verifying CSFQ and Refeedback, it experiences a heavy load by policing inter-domain traffic for lack of trust. In this paper, Stable Token-Limited Congestion Control (STLCC) is introduced as new protocols which appends inter-domain congestion control to TBCC and make the congestion control system to be stable. STLCC is able to shape output and input traffic at the inter-domain link with O (1) complexity. STLCC produces a congestion index, pushes the packet loss to the network edge and improves the network performance. Finally, the simple version of STLCC is introduced. This version is deployable in the Internet without any IP protocols modifications and preserves also the packet datagram.
Author Keywords:- congestion control protocols and algorithms which can solve the packet loss parameter can be kept under control
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 03
© Copyright 2014, All rights reserved.
Author Keywords:- congestion control protocols and algorithms which can solve the packet loss parameter can be kept under control
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 03
© Copyright 2014, All rights reserved.
Use of Endocrine Technologies in Monitoring Hormonal Disorders in College Going Girls of Mewar Region – Rajasthan
Dr. Asha Gupta
Abstract Full text PDF
Mewar Region serves as a feeder channel for girls that came from remote tribal areas to semi-urban and urban area, seeking admission in Govt. Meera Girls College, Udaipur, which is affiliated to MLS University, Udaipur, Rajasthan. Young girls entering college are unaware of their body physiology and biomedical milieu. These are critically connected with their psychology and behavior. Our aim is to provide councelling for better health, growth development and behvorial adjustments on the basis of results of use of endocrine technologies. These could be used in each case to establish Biometric Data of every girls admitted in the college. This "Individual Health Card"will assist in providing baseline information to every girl about their body.This has wide impact on their social, emotional and intellectual quotient.
Author Keywords:- Technology, Endocrine Profile, Hormonal Dynamics, Biometric Data, Health Card
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 02
© Copyright 2014, All rights reserved.
Author Keywords:- Technology, Endocrine Profile, Hormonal Dynamics, Biometric Data, Health Card
e-ISSN: 2319-183X, p-ISSN: 2319-1821 Source Type: Journal Original Language: English
Document Type: Article Number of pages: 02
© Copyright 2014, All rights reserved.