Pilot Scale Production and Compatibility Criteria of New Self-Cleaning Materials

The paper involves a chain of activities from synthesis, establishment of the methodology for characterization and testing of novel protective materials through the pilot production and application on model supports. It summarizes the results regarding the development of the pilot production protocol for newly developed self-cleaning materials. The optimization of the production parameters was completed in order to improve the most important functional properties (mineralogy characteristics, particle size, self-cleaning properties and photocatalytic activity) of the newly designed nanocomposite material.

Free Fatty Acid Assessment of Crude Palm Oil Using a Non-Destructive Approach

Near infrared (NIR) spectroscopy has always been of great interest in the food and agriculture industries. The development of prediction models has facilitated the estimation process in recent years. In this study, 110 crude palm oil (CPO) samples were used to build a free fatty acid (FFA) prediction model. 60% of the collected data were used for training purposes and the remaining 40% used for testing. The visible peaks on the NIR spectrum were at 1725 nm and 1760 nm, indicating the existence of the first overtone of C-H bands. Principal component regression (PCR) was applied to the data in order to build this mathematical prediction model. The optimal number of principal components was 10. The results showed R2=0.7147 for the training set and R2=0.6404 for the testing set.

The Load Balancing Algorithm for the Star Interconnection Network

The star network is one of the promising interconnection networks for future high speed parallel computers, it is expected to be one of the future-generation networks. The star network is both edge and vertex symmetry, it was shown to have many gorgeous topological proprieties also it is owns hierarchical structure framework. Although much of the research work has been done on this promising network in literature, it still suffers from having enough algorithms for load balancing problem. In this paper we try to work on this issue by investigating and proposing an efficient algorithm for load balancing problem for the star network. The proposed algorithm is called Star Clustered Dimension Exchange Method SCDEM to be implemented on the star network. The proposed algorithm is based on the Clustered Dimension Exchange Method (CDEM). The SCDEM algorithm is shown to be efficient in redistributing the load balancing as evenly as possible among all nodes of different factor networks.

Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC) within Operational Research (OR) with Sustainability and Phenomenology

Supply chain (SC) is an operational research (OR) approach and technique which acts as catalyst within central nervous system of business today. Without SC, any type of business is at doldrums, hence entropy. SC is the lifeblood of business today because it is the pivotal hub which provides imperative competitive advantage. The paper present a conceptual framework dubbed as Homomorphic Conceptual Framework for Effective Supply Chain Strategy (HCEFSC).The term Homomorphic is derived from abstract algebraic mathematical term homomorphism (same shape) which also embeds the following mathematical application sets: monomorphisms, isomorphism, automorphisms, and endomorphism. The HCFESC is intertwined and integrated with wide and broad sets of elements.

Modeling and Simulation of Axial Fan Using CFD

Axial flow fans, while incapable of developing high pressures, they are well suitable for handling large volumes of air at relatively low pressures. In general, they are low in cost and possess good efficiency, and can have blades of airfoil shape. Axial flow fans show good efficiencies, and can operate at high static pressures if such operation is necessary. Our objective is to model and analyze the flow through AXIAL FANS using CFD Software and draw inference from the obtained results, so as to get maximum efficiency. The performance of an axial fan was simulated using CFD and the effect of variation of different parameters such as the blade number, noise level, velocity, temperature and pressure distribution on the blade surface was studied. This paper aims to present a final 3D CAD model of axial flow fan. Adapting this model to the available components in the market, the first optimization was done. After this step, CFX flow solver is used to do the necessary numerical analyses on the aerodynamic performance of this model. This analysis results in a final optimization of the proposed 3D model which is presented in this article.

An Axiomatic Model for Development of the Allocated Architecture in Systems Engineering Process

The final step to complete the “Analytical Systems Engineering Process” is the “Allocated Architecture” in which all Functional Requirements (FRs) of an engineering system must be allocated into their corresponding Physical Components (PCs). At this step, any design for developing the system’s allocated architecture in which no clear pattern of assigning the exclusive “responsibility” of each PC for fulfilling the allocated FR(s) can be found is considered a poor design that may cause difficulties in determining the specific PC(s) which has (have) failed to satisfy a given FR successfully. The present study utilizes the Axiomatic Design method principles to mathematically address this problem and establishes an “Axiomatic Model” as a solution for reaching good alternatives for developing the allocated architecture. This study proposes a “loss Function”, as a quantitative criterion to monetarily compare non-ideal designs for developing the allocated architecture and choose the one which imposes relatively lower cost to the system’s stakeholders. For the case-study, we use the existing design of U. S. electricity marketing subsystem, based on data provided by the U.S. Energy Information Administration (EIA). The result for 2012 shows the symptoms of a poor design and ineffectiveness due to coupling among the FRs of this subsystem.

Impact of Machining Parameters on the Surface Roughness of Machined PU Block

Machining parameters are very important in determining the surface quality of any material. In the past decade, some new engineering materials were developed for the manufacturing industry which created a need to conduct an investigation on the impact of the said parameters on their surface roughness. Polyurethane (PU) block is widely used in the automotive industry to manufacture parts such as checking fixtures that are used to verify the dimensional accuracy of automotive parts. In this paper, the design of experiment (DOE) was used to investigate on the effect of the milling parameters on the PU block. Furthermore, an analysis of the machined surface chemical composition was done using scanning electron microscope (SEM). It was found that the surface roughness of the PU block is severely affected when PU undergoes a flood machining process instead of a dry condition. In addition the stepover and the silicon content were found to be the most significant parameters that influence the surface quality of the PU block.

Numerical Study of Vortex Formation inside a Stirred Tank

The computational fluid dynamics (CFD) study of stirred tank with the air-water interface are carried out in the presence of different types of the impeller and with or without baffles. A multiple reference frame (MRF) approach with the volume of fluid (VOF) method is used to capture the air-water interface. The RANS (Reynolds Averaged Navier-Stokes) equations with k-ε turbulence model are solved to predict the flow behavior of water and air phase which are treated as a different phases. The predicted results have shown that the VOF method is able to capture the interface in the unbaffled tank. While, the VOF method is showing an unfeasible results in the baffled tank with high rotational impeller speed. For continuous stirred tank, the air-water interface is disturbed by the inflow and the level of water is also increased with time.

Using Data Mining in Automotive Safety

Safety is one of the most important considerations when buying a new car. While active safety aims at avoiding accidents, passive safety systems such as airbags and seat belts protect the occupant in case of an accident. In addition to legal regulations, organizations like Euro NCAP provide consumers with an independent assessment of the safety performance of cars and drive the development of safety systems in automobile industry. Those ratings are mainly based on injury assessment reference values derived from physical parameters measured in dummies during a car crash test. The components and sub-systems of a safety system are designed to achieve the required restraint performance. Sled tests and other types of tests are then carried out by car makers and their suppliers to confirm the protection level of the safety system. A Knowledge Discovery in Databases (KDD) process is proposed in order to minimize the number of tests. The KDD process is based on the data emerging from sled tests according to Euro NCAP specifications. About 30 parameters of the passive safety systems from different data sources (crash data, dummy protocol) are first analysed together with experts opinions. A procedure is proposed to manage missing data and validated on real data sets. Finally, a procedure is developed to estimate a set of rough initial parameters of the passive system before testing aiming at reducing the number of tests.

Eco-Friendly Preservative Treated Bamboo Culm: Compressive Strength Analysis

Bamboo is extensively used in construction industry. Low durability of bamboo due to fungus infestation and termites attack under storage puts certain constrains for it usage as modern structural material. Looking at many chemical formulations for bamboo treatment leading to severe harmful environment effects, research on eco-friendly preservatives for bamboo treatment has been initiated world-over. In the present studies, eco-friendly preservative for bamboo treatment has been developed. To validate its application for structural purposes, investigation of effect of treatment on compressive strength has been investigated. Neemoil (25%) integrated with copper naphthenate (0.3%) on dilution with kerosene oil impregnated into bamboo culm at 2 bar pressure, has shown weight loss of only 3.15% in soil block analysis method. The results from compressive strength analysis using HEICO Automatic Compression Testing Machine reveal that preservative treatment has not altered the structural properties of bamboo culms. Compressive strength of control (11.72 N/mm2) and above treated samples (11.71 N/mm2) was found to be comparable.

Tool Wear of Metal Matrix Composite 10wt% AlN Reinforcement Using TiB2 Cutting Tool

Metal matrix composites (MMCs) attract considerable attention as a result from its ability in providing a high strength, high modulus, high toughness, high impact properties, improving wear resistance and providing good corrosion resistance compared to unreinforced alloy. Aluminium Silicon (Al/Si) alloy MMC has been widely used in various industrial sectors such as in transportation, domestic equipment, aerospace, military, construction, etc. Aluminium silicon alloy is an MMC that had been reinforced with aluminium nitrate (AlN) particle and become a new generation material use in automotive and aerospace sector. The AlN is one of the advance material that have a bright prospect in future since it has features such as lightweight, high strength, high hardness and stiffness quality. However, the high degree of ceramic particle reinforcement and the irregular nature of the particles along the matrix material that contribute to its low density is the main problem which leads to difficulties in machining process. This paper examined the tool wear when milling AlSi/AlN Metal Matrix Composite using a TiB2 (Titanium diboride) coated carbide cutting tool. The volume of the AlN reinforced particle was 10% and milling process was carried out under dry cutting condition. The TiB2 coated carbide insert parameters used were at the cutting speed of (230, 300 and 370m/min, feed rate of 0.8, Depth of Cut (DoC) at 0.4m). The Sometech SV-35 video microscope system used to quantify of the tool wear. The result shown that tool life span increasing with the cutting speeds at (370m/min, feed rate of 0.8mm/tooth and DoC at 0.4mm) which constituted an optimum condition for longer tool life lasted until 123.2 mins. Meanwhile, at medium cutting speed which at 300m/m, feed rate of 0.8mm/tooth and depth of cut at 0.4mm we found that tool life span lasted until 119.86 mins while at low cutting speed it lasted in 119.66 mins. High cutting speed will give the best parameter in cutting AlSi/AlN MMCs material. The result will help manufacturers in machining process of AlSi/AlN MMCs materials.

Use of Hair as an Indicator of Environmental Lead Pollution: Characteristics and Seasonal Variation of Lead Pollution in Egypt

Lead being a toxic heavy metal that mankind is exposed to the highest levels of this metal from environmental pollutants. A total of 180 Male scalp hair samples were collected from different environments in Greater Cairo (GC), i.e. industrial, heavy traffic and rural areas (60 samples from each) having different activities during the period of, 1/5/2010 to 1/11/2012. Hair samples were collected during five stages. Data proved that the concentration of lead in male industrial areas of Cairo ranged between 6.2847 to 19.0432 μg/g, with mean value of 12.3288 μg/g. On the other hand, lead content of hair samples of residential-traffic areas ranged between 2.8634 to 16.3311 μg/g with mean value of 9.7552 μg/g. While lead concentration on the hair of the male residents living in rural area ranged between 1.0499-9.0402μg/g with mean value of 4.7327 μg/g. The Pb concentration in scalp hair of Cairo residents of residential-traffic and rural traffic areas was observed to follow the same pattern. The pattern was that of decrease concentration of summer and its increase in winter. Then, there was a marked increase in Pb concentration of summer 2012, and this increase was significant. These were obviously seen for the residential-traffic and rural areas residents. Pb pollution in residents of industrial areas showed the same seasonal pattern, but there was marked to decrease in Pb concentration of summer 2012, and this decrease was significant. Lead pollution in residents of GC was serious. It is worth noting that the atmosphere is still contaminated by lead despite a decade of using unleaded gasoline. Strong seasonal variation in higher Pb concentration on winter than in summer was found. Major contributions to the pollution with Pb could include industry emissions, motor vehicle emissions and long transported dust from outside Cairo. More attention should be paid to the reduction of Pb content of the urban aerosol and to the Pb pollution health.

Image Spam Detection Using Color Features and K-Nearest Neighbor Classification

Image spam is a kind of email spam where the spam text is embedded with an image. It is a new spamming technique being used by spammers to send their messages to bulk of internet users. Spam email has become a big problem in the lives of internet users, causing time consumption and economic losses. The main objective of this paper is to detect the image spam by using histogram properties of an image. Though there are many techniques to automatically detect and avoid this problem, spammers employing new tricks to bypass those techniques, as a result those techniques are inefficient to detect the spam mails. In this paper we have proposed a new method to detect the image spam. Here the image features are extracted by using RGB histogram, HSV histogram and combination of both RGB and HSV histogram. Based on the optimized image feature set classification is done by using k- Nearest Neighbor(k-NN) algorithm. Experimental result shows that our method has achieved better accuracy. From the result it is known that combination of RGB and HSV histogram with k-NN algorithm gives the best accuracy in spam detection.

Unsteady Flow of an Incompressible Viscous Electrically Conducting Fluid in Tube of Elliptical Cross Section under the Influence of Magnetic Field

Exact solution of an unsteady flow of elastico-viscous electrically conducting fluid through a porous media in a tube of elliptical cross section under the influence of constant pressure gradient and magnetic field has been obtained in this paper. Initially, the flow is generated by a constant pressure gradient. After attaining the steady state, the pressure gradient is suddenly withdrawn and the resulting fluid motion in a tube of elliptical cross section by taking into account of the transverse magnetic field and porosity factor of the bounding surface is investigated. The problem is solved in twostages the first stage is a steady motion in tube under the influence of a constant pressure gradient, the second stage concern with an unsteady motion. The problem is solved employing separation of variables technique. The results are expressed in terms of a nondimensional porosity parameter (K), magnetic parameter (m) and elastico-viscosity parameter (β), which depends on the Non- Newtonian coefficient. The flow parameters are found to be identical with that of Newtonian case as elastic-viscosity parameter and magnetic parameter tends to zero and porosity tends to infinity. It is seen that the effect of elastico-viscosity parameter, magnetic parameter and the porosity parameter of the bounding surface has significant effect on the velocity parameter.

Comparison of GSA, SA and PSO Based Intelligent Controllers for Path Planning of Mobile Robot in Unknown Environment

Now-a-days autonomous mobile robots have found applications in diverse fields. An autonomous robot system must be able to behave in an intelligent manner to deal with complex and changing environment. This work proposes the performance of path planning and navigation of autonomous mobile robot using Gravitational Search Algorithm (GSA), Simulated Annealing (SA) and Particle Swarm optimization (PSO) based intelligent controllers in an unstructured environment. The approach not only finds a valid collision free path but also optimal one. The main aim of the work is to minimize the length of the path and duration of travel from a starting point to a target while moving in an unknown environment with obstacles without collision. Finally, a comparison is made between the three controllers, it is found that the path length and time duration made by the robot using GSA is better than SA and PSO based controllers for the same work.

A Robust Image Steganography Method Using PMM in Bit Plane Domain

Steganography is the art and science that hides the information in an appropriate cover carrier like image, text, audio and video media. In this work the authors propose a new image based steganographic method for hiding information within the complex bit planes of the image. After slicing into bit planes the cover image is analyzed to extract the most complex planes in decreasing order based on their bit plane complexity. The complexity function next determines the complex noisy blocks of the chosen bit plane and finally pixel mapping method (PMM) has been used to embed secret bits into those regions of the bit plane. The novel approach of using pixel mapping method (PMM) in bit plane domain adaptively embeds data on most complex regions of image, provides high embedding capacity, better imperceptibility and resistance to steganalysis attack.

An Approach for the Integration of the Existing Wireless Networks

The demand of high quality services has fueled dimensional research and development in wireless communications and networking. As a result, different wireless technologies like Wireless LAN, CDMA, GSM, UMTS, MANET, Bluetooth and satellite networks etc. have emerged in the last two decades. Future networks capable of carrying multimedia traffic need IP convergence, portability, seamless roaming and scalability among the existing networking technologies without changing the core part of the existing communications networks. To fulfill these goals, the present networking systems are required to work in cooperation to ensure technological independence, seamless roaming, high security and authentication, guaranteed Quality of Services (QoS). In this paper, a conceptual framework for a cooperative network (CN) is proposed for integration of heterogeneous existing networks to meet out the requirements of the next generation wireless networks.

Phytochemical Study and Biological Activity of Sage (Salvia officinalis L.)

This study presents an attempt to evaluate the antioxidant potential and antimicrobial activity of methanolic extract, and essential oils prepared from the leaves of sage (Salvia officinalis L.). The content of polyphenol in the methanolic extracts from the leaves of Salvia officinalis was determined spectrophotometrically, calculated as gallic acid and catechin equivalent. The essential oils and methanol extract were also subjected to screenings for the evaluation of their antioxidant activities using 2, 2-diphenyl-1- picrylhydrazyl (DPPH) test. While the plant essential oils showed only weak antioxidant activities, its methanol extract was considerably active in DPPH (IC50 = 37.29 μg/ml) test. Appreciable total polyphenol content (31.25 mg/g) was also detected for the plant methanol extract as gallic acid equivalent in the Folin–Ciocalteu test. The plant was also screened for its antimicrobial activity and good to moderate inhibitions were recorded for its essential oils, and methanol extracts against most of the tested microorganisms. The present investigation revealed that this plant had rich source of antioxidant properties. It is for this reason that sage has found increasing application in food formulations.

EUDIS-An Encryption Scheme for User-Data Security in Public Networks

The method of introducing the proxy interpretation for sending and receiving requests increase the capability of the server and our approach UDIV (User-Data Identity Security) to solve the data and user authentication without extending size of the data makes better than hybrid IDS (Intrusion Detection System). And at the same time all the security stages we have framed have to pass through less through that minimize the response time of the request. Even though an anomaly detected, before rejecting it the proxy extracts its identity to prevent it to enter into system. In case of false anomalies, the request will be reshaped and transformed into legitimate request for further response. Finally we are holding the normal and abnormal requests in two different queues with own priorities.

Information Retrieval: A Comparative Study of Textual Indexing Using an Oriented Object Database (db4o) and the Inverted File

The growth in the volume of text data such as books and articles in libraries for centuries has imposed to establish effective mechanisms to locate them. Early techniques such as abstraction, indexing and the use of classification categories have marked the birth of a new field of research called "Information Retrieval". Information Retrieval (IR) can be defined as the task of defining models and systems whose purpose is to facilitate access to a set of documents in electronic form (corpus) to allow a user to find the relevant ones for him, that is to say, the contents which matches with the information needs of the user. Most of the models of information retrieval use a specific data structure to index a corpus which is called "inverted file" or "reverse index". This inverted file collects information on all terms over the corpus documents specifying the identifiers of documents that contain the term in question, the frequency of each term in the documents of the corpus, the positions of the occurrences of the word... In this paper we use an oriented object database (db4o) instead of the inverted file, that is to say, instead to search a term in the inverted file, we will search it in the db4o database. The purpose of this work is to make a comparative study to see if the oriented object databases may be competing for the inverse index in terms of access speed and resource consumption using a large volume of data.