Abstract: Titanium alloys like the modern alloy Ti 6Al 2Sn 4Zr 6Mo (Ti-6246) combine excellent specific mechanical properties and corrosion resistance. On the other hand,due to their material characteristics, machining of these alloys is difficult to perform. The aim of the current study is the analyses of wear mechanisms of coated cemented carbide tools applied in orthogonal cutting experiments of Ti-6246 alloy. Round bars were machined with standard coated tools in dry conditions on a CNC latheusing a wide range of cutting speeds and cutting depths. Tool wear mechanisms were afterwards investigated by means of stereo microscopy, optical microscopy, confocal microscopy and scanning electron microscopy. Wear mechanisms included fracture of the tool tip (total failure) and abrasion. Specific wear features like crater wear, micro cracks and built-up edgeformation appeared depending of the mechanical and thermal conditions generated in the workpiece surface by the cutting action.
Abstract: Biological evolution has generated a rich variety of
successful solutions; from nature, optimized strategies can be
inspired. One interesting example is the ant colonies, which are able
to exhibit a collective intelligence, still that their dynamic is simple.
The emergence of different patterns depends on the pheromone trail,
leaved by the foragers. It serves as positive feedback mechanism for
sharing information.
In this paper, we use the dynamic of TASEP as a model of
interaction at a low level of the collective environment in the ant-s
traffic flow. This work consists of modifying the movement rules of
particles “ants" belonging to the TASEP model, so that it adopts with
the natural movement of ants. Therefore, as to respect the constraints
of having no more than one particle per a given site, and in order to
avoid collision within a bidirectional circulation, we suggested two
strategies: decease strategy and waiting strategy. As a third work
stage, this is devoted to the study of these two proposed strategies-
stability. As a final work stage, we applied the first strategy to the
whole environment, in order to get to the emergence of traffic flow,
which is a way of learning.
Abstract: Pentachlorophenol (PCP) is a polychlorinated
aromatic compound that is widespread in industrial effluents and is
considered to be a serious pollutant. Among the variety of industrial
effluents encountered, effluents from tanning industry are very
important and have a serious pollution potential. PCP is also formed
unintentionally in effluents of paper and pulp industries. It is highly
persistent in soils and is lethal to a wide variety of beneficial
microorganisms and insects, human beings and animals. The natural
processes that breakdown toxic chemicals in the environment have
become the focus of much attention to develop safe and environmentfriendly
deactivation technologies. Microbes and plants are among
the most important biological agents that remove and degrade waste
materials to enable their recycling in the environment. The present
investigation was carried out with the aim of developing a microbial
system for bioremediation of PCP polluted soils. A number of plant
species were evaluated for their ability to tolerate different
concentrations of pentachlorophenol (PCP) in the soil. The
experiment was conducted for 30 days under pot culture conditions.
The toxic effect of PCP on plants was studied by monitoring seed
germination, plant growth and biomass. As the concentration of PCP
was increased to 50 ppm, the inhibition of seed germination, plant
growth and biomass was also increased. Although PCP had a
negative effect on all plant species tested, maize and groundnut
showed the maximum tolerance to PCP. Other tolerating crops
included wheat, safflower, sunflower, and soybean. From the
rhizosphere soil of the tolerant seedlings, as many as twenty seven
PCP tolerant bacteria were isolated. From soybean, 8; sunflower, 3;
safflower 8; maize 2; groundnut and wheat, 3 each isolates were
made. They were screened for their PCP degradation potentials.
HPLC analyses of PCP degradation revealed that the isolate MAZ-2
degraded PCP completely. The isolate MAZ-1 was the next best
isolate with 90 per cent PCP degradation. These strains hold promise
to be used in the bioremediation of PCP polluted soils.
Abstract: For gamma radiation detection, assemblies having
scintillation crystals and a photomultiplier tube, also there is a
preamplifier connected to the detector because the signals from
photomultiplier tube are of small amplitude. After pre-amplification
the signals are sent to the amplifier and then to the multichannel
analyser. The multichannel analyser sorts all incoming electrical
signals according to their amplitudes and sorts the detected photons
in channels covering small energy intervals. The energy range of
each channel depends on the gain settings of the multichannel
analyser and the high voltage across the photomultiplier tube. The
exit spectrum data of the two main isotopes studied ,putting data in
biomass program ,process it by Matlab program to get the solid
holdup image (solid spherical nuclear fuel)
Abstract: The objective of global optimization is to find the
globally best solution of a model. Nonlinear models are ubiquitous
in many applications and their solution often requires a global
search approach; i.e. for a function f from a set A ⊂ Rn to
the real numbers, an element x0 ∈ A is sought-after, such that
∀ x ∈ A : f(x0) ≤ f(x). Depending on the field of application,
the question whether a found solution x0 is not only a local minimum
but a global one is very important.
This article presents a probabilistic approach to determine the
probability of a solution being a global minimum. The approach is
independent of the used global search method and only requires a
limited, convex parameter domain A as well as a Lipschitz continuous
function f whose Lipschitz constant is not needed to be known.
Abstract: The software system goes through a number of stages
during its life and a software process model gives a standard format
for planning, organizing and running a project. The article presents a
new software development process model named as “Divide and
Conquer Process Model", based on the idea first it divides the things
to make them simple and then gathered them to get the whole work
done. The article begins with the backgrounds of different software
process models and problems in these models. This is followed by a
new divide and conquer process model, explanation of its different
stages and at the end edge over other models is shown.
Abstract: In historical science and social science, the influence
of natural disaster upon society is a matter of great interest. In
recent years, some archives are made through many hands for natural
disasters, however it is inefficiency and waste. So, we suppose a
computer system to create a historical natural disaster archive. As
the target of this analysis, we consider newspaper articles. The news
articles are considered to be typical examples that prescribe the
temporal relations of affairs for natural disaster. In order to do this
analysis, we identify the occurrences in newspaper articles by some
index entries, considering the affairs which are specific to natural
disasters, and show the temporal relation between natural disasters.
We designed and implemented the automatic system of “extraction
of the occurrences of natural disaster" and “temporal relation table
for natural disaster."
Abstract: The hydrodynamic and thermal lattice Boltzmann
methods are applied to investigate the turbulent convective heat
transfer in the wavy channel flows. In this study, the turbulent
phenomena are modeling by large-eddy simulations with the
Smagorinsky model. As a benchmark, the laminar and turbulent
backward-facing step flows are simulated first. The results give good
agreement with other numerical and experimental data. For wavy
channel flows, the distribution of Nusselt number and the skin-friction
coefficients are calculated to evaluate the heat transfer effect and the
drag force. It indicates that the vortices at the trough would affect the
magnitude of drag and weaken the heat convection effects on the wavy
surface. In turbulent cases, if the amplitude of the wavy boundary is
large enough, the secondary vortices would be generated at troughs
and contribute to the heat convection. Finally, the effects of different
Re on the turbulent transport phenomena are discussed.
Abstract: The PRAF family of proteins is a plant specific family of proteins with distinct domain architecture and various unique sequence/structure traits. We have carried out an extensive search of the Arabidopsis genome using an automated pipeline and manual methods to verify previously known and identify unknown instances of PRAF proteins, characterize their sequence and build 3D structures of their individual domains. Integrating the sequence, structure and whatever little known experimental details for each of these proteins and their domains, we present a comprehensive characterization of the different domains in these proteins and their variant properties.
Abstract: This paper considers the integration of assembly
operations and product structure to Cellular Manufacturing System
(CMS) design so that to correct the drawbacks of previous researches
in the literature. For this purpose, a new mathematical model is
developed which dedicates machining and assembly operations to
manufacturing cells while the objective function is to minimize the
intercellular movements resulting due to both of them. A
linearization method is applied to achieve optimum solution through
solving aforementioned nonlinear model by common programming
language such as Lingo. Then, using different examples and
comparing the results, the importance of integrating assembly
considerations is demonstrated.
Abstract: Wireless Sensor Network is widely used in electronics. Wireless sensor networks are now used in many applications including military, environmental, healthcare applications, home automation and traffic control. We will study one area of wireless sensor networks, which is the routing protocol. Routing protocols are needed to send data between sensor nodes and the base station. In this paper, we will discuss two routing protocols, such as datacentric and hierarchical routing protocol. We will show the output of the protocols using the NS-2 simulator. This paper will compare the simulation output of the two routing protocol using Nam. We will simulate using Xgraph to find the throughput and delay of the protocol.
Abstract: The accomplished study is based on the appointment
and identification of ageing effects and according to this absorption
of moisture of aircraft cabin components over the life-cycle. In the
first step of the study ceiling panels from same age and from the
same aircraft cabin have been examined according to weight changes
depending on the position in the aircraft cabin. In the second step of
the study different aged ceiling panels have been examined
concerning deflection, weight changes and the acoustic sound
transmission loss. To prove the assumption of water absorption
within the study and with the theoretical background from literature
and scientific papers, an older test panel was exposed extreme
thermal conditions (humidity and temperature) within a climate
chamber to show that there is a general ingress of water to cabin
components and that this ingress of water leads to the change of
different mechanical properties.
Abstract: For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.
Abstract: This study was conducted to evaluate the antifungal
activities of Cinnamomum zeylanicum and Origanum vulgare L.
essential oil against Aspergillus flavus in culture media and tomato
paste. 200 ppm of cinnamon and 500 ppm of oregano completely
inhibited A. flavus growth in culture media, while in tomato paste 300
ppm of cinnamon and 200 ppm of oregano had the same effect. Test
panel evaluations revealed that samples with 100 and 200 ppm
cinnamon were acceptable. The results may suggest the potential use
of Cinnamomum zeylanicum essential oil as natural preservative in
tomato paste.
Abstract: This study examined the underlying dimensions of
brand equity in the chocolate industry. For this purpose, researchers
developed a model to identify which factors are influential in
building brand equity. The second purpose was to assess brand
loyalty and brand images mediating effect between brand attitude,
brand personality, brand association with brand equity. The study
employed structural equation modeling to investigate the causal
relationships between the dimensions of brand equity and brand
equity itself. It specifically measured the way in which consumers’
perceptions of the dimensions of brand equity affected the overall
brand equity evaluations. Data were collected from a sample of
consumers of chocolate industry in Iran. The results of this empirical
study indicate that brand loyalty and brand image are important
components of brand equity in this industry. Moreover, the role of
brand loyalty and brand image as mediating factors in the intention of
brand equity are supported. The principal contribution of the present
research is that it provides empirical evidence of the
multidimensionality of consumer based brand equity, supporting
Aaker´s and Keller´s conceptualization of brand equity. The present
research also enriched brand equity building by incorporating the
brand personality and brand image, as recommended by previous
researchers. Moreover, creating the brand equity index in chocolate
industry of Iran particularly is novel.
Abstract: Competitive relationships among Bradyrhizobium
japonicum USDA serogroup 123, 122 and 138 were screened versus
the standard commercial soybean variety Williams and two
introductions P1 377578 "671" in a field trial. Displacement of strain
123 by an effective strain should improved N2 fixation. Root nodules
were collected and strain occupancy percentage was determined
using strain specific fluorescent antibodies technique. As anticipated
the strain USDA 123 dominated 92% of nodules due to the high
affinity between the host and the symbiont. This dominance was
consistent and not changed materially either by inoculation practice
or by introducing new strainan. The interrelationship between the
genotype Williams and serogroup 122 & 138 was found very weak
although the cell density of the strain in the rhizosphere area was
equal. On the other hand, the nodule occupancy of genotypes 671 and
166 with rhizobia serogroup 123 was almost diminished to zero. .
The data further exhibited that the genotypes P1 671 and P1 166 have
high affinity to colonize with strains 122 and 138 whereas Williams
was highly promiscuous to strain 123.
Abstract: Phrases has a long history in information retrieval, particularly in commercial systems. Implicit semantic relationship between words in a form of BaseNP have shown significant improvement in term of precision in many IR studies. Our research focuses on linguistic phrases which is language dependent. Our results show that using BaseNP can improve performance although above 62% of words formation in Malay Language based on derivational affixes and suffixes.
Abstract: In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.
Abstract: This paper presents a critical study about the
application of Neural Networks to ion-exchange process. Ionexchange
is a complex non-linear process involving many factors
influencing the ions uptake mechanisms from the pregnant solution.
The following step includes the elution. Published data presents
empirical isotherm equations with definite shortcomings resulting in
unreliable predictions. Although Neural Network simulation
technique encounters a number of disadvantages including its “black
box", and a limited ability to explicitly identify possible causal
relationships, it has the advantage to implicitly handle complex
nonlinear relationships between dependent and independent
variables. In the present paper, the Neural Network model based on
the back-propagation algorithm Levenberg-Marquardt was developed
using a three layer approach with a tangent sigmoid transfer function
(tansig) at hidden layer with 11 neurons and linear transfer function
(purelin) at out layer. The above mentioned approach has been used
to test the effectiveness in simulating ion exchange processes. The
modeling results showed that there is an excellent agreement between
the experimental data and the predicted values of copper ions
removed from aqueous solutions.
Abstract: We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.