Abstract: Linear stochastic estimation and quadratic stochastic
estimation techniques were applied to estimate the entire velocity
flow-field of an open cavity with a length to depth ratio of 2. The
estimations were done through the use of instantaneous velocity
magnitude as estimators. These measurements were obtained by
Particle Image Velocimetry. The predicted flow was compared
against the original flow-field in terms of the Reynolds stresses and
turbulent kinetic energy. Quadratic stochastic estimation proved to be
more superior than linear stochastic estimation in resolving the shear
layer flow. When the velocity fluctuations were scaled up in the
quadratic estimate, both the time-averaged quantities and the
instantaneous cavity flow can be predicted to a rather accurate extent.
Abstract: The aim of this study was to compare the
sensitometric properties of commonly used radiographic films
processed with chemical solutions in different workload hospitals.
The effect of different processing conditions on induced densities on
radiologic films was investigated. Two accessible double emulsions
Fuji and Kodak films were exposed with 11-step wedge and
processed with Champion and CPAC processing solutions. The
mentioned films provided in both workloads centers, high and low.
Our findings displays that the speed and contrast of Kodak filmscreen
in both work load (high and low) is higher than Fuji filmscreen
for both processing solutions. However there was significant
differences in films contrast for both workloads when CPAC solution
had been used (p=0.000 and 0.028). The results showed base plus
fog density for Kodak film was lower than Fuji. Generally Champion
processing solution caused more speed and contrast for investigated
films in different conditions and there was significant differences in
95% confidence level between two used processing solutions
(p=0.01). Low base plus fog density for Kodak films provide more
visibility and accuracy and higher contrast results in using lower
exposure factors to obtain better quality in resulting radiographs. In
this study we found an economic advantages since Champion
solution and Kodak film are used while it makes lower patient dose.
Thus, in a radiologic facility any change in film processor/processing
cycle or chemistry should be carefully investigated before
radiological procedures of patients are acquired.
Abstract: This paper presents development of an ignition system using spark electrodes for application in a research explosion vessel.
A single spark is aimed to be discharged with quantifiable ignition energy. The spark electrode system would enable study of flame
propagation, ignitability of fuel-air mixtures and other fundamental characteristics of flames. The principle of the capacitive spark circuit
of ASTM is studied to charge an appropriate capacitance connected across the spark gap through a large resistor by a high voltage from
the source of power supply until the initiation of spark. Different spark energies could be obtained mainly by varying the value of the
capacitance and the supply current. The spark sizes produced are found to be affected by the spark gap, electrode size, input voltage
and capacitance value.
Abstract: Two commercial proteases from Bacillus
licheniformis (Alcalase 2.4 L FG and Alcalase 2.5 L, Type DX) were
screened for the production of Z-Ala-Phe-NH2 in batch reaction.
Alcalase 2.4 L FG was the most efficient enzyme for the C-terminal
amidation of Z-Ala-Phe-OMe using ammonium carbamate as
ammonium source. Immobilization of protease has been achieved by
the sol-gel method, using dimethyldimethoxysilane (DMDMOS) and
tetramethoxysilane (TMOS) as precursors (unpublished results). In
batch production, about 95% of Z-Ala-Phe-NH2 was obtained at
30°C after 24 hours of incubation. Reproducibility of different
batches of commercial Alcalase 2.4 L FG preparations was also
investigated by evaluating the amidation activity and the entrapment
yields in the case of immobilization. A packed-bed reactor (0.68 cm
ID, 15.0 cm long) was operated successfully for the continuous
synthesis of peptide amides. The immobilized enzyme retained the
initial activity over 10 cycles of repeated use in continuous reactor at
ambient temperature. At 0.75 mL/min flow rate of the substrate
mixture, the total conversion of Z-Ala-Phe-OMe was achieved after 5
hours of substrate recycling. The product contained about 90%
peptide amide and 10% hydrolysis byproduct.
Abstract: A boundary layer wind tunnel facility has been
adopted in order to conduct experimental measurements of the flow field around a model of the Panorama Giustinelli Building, Trieste
(Italy). Information on the main flow structures has been obtained by means of flow visualization techniques and has been compared to the
numerical predictions of the vortical structures spread on top of the roof, in order to investigate the optimal positioning for a vertical-axis
wind energy conversion system, registering a good agreement between experimental measurements and numerical predictions.
Abstract: In this paper, the direct AnsAz method is used for constructing the multi-wave solutions to the (2+1)-dimensional extension of the Korteweg de-Vries (shortly EKdV) equation. A new breather type of three-wave solutions including periodic breather type soliton solution, breather type of two-solitary solution are obtained. Some cases with specific values of the involved parameters are plotted for each of the three-wave solutions. Mechanical features of resonance interaction among the multi-wave are discussed. These results enrich the variety of the dynamics of higher-dimensional nonlinear wave field.
Abstract: The current study begins with an awareness that
today-s media environment is characterized by technological
development and a new way of reading caused by the introduction of
the Internet. The researcher conducted a meta analysis framed within
Technological Determinism to investigate the process of hypertext
reading, its differences from linear reading and the effects such
differences can have on people-s ways of mentally structuring their
world. The relationship between literacy and the comprehension
achieved by reading hypertexts is also investigated. The results show
hypertexts are not always user friendly. People experience hyperlinks
as interruptions that distract their attention generating comprehension
and disorientation. On one hand hypertextual jumping reading
generates interruptions that finally make people lose their
concentration. On the other hand hypertexts fascinate people who
would rather read a document in such a format even though the
outcome is often frustrating and affects their ability to elaborate and
retain information.
Abstract: Optimization of rational geometrical and mechanical
parameters of panel with curved plywood ribs is considered in this
paper. The panel consists of cylindrical plywood ribs manufactured
from Finish plywood, upper and bottom plywood flange, stiffness
diaphragms. Panel is filled with foam. Minimal ratio of structure self
weight and load that could be applied to structure is considered as
rationality criteria. Optimization is done, by using classical beam
theory without nonlinearities. Optimization of discreet design
variables is done by Genetic algorithm.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: Business scenario is an important technique that may be used at various stages of the enterprise architecture to derive its characteristics based on the high-level requirements of the business. In terms of wireless deployments, they are used to help identify and understand business needs involving wireless services, and thereby to derive the business requirements that the architecture development has to address by taking into account of various wireless challenges. This study assesses the deployment of Wireless Local Area Network (WLAN) and Broadband Wireless Access (BWA) solutions for several business scenarios in Asia Pacific region. This paper focuses on the overview of the business and technology environments, whereby examples of existing (or suggested) wireless solutions (to be) adopted in Asia Pacific region will be discussed. Interactions of several players, enabling technologies, and key processes in the wireless environments are studied. The analysis and discussions associated to this study are divided into two divisions: healthcare and education, where the merits of wireless solutions in improving living quality are highlighted.
Abstract: The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.
Abstract: In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.
Abstract: The background estimation approach using a small
window median filter is presented on the bases of analyzing IR point
target, noise and clutter model. After simplifying the two-dimensional
filter, a simple method of adopting one-dimensional median filter is
illustrated to make estimations of background according to the
characteristics of IR scanning system. The adaptive threshold is used
to segment canceled image in the background. Experimental results
show that the algorithm achieved good performance and satisfy the
requirement of big size image-s real-time processing.
Abstract: Spatial and mobile computing evolves. This paper
describes a smart modeling platform called “GeoSEMA". This
approach tends to model multidimensional GeoSpatial Evolutionary
and Mobile Agents. Instead of 3D and location-based issues, there
are some other dimensions that may characterize spatial agents, e.g.
discrete-continuous time, agent behaviors. GeoSEMA is seen as a
devoted design pattern motivating temporal geographic-based
applications; it is a firm foundation for multipurpose and
multidimensional special-based applications. It deals with
multipurpose smart objects (buildings, shapes, missiles, etc.) by
stimulating geospatial agents.
Formally, GeoSEMA refers to geospatial, spatio-evolutive and
mobile space constituents where a conceptual geospatial space model
is given in this paper. In addition to modeling and categorizing
geospatial agents, the model incorporates the concept of inter-agents
event-based protocols. Finally, a rapid software-architecture
prototyping GeoSEMA platform is also given. It will be
implemented/ validated in the next phase of our work.
Abstract: In a competitive production environment, critical
decision making are based on data resulted by random sampling of
product units. Efficiency of these decisions depends on data quality
and also their reliability scale. This point leads to the necessity of a
reliable measurement system. Therefore, the conjecture process and
analysing the errors contributes to a measurement system known as
Measurement System Analysis (MSA). The aim of this research is on
determining the necessity and assurance of extensive development in
analysing measurement systems, particularly with the use of
Repeatability and Reproducibility Gages (GR&R) to improve
physical measurements. Nowadays in productive industries,
repeatability and reproducibility gages released so well but they are
not applicable as well as other measurement system analysis
methods. To get familiar with this method and gain a feedback in
improving measurement systems, this survey would be on
“ANOVA" method as the most widespread way of calculating
Repeatability and Reproducibility (R&R).
Abstract: Many methods exist for either measuring or estimating
evaporation from free water surfaces. Evaporation pans provide one
of the simplest, inexpensive, and most widely used methods of
estimating evaporative losses. In this study, the rate of evaporation
starting from a water surface was calculated by modeling with
application to dams in wet, arid and semi arid areas in Algeria.
We calculate the evaporation rate from the pan using the energy
budget equation, which offers the advantage of an ease of use, but
our results do not agree completely with the measurements taken by
the National Agency of areas carried out using dams located in areas
of different climates. For that, we develop a mathematical model to
simulate evaporation. This simulation uses an energy budget on the
level of a vat of measurement and a Computational Fluid Dynamics
(Fluent). Our calculation of evaporation rate is compared then by the
two methods and with the measures of areas in situ.
Abstract: The acidity of different raw Jordanian clays
containing zeolite, bentonite, red and white kaolinite and diatomite
was characterized by means of temperature programmed desorption
(TPD) of ammonia, conversion of 2-methyl-3-butyn-2-ol (MBOH),
FTIR and BET-measurements. FTIR spectra proved presence of
silanol and bridged hydroxyls on the clay surface. The number of
acidic sites was calculated from experimental TPD-profiles. We
observed the decrease of surface acidity correlates with the decrease
of Si/Al ratio except for diatomite. On the TPD-plot for zeolite two
maxima were registered due to different strength of surface acidic
sites. Values of MBOH conversion, product yields and selectivity
were calculated for the catalysis on Jordanian clays. We obtained that
all clay samples are able to convert MBOH into a major product
which is 3-methyl-3-buten-1-yne (MBYNE) catalyzed by acid
surface sites with the selectivity close to 70%. There was found a
correlation between MBOH conversion and acidity of clays
determined by TPD-NH3, i.e. the higher the acidity the higher the
conversion of MBOH. However, diatomite provided the lowest
conversion of MBOH as result of poor polarization of silanol groups.
Comparison of surface areas and conversions revealed the highest
density of active sites for red kaolinite and the lowest for zeolite and
diatomite.
Abstract: The problem of frequent pattern discovery is defined
as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns
has become an important data mining task because it reveals associations, correlations, and many other interesting relationships
hidden in a database. Most of the proposed frequent pattern mining
algorithms have been implemented with imperative programming
languages. Such paradigm is inefficient when set of patterns is large
and the frequent pattern is long. We suggest a high-level declarative
style of programming apply to the problem of frequent pattern
discovery. We consider two languages: Haskell and Prolog. Our
intuitive idea is that the problem of finding frequent patterns should
be efficiently and concisely implemented via a declarative paradigm
since pattern matching is a fundamental feature supported by most
functional languages and Prolog. Our frequent pattern mining
implementation using the Haskell and Prolog languages confirms our
hypothesis about conciseness of the program. The comparative
performance studies on line-of-code, speed and memory usage of
declarative versus imperative programming have been reported in the
paper.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The feasibility of applying a simple and cost effective sliding friction testing apparatus to study the friction behaviour of a clutch facing material, effected by the variation of temperature and contact pressure, was investigated. It was found that the method used in this work was able to give a convenient and cost effective measurement of friction coefficients and their transitions of a clutch facing material. The obtained results will be useful for the development process of new facing materials.