Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: Business scenario is an important technique that may be used at various stages of the enterprise architecture to derive its characteristics based on the high-level requirements of the business. In terms of wireless deployments, they are used to help identify and understand business needs involving wireless services, and thereby to derive the business requirements that the architecture development has to address by taking into account of various wireless challenges. This study assesses the deployment of Wireless Local Area Network (WLAN) and Broadband Wireless Access (BWA) solutions for several business scenarios in Asia Pacific region. This paper focuses on the overview of the business and technology environments, whereby examples of existing (or suggested) wireless solutions (to be) adopted in Asia Pacific region will be discussed. Interactions of several players, enabling technologies, and key processes in the wireless environments are studied. The analysis and discussions associated to this study are divided into two divisions: healthcare and education, where the merits of wireless solutions in improving living quality are highlighted.
Abstract: The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.
Abstract: In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.
Abstract: The background estimation approach using a small
window median filter is presented on the bases of analyzing IR point
target, noise and clutter model. After simplifying the two-dimensional
filter, a simple method of adopting one-dimensional median filter is
illustrated to make estimations of background according to the
characteristics of IR scanning system. The adaptive threshold is used
to segment canceled image in the background. Experimental results
show that the algorithm achieved good performance and satisfy the
requirement of big size image-s real-time processing.
Abstract: Spatial and mobile computing evolves. This paper
describes a smart modeling platform called “GeoSEMA". This
approach tends to model multidimensional GeoSpatial Evolutionary
and Mobile Agents. Instead of 3D and location-based issues, there
are some other dimensions that may characterize spatial agents, e.g.
discrete-continuous time, agent behaviors. GeoSEMA is seen as a
devoted design pattern motivating temporal geographic-based
applications; it is a firm foundation for multipurpose and
multidimensional special-based applications. It deals with
multipurpose smart objects (buildings, shapes, missiles, etc.) by
stimulating geospatial agents.
Formally, GeoSEMA refers to geospatial, spatio-evolutive and
mobile space constituents where a conceptual geospatial space model
is given in this paper. In addition to modeling and categorizing
geospatial agents, the model incorporates the concept of inter-agents
event-based protocols. Finally, a rapid software-architecture
prototyping GeoSEMA platform is also given. It will be
implemented/ validated in the next phase of our work.
Abstract: In a competitive production environment, critical
decision making are based on data resulted by random sampling of
product units. Efficiency of these decisions depends on data quality
and also their reliability scale. This point leads to the necessity of a
reliable measurement system. Therefore, the conjecture process and
analysing the errors contributes to a measurement system known as
Measurement System Analysis (MSA). The aim of this research is on
determining the necessity and assurance of extensive development in
analysing measurement systems, particularly with the use of
Repeatability and Reproducibility Gages (GR&R) to improve
physical measurements. Nowadays in productive industries,
repeatability and reproducibility gages released so well but they are
not applicable as well as other measurement system analysis
methods. To get familiar with this method and gain a feedback in
improving measurement systems, this survey would be on
“ANOVA" method as the most widespread way of calculating
Repeatability and Reproducibility (R&R).
Abstract: The principal objective of this study is to be able to
extract niobium oxide from columbite-tantalite concentrate of Thayet
Kon Area in Nay Phi Taw. It is recovered from columbite-tantalite
concentrate which contains 19.29 % Nb2O5.The recovery of niobium
oxide from columbite-tantalite concentrate can be divided into three
main sections, namely, digestion of the concentrate, recovery from
the leached solution and precipitation and calcinations. The
concentrate was digested with hydrofluoric acid and sulfuric acid. Of
the various parameters that effect acidity and time were studied. In
the recovery section solvent extraction process using methyl isobutyl
ketone was investigated. Ammonium hydroxide was used as a
precipitating agent and the precipitate was later calcined. The
percentage of niobium oxide is 74%.
Abstract: Many methods exist for either measuring or estimating
evaporation from free water surfaces. Evaporation pans provide one
of the simplest, inexpensive, and most widely used methods of
estimating evaporative losses. In this study, the rate of evaporation
starting from a water surface was calculated by modeling with
application to dams in wet, arid and semi arid areas in Algeria.
We calculate the evaporation rate from the pan using the energy
budget equation, which offers the advantage of an ease of use, but
our results do not agree completely with the measurements taken by
the National Agency of areas carried out using dams located in areas
of different climates. For that, we develop a mathematical model to
simulate evaporation. This simulation uses an energy budget on the
level of a vat of measurement and a Computational Fluid Dynamics
(Fluent). Our calculation of evaporation rate is compared then by the
two methods and with the measures of areas in situ.
Abstract: The acidity of different raw Jordanian clays
containing zeolite, bentonite, red and white kaolinite and diatomite
was characterized by means of temperature programmed desorption
(TPD) of ammonia, conversion of 2-methyl-3-butyn-2-ol (MBOH),
FTIR and BET-measurements. FTIR spectra proved presence of
silanol and bridged hydroxyls on the clay surface. The number of
acidic sites was calculated from experimental TPD-profiles. We
observed the decrease of surface acidity correlates with the decrease
of Si/Al ratio except for diatomite. On the TPD-plot for zeolite two
maxima were registered due to different strength of surface acidic
sites. Values of MBOH conversion, product yields and selectivity
were calculated for the catalysis on Jordanian clays. We obtained that
all clay samples are able to convert MBOH into a major product
which is 3-methyl-3-buten-1-yne (MBYNE) catalyzed by acid
surface sites with the selectivity close to 70%. There was found a
correlation between MBOH conversion and acidity of clays
determined by TPD-NH3, i.e. the higher the acidity the higher the
conversion of MBOH. However, diatomite provided the lowest
conversion of MBOH as result of poor polarization of silanol groups.
Comparison of surface areas and conversions revealed the highest
density of active sites for red kaolinite and the lowest for zeolite and
diatomite.
Abstract: The problem of frequent pattern discovery is defined
as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns
has become an important data mining task because it reveals associations, correlations, and many other interesting relationships
hidden in a database. Most of the proposed frequent pattern mining
algorithms have been implemented with imperative programming
languages. Such paradigm is inefficient when set of patterns is large
and the frequent pattern is long. We suggest a high-level declarative
style of programming apply to the problem of frequent pattern
discovery. We consider two languages: Haskell and Prolog. Our
intuitive idea is that the problem of finding frequent patterns should
be efficiently and concisely implemented via a declarative paradigm
since pattern matching is a fundamental feature supported by most
functional languages and Prolog. Our frequent pattern mining
implementation using the Haskell and Prolog languages confirms our
hypothesis about conciseness of the program. The comparative
performance studies on line-of-code, speed and memory usage of
declarative versus imperative programming have been reported in the
paper.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The feasibility of applying a simple and cost effective sliding friction testing apparatus to study the friction behaviour of a clutch facing material, effected by the variation of temperature and contact pressure, was investigated. It was found that the method used in this work was able to give a convenient and cost effective measurement of friction coefficients and their transitions of a clutch facing material. The obtained results will be useful for the development process of new facing materials.
Abstract: Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.
Abstract: Chaiyaphum Starch Co. Ltd. is one of many starch
manufacturers that has introduced machinery to aid in manufacturing.
Even though machinery has replaced many elements and is now a
significant part in manufacturing processes, problems that must be
solved with respect to current process flow to increase efficiency still
exist. The paper-s aim is to increase productivity while maintaining
desired quality of starch, by redesigning the flipping machine-s
mechanical control system which has grossly low functional lifetime.
Such problems stem from the mechanical control system-s bearings,
as fluids and humidity can access into said bearing directly, in
tandem with vibrations from the machine-s function itself. The wheel
which is used to sense starch thickness occasionally falls from its
shaft, due to high speed rotation during operation, while the shaft
may bend from impact when processing dried bread. Redesigning its
mechanical control system has increased its efficiency, allowing
quality thickness measurement while increasing functional lifetime
an additional 62 days.
Abstract: The availability of water in adequate quantity and
quality is imperative for sustainable development. Worldwide,
significant imbalance exists with regards to sustainable development
particularly from a water and sanitation perspective. Water is a
critical component of public health, and failure to supply safe water
will place a heavy burden on the entire population. Although the 21st
century has witnessed wealth and advanced development, it has not
been realized everywhere. Billions of people are still striving to
access the most basic human needs which are food, shelter, safe
drinking water and adequate sanitation. The global picture conceals
various inequalities particularly with regards to sanitation coverage in
rural and urban areas. Currently, water scarcity and in particular
water governance is the main challenge which will cause a threat to
sustainable development goals. Within the context of water,
sanitation and health, sustainable development is a confusing concept
primarily when examined from the viewpoint of policy options for
developing countries. This perspective paper aims to summarize and
critically evaluate evidence of published studies in relation to water,
sanitation and health and to identify relevant solutions to reduce
public health impacts. Evidently, improving water and sanitation
services will result in significant and lasting gains in health and
economic development.
Abstract: We present design, fabrication, and characterization of
a small (12 mm × 12 mm × 8 mm) movable railway vehicle for sensor
carrying. The miniature railway vehicle (MRV) was mainly composed
of a vibrational structure and three legs. A railway was designed and
fabricated to power and guide the MRV. It also transmits the sensed
data from the MRV to the signal processing unit. The MRV with legs
on the railway was moving due to its high-frequency vibration. A
model was derived to describe the motion. Besides, FEM simulations
were performed to design the legs. Then, the MRV and the railway
were fabricated by precision machining. Finally, an infrared sensor
was carried and tested. The result shows that the MRV without loading
was moving along the railway and its maximum speed was 12.2 mm/s.
Moreover, the testing signal was sensed by the MRV.
Abstract: The cable tower of Liede Bridge is a double-column curved-lever arched-beam portal framed structure. Being novel and unique in structure, its cable tower differs in complexity from traditional ones. This paper analyzes the ultimate load capacity of cable tower by adopting the finite element calculations and model tests which indicate that constitutive relations applied here give a better simulation of actual failure process of prestressed reinforced concrete. In vertical load, horizontal load and overloading tests, the stepped loading of the tower model is of linear relationship, and the test data has good repeatability. All suggests that the cable tower has good bearing capacity, rational design and high emergency capacity.
Abstract: In this paper is reported an analysis about the outdoor air pollution of the urban centre of the city of Messina. The variations of the most critical pollutants concentrations (PM10, O3, CO, C6H6) and their trends respect of climatic parameters and vehicular traffic have been studied. Linear regressions have been effectuated for representing the relations among the pollutants; the differences between pollutants concentrations on weekend/weekday were also analyzed. In order to evaluate air pollution and its effects on human health, a method for calculating a pollution index was implemented and applied in the urban centre of the city. This index is based on the weighted mean of the most detrimental air pollutants concentrations respect of their limit values for protection of human health. The analyzed data of the polluting substances were collected by the Assessorship of the Environment of the Regional Province of Messina in the year 2004. A statistical analysis of the air quality index trends is also reported.
Abstract: Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).