Abstract: This paper seeks to develop simple yet practical and
efficient control scheme that enables cooperating arms to handle a
flexible beam. Specifically the problem studied herein is that of two
arms rigidly grasping a flexible beam and such capable of generating
forces/moments in such away as to move a flexible beam along a
predefined trajectory. The paper develops a sliding mode control law
that provides robustness against model imperfection and uncertainty.
It also provides an implicit stability proof. Simulation results for two
three joint arms moving a flexible beam, are presented to validate the
theoretical results.
Abstract: Particle detection in very noisy and low contrast images
is an active field of research in image processing. In this article, a
method is proposed for the efficient detection and sizing of subsurface
spherical particles, which is used for the processing of softly fused
Au nanoparticles. Transmission Electron Microscopy is used for
imaging the nanoparticles, and the proposed algorithm has been
tested with the two-dimensional projected TEM images obtained.
Results are compared with the data obtained by transmission optical
spectroscopy, as well as with conventional circular object detection
algorithms.
Abstract: Unified Speech Audio Coding (USAC), the latest MPEG standardization for unified speech and audio coding, uses a speech/audio classification algorithm to distinguish speech and audio segments of the input signal. The quality of the recovered audio can be increased by well-designed orchestra/percussion classification and subsequent processing. However, owing to the shortcoming of the system, introducing an orchestra/percussion classification and modifying subsequent processing can enormously increase the quality of the recovered audio. This paper proposes an orchestra/percussion classification algorithm for the USAC system which only extracts 3 scales of Mel-Frequency Cepstral Coefficients (MFCCs) rather than traditional 13 scales of MFCCs and use Iterative Dichotomiser 3 (ID3) Decision Tree rather than other complex learning method, thus the proposed algorithm has lower computing complexity than most existing algorithms. Considering that frequent changing of attributes may lead to quality loss of the recovered audio signal, this paper also design a modified subsequent process to help the whole classification system reach an accurate rate as high as 97% which is comparable to classical 99%.
Abstract: Accurate loss minimization is the critical component
for efficient electrical distribution power flow .The contribution of
this work presents loss minimization in power distribution system
through feeder restructuring, incorporating DG and placement of
capacitor. The study of this work was conducted on IEEE
distribution network and India Electricity Board benchmark
distribution system. The executed experimental result of Indian
system is recommended to board and implement practically for
regulated stable output.
Abstract: Smith Predictor control is theoretically a good solution to the problem of controlling the time delay systems. However, it seldom gets use because it is almost impossible to find out a precise mathematical model of the practical system and very sensitive to uncertain system with variable time-delay. In this paper is concerned with a design method of smith predictor for temperature control system by Coefficient Diagram Method (CDM). The simulation results show that the control system with smith predictor design by CDM is stable and robust whilst giving the desired time domain system performance.
Abstract: With major technological advances and to reduce the
cost of training apprentices for real-time critical systems, it was
necessary the development of Intelligent Tutoring Systems for
training apprentices in these systems. These systems, in general, have
interactive features so that the learning is actually more efficient,
making the learner more familiar with the mechanism in question. In
the home stage of learning, tests are performed to obtain the student's
income, a measure on their use. The aim of this paper is to present a
framework to model an Intelligent Tutoring Systems using the UML
language. The various steps of the analysis are considered the
diagrams required to build a general model, whose purpose is to
present the different perspectives of its development.
Abstract: Clustering large populations is an important problem
when the data contain noise and different shapes. A good clustering
algorithm or approach should be efficient enough to detect clusters
sensitively. Besides space complexity, time complexity also gains
importance as the size grows. Using hierarchies we developed a new
algorithm to split attributes according to the values they have and
choosing the dimension for splitting so as to divide the database
roughly into equal parts as much as possible. At each node we
calculate some certain descriptive statistical features of the data
which reside and by pruning we generate the natural clusters with a
complexity of O(n).
Abstract: In a world worried about water resources with the
shadow of drought and famine looming all around, the quality of
water is as important as its quantity. The source of all concerns is the
constant reduction of per capita quality water for different uses.
Iran With an average annual precipitation of 250 mm compared to
the 800 mm world average, Iran is considered a water scarce country
and the disparity in the rainfall distribution, the limitations of
renewable resources and the population concentration in the margins
of desert and water scarce areas have intensified the problem.
The shortage of per capita renewable freshwater and its poor
quality in large areas of the country, which have saline, brackish or
hard water resources, and the profusion of natural and artificial
pollutant have caused the deterioration of water quality.
Among methods of treatment and use of these waters one can refer
to the application of membrane technologies, which have come into
focus in recent years due to their great advantages. This process is
quite efficient in eliminating multi-capacity ions; and due to the
possibilities of production at different capacities, application as
treatment process in points of use, and the need for less energy in
comparison to Reverse Osmosis processes, it can revolutionize the
water and wastewater sector in years to come. The article studied the
different capacities of water resources in the Persian Gulf and Oman
Sea watershed basins, and processes the possibility of using
nanofiltration process to treat brackish and non-conventional waters
in these basins.
Abstract: In this paper newly reported Cosh window function is
used in the design of prototype filter for M-channel Near Perfect
Reconstruction (NPR) Cosine Modulated Filter Bank (CMFB). Local
search optimization algorithm is used for minimization of distortion
parameters by optimizing the filter coefficients of prototype filter.
Design examples are presented and comparison has been made with
Kaiser window based filterbank design of recently reported work.
The result shows that the proposed design approach provides lower
distortion parameters and improved far-end suppression than the
Kaiser window based design of recent reported work.
Abstract: Orthogonal Frequency Division Multiplexing
(OFDM) is an efficient method of data transmission for high speed
communication systems. However, the main drawback of OFDM
systems is that, it suffers from the problem of high Peak-to-Average
Power Ratio (PAPR) which causes inefficient use of the High Power
Amplifier and could limit transmission efficiency. OFDM consist of
large number of independent subcarriers, as a result of which the
amplitude of such a signal can have high peak values. In this paper,
we propose an effective reduction scheme that combines DCT and
SLM techniques. The scheme is composed of the DCT followed by
the SLM using the Riemann matrix to obtain phase sequences for the
SLM technique. The simulation results show PAPR can be greatly
reduced by applying the proposed scheme. In comparison with
OFDM, while OFDM had high values of PAPR –about 10.4dB our
proposed method achieved about 4.7dB reduction of the PAPR with
low complexities computation. This approach also avoids
randomness in phase sequence selection, which makes it simpler to
decode at the receiver. As an added benefit, the matrices can be
generated at the receiver end to obtain the data signal and hence it is
not required to transmit side information (SI).
Abstract: An appropriate method for fault identification and classification on extra high voltage transmission line using discrete wavelet transform is proposed in this paper. The sharp variations of the generated short circuit transient signals which are recorded at the sending end of the transmission line are adopted to identify the fault. The threshold values involve fault classification and these are done on the basis of the multiresolution analysis. A comparative study of the performance is also presented for Discrete Fourier Transform (DFT) based Artificial Neural Network (ANN) and Discrete Wavelet Transform (DWT). The results prove that the proposed method is an effective and efficient one in obtaining the accurate result within short duration of time by using Daubechies 4 and 9. Simulation of the power system is done using MATLAB.
Abstract: Societal security, continuity scenarios and methodological cycling approach explained in this article. Namely societal security organizational challenges ask implementation of international standards BS 25999-2 & global ISO 22300 which is a family of standards for business continuity management system. Efficient global organization system is distinguished of high entity´s complexity, connectivity & interoperability, having not only cooperative relations in a fact. Competing business have numerous participating ´enemies´, which are in apparent or hidden opponent and antagonistic roles with prosperous organization system, resulting to a crisis scene or even to a battle theatre. Organization business continuity scenarios are necessary for such ´a play´ preparedness, planning, management & overmastering in real environments.
Abstract: A DEA model can generally evaluate the performance
using multiple inputs and outputs for the same period. However, it is
hard to avoid the production lead time phenomenon some times, such
as long-term project or marketing activity. A couple of models have
been suggested to capture this time lag issue in the context of DEA.
This paper develops a dual-MPO model to deal with time lag effect in
evaluating efficiency. A numerical example is also given to show that
the proposed model can be used to get efficiency and reference set of
inefficient DMUs and to obtain projected target value of input
attributes for inefficient DMUs to be efficient.
Abstract: Irradiation is considered one of the most efficient technological processes for the reduction of microorganisms in food. It can be used to improve the safety of food products, and to extend their shelf lives. The aim of this study was to evaluate the effects of gamma irradiation for improvement of saffron shelf life. Samples were treated with 0 (none irradiated), 1.0, 2.0, 3.0 and 4.0 kGy of gamma irradiation and held for 2 months. The control and irradiated samples were underwent microbial analysis, chemical characteristics and sensory evaluation at 30 days intervals. Microbial analysis indicated that irradiation had a significant effect (P < 0.05) on the reduction of microbial loads. There was no significant difference in sensory quality and chemical characteristics during storage in saffron.
Abstract: A wireless sensor network with a large number of tiny sensor nodes can be used as an effective tool for gathering data in various situations. One of the major issues in wireless sensor networks is developing an energy-efficient routing protocol which has a significant impact on the overall lifetime of the sensor network. In this paper, we propose a novel hierarchical with static clustering routing protocol called Energy-Efficient Protocol with Static Clustering (EEPSC). EEPSC, partitions the network into static clusters, eliminates the overhead of dynamic clustering and utilizes temporary-cluster-heads to distribute the energy load among high-power sensor nodes; thus extends network lifetime. We have conducted simulation-based evaluations to compare the performance of EEPSC against Low-Energy Adaptive Clustering Hierarchy (LEACH). Our experiment results show that EEPSC outperforms LEACH in terms of network lifetime and power consumption minimization.
Abstract: In this research, a part of Aghche basin in Isfahan
province with an area about 2000 hectars, was chosen to be obtain
curve number coefficient runoff and W indicator in second Cook
method By using aerial photos 1968 and 1995, the satellite data of
the IRS in 2008. Then the process of land use changes in the period
of study and its effect on the changes of curve number (CN), W
indicator and surface runoff coefficient (C) of the basin was
investigated. These results showed that on the track of these land use
changes the weight averages curve number (CN), surface runoff
coefficient (C) and W indicator of the basin were increased to 0.92,
0.02 and 0.78 unit in the first period of study and 1.18, 0.03, 0.99
Unit in the second period of study respectively.
Abstract: In recent years, many researches to mine the exploding Web world, especially User Generated Content (UGC) such as
weblogs, for knowledge about various phenomena and events in the physical world have been done actively, and also Web services
with the Web-mined knowledge have begun to be developed for
the public. However, there are few detailed investigations on how accurately Web-mined data reflect physical-world data. It must be
problematic to idolatrously utilize the Web-mined data in public Web services without ensuring their accuracy sufficiently. Therefore,
this paper introduces the simplest Web Sensor and spatiotemporallynormalized
Web Sensor to extract spatiotemporal data about a target
phenomenon from weblogs searched by keyword(s) representing the
target phenomenon, and tries to validate the potential and reliability of the Web-sensed spatiotemporal data by four kinds of granularity
analyses of coefficient correlation with temperature, rainfall, snowfall,
and earthquake statistics per day by region of Japan Meteorological
Agency as physical-world data: spatial granularity (region-s population
density), temporal granularity (time period, e.g., per day vs. per week), representation granularity (e.g., “rain" vs. “heavy rain"), and
media granularity (weblogs vs. microblogs such as Tweets).
Abstract: Radial flow reactor was focused for large scale
methanol synthesis and in which the heat transfer type was cross-flow.
The effects of operating conditions including the reactor inlet air
temperature, the heating pipe temperature and the air flow rate on the
cross-flow heat transfer was investigated and the results showed that
the temperature profile of the area in front of the heating pipe was
slightly affected by all the operating conditions. The main area whose
temperature profile was influenced was the area behind the heating
pipe. The heat transfer direction according to the air flow directions. In
order to provide the basis for radial flow reactor design calculation, the
dimensionless number group method was used for data fitting of the
bed effective thermal conductivity and the wall heat transfer
coefficient which was calculated by the mathematical model with the
product of Reynolds number and Prandtl number. The comparison of
experimental data and calculated value showed that the calculated
value fit the experimental data very well and the formulas could be
used for reactor designing calculation.
Abstract: Currently, web usage make a huge data from a lot of
user attention. In general, proxy server is a system to support web
usage from user and can manage system by using hit rates. This
research tries to improve hit rates in proxy system by applying data
mining technique. The data set are collected from proxy servers in the
university and are investigated relationship based on several features.
The model is used to predict the future access websites. Association
rule technique is applied to get the relation among Date, Time, Main
Group web, Sub Group web, and Domain name for created model.
The results showed that this technique can predict web content for the
next day, moreover the future accesses of websites increased from
38.15% to 85.57 %.
This model can predict web page access which tends to increase
the efficient of proxy servers as a result. In additional, the
performance of internet access will be improved and help to reduce
traffic in networks.
Abstract: The pavement constructions on soft and expansive soils are not durable and unable to sustain heavy traffic loading. As a result, pavement failures and settlement problems will occur very often even under light traffic loading due to cyclic and rolling effects. Geotechnical engineers have dwelled deeply into this matter, and adopt various methods to improve the engineering characteristics of soft fine-grained soils and expansive soils. The problematic soils are either replaced by good and better quality material or treated by using chemical stabilization with various binding materials. Increased the strength and durability are also the part of the sustainability drive to reduce the environment footprint of the built environment by the efficient use of resources and waste recycle materials. This paper presents a series of laboratory tests and evaluates the effect of cement and fly ash on the strength and drainage characteristics of soil in Miri. The tests were performed at different percentages of cement and fly ash by dry weight of soil. Additional tests were also performed on soils treated with the combinations of fly ash with cement and lime. The results of this study indicate an increase in unconfined compression strength and a decrease in hydraulic conductivity of the treated soil.