Abstract: One of the processes of slope that occurs every year in Iran and some parts of world and cause a lot of criminal and financial harms is called landslide. They are plenty of method to stability landslide in soil and rock slides. The use of the best method with the least cost and in the shortest time is important for researchers. In this research, determining the best method of stability is investigated by using of Decision Support systems. DSS is made for this purpose and was used (for Hasan Salaran area in Kurdistan). Field study data from topography, slope, geology, geometry of landslide and the related features was used. The related data entered decision making managements programs (DSS) (ALES).Analysis of mass stability indicated the instability potential at present. Research results show that surface and sub surface drainage the best method of stabilizing. Analysis of stability shows that acceptable increase in security coefficient is a consequence of drainage.
Abstract: This is an application research presenting the
improvement of production quality using the six sigma solutions and
the analyses of benefit-cost ratio. The case of interest is the
production of tile-concrete. Such production has faced with the
problem of high nonconforming products from an inappropriate
surface coating and had low process capability based on the strength
property of tile. Surface coating and tile strength are the most critical
to quality of this product. The improvements followed five stages of
six sigma solutions. After the improvement, the production yield was
improved to 80% as target required and the defective products from
coating process was remarkably reduced from 29.40% to 4.09%. The
process capability based on the strength quality was increased from
0.87 to 1.08 as customer oriented. The improvement was able to save
the materials loss for 3.24 millions baht or 0.11 million dollars. The
benefits from the improvement were analyzed from (1) the reduction
of the numbers of non conforming tile using its factory price for
surface coating improvement and (2) the materials saved from the
increment of process capability. The benefit-cost ratio of overall
improvement was high as 7.03. It was non valuable investment in
define, measure, analyses and the initial of improve stages after that
it kept increasing. This was due to there were no benefits in define,
measure, and analyze stages of six sigma since these three stages
mainly determine the cause of problem and its effects rather than
improve the process. The benefit-cost ratio starts existing in the
improve stage and go on. Within each stage, the individual benefitcost
ratio was much higher than the accumulative one as there was an
accumulation of cost since the first stage of six sigma. The
consideration of the benefit-cost ratio during the improvement
project helps make decisions for cost saving of similar activities
during the improvement and for new project. In conclusion, the
determination of benefit-cost ratio behavior through out six sigma
implementation period provides the useful data for managing quality
improvement for the optimal effectiveness. This is the additional
outcome from the regular proceeding of six sigma.
Abstract: The aluminum bronze matrix alumina composites using hot press and resin infiltration were investigated to study their porosities, hardness, bending strengths, and microstructures. The experiment results show that the hardness of the sintered composites with the decrease of porosity increases. The composites without and with resin infiltration have about HRF 42-61 of about 34-40% of porosity and about HRF 62-83 of about 30-36% of porosity, respectively. Besides, the alumina composites contain a more amount of iron and nickel powders would cause a lower bending strength due to forming some weaker bonding among the iron, nickel, copper, aluminum under this hot pressing of shorter time.
Abstract: This paper presents a compact thermoelectric power generator system based on temperature difference across the element. The system can transfer the burning heat energy to electric energy directly. The proposed system has a thermoelectric generator and a power control box. In the generator, there are 4 thermoelectric modules (TEMs), each of which uses 2 thermoelectric chips (TEs) and 2 cold sinks, 1 thermal absorber, and 1 thermal conduction flat board. In the power control box, there are 1 storing energy device, 1 converter, and 1 inverter. The total net generating power is about 11W. This system uses commercial portable gas stoves or burns timber or the coal as the heat source, which is easily obtained. It adopts solid-state thermoelectric chips as heat inverter parts. The system has the advantages of being light-weight, quite, and mobile, requiring no maintenance, and havng easily-supplied heat source. The system can be used a as long as burning is allowed. This system works well for highly-mobilized outdoors situations by providing a power for illumination, entertainment equipment or the wireless equipment at refuge. Under heavy storms such as typhoon, when the solar panels become ineffective and the wind-powered machines malfunction, the thermoelectric power generator can continue providing the vital power.
Abstract: Histogram equalization is often used in image enhancement, but it can be also used in auto exposure. However, conventional histogram equalization does not work well when many pixels are concentrated in a narrow luminance range.This paper proposes an auto exposure method based on 2-way histogram equalization. Two cumulative distribution functions are used, where one is from dark to bright and the other is from bright to dark. In this paper, the proposed auto exposure method is also designed and implemented for image signal processors with full-HD images.
Abstract: This study performs a comparative analysis of the 21 Greek Universities in terms of their public funding, awarded for covering their operating expenditure. First it introduces a DEA/MCDM model that allocates the fund into four expenditure factors in the most favorable way for each university. Then, it presents a common, consensual assessment model to reallocate the amounts, remaining in the same level of total public budget. From the analysis it derives that a number of universities cannot justify the public funding in terms of their size and operational workload. For them, the sufficient reduction of their public funding amount is estimated as a future target. Due to the lack of precise data for a number of expenditure criteria, the analysis is based on a mixed crisp-ordinal data set.
Abstract: Text categorization is the problem of classifying text
documents into a set of predefined classes. In this paper, we
investigated three approaches to build a meta-classifier in order to
increase the classification accuracy. The basic idea is to learn a metaclassifier
to optimally select the best component classifier for each
data point. The experimental results show that combining classifiers
can significantly improve the accuracy of classification and that our
meta-classification strategy gives better results than each individual
classifier. For 7083 Reuters text documents we obtained a
classification accuracies up to 92.04%.
Abstract: This paper presents a simplified version of Data Envelopment Analysis (DEA) - a conventional approach to evaluating the performance and ranking of competitive objects characterized by two groups of factors acting in opposite directions: inputs and outputs. DEA with a Perfect Object (DEA PO) augments the group of actual objects with a virtual Perfect Object - the one having greatest outputs and smallest inputs. It allows for obtaining an explicit analytical solution and making a step to an absolute efficiency. This paper develops this approach further and introduces a DEA model with Partially Perfect Objects. DEA PPO consecutively eliminates the smallest relative inputs or greatest relative outputs, and applies DEA PO to the reduced collections of indicators. The partial efficiency scores are combined to get the weighted efficiency score. The computational scheme remains simple, like that of DEA PO, but the advantage of the DEA PPO is taking into account all of the inputs and outputs for each actual object. Firm evaluation is considered as an example.
Abstract: A conventional binding method for low power in a
high-level synthesis mainly focuses on finding an optimal binding for
an assumed input data, and obtains only one binding table. In this
paper, we show that a binding method which uses multiple binding
tables gets better solution compared with the conventional methods
which use a single binding table, and propose a dynamic bus binding
scheme for low power using multiple binding tables. The proposed
method finds multiple binding tables for the proper partitions of an
input data, and switches binding tables dynamically to produce the
minimum total switching activity. Experimental result shows that the
proposed method obtains a binding solution having 12.6-28.9%
smaller total switching activity compared with the conventional
methods.
Abstract: Blood pulse is an important human physiological signal commonly used for the understanding of the individual physical health. Current methods of non-invasive blood pulse sensing require direct contact or access to the human skin. As such, the performances of these devices tend to vary with time and are subjective to human body fluids (e.g. blood, perspiration and skin-oil) and environmental contaminants (e.g. mud, water, etc). This paper proposes a simulation model for the novel method of non-invasive acquisition of blood pulse using the disturbance created by blood flowing through a localized magnetic field. The simulation model geometry represents a blood vessel, a permanent magnet, a magnetic sensor, surrounding tissues and air in 2-dimensional. In this model, the velocity and pressure fields in the blood stream are described based on Navier-Stroke equations and the walls of the blood vessel are assumed to have no-slip condition. The blood assumes a parabolic profile considering a laminar flow for blood in major artery near the skin. And the inlet velocity follows a sinusoidal equation. This will allow the computational software to compute the interactions between the magnetic vector potential generated by the permanent magnet and the magnetic nanoparticles in the blood. These interactions are simulated based on Maxwell equations at the location where the magnetic sensor is placed. The simulated magnetic field at the sensor location is found to assume similar sinusoidal waveform characteristics as the inlet velocity of the blood. The amplitude of the simulated waveforms at the sensor location are compared with physical measurements on human subjects and found to be highly correlated.
Abstract: The objective of this paper is to estimate realistic
principal extrusion process parameters by means of artificial neural
network. Conventionally, finite element analysis is used to derive
process parameters. However, the finite element analysis of the
extrusion model does not consider the manufacturing process
constraints in its modeling. Therefore, the process parameters
obtained through such an analysis remains highly theoretical.
Alternatively, process development in industrial extrusion is to a
great extent based on trial and error and often involves full-size
experiments, which are both expensive and time-consuming. The
artificial neural network-based estimation of the extrusion process
parameters prior to plant execution helps to make the actual extrusion
operation more efficient because more realistic parameters may be
obtained. And so, it bridges the gap between simulation and real
manufacturing execution system. In this work, a suitable neural
network is designed which is trained using an appropriate learning
algorithm. The network so trained is used to predict the
manufacturing process parameters.
Abstract: Electrospinning is a broadly used technology to obtain
polymeric nanofibers ranging from several micrometers down to
several hundred nanometers for a wide range of applications. It offers
unique capabilities to produce nanofibers with controllable porous
structure. With smaller pores and higher surface area than regular
fibers, electrospun fibers have been successfully applied in various
fields, such as, nanocatalysis, tissue engineering scaffolds, protective
clothing, filtration, biomedical, pharmaceutical, optical electronics,
healthcare, biotechnology, defense and security, and environmental
engineering. In this study, polyurethane nanofibers were obtained
under different electrospinning parameters. Fiber morphology and
diameter distribution were investigated in order to understand them
as a function of process parameters.
Abstract: An optical fiber Fabry-Perot interferometer (FFPI) is
proposed and demonstrated for dynamic measurements in a
mechanical vibrating target. A polishing metal with a low reflectance
value adhered to a mechanical vibrator was excited via a function
generator at various excitation frequencies. Output interference
fringes were generated by modulating the reference and sensing
signal at the output arm. A fringe-counting technique was used for
interpreting the displacement information on the dedicated computer.
The fiber interferometer has been found the capability of the
displacement measurements of 1.28 μm – 96.01 μm. A commercial
displacement sensor was employed as a reference sensor for
investigating the measurement errors from the fiber sensor. A
maximum percentage measurement error of approximately 1.59 %
was obtained.
Abstract: This paper reports a case study on how a conceptual
and analytical thinking approach was used in Art and Design Department at Multimedia University (Malaysia) in addressing the
issues of one nation and its impact in the society through artworks. The art project was designed for students to increase the know-how
and develop creative thinking in design and communication. Goals of the design project were: (1) to develop creative thinking in design
and communication, (2) to increase student understanding on the
process of problem solving for design work, and (3) to use design
elements and principles to generate interest, attention and emotional responses. An exhibition entitled "One Nation" was showcased to
local and international viewers consisting of the general public, professionals, academics, artists and students. Findings indicate that the project supported several visual art standards, as well as
generated awareness in the society. This project may be of interest to
current and future art educators and others interested in the potential
of utilizing global issues as content for art, community and environment studies for the purpose of educational art.
Abstract: This paper introduces a temporal epistemic logic
CBCTL that updates agent-s belief states through communications
in them, based on computational tree logic (CTL). In practical
environments, communication channels between agents may not be
secure, and in bad cases agents might suffer blackouts. In this study,
we provide inform* protocol based on ACL of FIPA, and declare the
presence of secure channels between two agents, dependent on time.
Thus, the belief state of each agent is updated along with the progress
of time. We show a prover, that is a reasoning system for a given
formula in a given a situation of an agent ; if it is directly provable
or if it could be validated through the chains of communications, the
system returns the proof.
Abstract: The design of a modern aircraft is based on three pillars: theoretical results, experimental test and computational simulations.
As a results of this, Computational Fluid Dynamic (CFD) solvers are
widely used in the aeronautical field. These solvers require the correct
selection of many parameters in order to obtain successful results. Besides, the computational time spent in the simulation depends on
the proper choice of these parameters.
In this paper we create an expert system capable of making an
accurate prediction of the number of iterations and time required for the convergence of a computational fluid dynamic (CFD) solver.
Artificial neural network (ANN) has been used to design the expert system. It is shown that the developed expert system is capable of making an accurate prediction the number of iterations and time
required for the convergence of a CFD solver.
Abstract: Conventionally the selection of parameters depends
intensely on the operator-s experience or conservative technological
data provided by the EDM equipment manufacturers that assign
inconsistent machining performance. The parameter settings given by
the manufacturers are only relevant with common steel grades. A
single parameter change influences the process in a complex way.
Hence, the present research proposes artificial neural network (ANN)
models for the prediction of surface roughness on first commenced
Ti-15-3 alloy in electrical discharge machining (EDM) process. The
proposed models use peak current, pulse on time, pulse off time and
servo voltage as input parameters. Multilayer perceptron (MLP) with
three hidden layer feedforward networks are applied. An assessment
is carried out with the models of distinct hidden layer. Training of the
models is performed with data from an extensive series of
experiments utilizing copper electrode as positive polarity. The
predictions based on the above developed models have been verified
with another set of experiments and are found to be in good
agreement with the experimental results. Beside this they can be
exercised as precious tools for the process planning for EDM.
Abstract: In this paper we compare the response of linear and
nonlinear neural network-based prediction schemes in prediction of
received Signal-to-Interference Power Ratio (SIR) in Direct
Sequence Code Division Multiple Access (DS/CDMA) systems. The
nonlinear predictor is Multilayer Perceptron MLP and the linear
predictor is an Adaptive Linear (Adaline) predictor. We solve the
problem of complexity by using the Minimum Mean Squared Error
(MMSE) principle to select the optimal predictors. The optimized
Adaline predictor is compared to optimized MLP by employing
noisy Rayleigh fading signals with 1.8 GHZ carrier frequency in an
urban environment. The results show that the Adaline predictor can
estimates SIR with the same error as MLP when the user has the
velocity of 5 km/h and 60 km/h but by increasing the velocity up-to
120 km/h the mean squared error of MLP is two times more than
Adaline predictor. This makes the Adaline predictor (with lower
complexity) more suitable than MLP for closed-loop power control
where efficient and accurate identification of the time-varying
inverse dynamics of the multi path fading channel is required.
Abstract: Supplier selection is a multi criteria decision-making process that comprises tangible and intangible factors. The majority of previous supplier selection techniques do not consider strategic perspective. Besides, uncertainty is one of the most important obstacles in supplier selection. For the first, time in this paper, the idea of the algorithm " Knapsack " is used to select suppliers Moreover, an attempt has to be made to take the advantage of a simple numerical method for solving model .This is an innovation to resolve any ambiguity in choosing suppliers. This model has been tried in the suppliers selected in a competitive environment and according to all desired standards of quality and quantity to show the efficiency of the model, an industry sample has been uses.
Abstract: Heterogeneity of solid waste characteristics as well as the complex processes taking place within the landfill ecosystem motivated the implementation of soft computing methodologies such as artificial neural networks (ANN), fuzzy logic (FL), and their combination. The present work uses a hybrid ANN-FL model that employs knowledge-based FL to describe the process qualitatively and implements the learning algorithm of ANN to optimize model parameters. The model was developed to simulate and predict the landfill gas production at a given time based on operational parameters. The experimental data used were compiled from lab-scale experiment that involved various operating scenarios. The developed model was validated and statistically analyzed using F-test, linear regression between actual and predicted data, and mean squared error measures. Overall, the simulated landfill gas production rates demonstrated reasonable agreement with actual data. The discussion focused on the effect of the size of training datasets and number of training epochs.