Abstract: This paper analytically investigates the 3D flow
pattern at the confluences of two rectangular channels having 900
angles using Navier-Stokes equations based on Reynolds Stress
Turbulence Model (RSM). The equations are solved by the Finite-
Volume Method (FVM) and the flow is analyzed in terms of steadystate
(single-phased) conditions. The Shumate experimental findings
were used to test the validity of data. Comparison of the simulation
model with the experimental ones indicated a close proximity
between the flow patterns of the two sets. Effects of the discharge
ratio on separation zone dimensions created in the main-channel
downstream of the confluence indicated an inverse relation, where a
decrease in discharge ratio, will entail an increase in the length and
width of the separation zone. The study also found the model as a
powerful analytical tool in the feasibility study of hydraulic
engineering projects.
Abstract: The highly nonlinear characteristics of drying
processes have prompted researchers to seek new nonlinear control
solutions. However, the relation between the implementation
complexity, on-line processing complexity, reliability control
structure and controller-s performance is not well established. The
present paper proposes high performance nonlinear fuzzy controllers
for a real-time operation of a drying machine, being developed under
a consistent match between those issues. A PCI-6025E data
acquisition device from National Instruments® was used, and the
control system was fully designed with MATLAB® / SIMULINK
language. Drying parameters, namely relative humidity and
temperature, were controlled through MIMOs Hybrid Bang-bang+PI
(BPI) and Four-dimensional Fuzzy Logic (FLC) real-time-based
controllers to perform drying tests on biological materials. The
performance of the drying strategies was compared through several
criteria, which are reported without controllers- retuning. Controllers-
performance analysis has showed much better performance of FLC
than BPI controller. The absolute errors were lower than 8,85 % for
Fuzzy Logic Controller, about three times lower than the
experimental results with BPI control.
Abstract: Grid computing is a group of clusters connected over
high-speed networks that involves coordinating and sharing
computational power, data storage and network resources operating
across dynamic and geographically dispersed locations. Resource
management and job scheduling are critical tasks in grid computing.
Resource selection becomes challenging due to heterogeneity and
dynamic availability of resources. Job scheduling is a NP-complete
problem and different heuristics may be used to reach an optimal or
near optimal solution. This paper proposes a model for resource and
job scheduling in dynamic grid environment. The main focus is to
maximize the resource utilization and minimize processing time of
jobs. Grid resource selection strategy is based on Max Heap Tree
(MHT) that best suits for large scale application and root node of
MHT is selected for job submission. Job grouping concept is used to
maximize resource utilization for scheduling of jobs in grid
computing. Proposed resource selection model and job grouping
concept are used to enhance scalability, robustness, efficiency and
load balancing ability of the grid.
Abstract: Iterative learning control aims to achieve zero tracking
error of a specific command. This is accomplished by iteratively
adjusting the command given to a feedback control system, based on
the tracking error observed in the previous iteration. One would like
the iterations to converge to zero tracking error in spite of any error
present in the model used to design the learning law. First, this need
for stability robustness is discussed, and then the need for robustness
of the property that the transients are well behaved. Methods of
producing the needed robustness to parameter variations and to
singular perturbations are presented. Then a method involving
reverse time runs is given that lets the world behavior produce the
ILC gains in such a way as to eliminate the need for a mathematical
model. Since the real world is producing the gains, there is no issue
of model error. Provided the world behaves linearly, the approach
gives an ILC law with both stability robustness and good transient
robustness, without the need to generate a model.
Abstract: Solid state fermentation of cassava peel with emphasis on protein enrichment using Trichoderma viride was evaluated. The effect of five variables: moisture content, pH, particle size (p), nitrogen source and incubation temperature; on the true protein and total sugars of cassava peel was investigated. The optimum fermentation period was established to be 8 days. Total sugars were 5-fold higher at pH 6 relative to pH 4 and 7-fold higher when cassava peels were fermented at 30oC relative to 25oC as well as using ammonium sulfate as the nitrogen source relative to urea or a combination of both. Total sugars ranged between 123.21mg/g at 50% initial moisture content to 374mg/g at 60% and from 190.59mg/g with particle size range of 2.00>p>1.41mm to 310.10mg/g with 4.00>p>3.35mm.True protein ranged from 229.70 mg/g at pH 4 to 284.05 mg/g at pH 6; from 200.87 mg/g with urea as nitrogen source and to 254.50mg/g with ammonium sulfate; from 213.82mg/g at 50% initial moisture content to 254.50mg/g at 60% moisture content, from 205.75mg/g in cassava peel with 5.6>p> 4.75mm to 268.30 in cassava peel with particle size 4.00>p>3.35mm, from 207.57mg/g at 25oC to 254.50mg/g at 30oC Cassava peel with particle size 4.00>p>3.35 mm and initial moisture content of 60% at pH 6.0, 30oC incubation temperature with ammonium sulfate (10g N / kg substrate) was most suitable for protein enrichment with Trichoderma viride. Crude protein increased from 4.21 % in unfermented cassava peel samples to 10.43 % in fermented samples.
Abstract: The characteristics of fluid flow and phase separation
in an oil-water separator were numerically analysed as part of the
work presented herein. Simulations were performed for different
velocities and droplet diameters, and the way this parameters can
influence the separator geometry was studied.
The simulations were carried out using the software package
Fluent 6.2, which is designed for numerical simulation of fluid flow
and mass transfer. The model consisted of a cylindrical horizontal
separator. A tetrahedral mesh was employed in the computational
domain. The condition of two-phase flow was simulated with the
two-fluid model, taking into consideration turbulence effects using
the k-ε model.
The results showed that there is a strong dependency of phase
separation on mixture velocity and droplet diameter. An increase in
mixture velocity will bring about a slow down in phase separation
and as a consequence will require a weir of greater height. An
increase in droplet diameter will produce a better phase separation.
The simulations are in agreement with results reported in literature
and show that CFD can be a useful tool in studying a horizontal oilwater
separator.
Abstract: This paper introduces a tool that is being developed for the expression of information security policy controls that govern electronic healthcare records. By reference to published findings, the paper introduces the theory behind the use of knowledge management for automatic and consistent security policy assertion using the formalism called the Secutype; the development of the tool and functionality is discussed; some examples of Secutypes generated by the tool are provided; proposed integration with existing medical record systems is described. The paper is concluded with a section on further work and critique of the work achieved to date.
Abstract: Flows over a harmonically oscillating NACA 0012
airfoil are simulated here using a two-dimensional, unsteady,
incompressibleNavier-Stokes solver.Both pure-plunging and
pitching-plunging combined oscillations are considered at a Reynolds
number of 5000. Special attention is paid to the vortex shedding and
interaction mechanism of the motions. For all the simulations
presented here, the reduced frequency (k) is fixed at a value of 2.5
and plunging amplitude (h) is selected to be in the range of 0.2-0.5.
The simulation results show that the interaction mechanism between
the leading and trailing edge vortices has a decisive effect on the
values of the resulting thrust and propulsive efficiency.
Abstract: Metal stamping die design is a complex, experiencebased
and time-consuming task. Various artificial intelligence (AI)
techniques are being used by worldwide researchers for stamping die
design to reduce complexity, dependence on human expertise and
time taken in design process as well as to improve design efficiency.
In this paper a comprehensive review of applications of AI
techniques in manufacturability evaluation of sheet metal parts, die
design and process planning of metal stamping die is presented.
Further the salient features of major research work published in the
area of metal stamping are presented in tabular form and scope of
future research work is identified.
Abstract: The study of the geometric shape of the plunging wave enclosed vortices as a possible indicator for the breaking intensity of ocean waves has been ongoing for almost 50 years with limited success. This paper investigates the validity of using the vortex ratio and vortex angle as methods of predicting breaking intensity. Previously published works on vortex parameters, based on regular wave flume results or solitary wave theory, present contradictory results and conclusions. Through the first complete analysis of field collected irregular wave breaking vortex parameters it is illustrated that the vortex ratio and vortex angle cannot be accurately predicted using standard breaking wave characteristics and hence are not suggested as a possible indicator for breaking intensity.
Abstract: Complex networks have been intensively studied across
many fields, especially in Internet technology, biological engineering,
and nonlinear science. Software is built up out of many interacting
components at various levels of granularity, such as functions, classes,
and packages, representing another important class of complex networks.
It can also be studied using complex network theory. Over the
last decade, many papers on the interdisciplinary research between
software engineering and complex networks have been published.
It provides a different dimension to our understanding of software
and also is very useful for the design and development of software
systems. This paper will explore how to use the complex network
theory to analyze software structure, and briefly review the main
advances in corresponding aspects.
Abstract: The RR interval series is non-stationary and unevenly
spaced in time. For estimating its power spectral density (PSD) using
traditional techniques like FFT, require resampling at uniform
intervals. The researchers have used different interpolation
techniques as resampling methods. All these resampling methods
introduce the low pass filtering effect in the power spectrum. The
lomb transform is a means of obtaining PSD estimates directly from
irregularly sampled RR interval series, thus avoiding resampling. In
this work, the superiority of Lomb transform method has been
established over FFT based approach, after applying linear and
cubicspline interpolation as resampling methods, in terms of
reproduction of exact frequency locations as well as the relative
magnitudes of each spectral component.
Abstract: In this paper, in order to investigate the effects of
photovoltaic system introduction to detached houses in Japan, two
kinds of works were done. Firstly, the hourly generation amount of a
4.2kW photovoltaic system were simulated in 46 cities to investigate
the potential of the system in different regions in Japan using a
simulation model of photovoltaic system. Secondly, based on the
simulated electricity generation amount, the energy saving, the
environmental and the economic effect of the photovoltaic system
were examined from hourly to annual timescales, based upon
calculations of typical electricity, heating, cooling and hot water
supply load profiles for Japanese dwellings. The above analysis was
carried out using a standard year-s hourly weather data for the
different city provided by the Expanded AMeDAS Weather Data
issued by AIJ (Architectural Institute of Japan).
Abstract: To investigate the possible correlation between peer aggression and peer victimization, 148 sixth-graders were asked to respond to the Reduced Aggression and Victimization Scales (RAVS). RAVS measures the frequency of reporting aggressive behaviors or of being victimized during the previous week prior to the survey. The scales are composed of six items each. Each point represents one instance of aggression or victimization. Specifically, the Pearson Product-Moment Correlation Coefficient (PMCC) was used to determine the correlations between the scores of the sixthgraders in the two scales, both in individual items and total scores. Positive correlations were established and correlations were significant at the 0.01 levels.
Abstract: Project managers are the ultimate responsible for the
overall characteristics of a project, i.e. they should deliver the project
on time with minimum cost and with maximum quality. It is vital for
any manager to decide a trade-off between these conflicting
objectives and they will be benefited of any scientific decision
support tool. Our work will try to determine optimal solutions (rather
than a single optimal solution) from which the project manager will
select his desirable choice to run the project. In this paper, the
problem in project scheduling notated as
(1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The
problem is multi-objective and the purpose is finding the Pareto
optimal front of time, cost and quality of a project
(curve:quality,time,cost), whose activities belong to a start to finish
activity relationship network (cpm) and they can be done in different
possible modes (mu) which are non-continuous or discrete (disc), and
each mode has a different cost, time and quality . The project is
constrained to a non-renewable resource i.e. money (1,T). Because
the problem is NP-Hard, to solve the problem, a meta-heuristic is
developed based on a version of genetic algorithm specially adapted
to solve multi-objective problems namely FastPGA. A sample project
with 30 activities is generated and then solved by the proposed
method.
Abstract: The incidence of mechanical fracture of an
automobile piston rings prompted development of fracture analysis
method on this case. The three rings (two compression rings and one
oil ring) were smashed into several parts during the power-test (after
manufacturing the engine) causing piston and liner to be damaged.
The radial and oblique cracking happened on the failed piston rings.
The aim of the fracture mechanics simulations presented in this paper
was the calculation of particular effective fracture mechanics
parameters, such as J-integrals and stress intensity factors. Crack
propagation angles were calculated as well. Two-dimensional
fracture analysis of the first compression ring has been developed in
this paper using ABAQUS CAE6.5-1 software. Moreover, SEM
fractography was developed on fracture surfaces and is discussed in
this paper. Results of numerical calculations constitute the basis for
further research on real object.
Abstract: Researches related to standard product model and
development of neutral manufacturing interfaces for numerical
control machines becomes a significant topic since the last 25 years.
In this paper, a detail description of STEP implementation on turnmill
manufacturing has been discussed. It shows requirements of
information contents from ISO14649 data model. It covers to
describe the design of STEP-NC framework applicable to turn-mill
manufacturing. In the framework, EXPRESS-G and UML modeling
tools are used to depict the information contents of the system and
established the bases of information model requirement. A product
and manufacturing data model applicable for STEP compliant
manufacturing. The next generation turn-mill operations
requirements have been represented by a UML diagram. An object
oriented classes of ISO1449 has been developed on Visual Basic dot
NET platform for binding the static information model represented
by the UML diagram. An architect of the proposed system
implementation has been given on the bases of the design and
manufacturing module of STEP-NC interface established. Finally, a
part 21 file process plan generated for an illustration of turn-mill
components.
Abstract: In this paper, a novel multi join algorithm to join
multiple relations will be introduced. The novel algorithm is based
on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But
instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join
buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This
will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity
required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m
for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join
indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.
Abstract: It is quite essential to form dialogue mechanisms and
dialogue channels to solve intercultural communication issues.
Therefore, every country should develop a intercultural education
project which aims to resolve international communication issues.
For proper mediation training, the first step is to reach an agreement
on the actors to run the project. The strongest mediation mechanisms
in the world should be analyzed and initiated within the educational
policies. A communication-based mediation model should be
developed for international mediation training. Mediators can use
their convincing communication skills as a part of this model. At the
first, fundamental stages of the mediation training should be specified
within the scope of the model. Another important topic at this point is
common sence and peace leaders to act as an ombudsman in this
process. Especially for solving some social issues and conflicts,
common sense leaders acting as an ombudsman would lead to
effective communication. In mediation training that is run by
universities and non-governmental organizations, another phase is to
focus on conducting the meetings. In intercultural mediation training,
one of the most critical topics is to conduct the meeting traffic and
performing a shuttle diplomacy. Meeting traffic is where the mediator
organizes meetings with the parties with initiative powers, in order to
contribute to the solution of the issue, and schedule these meetings.
In this notice titled “ Intercultural mediation training and the training
process of common sense leaders by the leadership of universities
communication and artistic campaigns" , communication models and
strategies about this topic will be constructed and an intercultural art
activities and perspectives will be presented.
Abstract: This paper presents a new feature based dense stereo
matching algorithm to obtain the dense disparity map via dynamic
programming. After extraction of some proper features, we use some
matching constraints such as epipolar line, disparity limit, ordering
and limit of directional derivative of disparity as well. Also, a coarseto-
fine multiresolution strategy is used to decrease the search space
and therefore increase the accuracy and processing speed. The
proposed method links the detected feature points into the chains and
compares some of the feature points from different chains, to
increase the matching speed. We also employ color stereo matching
to increase the accuracy of the algorithm. Then after feature
matching, we use the dynamic programming to obtain the dense
disparity map. It differs from the classical DP methods in the stereo
vision, since it employs sparse disparity map obtained from the
feature based matching stage. The DP is also performed further on a
scan line, between any matched two feature points on that scan line.
Thus our algorithm is truly an optimization method. Our algorithm
offers a good trade off in terms of accuracy and computational
efficiency. Regarding the results of our experiments, the proposed
algorithm increases the accuracy from 20 to 70%, and reduces the
running time of the algorithm almost 70%.