Abstract: IP networks are evolving from data communication
infrastructure into many real-time applications such as video
conferencing, IP telephony and require stringent Quality of Service
(QoS) requirements. A rudimentary issue in QoS routing is to find a
path between a source-destination pair that satisfies two or more endto-
end constraints and termed to be NP hard or complete. In this
context, we present an algorithm Multi Constraint Path Problem
Version 3 (MCPv3), where all constraints are approximated and
return a feasible path in much quicker time. We present another
algorithm namely Delay Coerced Multi Constrained Routing
(DCMCR) where coerce one constraint and approximate the
remaining constraints. Our algorithm returns a feasible path, if exists,
in polynomial time between a source-destination pair whose first
weight satisfied by the first constraint and every other weight is
bounded by remaining constraints by a predefined approximation
factor (a). We present our experimental results with different
topologies and network conditions.
Abstract: The most important subtype of non-Hodgkin-s
lymphoma is the Diffuse Large B-Cell Lymphoma. Approximately
40% of the patients suffering from it respond well to therapy,
whereas the remainder needs a more aggressive treatment, in order to
better their chances of survival. Data Mining techniques have helped
to identify the class of the lymphoma in an efficient manner. Despite
that, thousands of genes should be processed to obtain the results.
This paper presents a comparison of the use of various attribute
selection methods aiming to reduce the number of genes to be
searched, looking for a more effective procedure as a whole.
Abstract: The ionization energy in semiconductor
systems in nano scale was investigated by using effective mass
approximation. By introducing the Hamiltonian of the system, the
variational technique was employed to calculate the ground state and
the ionization energy of a donor at the center and in the case that the
impurities are randomly distributed inside a cubic quantum well. The
numerical results for GaAs/GaAlAs show that the ionization energy
strongly depends on the well width for both cases and it decreases as
the well width increases. The ionization energy of a quantum wire
was also calculated and compared with the results for the well.
Abstract: In this paper we study numerical methods for solving Sylvester matrix equations of the form AX +XBT +CDT = 0. A new projection method is proposed. The union of Krylov subspaces in A and its inverse and the union of Krylov subspaces in B and its inverse are used as the right and left projection subspaces, respectively. The Arnoldi-like process for constructing the orthonormal basis of the projection subspaces is outlined. We show that the approximate solution is an exact solution of a perturbed Sylvester matrix equation. Moreover, exact expression for the norm of residual is derived and results on finite termination and convergence are presented. Some numerical examples are presented to illustrate the effectiveness of the proposed method.
Abstract: The presence of a vertical edge-crack within a web
plate subjected to pure bending induces local compressive stresses
about the crack which may cause tension buckling. Approximate
theoretical expressions were derived for the critical far-field tensile
stress and bending moment capacity of an edge-cracked web plate
associated with tension buckling. These expressions were validated
with finite element analyses and used to investigate the possibility of
tension buckling in web-cracked trial girders. It was found that
tension buckling is an unlikely occurrence unless the web is relatively
thin or the crack is very long.
Abstract: The Beshar River is one of the most important aquatic ecosystems in the upstream of the Karun watershed in south of Iran which is affected by point and non point pollutant sources . This study was done in order to evaluate the effects of pollutants activities on the water quality of the Beshar river and its aquatic ecosystems. This river is approximately 190 km in length and situated at the geographical positions of 51° 20´ to 51° 48´ E and 30° 18´ to 30° 52´ N it is one of the most important aquatic ecosystems of Kohkiloye and Boyerahmad province in south-west Iran. In this research project, five study stations were selected to examine water pollution in the Beshar River systems. Human activity is now one of the most important factors affecting on hydrology and water quality of the Beshar river. Humans use large amounts of resources to sustain various standards of living, although measures of sustainability are highly variable depending on how sustainability is defined. The Beshar river ecosystems are particularly sensitive and vulnerable to human activities. Therefore, to determine the impact of human activities on the Beshar River, the most important water quality parameters such as pH, dissolve oxygen (DO), Biological Oxygen Demand (BOD5), Total Dissolve Solids (TDS), Nitrates (NO3-N) and Phosphates (PO4) were estimated at the five stations. As the results show, the most important pollution index parameters such as BOD5, NO3 and PO4 increase and DO and pH decrease according to human activities (P
Abstract: The problem of N cracks interaction in an isotropic
elastic solid is decomposed into a subproblem of a homogeneous solid
without crack and N subproblems with each having a single crack
subjected to unknown tractions on the two crack faces. The unknown
tractions, namely pseudo tractions on each crack are expanded into
polynomials with unknown coefficients, which have to be determined
by the consistency condition, i.e. by the equivalence of the original
multiple cracks interaction problem and the superposition of the N+1
subproblems. In this paper, Kachanov-s approach of average tractions
is extended into the method of moments to approximately impose the
consistence condition. Hence Kachanov-s method can be viewed as
the zero-order method of moments. Numerical results of the stress
intensity factors are presented for interactions of two collinear cracks,
three collinear cracks, two parallel cracks, and three parallel cracks.
As the order of moment increases, the accuracy of the method of
moments improves.
Abstract: This paper summarizes and compares approaches to
solving the knapsack problem and its known application in capital
budgeting. The first approach uses deterministic methods and can be
applied to small-size tasks with a single constraint. We can also
apply commercial software systems such as the GAMS modelling
system. However, because of NP-completeness of the problem, more
complex problem instances must be solved by means of heuristic
techniques to achieve an approximation of the exact solution in a
reasonable amount of time. We show the problem representation and
parameter settings for a genetic algorithm framework.
Abstract: Mycophenolic acid (MPA) is a secondary metabolite
produced by Penicillium brevicompactum, which has antibiotic and
immunosuppressive properties. In this study, the first, mycophenolic
acid was produced in a fermentation process by Penicillium
brevicompactum MUCL 19011 in shake flask using a base medium.
The maximum MPA production, product yield and productivity of
process were 1.379 g/L, 18.6 mg/g glucose and 4.9 mg/L. h,
respectively. Also the glucose consumption, biomass and MPA
production profiles were investigated during batch cultivation.
Obtained results showed that MPA production starts approximately
after 180 hours and reaches to a maximum at 280 h. In the next step,
the effects of some various concentrations of enzymatically
hydrolyzed casein on MPA production were evaluated. Maximum
MPA production, product yield and productivity as 3.63 g/L, 49
mg/g glucose and 12.96 mg/L.h, respectively were obtained with
using 30 g/L enzymatically hydrolyzed casein in culture medium.
These values show an enhanced MPA production, product yield and
process productivity pr as 116.8%, 132.8% and 163.2%, respectively.
Abstract: A statistical optimization of the saccharification
process of EFB was studied. The statistical analysis was done by
applying faced centered central composite design (FCCCD) under
response surface methodology (RSM). In this investigation, EFB
dose, enzyme dose and saccharification period was examined, and the
maximum 53.45% (w/w) yield of reducing sugar was found with 4%
(w/v) of EFB, 10% (v/v) of enzyme after 120 hours of incubation. It
can be calculated that the conversion rate of cellulose content of the
substrate is more than 75% (w/w) which can be considered as a
remarkable achievement. All the variables, linear, quadratic and
interaction coefficient, were found to be highly significant, other than
two coefficients, one quadratic and another interaction coefficient.
The coefficient of determination (R2) is 0.9898 that confirms a
satisfactory data and indicated that approximately 98.98% of the
variability in the dependent variable, saccharification of EFB, could
be explained by this model.
Abstract: Voice Over IP (VoIP) is a technology that could pass
the voice traffic and data packet form over an IP network. Network
can be used for intranet or Internet. Phone calls using VoIP has
advantages in terms of cheaper cost of PSTN phone to more than
half, because the cost is calculated by the cost of the global nature of
the Internet. Session Initiation Protocol (SIP) is a signaling protocol
at the application layer which serves to establish, modify, and
terminate a multimedia session involving one or more users. This SIP
signaling has SIP message in text form that is used for session
management by the SIP components, such as User Agent, Registrar,
Redirect Server, and Proxy Server. To build a SIP communication is
required SIP Express Router (SER) to be able to receive SIP
messages, for handling the basic functions of SIP messages.
Problems occur when the NAT through which affects the voice
communication will be blocked starting from the sound that is not
sent or one side of the sound are sent (half duplex). How that could
be used to penetrate NAT is to use a given mediaproxy random RTP
port to penetrate NAT.
Abstract: The medical data statistical analysis often requires the
using of some special techniques, because of the particularities of
these data. The principal components analysis and the data clustering
are two statistical methods for data mining very useful in the medical
field, the first one as a method to decrease the number of studied
parameters, and the second one as a method to analyze the
connections between diagnosis and the data about the patient-s
condition. In this paper we investigate the implications obtained from
a specific data analysis technique: the data clustering preceded by a
selection of the most relevant parameters, made using the principal
components analysis. Our assumption was that, using the principal
components analysis before data clustering - in order to select and to
classify only the most relevant parameters – the accuracy of
clustering is improved, but the practical results showed the opposite
fact: the clustering accuracy decreases, with a percentage
approximately equal with the percentage of information loss reported
by the principal components analysis.
Abstract: This paper proposed classification models that would
be used as a proxy for hard disk drive (HDD) functional test equitant
which required approximately more than two weeks to perform the
HDD status classification in either “Pass" or “Fail". These models
were constructed by using committee network which consisted of a
number of single neural networks. This paper also included the
method to solve the problem of sparseness data in failed part, which
was called “enforce learning method". Our results reveal that the
constructed classification models with the proposed method could
perform well in the sparse data conditions and thus the models,
which used a few seconds for HDD classification, could be used to
substitute the HDD functional tests.
Abstract: In this paper, we study statistical multiplexing of VBR
video in ATM networks. ATM promises to provide high speed realtime
multi-point to central video transmission for telemedicine
applications in rural hospitals and in emergency medical services.
Video coders are known to produce variable bit rate (VBR) signals
and the effects of aggregating these VBR signals need to be
determined in order to design a telemedicine network infrastructure
capable of carrying these signals. We first model the VBR video
signal and simulate it using a generic continuous-data autoregressive
(AR) scheme. We carry out the queueing analysis by the Fluid
Approximation Model (FAM) and the Markov Modulated Poisson
Process (MMPP). The study has shown a trade off: multiplexing
VBR signals reduces burstiness and improves resource utilization,
however, the buffer size needs to be increased with an associated
economic cost. We also show that the MMPP model and the Fluid
Approximation model fit best, respectively, the cell region and the
burst region. Therefore, a hybrid MMPP and FAM completely
characterizes the overall performance of the ATM statistical
multiplexer. The ramifications of this technology are clear: speed,
reliability (lower loss rate and jitter), and increased capacity in video
transmission for telemedicine. With migration to full IP-based
networks still a long way to achieving both high speed and high
quality of service, the proposed ATM architecture will remain of
significant use for telemedicine.
Abstract: A robust still image face localization algorithm
capable of operating in an unconstrained visual environment is
proposed. First, construction of a robust skin classifier within a
shifted HSV color space is described. Then various filtering
operations are performed to better isolate face candidates and
mitigate the effect of substantial non-skin regions. Finally, a novel
Bhattacharyya-based face detection algorithm is used to compare
candidate regions of interest with a unique illumination-dependent
face model probability distribution function approximation.
Experimental results show a 90% face detection success rate despite
the demands of the visually noisy environment.
Abstract: This paper deals with infinite time horizon fuzzy Economic Order Quantity (EOQ) models for deteriorating items with
stock dependent demand rate and nonlinear holding costs by taking deterioration rate θ0 as a triangular fuzzy number (θ0 −δ 1, θ0, θ0 +δ 2), where 1 2 0 0
Abstract: In today-s world, the efficient utilization of wood
resources comes more and more to the mind of forest owners. It is a
very complex challenge to ensure an efficient harvest of the wood
resources. This is one of the scopes the project “Virtual Forest II"
addresses. Its core is a database with data about forests containing
approximately 260 million trees located in North Rhine-Westphalia
(NRW). Based on this data, tree growth simulations and wood
mobilization simulations can be conducted. This paper focuses on the
latter. It describes a discrete-event-simulation with an attached 3-D
real time visualization which simulates timber harvest using trees
from the database with different crop resources. This simulation can
be displayed in 3-D to show the progress of the wood crop. All the
data gathered during the simulation is presented as a detailed
summary afterwards. This summary includes cost-benefit
calculations and can be compared to those of previous runs to
optimize the financial outcome of the timber harvest by exchanging
crop resources or modifying their parameters.
Abstract: Recent fifteen years witnessed fast improvements in the field of humanoid robotics. The human-like robot structure is
more suitable to human environment with its supreme obstacle avoidance properties when compared with wheeled service robots.
However, the walking control for bipedal robots is a challenging task
due to their complex dynamics. Stable reference generation plays a very important role in control.
Linear Inverted Pendulum Model (LIPM) and the Zero Moment Point (ZMP) criterion are applied in a number of studies for stable
walking reference generation of biped walking robots. This paper follows this main approach too. We propose a natural and continuous ZMP reference trajectory for a stable and human-like walk. The ZMP reference trajectories move forward under the sole of the support foot when the robot body is supported by a single leg. Robot center of mass trajectory is obtained
from predefined ZMP reference trajectories by a Fourier series
approximation method. The Gibbs phenomenon problem common with Fourier approximations of discontinuous functions is avoided by employing continuous ZMP references. Also, these ZMP reference
trajectories possess pre-assigned single and double support phases,
which are very useful in experimental tuning work.
The ZMP based reference generation strategy is tested via threedimensional
full-dynamics simulations of a 12-degrees-of-freedom
biped robot model. Simulation results indicate that the proposed reference trajectory generation technique is successful.
Abstract: In this paper, a new automated methodology to detect the optic disc (OD) automatically in retinal images from patients with risk of being affected by Diabetic Retinopathy (DR) and Macular Edema (ME) is presented. The detection procedure comprises two independent methodologies. On one hand, a location methodology obtains a pixel that belongs to the OD using image contrast analysis and structure filtering techniques and, on the other hand, a boundary segmentation methodology estimates a circular approximation of the OD boundary by applying mathematical morphology, edge detection techniques and the Circular Hough Transform. The methodologies were tested on a set of 1200 images composed of 229 retinographies from patients affected by DR with risk of ME, 431 with DR and no risk of ME and 540 images of healthy retinas. The location methodology obtained 98.83% success rate, whereas the OD boundary segmentation methodology obtained good circular OD boundary approximation in 94.58% of cases. The average computational time measured over the total set was 1.67 seconds for OD location and 5.78 seconds for OD boundary segmentation.
Abstract: Computational techniques derived from digital image processing are playing a significant role in the security and digital copyrights of multimedia and visual arts. This technology has the effect within the domain of computers. This research presents discrete M-band wavelet transform (MWT) and cosine transform (DCT) based watermarking algorithm by incorporating the principal component analysis (PCA). The proposed algorithm is expected to achieve higher perceptual transparency. Specifically, the developed watermarking scheme can successfully resist common signal processing, such as geometric distortions, and Gaussian noise. In addition, the proposed algorithm can be parameterized, thus resulting in more security. To meet these requirements, the image is transformed by a combination of MWT & DCT. In order to improve the security further, we randomize the watermark image to create three code books. During the watermark embedding, PCA is applied to the coefficients in approximation sub-band. Finally, first few component bands represent an excellent domain for inserting the watermark.