Abstract: In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.
Abstract: In this paper a new approach is proposed for the
adaptation of the simulated annealing search in the field of the
Multi-Objective Optimization (MOO). This new approach is called
Multi-Case Multi-Objective Simulated Annealing (MC-MOSA). It
uses some basics of a well-known recent Multi-Objective Simulated
Annealing proposed by Ulungu et al., which is referred in the
literature as U-MOSA. However, some drawbacks of this algorithm
have been found, and are substituted by other ones, especially in
the acceptance decision criterion. The MC-MOSA has shown better
performance than the U-MOSA in the numerical experiments. This
performance is further improved by some other subvariants of the
MC-MOSA, such as Fast-annealing MC-MOSA, Re-annealing MCMOSA
and the Two-Stage annealing MC-MOSA.
Abstract: Nowadays, there is little information, concerning the
heat shield systems, and this information is not completely reliable to
use in so many cases. for example, the precise calculation cannot be
done for various materials. In addition, the real scale test has two
disadvantages: high cost and low flexibility, and for each case we
must perform a new test. Hence, using numerical modeling program
that calculates the surface recession rate and interior temperature
distribution is necessary. Also, numerical solution of governing
equation for non-charring material ablation is presented in order to
anticipate the recession rate and the heat response of non-charring
heat shields. the governing equation is nonlinear and the Newton-
Rafson method along with TDMA algorithm is used to solve this
nonlinear equation system. Using Newton- Rafson method for
solving the governing equation is one of the advantages of the
solving method because this method is simple and it can be easily
generalized to more difficult problems. The obtained results
compared with reliable sources in order to examine the accuracy of
compiling code.
Abstract: Recently, a lot of attention has been devoted to
advanced techniques of system modeling. PNN(polynomial neural
network) is a GMDH-type algorithm (Group Method of Data
Handling) which is one of the useful method for modeling nonlinear
systems but PNN performance depends strongly on the number of
input variables and the order of polynomial which are determined by
trial and error. In this paper, we introduce GPNN (genetic
polynomial neural network) to improve the performance of PNN.
GPNN determines the number of input variables and the order of all
neurons with GA (genetic algorithm). We use GA to search between
all possible values for the number of input variables and the order of
polynomial. GPNN performance is obtained by two nonlinear
systems. the quadratic equation and the time series Dow Jones stock
index are two case studies for obtaining the GPNN performance.
Abstract: Chest pain is one of the most prevalent complaints
among adults that cause the people to attend to medical centers. The
aim was to determine the prevalence and risk factors of chest pain
among over 30 years old people in Tehran. In this cross-sectional
study, 787 adults took part from Apr 2005 until Apr 2006. The
sampling method was random cluster sampling and there were 25
clusters. In each cluster, interviews were performed with 32 over 30
years old, people lived in those houses. In cases with chest pain, extra
questions asked. The prevalence of CP was 9% (71 cases). Of them
21 cases (6.5%) were in 41-60 year age ranges and the remainders
were over 61 year old. 19 cases (26.8%) mentioned CP in resting
state and all of the cases had exertion onset CP. The CP duration was
10 minutes or less in all of the cases and in most of them (84.5%), the
location of pain mentioned left anterior part of chest, left anterior part
of sternum and or left arm. There was positive history of myocardial
infarction in 12 cases (17%). There was significant relation between
CP and age, sex and between history of myocardial infarction and
marital state of study people. Our results are similar to other studies-
results in most parts, however it is necessary to perform
supplementary tests and follow up studies to differentiate between
cardiac and non-cardiac CP exactly.
Abstract: We study the spatial design of experiment and we want to select a most informative subset, having prespecified size, from a set of correlated random variables. The problem arises in many applied domains, such as meteorology, environmental statistics, and statistical geology. In these applications, observations can be collected at different locations and possibly at different times. In spatial design, when the design region and the set of interest are discrete then the covariance matrix completely describe any objective function and our goal is to choose a feasible design that minimizes the resulting uncertainty. The problem is recast as that of maximizing the determinant of the covariance matrix of the chosen subset. This problem is NP-hard. For using these designs in computer experiments, in many cases, the design space is very large and it's not possible to calculate the exact optimal solution. Heuristic optimization methods can discover efficient experiment designs in situations where traditional designs cannot be applied, exchange methods are ineffective and exact solution not possible. We developed a GA algorithm to take advantage of the exploratory power of this algorithm. The successful application of this method is demonstrated in large design space. We consider a real case of design of experiment. In our problem, design space is very large and for solving the problem, we used proposed GA algorithm.
Abstract: In this paper we present the information life cycle and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here corresponds not just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise and in a manufacturing enterprise.
Abstract: When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.
Abstract: Recently, bianisotropic media again received
increasing importance in electromagnetic theory because of advances
in material science which enable the manufacturing of complex
bianisotropic materials. By using Maxwell's equations and
corresponding boundary conditions, the electromagnetic field
distribution in bianisotropic solenoid coils is determined and the
influence of the bianisotropic behaviour of coil to the impedance and
Q-factor is considered. Bianisotropic media are the largest class of
linear media which is able to describe the macroscopic material
properties of artificial dielectrics, artificial magnetics, artificial chiral
materials, left-handed materials, metamaterials, and other composite
materials. Several special cases of coils, filled with complex
substance, have been analyzed. Results obtained by using the
analytical approach are compared with values calculated by
numerical methods, especially by our new hybrid EEM/BEM method
and FEM.
Abstract: This work focuses on analysis of classical heat transfer equation regularized with Maxwell-Cattaneo transfer law. Computer simulations are performed in MATLAB environment. Numerical experiments are first developed on classical Fourier equation, then Maxwell-Cattaneo law is considered. Corresponding equation is regularized with a balancing diffusion term to stabilize discretizing scheme with adjusted time and space numerical steps. Several cases including a convective term in model equations are discussed, and results are given. It is shown that limiting conditions on regularizing parameters have to be satisfied in convective case for Maxwell-Cattaneo regularization to give physically acceptable solutions. In all valid cases, uniform convergence to solution of initial heat equation with Fourier law is observed, even in nonlinear case.
Abstract: In this paper, the notion of Hyperbolic Klingenberg
plane is introduced via a set of axioms like as Affine Klingenberg
planes and Projective Klingenberg planes. Models of such planes are
constructed by deleting a certain number m of equivalence classes
of lines from a Projective Klingenberg plane. In the finite case, an
upper bound for m is established and some combinatoric properties
are investigated.
Abstract: The wireless adhoc network is comprised of wireless
node which can move freely and are connected among themselves
without central infrastructure. Due to the limited transmission range
of wireless interfaces, in most cases communication has to be relayed
over intermediate nodes. Thus, in such multihop network each node
(also called router) is independent, self-reliant and capable to route
the messages over the dynamic network topology. Various protocols
are reported in this field and it is very difficult to decide the best one.
A key issue in deciding which type of routing protocol is best for
adhoc networks is the communication overhead incurred by the
protocol. In this paper STAR a table driven and DSR on demand
protocols based on IEEE 802.11 are analyzed for their performance
on different performance measuring metrics versus varying traffic
CBR load using QualNet 5.0.2 network simulator.
Abstract: The case study deals with the semi-quantitative risk
assessment of water resource earmarked for the emergency supply
of population with drinking water. The risk analysis has been based
on previously identified hazards/sensitivities of the elements
of hydrogeological structure and technological equipment of ground
water resource as well as on the assessment of the levels of hazard,
sensitivity and criticality of individual resource elements in the form
of point indexes. The following potential sources of hazard have
been considered: natural disasters caused by atmospheric and
geological changes, technological hazards, and environmental
burdens. The risk analysis has proved that the assessed risks are
acceptable and the water resource may be integrated into a crisis plan
of a given region.
Abstract: With the proliferation of World Wide Web,
development of web-based technologies and the growth in web
content, the structure of a website becomes more complex and web
navigation becomes a critical issue to both web designers and users.
In this paper we define the content and web pages as two important
and influential factors in website navigation and paraphrase the
enhancement in the website navigation as making some useful
changes in the link structure of the website based on the
aforementioned factors. Then we suggest a new method for
proposing the changes using fuzzy approach to optimize the website
architecture. Applying the proposed method to a real case of Iranian
Civil Aviation Organization (CAO) website, we discuss the results of
the novel approach at the final section.
Abstract: Noise has adverse effect on human health and
comfort. Noise not only cause hearing impairment, but it also acts as
a causal factor for stress and raising systolic pressure. Additionally it
can be a causal factor in work accidents, both by marking hazards
and warning signals and by impeding concentration. Industry
workers also suffer psychological and physical stress as well as
hearing loss due to industrial noise. This paper proposes an approach
to enable engineers to point out quantitatively the noisiest source for
modification, while multiple machines are operating simultaneously.
The model with the point source and spherical radiation in a free field
was adopted to formulate the problem. The procedure works very
well in ideal cases (point source and free field). However, most of the
industrial noise problems are complicated by the fact that the noise is
confined in a room. Reflections from the walls, floor, ceiling, and
equipment in a room create a reverberant sound field that alters the
sound wave characteristics from those for the free field. So the model
was validated for relatively low absorption room at NIT Kurukshetra
Central Workshop. The results of validation pointed out that the
estimated sound power of noise sources under simultaneous
conditions were on lower side, within the error limits 3.56 - 6.35 %.
Thus suggesting the use of this methodology for practical
implementation in industry. To demonstrate the application of the
above analytical procedure for estimating the sound power of noise
sources under simultaneous operating conditions, a manufacturing
facility (Railway Workshop at Yamunanagar, India) having five
sound sources (machines) on its workshop floor is considered in this
study. The findings of the case study had identified the two most
effective candidates (noise sources) for noise control in the Railway
Workshop Yamunanagar, India. The study suggests that the
modification in the design and/or replacement of these two identified
noisiest sources (machine) would be necessary so as to achieve an
effective reduction in noise levels. Further, the estimated data allows
engineers to better understand the noise situations of the workplace
and to revise the map when changes occur in noise level due to a
workplace re-layout.
Abstract: A sequential treatment of ozonation followed by a
Fenton or photo-Fenton process, using black light lamps (365 nm) in
this latter case, has been applied to remove a mixture of
pharmaceutical compounds and the generated by-products both in
ultrapure and secondary treated wastewater. The scientifictechnological
innovation of this study stems from the in situ
generation of hydrogen peroxide from the direct ozonation of
pharmaceuticals, and can later be used in the application of Fenton
and photo-Fenton processes. The compounds selected as models
were sulfamethoxazol and acetaminophen. It should be remarked that
the use of a second process is necessary as a result of the low
mineralization yield reached by the exclusive application of ozone.
Therefore, the influence of the water matrix has been studied in terms
of hydrogen peroxide concentration, individual compound
concentration and total organic carbon removed. Moreover, the
concentration of different iron species in solution has been measured.
Abstract: The governing two-dimensional equations of a heterogeneous material composed of a fluid (allowed to flow in the absence of acoustic excitations) and a crystalline piezoelectric cubic solid stacked one-dimensionally (along the z direction) are derived and special emphasis is given to the discussion of acoustic group velocity for the structure as a function of the wavenumber component perpendicular to the stacking direction (being the x axis). Variations in physical parameters with y are neglected assuming infinite material homogeneity along the y direction and the flow velocity is assumed to be directed along the x direction. In the first part of the paper, the governing set of differential equations are derived as well as the imposed boundary conditions. Solutions are provided using Hamilton-s equations for the wavenumber vs. frequency as a function of the number and thickness of solid layers and fluid layers in cases with and without flow (also the case of a position-dependent flow in the fluid layer is considered). In the first part of the paper, emphasis is given to the small-frequency case. Boundary conditions at the bottom and top parts of the full structure are left unspecified in the general solution but examples are provided for the case where these are subject to rigid-wall conditions (Neumann boundary conditions in the acoustic pressure). In the second part of the paper, emphasis is given to the general case of larger frequencies and wavenumber-frequency bandstructure formation. A wavenumber condition for an arbitrary set of consecutive solid and fluid layers, involving four propagating waves in each solid region, is obtained again using the monodromy matrix method. Case examples are finally discussed.
Abstract: A kinetic model for propane dehydrogenation in an
industrial moving bed reactor is developed based on the reported
reaction scheme. The kinetic parameters and activity constant are
fine tuned with several sets of balanced plant data. Plant data at
different operating conditions is applied to validate the model and
the results show a good agreement between the model
predictions and plant observations in terms of the amount of main
product, propylene produced. The simulation analysis of key
variables such as inlet temperature of each reactor (Tinrx) and
hydrogen to total hydrocarbon ratio (H2/THC) affecting process
performance is performed to identify the operating condition to
maximize the production of propylene. Within the range of operating
conditions applied in the present studies, the operating condition to
maximize the propylene production at the same weighted average
inlet temperature (WAIT) is ΔTinrx1= -2, ΔTinrx2= +1, ΔTinrx3= +1 ,
ΔTinrx4= +2 and ΔH2/THC= -0.02. Under this condition, the surplus
propylene produced is 7.07 tons/day as compared with base case.
Abstract: Current spectrums of a high power induction machine was calculated for the cases of full symmetry, static and dynamic eccentricity. The calculations involve integration of 93 electrical plus four mechanical ordinary differential equations. Electrical equations account for variable inductances affected by slotting and eccentricities. The calculations were followed by Fourier analysis of the stator currents in steady state operation. The paper presents the stator current spectrums in full symmetry, static and dynamic eccentricity cases, and demonstrates the harmonics present in each case. The effect of dynamic eccentricity is demonstrating via comparing the current spectrums related to dynamic eccentricity cases with the full symmetry one. The paper includes one case study, refers to dynamic eccentricity, to present the spectrum of the measured current and demonstrate the existence of the harmonics related to dynamic eccentricity. The zooms of current spectrums around the main slot harmonic zone are included to simplify the comparison and prove the existence of the dynamic eccentricity harmonics in both calculated and measured current spectrums.
Abstract: The use of a Bayesian Hierarchical Model (BHM) to interpret breath measurements obtained during a 13C Octanoic Breath Test (13COBT) is demonstrated. The statistical analysis was implemented using WinBUGS, a commercially available computer package for Bayesian inference. A hierarchical setting was adopted where poorly defined parameters associated with a delayed Gastric Emptying (GE) were able to "borrow" strength from global distributions. This is proved to be a sufficient tool to correct model's failures and data inconsistencies apparent in conventional analyses employing a Non-linear least squares technique (NLS). Direct comparison of two parameters describing gastric emptying ng ( tlag -lag phase, t1/ 2 -half emptying time) revealed a strong correlation between the two methods. Despite our large dataset ( n = 164 ), Bayesian modeling was fast and provided a successful fitting for all subjects. On the contrary, NLS failed to return acceptable estimates in cases where GE was delayed.