Abstract: The aim of this paper is to study the internal
stabilization of the Bernoulli-Euler equation numerically. For this,
we consider a square plate subjected to a feedback/damping force
distributed only in a subdomain. An algorithm for obtaining an
approximate solution to this problem was proposed and implemented.
The numerical method used was the Finite Difference Method.
Numerical simulations were performed and showed the behavior of
the solution, confirming the theoretical results that have already been
proved in the literature. In addition, we studied the validation of the
numerical scheme proposed, followed by an analysis of the numerical
error; and we conducted a study on the decay of the energy associated.
Abstract: The quantified residence time distribution (RTD)
provides a numerical characterization of mixing in a reactor, thus
allowing the process engineer to better understand mixing
performance of the reactor.This paper discusses computational
studies to investigate flow patterns in a two impinging streams
cyclone reactor(TISCR) . Flow in the reactor was modeled with
computational fluid dynamics (CFD). Utilizing the Eulerian-
Lagrangian approach, implemented in FLUENT (V6.3.22), particle
trajectories were obtained by solving the particle force balance
equations. From simulation results obtained at different Δts, the mean
residence time (tm) and the mean square deviation (σ2) were
calculated. a good agreement can be observed between predicted and
experimental data. Simulation results indicate that the behavior of
complex reactor systems can be predicted using the CFD technique
with minimum data requirement for validation.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.
Abstract: The purpose of this study is to introduce a new
interface program to calculate a dose distribution with Monte Carlo method in complex heterogeneous systems such as organs or tissues
in proton therapy. This interface program was developed under
MATLAB software and includes a friendly graphical user interface
with several tools such as image properties adjustment or results display. Quadtree decomposition technique was used as an image
segmentation algorithm to create optimum geometries from Computed Tomography (CT) images for dose calculations of proton
beam. The result of the mentioned technique is a number of nonoverlapped
squares with different sizes in every image. By this way
the resolution of image segmentation is high enough in and near
heterogeneous areas to preserve the precision of dose calculations
and is low enough in homogeneous areas to reduce the number of
cells directly. Furthermore a cell reduction algorithm can be used to combine neighboring cells with the same material. The validation of this method has been done in two ways; first, in comparison with experimental data obtained with 80 MeV proton beam in Cyclotron
and Radioisotope Center (CYRIC) in Tohoku University and second, in comparison with data based on polybinary tissue calibration method, performed in CYRIC. These results are presented in this paper. This program can read the output file of Monte Carlo code while region of interest is selected manually, and give a plot of dose distribution of proton beam superimposed onto the CT images.
Abstract: Improvement in CAE methods has an important role for shortening of the vehicle product development time. It is provided that validation of the design and improvements in terms of durability can be done without hardware prototype production. In recent years, several different methods have been developed in order to investigate fatigue damage of the vehicle. The intended goal among these methods is prediction of fatigue damage in a short time with reduced costs. This study developed a new fatigue damage prediction method in the automotive sector using power spectrum densities of accelerations. This study also confirmed that the weak region in vehicle can be easily detected with the method developed in this study which results were compared with conventional method.
Abstract: The Spiral development model has been used
successfully in many commercial systems and in a good number of
defense systems. This is due to the fact that cost-effective
incremental commitment of funds, via an analogy of the spiral model
to stud poker and also can be used to develop hardware or integrate
software, hardware, and systems. To support adaptive, semantic
collaboration between domain experts and knowledge engineers, a
new knowledge engineering process, called Spiral_OWL is proposed.
This model is based on the idea of iterative refinement, annotation
and structuring of knowledge base. The Spiral_OWL model is
generated base on spiral model and knowledge engineering
methodology. A central paradigm for Spiral_OWL model is the
concentration on risk-driven determination of knowledge engineering
process. The collaboration aspect comes into play during knowledge
acquisition and knowledge validation phase. Design rationales for the
Spiral_OWL model are to be easy-to-implement, well-organized, and
iterative development cycle as an expanding spiral.
Abstract: The value of overall oxygen transfer Coefficient
(KLa), which is the best measure of oxygen transfer in water through
aeration, is obtained by a simple approach, which sufficiently
explains the utility of the method to eliminate the discrepancies due
to inaccurate assumption of saturation dissolved oxygen
concentration. The rate of oxygen transfer depends on number of
factors like intensity of turbulence, which in turns depends on the
speed of rotation, size, and number of blades, diameter and
immersion depth of the rotor, and size and shape of aeration tank, as
well as on physical, chemical, and biological characteristic of water.
An attempt is made in this paper to correlate the overall oxygen
transfer Coefficient (KLa), as an independent parameter with other
influencing parameters mentioned above. It has been estimated that
the simulation equation developed predicts the values of KLa and
power with an average standard error of estimation of 0.0164 and
7.66 respectively and with R2 values of 0.979 and 0.989 respectively,
when compared with experimentally determined values. The
comparison of this model is done with the model generated using
Computational fluid dynamics (CFD) and both the models were
found to be in good agreement with each other.
Abstract: The traditional software product and process metrics
are neither suitable nor sufficient in measuring the complexity of
software components, which ultimately is necessary for quality and
productivity improvement within organizations adopting CBSE.
Researchers have proposed a wide range of complexity metrics for
software systems. However, these metrics are not sufficient for
components and component-based system and are restricted to the
module-oriented systems and object-oriented systems. In this
proposed study it is proposed to find the complexity of the JavaBean
Software Components as a reflection of its quality and the component
can be adopted accordingly to make it more reusable. The proposed
metric involves only the design issues of the component and does not
consider the packaging and the deployment complexity. In this way,
the software components could be kept in certain limit which in turn
help in enhancing the quality and productivity.
Abstract: The present paper was concerned primarily with the
analysis, simulation of the air flow and thermal patterns in a lecture
room. The paper is devoted to numerically investigate the influence
of location and number of ventilation and air conditioning supply and
extracts openings on air flow properties in a lecture room. The work
focuses on air flow patterns, thermal behaviour in lecture room where
large number of students. The effectiveness of an air flow system is
commonly assessed by the successful removal of sensible and latent
loads from occupants with additional of attaining air pollutant at a
prescribed level to attain the human thermal comfort conditions and
to improve the indoor air quality; this is the main target during the
present paper. The study is carried out using computational fluid
dynamics (CFD) simulation techniques as embedded in the
commercially available CFD code (FLUENT 6.2). The CFD
modelling techniques solved the continuity, momentum and energy
conservation equations in addition to standard k – ε model equations
for turbulence closure.
Throughout the investigations, numerical validation is carried out by
way of comparisons of numerical and experimental results. Good
agreement is found among both predictions.
Abstract: Gene, principal unit of inheritance, is an ordered
sequence of nucleotides. The genes of eukaryotic organisms include
alternating segments of exons and introns. The region of
Deoxyribonucleic acid (DNA) within a gene containing instructions
for coding a protein is called exon. On the other hand, non-coding
regions called introns are another part of DNA that regulates gene
expression by removing from the messenger Ribonucleic acid (RNA)
in a splicing process. This paper proposes to determine splice
junctions that are exon-intron boundaries by analyzing DNA
sequences. A splice junction can be either exon-intron (EI) or intron
exon (IE). Because of the popularity and compatibility of the
artificial neural network (ANN) in genetic fields; various ANN
models are applied in this research. Multi-layer Perceptron (MLP),
Radial Basis Function (RBF) and Generalized Regression Neural
Networks (GRNN) are used to analyze and detect the splice junctions
of gene sequences. 10-fold cross validation is used to demonstrate
the accuracy of networks. The real performances of these networks
are found by applying Receiver Operating Characteristic (ROC)
analysis.
Abstract: Knowledge is attributed to human whose problemsolving
behavior is subjective and complex. In today-s knowledge
economy, the need to manage knowledge produced by a community
of actors cannot be overemphasized. This is due to the fact that
actors possess some level of tacit knowledge which is generally
difficult to articulate. Problem-solving requires searching and sharing
of knowledge among a group of actors in a particular context.
Knowledge expressed within the context of a problem resolution
must be capitalized for future reuse. In this paper, an approach that
permits dynamic capitalization of relevant and reliable actors-
knowledge in solving decision problem following Economic
Intelligence process is proposed. Knowledge annotation method and
temporal attributes are used for handling the complexity in the
communication among actors and in contextualizing expressed
knowledge. A prototype is built to demonstrate the functionalities of
a collaborative Knowledge Management system based on this
approach. It is tested with sample cases and the result showed that
dynamic capitalization leads to knowledge validation hence
increasing reliability of captured knowledge for reuse. The system
can be adapted to various domains.
Abstract: A new design of a planar passive T-micromixer with fin-shaped baffles in the mixing channel is presented. The mixing efficiency and the level of pressure loss in the channel have been investigated by numerical simulations in the range of Reynolds number (Re) 1 to 50. A Mixing index (Mi) has been defined to quantify the mixing efficiency, which results over 85% at both ends of the Re range, what demonstrates the micromixer can enhance mixing using the mechanisms of diffusion (lower Re) and convection (higher Re). Three geometric dimensions: radius of baffle, baffles pitch and height of the channel define the design parameters, and the mixing index and pressure loss are the performance parameters used to optimize the micromixer geometry with a multi-criteria optimization method. The Pareto front of designs with the optimum trade-offs, maximum mixing index with minimum pressure loss, is obtained. Experiments for qualitative and quantitative validation have been implemented.
Abstract: One of the main advantages of the LO paradigm is to
allow the availability of good quality, shareable learning material
through the Web. The effectiveness of the retrieval process requires a
formal description of the resources (metadata) that closely fits the
user-s search criteria; in spite of the huge international efforts in this
field, educational metadata schemata often fail to fulfil this
requirement. This work aims to improve the situation, by the
definition of a metadata model capturing specific didactic features of
shareable learning resources. It classifies LOs into “teacher-oriented"
and “student-oriented" categories, in order to describe the role a LO
is to play when it is integrated into the educational process. This
article describes the model and a first experimental validation process
that has been carried out in a controlled environment.
Abstract: Grid composite structures have many applications in aerospace industry in which deal with transverse loadings abundantly. In present paper a stiffened composite cylindrical shell with clamped-free boundary condition under transverse end load experimentally and numerically was studied. Some electrical strain gauges were employed to measure the strains. Also a finite element analysis was done for validation of experimental result. The FEM software used was ANSYS11. In addition, the results between stiffened composite shell and unstiffened composite shell were compared. It was observed that intersection of two stiffeners has an important effect in decrease of stress in the shell. Fairly good agreements were observed between the numerical and the measured results. According to recent studies about grid composite structures, it should be noted that any investigation like this research has not been reported.
Abstract: Reservoirs with high pressures and temperatures
(HPHT) that were considered to be atypical in the past are now
frequent targets for exploration. For downhole oilfield drilling tools
and components, the temperature and pressure affect the mechanical
strength. To address this issue, a finite element analysis (FEA) for
206.84 MPa (30 ksi) pressure and 165°C has been performed on the
pressure housing of the measurement-while-drilling/logging-whiledrilling
(MWD/LWD) density tool.
The density tool is a MWD/LWD sensor that measures the density
of the formation. One of the components of the density tool is the
pressure housing that is positioned in the tool. The FEA results are
compared with the experimental test performed on the pressure
housing of the density tool. Past results show a close match between
the numerical results and the experimental test. This FEA model can
be used for extreme HPHT and ultra HPHT analyses, and/or optimal
design changes.
Abstract: The one-class support vector machine “support vector
data description” (SVDD) is an ideal approach for anomaly or outlier
detection. However, for the applicability of SVDD in real-world
applications, the ease of use is crucial. The results of SVDD are
massively determined by the choice of the regularisation parameter C
and the kernel parameter of the widely used RBF kernel. While for
two-class SVMs the parameters can be tuned using cross-validation
based on the confusion matrix, for a one-class SVM this is not
possible, because only true positives and false negatives can occur
during training. This paper proposes an approach to find the optimal
set of parameters for SVDD solely based on a training set from
one class and without any user parameterisation. Results on artificial
and real data sets are presented, underpinning the usefulness of the
approach.
Abstract: The paper presents the results of simple measurements
conducted on a model of a wind-driven venturi-type room ventilator.
The ventilator design is new and was developed employing
mathematical modeling. However, the computational model was not
validated experimentally for the particular application considered.
The paper presents the performance of the ventilator model under
laboratory conditions, for five different wind tunnel speeds. The
results are used to both demonstrate the effectiveness of the new
design and to validate the computational model employed to develop
it.
Abstract: Since the conception of JML, many tools, applications and implementations have been done. In this context, the users or developers who want to use JML seem surounded by many of these tools, applications and so on. Looking for a common infrastructure and an independent language to provide a bridge between these tools and JML, we developed an approach to embedded contracts in XML for Java: XJML. This approach offer us the ability to separate preconditions, posconditions and class invariants using JML and XML, so we made a front-end which can process Runtime Assertion Checking, Extended Static Checking and Full Static Program Verification. Besides, the capabilities for this front-end can be extended and easily implemented thanks to XML. We believe that XJML is an easy way to start the building of a Graphic User Interface delivering in this way a friendly and IDE independency to developers community wich want to work with JML.
Abstract: One approach to assess neural networks underlying the cognitive processes is to study Electroencephalography (EEG). It is relevant to detect various mental states and characterize the physiological changes that help to discriminate two situations. That is why an EEG (amplitude, synchrony) classification procedure is described, validated. The two situations are "eyes closed" and "eyes opened" in order to study the "alpha blocking response" phenomenon in the occipital area. The good classification rate between the two situations is 92.1 % (SD = 3.5%) The spatial distribution of a part of amplitude features that helps to discriminate the two situations are located in the occipital regions that permit to validate the localization method. Moreover amplitude features in frontal areas, "short distant" synchrony in frontal areas and "long distant" synchrony between frontal and occipital area also help to discriminate between the two situations. This procedure will be used for mental fatigue detection.
Abstract: The research investigates the “impact of VLE on mathematical concepts acquisition of the special education needs (SENs) students at KS4 secondary education sector" in England. The overall aim of the study is to establish possible areas of difficulties to approach for above or below knowledge standard requirements for KS4 students in the acquisition and validation of basic mathematical concepts. A teaching period, in which virtual learning environment (Fronter) was used to emphasise different mathematical perception and symbolic representation was carried out and task based survey conducted to 20 special education needs students [14 actually took part]. The result shows that students were able to process information and consider images, objects and numbers within the VLE at early stages of acquisition process. They were also able to carry out perceptual tasks but with limiting process of different quotient, thus they need teacher-s guidance to connect them to symbolic representations and sometimes coach them through. The pilot study further indicates that VLE curriculum approaches for students were minutely aligned with mathematics teaching which does not emphasise the integration of VLE into the existing curriculum and current teaching practice. There was also poor alignment of vision regarding the use of VLE in realisation of the objectives of teaching mathematics by the management. On the part of teacher training, not much was done to develop teacher-s skills in the technical and pedagogical aspects of VLE that is in-use at the school. The classroom observation confirmed teaching practice will find a reliance on VLE as an enhancer of mathematical skills, providing interaction and personalisation of learning to SEN students.