Abstract: The most important problem occurs on oil spills in sea
water is to reduce the oil spills size. This study deals with the
development of high pressurized nozzle using dispersion method for
oil leakage in offshore. 3D numerical simulation results were
obtained using ANSYS Fluent 13.0 code and correlate with the
experimental data for validation. This paper studies the contribution
of the process on flow speed and pressure of the flow from two
different geometrical designs of nozzles and to generate a spray
pattern suitable for dispersant application. Factor of size distribution
of droplets generated by the nozzle is calculated using pressures
ranging from 2 to 6 bars. Results obtain from both analyses shows a
significant spray pattern and flow distribution as well as distance.
Results also show a significant contribution on the effect of oil
leakage in terms of the diameter of the oil spills break up.
Abstract: Biopharmaceuticals manufacturing is one of the major economic activities worldwide. Ninety-three percent of the workforce in a biomanufacturing environment concentrates in production-related areas. As a result, strategic collaborations between industry and academia are crucial to ensure the availability of knowledgeable workforce needed in an economic region to become competitive in biomanufacturing. In the past decade, our institution has been a key strategic partner with multinational biotechnology companies in supplying science and engineering graduates in the field of industrial biotechnology. Initiatives addressing all levels of the educational pipeline, from K-12 to college to continued education for company employees have been established along a ten-year span. The Amgen BioTalents Program was designed to provide undergraduate science and engineering students with training in biomanufacturing. The areas targeted by this educational program enhance their academic development, since these topics are not part of their traditional science and engineering curricula. The educational curriculum involved the process of producing a biomolecule from the genetic engineering of cells to the production of an especially targeted polypeptide, protein expression and purification, to quality control, and validation. This paper will report and describe the implementation details and outcomes of the first sessions of the program.
Abstract: Pressure ulcer is a common problem for today’s
healthcare industry. It occurs due to external load applied to the skin.
Also when the subject is immobile for a longer period of time and
there is continuous load applied to a particular area of human body,
blood flow gets reduced and as a result pressure ulcer develops. Body
support surface has a significant role in preventing ulceration so it is
important to know the characteristics of support surface under loading
conditions. In this paper we have presented mathematical models of
different types of viscoelastic materials and also we have shown the
validation of our simulation results with experiments.
Abstract: This work presents results of moist air condensation in heat exchanger. It describes theoretical knowledge and definition of moist air. Model with geometry of square canal was created for better understanding and postprocessing of condensation phenomena. Different approaches were examined on this model to find suitable software and model. Obtained knowledge was applied to geometry of real heat exchanger and results from experiment were compared with numerical results. One of the goals is to solve this issue without creating any user defined function in the applied code. It also contains summary of knowledge and outlook for future work.
Abstract: In heat sinks, the flow within the core exhibits separation and hence does not lend itself to simple analytical boundary layer or duct flow analysis of the wall friction. In this paper, we present some findings from an experimental and numerical study aimed to obtain physical insight into the influence of the presence of the shield and its position on the hydraulic and thermal performance of square pin fin heat sink without top by-pass. The variations of the Nusselt number and friction factor are obtained under varied parameters, such as the Reynolds number and the shield position. The numerical code is validated by comparing the numerical results with the available experimental data. It is shown that, there is a good agreement between the temperature predictions based on the model and the experimental data. Results show that, as the presence of the shield, the heat transfer of fin array is enhanced and the flow resistance increased. The surface temperature distribution of the heat sink base is more uniform when the dimensionless shield position equals to 1/3 or 2/3. The comprehensive performance evaluation approach based on identical pumping power criteria is adopted and shows that the optimum shield position is at x/l=0.43.
Abstract: A new and cost effective RP-HPLC method was
developed and validated for simultaneous analysis of non steroidal
anti inflammatory dugs Diclofenac sodium (DFS), Flurbiprofen
(FLP) and an opioid analgesic Tramadol (TMD) in advanced drug
delivery systems (Liposome and Microcapsules), marketed brands
and human plasma. Isocratic system was employed for the flow of
mobile phase consisting of 10 mM sodium dihydrogen phosphate
buffer and acetonitrile in molar ratio of 67: 33 with adjusted pH of
3.2. The stationary phase was hypersil ODS column (C18, 250×4.6
mm i.d., 5 μm) with controlled temperature of 30 C°. DFS in
liposomes, microcapsules and marketed drug products was
determined in range of 99.76-99.84%. FLP and TMD in
microcapsules and brands formulation were 99.78 - 99.94 % and
99.80 - 99.82 %, respectively. Single step liquid-liquid extraction
procedure using combination of acetonitrile and trichloroacetic acid
(TCA) as protein precipitating agent was employed. The detection
limits (at S/N ratio 3) of quality control solutions and plasma samples
were 10, 20, and 20 ng/ml for DFS, FLP and TMD, respectively.
The Assay was acceptable in linear dynamic range. All other
validation parameters were found in limits of FDA and ICH method
validation guidelines. The proposed method is sensitive, accurate and
precise and could be applicable for routine analysis in
pharmaceutical industry as well as in human plasma samples for
bioequivalence and pharmacokinetics studies.
Abstract: The aerodynamic stall control of a baseline 13-percent
thick NASA GA(W)-2 airfoil using a synthetic jet actuator (SJA) is
presented in this paper. Unsteady Reynolds-averaged Navier-Stokes
equations are solved on a hybrid grid using a commercial software to
simulate the effects of a synthetic jet actuator located at 13% of the
chord from the leading edge at a Reynolds number Re = 2.1x106 and
incidence angles from 16 to 22 degrees. The experimental data for the
pressure distribution at Re = 3x106 and aerodynamic coefficients at
Re = 2.1x106 (angle of attack varied from -16 to 22 degrees) without
SJA is compared with the computational fluid dynamic (CFD)
simulation as a baseline validation. A good agreement of the CFD
simulations is obtained for aerodynamic coefficients and pressure
distribution.
A working SJA has been integrated with the baseline airfoil and
initial focus is on the aerodynamic stall control at angles of attack
from 16 to 22 degrees. The results show a noticeable improvement in
the aerodynamic performance with increase in lift and decrease in
drag at these post stall regimes.
Abstract: Aggressive scaling of MOS devices requires use of ultra-thin gate oxides to maintain a reasonable short channel effect and to take the advantage of higher density, high speed, lower cost etc. Such thin oxides give rise to high electric fields, resulting in considerable gate tunneling current through gate oxide in nano regime. Consequently, accurate analysis of gate tunneling current is very important especially in context of low power application. In this paper, a simple and efficient analytical model has been developed for channel and source/drain overlap region gate tunneling current through ultra thin gate oxide n-channel MOSFET with inevitable deep submicron effect (DSME).The results obtained have been verified with simulated and reported experimental results for the purpose of validation. It is shown that the calculated tunnel current is well fitted to the measured one over the entire oxide thickness range. The proposed model is suitable enough to be used in circuit simulator due to its simplicity. It is observed that neglecting deep sub-micron effect may lead to large error in the calculated gate tunneling current. It is found that temperature has almost negligible effect on gate tunneling current. It is also reported that gate tunneling current reduces with the increase of gate oxide thickness. The impact of source/drain overlap length is also assessed on gate tunneling current.
Abstract: Text Mining is around applying knowledge discovery
techniques to unstructured text is termed knowledge discovery in text
(KDT), or Text data mining or Text Mining. In decision tree
approach is most useful in classification problem. With this
technique, tree is constructed to model the classification process.
There are two basic steps in the technique: building the tree and
applying the tree to the database. This paper describes a proposed
C5.0 classifier that performs rulesets, cross validation and boosting
for original C5.0 in order to reduce the optimization of error ratio.
The feasibility and the benefits of the proposed approach are
demonstrated by means of medial data set like hypothyroid. It is
shown that, the performance of a classifier on the training cases from
which it was constructed gives a poor estimate by sampling or using a
separate test file, either way, the classifier is evaluated on cases that
were not used to build and evaluate the classifier are both are large. If
the cases in hypothyroid.data and hypothyroid.test were to be
shuffled and divided into a new 2772 case training set and a 1000
case test set, C5.0 might construct a different classifier with a lower
or higher error rate on the test cases. An important feature of see5 is
its ability to classifiers called rulesets. The ruleset has an error rate
0.5 % on the test cases. The standard errors of the means provide an
estimate of the variability of results. One way to get a more reliable
estimate of predictive is by f-fold –cross- validation. The error rate of
a classifier produced from all the cases is estimated as the ratio of the
total number of errors on the hold-out cases to the total number of
cases. The Boost option with x trials instructs See5 to construct up to
x classifiers in this manner. Trials over numerous datasets, large and
small, show that on average 10-classifier boosting reduces the error
rate for test cases by about 25%.
Abstract: The paper discusses the results obtained to predict
reinforcement in singly reinforced beam using Neural Net (NN),
Support Vector Machines (SVM-s) and Tree Based Models. Major
advantage of SVM-s over NN is of minimizing a bound on the
generalization error of model rather than minimizing a bound on
mean square error over the data set as done in NN. Tree Based
approach divides the problem into a small number of sub problems to
reach at a conclusion. Number of data was created for different
parameters of beam to calculate the reinforcement using limit state
method for creation of models and validation. The results from this
study suggest a remarkably good performance of tree based and
SVM-s models. Further, this study found that these two techniques
work well and even better than Neural Network methods. A
comparison of predicted values with actual values suggests a very
good correlation coefficient with all four techniques.
Abstract: Background, measuring an individual-s Health
Literacy is gaining attention, yet no appropriate instrument is available
in Taiwan. Measurement tools that were developed and used in
western countries may not be appropriate for use in Taiwan due to a
different language system. Purpose of this research was to develop a
Health Literacy measurement instrument specific for Taiwan adults.
Methods, several experts of clinic physicians; healthcare
administrators and scholars identified 125 common used health related
Chinese phrases from major medical knowledge sources that easy
accessible to the public. A five-point Likert scale is used to measure
the understanding level of the target population. Such measurement is
then used to compare with the correctness of their answers to a health
knowledge test for validation. Samples, samples under study were
purposefully taken from four groups of people in the northern
Pingtung, OPD patients, university students, community residents,
and casual visitors to the central park. A set of health knowledge index
with 10 questions is used to screen those false responses. A sample
size of 686 valid cases out of 776 was then included to construct this
scale. An independent t-test was used to examine each individual
phrase. The phrases with the highest significance are then identified
and retained to compose this scale. Result, a Taiwan Health Literacy
Scale (THLS) was finalized with 66 health-related phrases under nine
divisions. Cronbach-s alpha of each division is at a satisfactory level
of 89% and above. Conclusions, factors significantly differentiate the
levels of health literacy are education, female gender, age, family
members of stroke victims, experience with patient care, and
healthcare professionals in the initial application in this study..
Abstract: Functionalities and control behavior are both primary
requirements in design of a complex system. Automata theory plays
an important role in modeling behavior of a system. Z is an ideal
notation which is used for describing state space of a system and then
defining operations over it. Consequently, an integration of automata
and Z will be an effective tool for increasing modeling power for a
complex system. Further, nondeterministic finite automata (NFA)
may have different implementations and therefore it is needed to
verify the transformation from diagrams to a code. If we describe
formal specification of an NFA before implementing it, then
confidence over transformation can be increased. In this paper, we
have given a procedure for integrating NFA and Z. Complement of a
special type of NFA is defined. Then union of two NFAs is
formalized after defining their complements. Finally, formal
construction of intersection of NFAs is described. The specification
of this relationship is analyzed and validated using Z/EVES tool.
Abstract: In wavelet regression, choosing threshold value is a crucial issue. A too large value cuts too many coefficients resulting in over smoothing. Conversely, a too small threshold value allows many coefficients to be included in reconstruction, giving a wiggly estimate which result in under smoothing. However, the proper choice of threshold can be considered as a careful balance of these principles. This paper gives a very brief introduction to some thresholding selection methods. These methods include: Universal, Sure, Ebays, Two fold cross validation and level dependent cross validation. A simulation study on a variety of sample sizes, test functions, signal-to-noise ratios is conducted to compare their numerical performances using three different noise structures. For Gaussian noise, EBayes outperforms in all cases for all used functions while Two fold cross validation provides the best results in the case of long tail noise. For large values of signal-to-noise ratios, level dependent cross validation works well under correlated noises case. As expected, increasing both sample size and level of signal to noise ratio, increases estimation efficiency.
Abstract: Signature represents an individual characteristic of a
person which can be used for his / her validation. For such application
proper modeling is essential. Here we propose an offline signature
recognition and verification scheme which is based on extraction of
several features including one hybrid set from the input signature
and compare them with the already trained forms. Feature points
are classified using statistical parameters like mean and variance.
The scanned signature is normalized in slant using a very simple
algorithm with an intention to make the system robust which is
found to be very helpful. The slant correction is further aided by the
use of an Artificial Neural Network (ANN). The suggested scheme
discriminates between originals and forged signatures from simple
and random forgeries. The primary objective is to reduce the two
crucial parameters-False Acceptance Rate (FAR) and False Rejection
Rate (FRR) with lesser training time with an intension to make the
system dynamic using a cluster of ANNs forming a multiple classifier
system.
Abstract: The impact of OO design on software quality
characteristics such as defect density and rework by mean of
experimental validation. Encapsulation, inheritance, polymorphism,
reusability, Data hiding and message-passing are the major attribute
of an Object Oriented system. In order to evaluate the quality of an
Object oriented system the above said attributes can act as indicators.
The metrics are the well known quantifiable approach to express any
attribute. Hence, in this paper we tried to formulate a framework of
metrics representing the attributes of object oriented system.
Empirical Data is collected from three different projects based on
object oriented paradigms to calculate the metrics.
Abstract: In this paper we propose an NLP-based method for
Ontology Population from texts and apply it to semi automatic
instantiate a Generic Knowledge Base (Generic Domain Ontology) in
the risk management domain. The approach is semi-automatic and
uses a domain expert intervention for validation. The proposed
approach relies on a set of Instances Recognition Rules based on
syntactic structures, and on the predicative power of verbs in the
instantiation process. It is not domain dependent since it heavily
relies on linguistic knowledge.
A description of an experiment performed on a part of the
ontology of the PRIMA1 project (supported by the European
community) is given. A first validation of the method is done by
populating this ontology with Chemical Fact Sheets from
Environmental Protection Agency2. The results of this experiment
complete the paper and support the hypothesis that relying on the
predicative power of verbs in the instantiation process improves the
performance.
Abstract: The quantified residence time distribution (RTD)
provides a numerical characterization of mixing in a reactor, thus
allowing the process engineer to better understand mixing
performance of the reactor.This paper discusses computational
studies to investigate flow patterns in a two impinging streams
cyclone reactor(TISCR) . Flow in the reactor was modeled with
computational fluid dynamics (CFD). Utilizing the Eulerian-
Lagrangian approach, implemented in FLUENT (V6.3.22), particle
trajectories were obtained by solving the particle force balance
equations. From simulation results obtained at different Δts, the mean
residence time (tm) and the mean square deviation (σ2) were
calculated. a good agreement can be observed between predicted and
experimental data. Simulation results indicate that the behavior of
complex reactor systems can be predicted using the CFD technique
with minimum data requirement for validation.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.
Abstract: The value of overall oxygen transfer Coefficient
(KLa), which is the best measure of oxygen transfer in water through
aeration, is obtained by a simple approach, which sufficiently
explains the utility of the method to eliminate the discrepancies due
to inaccurate assumption of saturation dissolved oxygen
concentration. The rate of oxygen transfer depends on number of
factors like intensity of turbulence, which in turns depends on the
speed of rotation, size, and number of blades, diameter and
immersion depth of the rotor, and size and shape of aeration tank, as
well as on physical, chemical, and biological characteristic of water.
An attempt is made in this paper to correlate the overall oxygen
transfer Coefficient (KLa), as an independent parameter with other
influencing parameters mentioned above. It has been estimated that
the simulation equation developed predicts the values of KLa and
power with an average standard error of estimation of 0.0164 and
7.66 respectively and with R2 values of 0.979 and 0.989 respectively,
when compared with experimentally determined values. The
comparison of this model is done with the model generated using
Computational fluid dynamics (CFD) and both the models were
found to be in good agreement with each other.
Abstract: The traditional software product and process metrics
are neither suitable nor sufficient in measuring the complexity of
software components, which ultimately is necessary for quality and
productivity improvement within organizations adopting CBSE.
Researchers have proposed a wide range of complexity metrics for
software systems. However, these metrics are not sufficient for
components and component-based system and are restricted to the
module-oriented systems and object-oriented systems. In this
proposed study it is proposed to find the complexity of the JavaBean
Software Components as a reflection of its quality and the component
can be adopted accordingly to make it more reusable. The proposed
metric involves only the design issues of the component and does not
consider the packaging and the deployment complexity. In this way,
the software components could be kept in certain limit which in turn
help in enhancing the quality and productivity.