Abstract: The nickel and gold nanoclusters as supported
catalysts were analyzed by XAS, XRD and XPS in order to
determine their local, global and electronic structure. The present
study has pointed out a strong deformation of the local structure of
the metal, due to its interaction with oxide supports. The average
particle size, the mean squares of the microstrain, the particle size
distribution and microstrain functions of the supported Ni and Au
catalysts were determined by XRD method using Generalized Fermi
Function for the X-ray line profiles approximation. Based on EXAFS
analysis we consider that the local structure of the investigated
systems is strongly distorted concerning the atomic number pairs.
Metal-support interaction is confirmed by the shape changes of the
probability densities of electron transitions: Ni K edge (1s →
continuum and 2p), Au LIII-edge (2p3/2 → continuum, 6s, 6d5/2 and
6d3/2). XPS investigations confirm the metal-support interaction at
their interface.
Abstract: Effective evaluation of software development effort is an important issue during project plan. This study provides a model to predict development effort based on the software size estimated with function points. We generalize the average amount of effort spent on each phase of the development, and give the estimates for the effort used in software building, testing, and implementation. Finally, this paper finds a strong correlation between software defects and software size. As the size of software constantly increases, the quality remains to be a matter which requires major concern.
Abstract: The Influence Diagrams (IDs) is a kind of Probabilistic Belief Networks for graphic modeling. The usage of IDs can improve the communication among field experts, modelers, and decision makers, by showing the issue frame discussed from a high-level point of view. This paper enhances the Time-Sliced Influence Diagrams (TSIDs, or called Dynamic IDs) based formalism from a Discrete Event Systems Modeling and Simulation (DES M&S) perspective, for Exploring Analysis (EA) modeling. The enhancements enable a modeler to specify times occurred of endogenous events dynamically with stochastic sampling as model running and to describe the inter- influences among them with variable nodes in a dynamic situation that the existing TSIDs fails to capture. The new class of model is named Dynamic-Stochastic Influence Diagrams (DSIDs). The paper includes a description of the modeling formalism and the hiberarchy simulators implementing its simulation algorithm, and shows a case study to illustrate its enhancements.
Abstract: Some of the students' problems in writing skill stem
from inadequate preparation for the writing assignment. Students
should be taught how to write well when they arrive in language
classes. Having selected a topic, the students examine and explore the
theme from as large a variety of viewpoints as their background and
imagination make possible. Another strategy is that the students
prepare an Outline before writing the paper. The comparison between
the two mentioned thought provoking techniques was carried out
between the two class groups –students of Islamic Azad University of
Dezful who were studying “Writing 2" as their main course. Each
class group was assigned to write five compositions separately in
different periods of time. Then a t-test for each pair of exams between
the two class groups showed that the t-observed in each pair was
more than the t-critical. Consequently, the first hypothesis which
states those who utilize Brainstorming as a thought provoking
technique in prewriting phase are more successful than those who
outline the papers before writing was verified.
Abstract: Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.
Abstract: In this article, we aim to discuss the formulation of two explicit group iterative finite difference methods for time-dependent two dimensional Burger-s problem on a variable mesh. For the non-linear problems, the discretization leads to a non-linear system whose Jacobian is a tridiagonal matrix. We discuss the Newton-s explicit group iterative methods for a general Burger-s equation. The proposed explicit group methods are derived from the standard point and rotated point Crank-Nicolson finite difference schemes. Their computational complexity analysis is discussed. Numerical results are given to justify the feasibility of these two proposed iterative methods.
Abstract: With respect to the dissipation of energy through
plastic deformation of joints of prefabricated wall units, the paper
points out the principal importance of efficient reinforcement of the
prefabricated system at its joints. The method, quality and amount of
reinforcement are essential for reaching the necessary degree of joint
ductility. The paper presents partial results of experimental research
of vertical joints of prefabricated units exposed to monotonously
rising loading and repetitive shear force and formulates a conclusion
that the limit state of the structure as a whole is preceded by the
disintegration of joints, or that the structure tends to pass from
linearly elastic behaviour to non-linearly elastic to plastic behaviour
by exceeding the proportional elastic limit in joints.Experimental
verification on a model of a 7-storey prefabricated structure revealed
weak points in its load-bearing systems, mainly at places of critical
points around openings situated in close proximity to vertical joints
of mutually perpendicularly oriented walls.
Abstract: This work is focused on the steady boundary layer flow
near the forward stagnation point of plane and axisymmetric bodies
towards a stretching sheet. The no slip condition on the solid
boundary is replaced by the partial slip condition. The analytical
solutions for the velocity distributions are obtained for the various
values of the ratio of free stream velocity and stretching velocity, slip
parameter, the suction and injection velocity parameter, magnetic
parameter and dimensionality index parameter in the series forms with
the help of homotopy analysis method (HAM). Convergence of the
series is explicitly discussed. Results show that the flow and the skin
friction coefficient depend heavily on the velocity slip factor. In
addition, the effects of all the parameters mentioned above were more
pronounced for plane flows than for axisymmetric flows.
Abstract: The reliability of the tools developed to learn the
learning styles is essential to find out students- learning styles
trustworthily. For this purpose, the psychometric features of Grasha-
Riechman Student Learning Style Inventory developed by Grasha
was studied to contribute to this field. The study was carried out on
6th, 7th, and 8th graders of 10 primary education schools in Konya.
The inventory was applied twice with an interval of one month, and
according to the data of this application, the reliability coefficient
numbers of the 6 sub-dimensions pointed in the theory of the
inventory was found to be medium. Besides, it was found that the
inventory does not have a structure with 6 factors for both
Mathematics and English courses as represented in the theory.
Abstract: Since the feasibility study of R&D programs have been
initiated for efficient public R&D investments, year 2008, feasibility
studies have improved in terms of precision. Although experience
related to these studies of R&D programs have increased to a certain
point, still methodological improvement is required. The feasibility
studies of R&D programs are consisted of various viewpoints, such as
technology, policy, and economics. This research is to provide
improvement methods to the economic perspective; especially the cost
estimation process of R&D activities. First of all, the fundamental
concept of cost estimation is reviewed. After the review, a statistical
and econometric analysis method is applied as empirical analysis.
Conclusively, limitations and further research directions are provided.
Abstract: Many studies have shown that parallelization decreases efficiency [1], [2]. There are many reasons for these decrements. This paper investigates those which appear in the context of parallel data integration. Integration processes generally cannot be allocated to packages of identical size (i. e. tasks of identical complexity). The reason for this is unknown heterogeneous input data which result in variable task lengths. Process delay is defined by the slowest processing node. It leads to a detrimental effect on the total processing time. With a real world example, this study will show that while process delay does initially increase with the introduction of more nodes it ultimately decreases again after a certain point. The example will make use of the cloud computing platform Hadoop and be run inside Amazon-s EC2 compute cloud. A stochastic model will be set up which can explain this effect.
Abstract: Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is very important as it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, it helps to reduce the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we make a survey of the factors that influence exclusive breastfeeding and use two dispersed statistical models to analyze data. The models are the Generalized Poisson regression model and the Com-Poisson regression models.
Abstract: Analytical procedure was carried out in this paper to
calculate the ultimate load capacity of reinforced concrete corbels
strengthened or repaired externally with CFRP sheets. Strut and tie
method and shear friction method proposed earlier for analyzing
reinforced concrete corbels were modified to incorporate the effect of
external CFRP sheets bonded to the corbel. The points of weakness
of any method that lead to an inaccuracy, especially when
overestimating test results were checked and discussed. Comparison
of prediction with the test data indicates that the ratio of test /
calculated ultimate load is 0.82 and 1.17 using strut and tie method
and shear friction method, respectively. If the limits of maximum
shear stress is followed, the calculated ultimate load capacity using
shear friction method was found to underestimates test data
considerably.
Abstract: This paper studies the dependability of componentbased
applications, especially embedded ones, from the diagnosis
point of view. The principle of the diagnosis technique is to
implement inter-component tests in order to detect and locate the
faulty components without redundancy. The proposed approach for
diagnosing faulty components consists of two main aspects. The first
one concerns the execution of the inter-component tests which
requires integrating test functionality within a component. This is the
subject of this paper. The second one is the diagnosis process itself
which consists of the analysis of inter-component test results to
determine the fault-state of the whole system. Advantage of this
diagnosis method when compared to classical redundancy faulttolerant
techniques are application autonomy, cost-effectiveness and
better usage of system resources. Such advantage is very important
for many systems and especially for embedded ones.
Abstract: This project focuses on the development of a line
follower algorithm for a Two Wheels Balancing Robot. In this
project, ATMEGA32 is chosen as the brain board controller to react
towards the data received from Balance Processor Chip on the
balance board to monitor the changes of the environment through
two infra-red distance sensor to solve the inclination angle problem.
Hence, the system will immediately restore to the set point (balance
position) through the implementation of internal PID algorithms at
the balance board. Application of infra-red light sensors with the PID
control is vital, in order to develop a smooth line follower robot. As a
result of combination between line follower program and internal self
balancing algorithms, we are able to develop a dynamically
stabilized balancing robot with line follower function.
Abstract: The article deals with numerical investigation of axisymmetric
subsonic air to air ejector. An analysis of flow and mixing
processes in cylindrical mixing chamber are made. Several modes
with different velocity and ejection ratio are presented. The mixing
processes are described and differences between flow in the initial
region of mixing and the main region of mixing are described. The
lengths of both regions are evaluated. Transition point and point
where the mixing processes are finished are identified. It was found
that the length of the initial region of mixing is strongly dependent on
the velocity ratio, while the length of the main region of mixing is
dependent on velocity ratio only slightly.
Abstract: In a previous work, we presented the numerical
solution of the two dimensional second order telegraph partial
differential equation discretized by the centred and rotated five-point
finite difference discretizations, namely the explicit group (EG) and
explicit decoupled group (EDG) iterative methods, respectively. In
this paper, we utilize a domain decomposition algorithm on these
group schemes to divide the tasks involved in solving the same
equation. The objective of this study is to describe the development
of the parallel group iterative schemes under OpenMP programming
environment as a way to reduce the computational costs of the
solution processes using multicore technologies. A detailed
performance analysis of the parallel implementations of points and
group iterative schemes will be reported and discussed.
Abstract: Logistics is part of the supply chain processes that
plans, implements, and controls the efficient and effective forward
and reverse flow and storage of goods, services, and related
information between the point of origin and the point of consumption
in order to meet customer requirements. This research aims to
investigate the current status and future direction of the use of
Information Technology (IT) for logistics, focusing on Supply Chain
Management (SCM) and E-Commerce adoption in Malaysia.
Therefore, this research stresses on the type of technology being
adopted, factors, benefits and barriers affecting the innovation in
SCM and E-Commerce technology adoption among Logistics
Service Providers (LSP). A mailed questionnaire survey was
conducted to collect data from 265 logistics companies in Johor. The
research revealed a high level of SCM technology adoption among
LSP as they had adopted SCM technology in various business
processes while they perceived a high level of benefits from SCM
adoption.
Abstract: This paper presents an approach based on the
adoption of a distributed cognition framework and a non parametric
multicriteria evaluation methodology (DEA) designed specifically to
compare e-commerce websites from the consumer/user viewpoint. In
particular, the framework considers a website relative efficiency as a
measure of its quality and usability. A website is modelled as a black
box capable to provide the consumer/user with a set of
functionalities. When the consumer/user interacts with the website to
perform a task, he/she is involved in a cognitive activity, sustaining a
cognitive cost to search, interpret and process information, and
experiencing a sense of satisfaction. The degree of ambiguity and
uncertainty he/she perceives and the needed search time determine
the effort size – and, henceforth, the cognitive cost amount – he/she
has to sustain to perform his/her task. On the contrary, task
performing and result achievement induce a sense of gratification,
satisfaction and usefulness. In total, 9 variables are measured,
classified in a set of 3 website macro-dimensions (user experience,
site navigability and structure). The framework is implemented to
compare 40 websites of businesses performing electronic commerce
in the information technology market. A questionnaire to collect
subjective judgements for the websites in the sample was purposely
designed and administered to 85 university students enrolled in
computer science and information systems engineering
undergraduate courses.
Abstract: Lactic acid alone and its combined application with
nisin were evaluated for reducing population of naturally occurring
microorganisms on chilled shrimp. Fresh shrimps were dipped in 0,
1.0% and 2.0% (v/v) lactic acid alone and their combined application
with 0.04 (g/L/kg) nisin solution for 10 min. Total plate counts of
aerobic bacteria (TPCs), Psychrotrophic counts, population of
Pseudomonas spp., H2S producing bacteria and Lactic acid bacteria
(LAB) on shrimps were determined during storage at 4 °C. The
results indicated that total plate counts were 2.91 and 2.63 log CFU/g
higher on untreated shrimps after 7 and 14 days of storage,
respectively, than on shrimps treated with 2.0% lactic acid combined
with 0.04 (g/L/kg) nisin. Both concentrations of lactic acid indicated
significant reduction on Pseudomonas counts during storage, while
2.0% lactic acid combined with nisin indicated the highest reduction.
In addition, H2S producing bacteria were more sensitive to high
concentration of lactic acid combined with nisin during storage.