Abstract: Computers are increasingly being used as educational
tools in elementary/primary schools worldwide. A specific
application of such computer use, is that of multimedia games, where
the aim is to combine pedagogy and entertainment. This study
reports on a case-study whereby an educational multimedia game has
been developed for use by elementary school children. The stages of
the application-s design, implementation and evaluation are
presented. Strengths of the game are identified and discussed, and its
weaknesses are identified, allowing for suggestions for future redesigns.
The results show that the use of games can engage children
in the learning process for longer periods of time with the added
benefit of the entertainment factor.
Abstract: The present paper develops and validates a numerical procedure for the calculation of turbulent combustive flow in converging and diverging ducts and throuh simulation of the heat transfer processes, the amount of production and spread of Nox pollutant has been measured. A marching integration solution procedure employing the TDMA is used to solve the discretized equations. The turbulence model is the Prandtl Mixing Length method. Modeling the combustion process is done by the use of Arrhenius and Eddy Dissipation method. Thermal mechanism has been utilized for modeling the process of forming the nitrogen oxides. Finite difference method and Genmix numerical code are used for numerical solution of equations. Our results indicate the important influence of the limiting diverging angle of diffuser on the coefficient of recovering of pressure. Moreover, due to the intense dependence of Nox pollutant to the maximum temperature in the domain with this feature, the Nox pollutant amount is also in maximum level.
Abstract: In this paper, the telegraph equation is solved numerically by cubic B-spline quasi-interpolation .We obtain the numerical scheme, by using the derivative of the quasi-interpolation to approximate the spatial derivative of the dependent variable and a low order forward difference to approximate the temporal derivative of the dependent variable. The advantage of the resulting scheme is that the algorithm is very simple so it is very easy to implement. The results of numerical experiments are presented, and are compared with analytical solutions by calculating errors L2 and L∞ norms to confirm the good accuracy of the presented scheme.
Abstract: This paper argues that a product development exercise
involves in addition to the conventional stages, several decisions
regarding other aspects. These aspects should be addressed
simultaneously in order to develop a product that responds to the
customer needs and that helps realize objectives of the stakeholders
in terms of profitability, market share and the like. We present a
framework that encompasses these different development
dimensions. The framework shows that a product development
methodology such as the Quality Function Deployment (QFD) is the
basic tool which allows definition of the target specifications of a
new product. Creativity is the first dimension that enables the
development exercise to live and end successfully. A number of
group processes need to be followed by the development team in
order to ensure enough creativity and innovation. Secondly,
packaging is considered to be an important extension of the product.
Branding strategies, quality and standardization requirements,
identification technologies, design technologies, production
technologies and costing and pricing are also integral parts to the
development exercise. These dimensions constitute the proposed
framework. The paper also presents a mathematical model used to
calculate the design targets based on the target costing principle. The
framework is used to study a case of a new product development in
the telecommunications services sector.
Abstract: This paper discusses a new, systematic approach to
the synthesis of a NP-hard class of non-regenerative Boolean
networks, described by FON[FOFF]={mi}[{Mi}], where for every
mj[Mj]∈{mi}[{Mi}], there exists another mk[Mk]∈{mi}[{Mi}], such
that their Hamming distance HD(mj, mk)=HD(Mj, Mk)=O(n), (where
'n' represents the number of distinct primary inputs). The method
automatically ensures exact minimization for certain important selfdual
functions with 2n-1 points in its one-set. The elements meant for
grouping are determined from a newly proposed weighted incidence
matrix. Then the binary value corresponding to the candidate pair is
correlated with the proposed binary value matrix to enable direct
synthesis. We recommend algebraic factorization operations as a post
processing step to enable reduction in literal count. The algorithm
can be implemented in any high level language and achieves best
cost optimization for the problem dealt with, irrespective of the
number of inputs. For other cases, the method is iterated to
subsequently reduce it to a problem of O(n-1), O(n-2),.... and then
solved. In addition, it leads to optimal results for problems exhibiting
higher degree of adjacency, with a different interpretation of the
heuristic, and the results are comparable with other methods.
In terms of literal cost, at the technology independent stage, the
circuits synthesized using our algorithm enabled net savings over
AOI (AND-OR-Invert) logic, AND-EXOR logic (EXOR Sum-of-
Products or ESOP forms) and AND-OR-EXOR logic by 45.57%,
41.78% and 41.78% respectively for the various problems.
Circuit level simulations were performed for a wide variety of
case studies at 3.3V and 2.5V supply to validate the performance of
the proposed method and the quality of the resulting synthesized
circuits at two different voltage corners. Power estimation was
carried out for a 0.35micron TSMC CMOS process technology. In
comparison with AOI logic, the proposed method enabled mean
savings in power by 42.46%. With respect to AND-EXOR logic, the
proposed method yielded power savings to the tune of 31.88%, while
in comparison with AND-OR-EXOR level networks; average power
savings of 33.23% was obtained.
Abstract: This research is to study the types of products and
services that employs 'ambient media and respective techniques in its
advertisement materials. Data collection has been done via analyses of a total of 62 advertisements that employed ambient media
approach in Thailand during the years 2004 to 2011. The 62 advertisement were qualifying advertisements of the Adman Awards
& Symposium under the category of Outdoor & Ambience. Analysis
results reveal that there is a total of 14 products and services that
chooses to utilize ambient media in its advertisement. Amongst all ambient media techniques, 'intrusion' uses the value of a medium in
its representation of content most often. Following intrusion is 'interaction', where consumers are invited to participate and interact
with the advertising materials. 'Illusion' ranks third in its ability to subject the viewers to distortions of reality that makes the division
between reality and fantasy less clear.
Abstract: From past many decades human beings are suffering
from plethora of natural disasters. Occurrence of disasters is a
frequent process; it changes conceptual myths as more and more
advancement are made. Although we are living in technological era
but in developing countries like Pakistan disasters are shaped by
socially constructed roles. The need is to understand the most
vulnerable group of society i.e. females; their issues are complex in
nature because of undermined gender status in the society. There is a
need to identify maximum issues regarding females and to enhance
the achievement of millennium development goals (MDGs). Gender
issues are of great concern all around the globe including Pakistan.
Here female visibility in society is low, and also during disasters, the
failure to understand the reality that concentrates on double burden
including productive and reproductive care. Women have to
contribute a lot in society so we need to make them more disaster
resilient. For this non-structural measures like awareness, trainings
and education must be carried out. In rural and in urban settings in
any disaster like earthquake or flood, elements like gender
perspective, their age, physical health, demographic issues contribute
towards vulnerability. In Pakistan the gender issues in disasters were
of less concern before 2005 earthquake and 2010 floods. Significant
achievements are made after 2010 floods when gender and child cell
was created to provide all facilities to women and girls. The aim of
the study is to highlight all necessary facilities in a disaster to build
coping mechanism in females from basic rights till advance level
including education.
Abstract: The objective of this research is to investigate the
advantages of using large-diameter 0.7 inch prestressing strands in
pretention applications. The advantages of large-diameter strands are
mainly beneficial in the heavy construction applications. Bridges and
tunnels are subjected to a higher daily traffic with an exponential
increase in trucks ultimate weight, which raise the demand for higher
structural capacity of bridges and tunnels. In this research, precast
prestressed I-girders were considered as a case study. Flexure
capacities of girders fabricated using 0.7 inch strands and different
concrete strengths were calculated and compared to capacities of 0.6
inch strands girders fabricated using equivalent concrete strength.
The effect of bridge deck concrete strength on composite deck-girder
section capacity was investigated due to its possible effect on final
section capacity. Finally, a comparison was made to compare the
bridge cross-section of girders designed using regular 0.6 inch strands
and the large-diameter 0.7 inch. The research findings showed that
structural advantages of 0.7 inch strands allow for using fewer bridge
girders, reduced material quantity, and light-weight members. The
structural advantages of 0.7 inch strands are maximized when high
strength concrete (HSC) are used in girder fabrication, and concrete
of minimum 5ksi compressive strength is used in pouring bridge
decks. The use of 0.7 inch strands in bridge industry can partially
contribute to the improvement of bridge conditions, minimize
construction cost, and reduce the construction duration of the project.
Abstract: Median filters with larger windows offer greater smoothing and are more robust than the median filters of smaller windows. However, the larger median smoothers (the median filters with the larger windows) fail to track low order polynomial trends in the signals. Due to this, constant regions are produced at the signal corners, leading to the loss of fine details. In this paper, an algorithm, which combines the ability of the 3-point median smoother in preserving the low order polynomial trends and the superior noise filtering characteristics of the larger median smoother, is introduced. The proposed algorithm (called the combiner algorithm in this paper) is evaluated for its performance on a test image corrupted with different types of noise and the results obtained are included.
Abstract: We propose a reduced-ordermodel for the instantaneous
hydrodynamic force on a cylinder. The model consists of a system of
two ordinary differential equations (ODEs), which can be integrated
in time to yield very accurate histories of the resultant force and
its direction. In contrast to several existing models, the proposed
model considers the actual (total) hydrodynamic force rather than its
perpendicular or parallel projection (the lift and drag), and captures
the complete force rather than the oscillatory part only. We study
and provide descriptions of the relationship between the model
parameters, evaluated utilizing results from numerical simulations,
and the Reynolds number so that the model can be used at any
arbitrary value within the considered range of 100 to 500 to provide
accurate representation of the force without the need to perform timeconsuming
simulations and solving the partial differential equations
(PDEs) governing the flow field.
Abstract: We report the results of an lattice Boltzmann
simulation of magnetohydrodynamic damping of sidewall convection
in a rectangular enclosure filled with a porous medium. In particular
we investigate the suppression of convection when a steady magnetic
field is applied in the vertical direction. The left and right vertical
walls of the cavity are kept at constant but different temperatures
while both the top and bottom horizontal walls are insulated. The
effects of the controlling parameters involved in the heat transfer and
hydrodynamic characteristics are studied in detail. The heat and mass
transfer mechanisms and the flow characteristics inside the enclosure
depended strongly on the strength of the magnetic field and Darcy
number. The average Nusselt number decreases with rising values of
the Hartmann number while this increases with increasing values of
the Darcy number.
Abstract: Nanotechnology is the science of creating, using and
manipulating objects which have at least one dimension in range of
0.1 to 100 nanometers. In other words, nanotechnology is
reconstructing a substance using its individual atoms and arranging
them in a way that is desirable for our purpose.
The main reason that nanotechnology has been attracting
attentions is the unique properties that objects show when they are
formed at nano-scale. These differing characteristics that nano-scale
materials show compared to their nature-existing form is both useful
in creating high quality products and dangerous when being in
contact with body or spread in environment.
In order to control and lower the risk of such nano-scale particles,
the main following three topics should be considered:
1) First of all, these materials would cause long term diseases that
may show their effects on body years after being penetrated in human
organs and since this science has become recently developed in
industrial scale not enough information is available about their
hazards on body.
2) The second is that these particles can easily spread out in
environment and remain in air, soil or water for very long time,
besides their high ability to penetrate body skin and causing new
kinds of diseases.
3) The third one is that to protect body and environment against
the danger of these particles, the protective barriers must be finer than
these small objects and such defenses are hard to accomplish.
This paper will review, discuss and assess the risks that human and
environment face as this new science develops at a high rate.
Abstract: Assessment of IEP (Individual Education Plan) is an
important stage in the area of special education. This paper deals
with this problem by introducing computer software which process
the data gathered from application of IEP. The software is intended
to be used by special education institution in Turkey and allows
assessment of school and family trainings. The software has a user
friendly interface and its design includes graphical developer tools.
Abstract: Fuzzy fingerprint vault is a recently developed cryptographic construct based on the polynomial reconstruction problem to secure critical data with the fingerprint data. However, the previous researches are not applicable to the fingerprint having a few minutiae since they use a fixed degree of the polynomial without considering the number of fingerprint minutiae. To solve this problem, we use an adaptive degree of the polynomial considering the number of minutiae extracted from each user. Also, we apply multiple polynomials to avoid the possible degradation of the security of a simple solution(i.e., using a low-degree polynomial). Based on the experimental results, our method can make the possible attack difficult 2192 times more than using a low-degree polynomial as well as verify the users having a few minutiae.
Abstract: This paper studies the duration or survival time of commercial banks active in the Moscovian three month Rouble deposits market, during the 1994-1997 period. The privatization process of the Russian commercial banking industry, after the 1988 banking reform, caused a massive entry of new banks followed by a period of high rates of exit. As a consequence, many firms went bankrupt without refunding their deposits. Therefore, both for the banks and for the banks- depositors, it is of interest to analyze which are the significant characteristics that motivate the exit or the closing of the bank. We propose a different methodology based on penalized weighted least squares which represents a very general, flexible and innovative approach for this type of analysis. The more relevant results are that smaller banks exit sooner, banks that enter the market in the last part of the study have shorter durations. As expected, the more experienced banks have a longer duration in the market. In addition, the mean survival time is lower for banks which offer extreme interest rates.
Abstract: Trust is essential for further and wider acceptance of
contemporary e-services. It was first addressed almost thirty years
ago in Trusted Computer System Evaluation Criteria standard by
the US DoD. But this and other proposed approaches of that
period were actually solving security. Roughly some ten years ago,
methodologies followed that addressed trust phenomenon at its core,
and they were based on Bayesian statistics and its derivatives, while
some approaches were based on game theory. However, trust is a
manifestation of judgment and reasoning processes. It has to be dealt
with in accordance with this fact and adequately supported in cyber
environment. On the basis of the results in the field of psychology
and our own findings, a methodology called qualitative algebra has
been developed, which deals with so far overlooked elements of trust
phenomenon. It complements existing methodologies and provides a
basis for a practical technical solution that supports management of
trust in contemporary computing environments. Such solution is also
presented at the end of this paper.
Abstract: Several methods have been proposed for color image
compression but the reconstructed image had very low signal to noise
ratio which made it inefficient. This paper describes a lossy
compression technique for color images which overcomes the
drawbacks. The technique works on spatial domain where the pixel
values of RGB planes of the input color image is mapped onto two
dimensional planes. The proposed technique produced better results
than JPEG2000, 2DPCA and a comparative study is reported based
on the image quality measures such as PSNR and MSE.Experiments
on real time images are shown that compare this methodology with
previous ones and demonstrate its advantages.
Abstract: Water is the main component of biological processes.
Water management is important to obtain higher productivity. In this
study, some of the yield components were investigated together with
different drought levels. Four chickpea genotypes (CDC Frontier,
CDC Luna, Sawyer and Sierra) were grown in pots with 3 different
irrigation levels (a dose of 17.5 ml, 35 ml and 70 ml for each pot per
day) after three weeks from sowing. In the research, flowering, pod
set, pod per plant, fertile pod, double seed/pod, stem diameter, plant
weight, seed per plant, 1000 seed weight, seed diameter, vegetation
length and weekly plant height were measured. Consequently,
significant differences were observed on all the investigated
characteristics owing to genotypes (except double seed/pod and stem
diameter), water levels (except first pod, seed weight and height on
3rd week) and genotype x water level interaction (except first pod,
double seed/pod, seed weight and height).
Abstract: This paper presents the possibilities of using Weibull statistical distribution in modeling the distribution of defects in ERP systems. There follows a case study, which examines helpdesk records of defects that were reported as the result of one ERP subsystem upgrade. The result of the applied modeling is in modeling the reliability of the ERP system from a user perspective with estimated parameters like expected maximum number of defects in one day or predicted minimum of defects between two upgrades. Applied measurement-based analysis framework is proved to be suitable in predicting future states of the reliability of the observed ERP subsystems.
Abstract: In this paper a novel method for multiple one dimensional real valued sinusoidal signal frequency estimation in the presence of additive Gaussian noise is postulated. A computationally simple frequency estimation method with efficient statistical performance is attractive in many array signal processing applications. The prime focus of this paper is to combine the subspace-based technique and a simple peak search approach. This paper presents a variant of the Propagator Method (PM), where a collaborative approach of SUMWE and Propagator method is applied in order to estimate the multiple real valued sine wave frequencies. A new data model is proposed, which gives the dimension of the signal subspace is equal to the number of frequencies present in the observation. But, the signal subspace dimension is twice the number of frequencies in the conventional MUSIC method for estimating frequencies of real-valued sinusoidal signal. The statistical analysis of the proposed method is studied, and the explicit expression of asymptotic (large-sample) mean-squared-error (MSE) or variance of the estimation error is derived. The performance of the method is demonstrated, and the theoretical analysis is substantiated through numerical examples. The proposed method can achieve sustainable high estimation accuracy and frequency resolution at a lower SNR, which is verified by simulation by comparing with conventional MUSIC, ESPRIT and Propagator Method.