Abstract: This paper describes the results of an extensive study
and comparison of popular hash functions SHA-1, SHA-256,
RIPEMD-160 and RIPEMD-320 with JERIM-320, a 320-bit hash
function. The compression functions of hash functions like SHA-1
and SHA-256 are designed using serial successive iteration whereas
those like RIPEMD-160 and RIPEMD-320 are designed using two
parallel lines of message processing. JERIM-320 uses four parallel
lines of message processing resulting in higher level of security than
other hash functions at comparable speed and memory requirement.
The performance evaluation of these methods has been done by using
practical implementation and also by using step computation
methods. JERIM-320 proves to be secure and ensures the integrity of
messages at a higher degree. The focus of this work is to establish
JERIM-320 as an alternative of the present day hash functions for the
fast growing internet applications.
Abstract: Delivering streaming video over wireless is an
important component of many interactive multimedia applications
running on personal wireless handset devices. Such personal devices
have to be inexpensive, compact, and lightweight. But wireless
channels have a high channel bit error rate and limited bandwidth.
Delay variation of packets due to network congestion and the high bit
error rate greatly degrades the quality of video at the handheld
device. Therefore, mobile access to multimedia contents requires
video transcoding functionality at the edge of the mobile network for
interworking with heterogeneous networks and services. Therefore,
to guarantee quality of service (QoS) delivered to the mobile user, a
robust and efficient transcoding scheme should be deployed in
mobile multimedia transporting network. Hence, this paper
examines the challenges and limitations that the video transcoding
schemes in mobile multimedia transporting network face. Then
handheld resources, network conditions and content based mobile
and wireless video transcoding is proposed to provide high QoS
applications. Exceptional performance is demonstrated in the
experiment results. These experiments were designed to verify and
prove the robustness of the proposed approach. Extensive
experiments have been conducted, and the results of various video
clips with different bit rate and frame rate have been provided.
Abstract: A self-compacting concrete (SCC) is the one that can
be placed in the form and can go through obstructions by its own
weight and without the need of vibration. Since its first development
in Japan in 1988, SCC has gained wider acceptance in Japan, Europe
and USA due to its inherent distinct advantages. Although there are
visible signs of its gradual acceptance in the North Africa through its
limited use in construction, Libya has yet to explore the feasibility
and applicability of SCC in new construction. The contributing
factors to this reluctance appear to be lack of any supportive
evidence of its suitability with local aggregates and the harsh
environmental conditions. The primary aim of this study is to explore
the feasibility of using SCC made with local aggregates of Eastern
Province of Libya by examining its basic properties characteristics.
This research consists of: (i) Development of a suitable mix for SCC
such as the effect of water to cement ratio, limestone and silica fume
that would satisfy the requirements of the plastic state; (ii) Casting of
concrete samples and testing them for compressive strength and unit
weight. Local aggregates, cement, admixtures and industrial waste
materials were used in this research.
The significance of this research lies in its attempt to provide
some performance data of SCC made in the Eastern Province of
Libya so as to draw attention to the possible use of SCC.
Abstract: The use of buffer thresholds, blocking and adequate
service strategies are well-known techniques for computer networks
traffic congestion control. This motivates the study of series queues
with blocking, feedback (service under Head of Line (HoL) priority
discipline) and finite capacity buffers with thresholds. In this paper,
the external traffic is modelled using the Poisson process and the
service times have been modelled using the exponential distribution.
We consider a three-station network with two finite buffers, for
which a set of thresholds (tm1 and tm2) is defined. This computer
network behaves as follows. A task, which finishes its service at
station B, gets sent back to station A for re-processing with
probability o. When the number of tasks in the second buffer exceeds
a threshold tm2 and the number of task in the first buffer is less than
tm1, the fed back task is served under HoL priority discipline. In
opposite case, for fed backed tasks, “no two priority services in
succession" procedure (preventing a possible overflow in the first
buffer) is applied. Using an open Markovian queuing schema with
blocking, priority feedback service and thresholds, a closed form
cost-effective analytical solution is obtained. The model of servers
linked in series is very accurate. It is derived directly from a twodimensional
state graph and a set of steady-state equations, followed
by calculations of main measures of effectiveness. Consequently,
efficient expressions of the low computational cost are determined.
Based on numerical experiments and collected results we conclude
that the proposed model with blocking, feedback and thresholds can
provide accurate performance estimates of linked in series networks.
Abstract: After a strong earthquake occurs, a secondary disaster due to strong aftershocks, flood, landslide or heavy snow can possible to occur and the secondary disaster due to resident-s action also can possible to happen. However, until now seldom researchers have paid attention at it. This paper focused on the Inhabitant-s action after the strong earthquake occurs when a terrible even becomes calm. An inappropriate behavior of people with disadvantaged climate after the worse earthquake can bring a tragedy to their life.
Abstract: Multimedia security is an incredibly significant area
of concern. A number of papers on robust digital watermarking have
been presented, but there are no standards that have been defined so
far. Thus multimedia security is still a posing problem. The aim of
this paper is to design a robust image-watermarking scheme, which
can withstand a different set of attacks. The proposed scheme
provides a robust solution integrating image moment normalization,
content dependent watermark and discrete wavelet transformation.
Moment normalization is useful to recover the watermark even in
case of geometrical attacks. Content dependent watermarks are a
powerful means of authentication as the data is watermarked with its
own features. Discrete wavelet transforms have been used as they
describe image features in a better manner. The proposed scheme
finds its place in validating identification cards and financial
instruments.
Abstract: In the present study, a numerical analysis is carried
out to investigate unsteady MHD (magneto-hydrodynamic) flow and
heat transfer of a non-Newtonian second grade viscoelastic fluid
over an oscillatory stretching sheet. The flow is induced due to an
infinite elastic sheet which is stretched oscillatory (back and forth) in
its own plane. Effect of viscous dissipation and joule heating are
taken into account. The non-linear differential equations governing
the problem are transformed into system of non-dimensional
differential equations using similarity transformations. A newly
developed meshfree numerical technique Element free Galerkin
method (EFGM) is employed to solve the coupled non linear
differential equations. The results illustrating the effect of various
parameters like viscoelastic parameter, Hartman number, relative
frequency amplitude of the oscillatory sheet to the stretching rate and
Eckert number on velocity and temperature field are reported in
terms of graphs and tables. The present model finds its application in
polymer extrusion, drawing of plastic films and wires, glass, fiber
and paper production etc.
Abstract: Our objective in this paper is to propose an approach
capable of clustering web messages. The clustering is carried out by
assigning, with a certain probability, texts written by the same web
user to the same cluster based on Stylometric features and using
fuzzy clustering algorithms. Focus in the present work is on
comparing the most popular algorithms in fuzzy clustering theory
namely, Fuzzy C-means, Possibilistic C-means and Fuzzy
Possibilistic C-Means.
Abstract: Collaborative problem solving in e-learning can take
in the form of discussion among learner, creating a highly social
learning environment and characterized by participation and
interactivity. This paper, designed a collaborative learning
environment where agent act as co-learner, can play different roles
during interaction. Since different roles have been assigned to the
agent, learner will assume that multiple co-learner exists to help and
guide him all throughout the collaborative problem solving process,
but in fact, alone during the learning process. Specifically, it answers
the questions what roles of the agent should be incorporated to
contribute better learning outcomes, how agent will facilitate the
communication process to provide social learning and interactivity
and what are the specific instructional strategies that facilitate learner
participation, increased skill acquisition and develop critical thinking.
Abstract: Meshing is the process of discretizing problem
domain into many sub domains before the numerical calculation can
be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost
any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation.
The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is
highly affected by the mesh quality. Many efforts had been done in
order to improve the quality of the mesh. The paper describes a mesh
generation routine which has been developed capable of generating
high quality tetrahedral cells in arbitrary complex geometry. A few
test cases in CFD problems are used for testing the mesh generator.
The result of the mesh is compared with the one generated by a
commercial software. The results show that no sliver exists for the
meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary
recovery was also successfully done where all the missing faces are
rebuilt.
Abstract: The use of un-activated bentonite, and un-activated
bentonite blended with limestone for the treatment of acid mine
drainage (AMD) was investigated. Batch experiments were
conducted in a 5 L PVC reactor. Un-activated bentonite on its own
did not effectively neutralize and remove heavy metals from AMD.
The final pH obtained was below 4 and the metal removal efficiency
was below 50% for all the metals when bentonite solid loadings of 1,
5 and 10% were used. With un-activated bentonite (1%) blended with
1% limestone, the final pH obtained was approximately 7 and metal
removal efficiencies were greater than 60% for most of the metals.
The Langmuir isotherm gave the best fit for the experimental data
giving correlation coefficient (R2) very close to 1. Thus, it was
concluded that un-activated bentonite blended with limestone is
suitable for potential applications in removing heavy metals and
neutralizing AMD.
Abstract: In line with changes of consumers modern lifestyle has call for the advertising strategy to change. This research is to find out how game with telepresence and product experience embedded in the computer game to affect users- intention to purchase. Game content developers are urging to consider of placing product message as part of game design strategy that can influence the game player-s intention to purchase. Experiment was carried out on two hundred and fifty undergraduate students who volunteered to participate in the Internet game playing activities. A factor analysis and correlation analysis was performed on items designed to measure telepresence, attitudes toward telepresence, and game player intention to purchase the product advertise in the game that respondents experienced. The results indicated that telepresence consist of interactive experience and product experience. The study also found that product experience is positively related to the game players- intention to purchase. The significance of product experience implies the usefulness of an interactive advertising in the game playing to attract players- intention to purchase the advertised product placed in the creative game design.
Abstract: This study focuses on teamwork in Finnish working
life. Through a wide cross-section of teams the study examines the
causes to which team members attribute the outcomes of their teams.
Qualitative data was collected from 314 respondents. They wrote 616
stories to describe memorable experiences of success and failure in
teamwork. The stories revealed 1930 explanations. The findings
indicate that both favorable and unfavorable team outcomes are
perceived as being caused by the characteristics of team members,
relationships between members, team communication, team
structure, team goals, team leadership, and external forces. The types
represent different attribution levels in the context of organizational
teamwork.
Abstract: This paper deals with a periodic-review substitutable
inventory system for a finite and an infinite number of periods. Here
an upward substitution structure, a substitution of a more costly item
by a less costly one, is assumed, with two products. At the beginning
of each period, a stochastic demand comes for the first item only,
which is quality-wise better and hence costlier. Whenever an arriving
demand finds zero inventory of this product, a fraction of unsatisfied
customers goes for its substitutable second item. An optimal ordering
policy has been derived for each period. The results are illustrated
with numerical examples. A sensitivity analysis has been done to
examine how sensitive the optimal solution and the maximum profit
are to the values of the discount factor, when there is a large number
of periods.
Abstract: In this paper an analytical crack propagation scenario
is proposed which assumes that a crack propagates in the tooth root in
both the crack depth direction and the tooth width direction, and
which is more reasonable and realistic for non-uniform load
distribution cases than the other presented scenarios. An analytical
approach is used for quantifying the loss of time-varying gear mesh
stiffness with the presence of crack propagation in the gear tooth root.
The proposed crack propagation scenario can be applied for crack
propagation modelling and monitoring simulation, but further
research is required for comparison and evaluation of all the
presented crack propagation scenarios from the condition monitoring
point of view.
Abstract: Creating3D environments, including characters and
cities, is a significantly time consuming process due to a large amount
of workinvolved in designing and modelling.There have been a
number of attempts to automatically generate 3D objects employing
shape grammars. However it is still too early to apply the mechanism
to real problems such as real-time computer games.The purpose of this
research is to introduce a time efficient and cost effective method to
automatically generatevarious 3D objects for real-time 3D games.
This Shape grammar-based real-time City Generation (RCG) model is
a conceptual model for generating 3Denvironments in real-time and
can be applied to 3D gamesoranimations. The RCG system can
generate even a large cityby applying fundamental principles of shape
grammars to building elementsin various levels of detailin real-time.
Abstract: The stem cells have ability to differentiated
themselves through mitotic cell division and various range of
specialized cell types. Cellular differentiation is a way by which few
specialized cell develops into more specialized.This paper studies the
fundamental problem of computational schema for an artificial neural
network based on chemical, physical and biological variables of
state. By doing this type of study system could be model for a viable
propagation of various economically important stem cells
differentiation. This paper proposes various differentiation outcomes
of artificial neural network into variety of potential specialized cells
on implementing MATLAB version 2009. A feed-forward back
propagation kind of network was created to input vector (five input
elements) with single hidden layer and one output unit in output
layer. The efficiency of neural network was done by the assessment
of results achieved from this study with that of experimental data
input and chosen target data. The propose solution for the efficiency
of artificial neural network assessed by the comparatative analysis of
“Mean Square Error" at zero epochs. There are different variables of
data in order to test the targeted results.
Abstract: The dispersion of heavy particles line in an isotropic
and incompressible three-dimensional turbulent flow has been
studied using the Kinematic Simulation techniques to find out the
evolution of the line fractal dimension. In this study, the fractal
dimension of the line is found for different cases of heavy particles
inertia (different Stokes numbers) in the absence of the particle
gravity with a comparison with the fractal dimension obtained in the
diffusion case of material line at the same Reynolds number. It can
be concluded for the dispersion of heavy particles line in turbulent
flow that the particle inertia affect the fractal dimension of a line
released in a turbulent flow for Stokes numbers 0.02 < St < 2. At the
beginning for small times, most of the different cases are not affected
by the inertia until a certain time, the particle response time τa, with
larger time as the particles inertia increases, the fractal dimension of
the line increases owing to the particles becoming more sensitive to
the small scales which cause the change in the line shape during its
journey.
Abstract: In the current economy of increasing global
competition, many organizations are attempting to use knowledge as
one of the means to gain sustainable competitive advantage. Besides
large organizations, the success of SMEs can be linked to how well
they manage their knowledge. Despite the profusion of research
about knowledge management within large organizations, fewer
studies tried to analyze KM in SMEs.
This research proposes a new framework showing the determinant
role of organizational dimensions onto KM approaches. The paper
and its propositions are based on a literature review and analysis.
In this research, personalization versus codification,
individualization versus institutionalization and IT-based versus non
IT-based are highlighted as three distinct dimensions of knowledge
management approaches.
The study contributes to research by providing a more nuanced
classification of KM approaches and provides guidance to managers
about the types of KM approaches that should be adopted based on
the size, geographical dispersion and task nature of SMEs.
To the author-s knowledge, the paper is the first of its kind to
examine if there are suitable configurations of KM approaches for
SMEs with different dimensions. It gives valuable information, which
hopefully will help SME sector to accomplish KM.
Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.