Abstract: MC (Management Control)& IC (Internal Control) – what is the relationship? (an empirical study into the definitions between MC and IC) based on the wider considerations of Internal Control and Management Control terms, attention is focused not only on the financial aspects but also more on the soft aspects of the business, such as culture, behaviour, standards and values. The limited considerations of Management Control are focused mainly in the hard, financial aspects of business operation. The definitions of Management Control and Internal Control are often used interchangeably and the results of this empirical study reveal that Management Control is part of Internal Control, there is no causal link between the two concepts. Based on the interpretation of the respondents, the term Management Control has moved from a broad term to a more limited term with the soft aspects of the influencing of behaviour, performance measurements, incentives and culture. This paper is an exploratory study based on qualitative research and on a qualitative matrix method analysis of the thematic definition of the terms Management Control and Internal Control.
Abstract: A reliability, availability and maintainability (RAM) model has been built for acid gas removal plant for system analysis that will play an important role in any process modifications, if required, for achieving its optimum performance. Due to the complexity of the plant, the model was based on a Reliability Block Diagram (RBD) with a Monte Carlo simulation engine. The model has been validated against actual plant data as well as local expert opinions, resulting in an acceptable simulation model. The results from the model showed that the operation and maintenance can be further improved, resulting in reduction of the annual production loss.
Abstract: In this paper, the requirement for Coke quality
prediction, its role in Blast furnaces, and the model output is
explained. By applying method of Artificial Neural Networking
(ANN) using back propagation (BP) algorithm, prediction model has
been developed to predict CSR. Important blast furnace functions
such as permeability, heat exchanging, melting, and reducing
capacity are mostly connected to coke quality. Coke quality is further
dependent upon coal characterization and coke making process
parameters. The ANN model developed is a useful tool for process
experts to adjust the control parameters in case of coke quality
deviations. The model also makes it possible to predict CSR for new
coal blends which are yet to be used in Coke Plant. Input data to the
model was structured into 3 modules, for tenure of past 2 years and
the incremental models thus developed assists in identifying the
group causing the deviation of CSR.
Abstract: Bluetooth is a personal wireless communication
technology and is being applied in many scenarios. It is an emerging
standard for short range, low cost, low power wireless access
technology. Current existing MAC (Medium Access Control)
scheduling schemes only provide best-effort service for all masterslave
connections. It is very challenging to provide QoS (Quality of
Service) support for different connections due to the feature of
Master Driven TDD (Time Division Duplex). However, there is no
solution available to support both delay and bandwidth guarantees
required by real time applications. This paper addresses the issue of
how to enhance QoS support in a Bluetooth piconet. The Bluetooth
specification proposes a Round Robin scheduler as possible solution
for scheduling the transmissions in a Bluetooth Piconet. We propose
an algorithm which will reduce the bandwidth waste and enhance the
efficiency of network. We define token counters to estimate traffic of
real-time slaves. To increase bandwidth utilization, a back-off
mechanism is then presented for best-effort slaves to decrease the
frequency of polling idle slaves. Simulation results demonstrate that
our scheme achieves better performance over the Round Robin
scheduling.
Abstract: Validation of an automation system is an important issue. The goal is to check if the system under investigation, modeled by a Petri net, never enters the undesired states. Usually, tools dedicated to Petri nets such as DESIGN/CPN are used to make reachability analysis. The biggest problem with this approach is that it is impossible to generate the full occurence graph of the system because it is too large. In this paper, we show how computational methods such as temporal logic model checking and Groebner bases can be used to verify the correctness of the design of an automation system. We report our experimental results with two automation systems: the Automated Guided Vehicle (AGV) system and the traffic light system. Validation of these two systems ranged from 10 to 30 seconds on a PC depending on the optimizing parameters.
Abstract: In research on natural ventilation, and passive cooling
with forced convection, is essential to know how heat flows in a solid
object and the pattern of temperature distribution on their surfaces,
and eventually how air flows through and convects heat from the
surfaces of steel under roof. This paper presents some results from
running the computational fluid dynamic program (CFD) by
comparison between natural ventilation and forced convection within
roof attic that is received directly from solar radiation. The CFD
program for modeling air flow inside roof attic has been modified to
allow as two cases. First case, the analysis under natural ventilation,
is closed area in roof attic and second case, the analysis under forced
convection, is opened area in roof attic. These extend of all cases to
available predictions of variations such as temperature, pressure, and
mass flow rate distributions in each case within roof attic. The
comparison shows that this CFD program is an effective model for
predicting air flow of temperature and heat transfer coefficient
distribution within roof attic. The result shows that forced convection
can help to reduce heat transfer through roof attic and an around area
of steel core has temperature inner zone lower than natural
ventilation type. The different temperature on the steel core of roof
attic of two cases was 10-15 oK.
Abstract: The development of the signal compression
algorithms is having compressive progress. These algorithms are
continuously improved by new tools and aim to reduce, an average,
the number of bits necessary to the signal representation by means of
minimizing the reconstruction error. The following article proposes
the compression of Arabic speech signal by a hybrid method
combining the wavelet transform and the linear prediction. The
adopted approach rests, on one hand, on the original signal
decomposition by ways of analysis filters, which is followed by the
compression stage, and on the other hand, on the application of the
order 5, as well as, the compression signal coefficients. The aim of
this approach is the estimation of the predicted error, which will be
coded and transmitted. The decoding operation is then used to
reconstitute the original signal. Thus, the adequate choice of the
bench of filters is useful to the transform in necessary to increase the
compression rate and induce an impercevable distortion from an
auditive point of view.
Abstract: In policy discourse of 1990s, more inclusive spaces
have been constructed for realizing full and meaningful participation
of common people in education. These participatory spaces provide
an alternative possibility for universalizing elementary education
against the backdrop of a history of entrenched forms of social and
economical exclusion; inequitable education provisions; and
shrinking role of the state in today-s neo-liberal times. Drawing on
case-studies of bottom-up approaches to school governance, the study
examines an array of innovative ways through which poor people
gained a sense of identity and agency by evolving indigenous
solutions to issues regarding schooling of their children. In the
process, state-s institutions and practices became more accountable
and responsive to educational concerns of the marginalized people.
The deliberative participation emerged as an active way of
experiencing deeper forms of empowerment and democracy than its
passive realization as mere bearers of citizen rights.
Abstract: The fluid mechanics principle is used extensively in
designing axial flow fans and their associated equipment. This paper presents a computational fluid dynamics (CFD) modeling of air flow
distribution from a radiator axial flow fan used in an acid pump truck Tier4 (APT T4) Repower. This axial flow fan augments the transfer
of heat from the engine mounted on the APT T4.
CFD analysis was performed for an area weighted average static pressure difference at the inlet and outlet of the fan. Pressure contours, velocity vectors, and path lines were plotted for detailing
the flow characteristics for different orientations of the fan blade. The results were then compared and verified against known theoretical observations and actual experimental data. This study
shows that a CFD simulation can be very useful for predicting and understanding the flow distribution from a radiator fan for further
research work.
Abstract: In this paper, cloud resource broker using goalbased
request in medical application is proposed. To handle recent
huge production of digital images and data in medical informatics
application, the cloud resource broker could be used by medical
practitioner for proper process in discovering and selecting correct
information and application. This paper summarizes several
reviewed articles to relate medical informatics application with
current broker technology and presents a research work in applying
goal-based request in cloud resource broker to optimize the use of
resources in cloud environment. The objective of proposing a new
kind of resource broker is to enhance the current resource
scheduling, discovery, and selection procedures. We believed that
it could help to maximize resources allocation in medical
informatics application.
Abstract: This paper presents image compression with wavelet based method. The wavelet transformation divides image to low- and high pass filtered parts. The traditional JPEG compression technique requires lower computation power with feasible losses, when only compression is needed. However, there is obvious need for wavelet based methods in certain circumstances. The methods are intended to the applications in which the image analyzing is done parallel with compression. Furthermore, high frequency bands can be used to detect changes or edges. Wavelets enable hierarchical analysis for low pass filtered sub-images. The first analysis can be done for a small image, and only if any interesting is found, the whole image is processed or reconstructed.
Abstract: The social force model which belongs to the
microscopic pedestrian studies has been considered as the supremacy
by many researchers and due to the main feature of reproducing the
self-organized phenomena resulted from pedestrian dynamic. The
Preferred Force which is a measurement of pedestrian-s motivation to
adapt his actual velocity to his desired velocity is an essential term on
which the model was set up. This Force has gone through stages of
development: first of all, Helbing and Molnar (1995) have modeled
the original force for the normal situation. Second, Helbing and his
co-workers (2000) have incorporated the panic situation into this
force by incorporating the panic parameter to account for the panic
situations. Third, Lakoba and Kaup (2005) have provided the
pedestrians some kind of intelligence by incorporating aspects of the
decision-making capability. In this paper, the authors analyze the
most important incorporations into the model regarding the preferred
force. They make comparisons between the different factors of these
incorporations. Furthermore, to enhance the decision-making ability
of the pedestrians, they introduce additional features such as the
familiarity factor to the preferred force to let it appear more
representative of what actually happens in reality.
Abstract: In this research, we propose to use the discrete cosine
transform to approximate the cumulative distributions of data cube
cells- values. The cosine transform is known to have a good energy
compaction property and thus can approximate data distribution
functions easily with small number of coefficients. The derived
estimator is accurate and easy to update. We perform experiments to
compare its performance with a well-known technique - the (Haar)
wavelet. The experimental results show that the cosine transform
performs much better than the wavelet in estimation accuracy, speed,
space efficiency, and update easiness.
Abstract: Internet today has a huge impact on all aspects of life,
and also in the area of the broader context of democracy, politics and
politicians. If democracy is freedom of choice, there are a number of
conditions that can ensure in practice the freedom to be achieved and
realized. These preconditions must be achieved regardless of the
manner of voting. The key contribution of ICT to achieve freedom of
choice is that technology enables the correlation of the citizens and
elected representatives on the better way than it was possible without
the Internet. In this sense, we can say that the Internet and ICT are
changing significantly, and potentially improving the environment in
which democratic processes are taking place. This paper aims to
describe trends in use of ICT in democratic processes, and analyzes
the challenges for implementation of e-Democracy in Montenegro
Abstract: Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.
Abstract: The paper focuses on the enhanced stiffness modeling
of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the
virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by
rigid links and perfect joints. In contrast to the conventional
formulation, which is valid for the unloaded mode and small
displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The
developed numerical technique allows computing the static
equilibrium and relevant force/torque reaction of the manipulator for
any given displacement of the end-effector. This enables designer
detecting essentially nonlinear effects in elastic behavior of
manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of
the dedicated matrix composed of the stiffness parameters of the
virtual springs and the Jacobians/Hessians of the active and passive
joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel
manipulator of the Orthoglide family
Abstract: Being creative in an educational environment, such as in the university, has many times been downplayed by bureaucracy, human inadequacy and physical hindrance. These factors control, stifle and subsequently condemn this natural phenomenon which is normally exuded by the tertiary community. If taken in a positive light, creativity has always led to many new discoveries and inventions. These creations are then gradually developed for the university reputation and achievements, in all fields of studies from the sciences to the humanities. This paper attempts to explore, through more than twenty years of observation, issues that stifle the university citizenry – academicians and students- – creativity. It also scrutinizes how enhancement of such creativity can be further supported by bureaucracy simplicity, encouraging and developing human potential and constructing uncompromising physical infrastructure and administrative support. These ideals – all of which can help to promote creativity, increases the productivity of the university community in aspects of teaching, research, publication, innovation and commercialization; be it at national as well as at international arena for the good of human and societal growth and development. This discursive presentation hopes to address another issue on promoting university community creativity through several deliverables which require cooperation from every quarter of the institution so that being creative continues to be promoted for sustainable human capital growth and development of the country, if not, the global community.
Abstract: Recently, Cassava has been the driving force of many
developing countries- economic progress. To attain this level,
prerequisites were put in place enabling cassava sector to become an
industrial and a highly competitive crop. Cameroon can achieve the
same results. Moreover, it can upgrade the living conditions of both
rural and urban dwellers and stimulate the development of the whole
economy. Achieving this outcome calls for agricultural policy
reforms. The adoption and implementation of adequate policies go
along with efficient strategies. To choose effective strategies, an indepth
investigation of the sector-s problems is highly recommended.
This paper uses gap analysis method to evaluate cassava sector in
Cameroon. It studies the present situation (where it is now),
interrogates the future (where it should be) and finally proposes
solutions to fill the gap.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: This article proposes an Ant Colony Optimization
(ACO) metaheuristic to minimize total makespan for scheduling a set
of jobs and assign workers for uniformly related parallel machines.
An algorithm based on ACO has been developed and coded on a
computer program Matlab®, to solve this problem. The paper
explains various steps to apply Ant Colony approach to the problem
of minimizing makespan for the worker assignment & jobs
scheduling problem in a parallel machine model and is aimed at
evaluating the strength of ACO as compared to other conventional
approaches. One data set containing 100 problems (12 Jobs, 03
machines and 10 workers) which is available on internet, has been
taken and solved through this ACO algorithm. The results of our
ACO based algorithm has shown drastically improved results,
especially, in terms of negligible computational effort of CPU, to
reach the optimal solution. In our case, the time taken to solve all 100
problems is even lesser than the average time taken to solve one
problem in the data set by other conventional approaches like GA
algorithm and SPT-A/LMC heuristics.