Abstract: Environmental accounting is a recent phenomenon in the modern jurisprudence. It may reflect the corporate governance mechanisms in line with the natural resources and environmental sound management and administration systems in any country of the world. It may be a corporate focused on the improving of the environmental quality. But it is often identified that it is ignored due to some reasons such as unconsciousness, lack of ethical education etc. At present, the world community is very much concerned about the state of the environmental accounting and auditing systems as it bears sustainability on the mother earth for our generations. It is one of the important tools for understanding on the role played by the natural environment in the economy. It provides adequate data which is highlighted both in the contribution of natural resources to economic well-being as well as the costs imposed by pollution or resource degradation. It can play a critical role as on be a part of the many international environmental organizations such as IUCN, WWF, PADELIA, WRI etc.; as they have been taking many initiatives for ensuring the environmental accouting for our competent survivals. The global state actors have already taken some greening accounting initiatives under the forum of the United Nations Division for Sustainable Dedevolpment, the United Nations Statistical Division, the United Nations Conference on Environment and development known as Earth Summit in Rio de Janeiro, Johannesburg Conference 2002 etc. This study will provide an overview of the environmental accounting education consisting of 25 respondents based on the primary and secondary sources.
Abstract: The groundwater is one of the main sources for
sustainability in the United Arab Emirates (UAE). Intensive
developments in Al-Ain area lead to increase water demand, which
consequently reduced the overall groundwater quantity in major
aquifers. However, in certain residential areas within Al-Ain, it has
been noticed that the groundwater level is rising, for example in
Sha-ab Al Askher area. The reasons for the groundwater rising
phenomenon are yet to be investigated. In this work, twenty four
seismic refraction profiles have been carried out along the study
pilot area; as well as field measurement of the groundwater level in
a number of available water wells in the area. The processed
seismic data indicated the deepest and shallowest groundwater
levels are 15m and 2.3 meters respectively. This result is greatly
consistent with the proper field measurement of the groundwater
level. The minimum detected value may be referred to perched
subsurface water which may be associated to the infiltration from
the surrounding water bodies such as lakes, and elevated farms. The
maximum values indicate the accurate groundwater level within the
study area. The findings of this work may be considered as a
preliminary help to the decision makers.
Abstract: Theoptimal extraction condition of dried Phaseolus
vulgaris powderwas studied. The three independent variables are raw
material concentration, shaking and centrifugaltime. The dependent
variables are both yield percentage of crude extract and alphaamylase
enzyme inhibition activity. The experimental design was
based on box-behnkendesign. Highest yield percentage of crude
extract could get from extraction condition at concentration of 1, 0,1,
concentration of 0.15 M ,extraction time for 2hour, and
separationtime for60 min. Moreover, the crude extract with highest
alpha-amylase enzyme inhibition activityoccurred by extraction
condition at concentration of 0.10 M, extraction time for 2 min, and
separation time for 45 min
Abstract: One of the most challenges for hard surface cleaning product is to get rid of soap scum, a filmy sticky layer in the bathroom. The deposits of soap scum can be removed by using a proper surfactant solution with chelating agent. Unfortunately, the conventional chelating agent, ethylenediamine tetraacetic acid (EDTA), has low biodegradability, which can be tolerance in water resources and harmful to aquatic animal and microorganism. In this study, two biodegradable chelating agents, ethylenediamine disuccinic acid (EDDS) and glutamic acid diacetic acid (GLDA) were introduced as a replacement of EDTA. The result shows that using GLDA with amphoteric surfactant gave the highest equilibrium solubility of soap scum.
Abstract: In this paper performance of Puma 560
manipulator is being compared for hybrid gradient descent
and least square method learning based ANFIS controller with
hybrid Genetic Algorithm and Generalized Pattern Search
tuned radial basis function based Neuro-Fuzzy controller.
ANFIS which is based on Takagi Sugeno type Fuzzy
controller needs prior knowledge of rule base while in radial
basis function based Neuro-Fuzzy rule base knowledge is not
required. Hybrid Genetic Algorithm with generalized Pattern
Search is used for tuning weights of radial basis function
based Neuro- fuzzy controller. All the controllers are checked
for butterfly trajectory tracking and results in the form of
Cartesian and joint space errors are being compared. ANFIS
based controller is showing better performance compared to
Radial Basis Function based Neuro-Fuzzy Controller but rule
base independency of RBF based Neuro-Fuzzy gives it an
edge over ANFIS
Abstract: This research aimed at investigating the Cr (III), Cd
(II) and Pb (II) removal efficiencies by using the newly synthesized
metal oxides/ polyethersulfone (PES), Al2O3/PES and ZrO2/PES,
membranes from synthetic wastewater and exploring fouling
mechanisms. A Comparative study between the removal efficiencies
of Cr (III), Cd (II) and Pb (II) from synthetic and natural wastewater
by using adsorption onto agricultural by products and the newly
synthesized Al2O3/PES and ZrO2/PES membranes was conducted to
assess the advantages and limitations of using the metal oxides/PES
membranes for heavy metals removal. The results showed that about
99 % and 88 % removal efficiencies were achieved by the tested
membranes for Pb (II) and Cr (III), respectively.
Abstract: Machining is an important manufacturing process used to produce a wide variety of metallic parts. Among various machining processes, turning is one of the most important one which is employed to shape cylindrical parts. In turning, the quality of finished product is measured in terms of surface roughness. In turn, surface quality is determined by machining parameters and tool geometry specifications. The main objective of this study is to simultaneously model and optimize machining parameters and tool geometry in order to improve the surface roughness for AISI1045 steel. Several levels of machining parameters and tool geometry specifications are considered as input parameters. The surface roughness is selected as process output measure of performance. A Taguchi approach is employed to gather experimental data. Then, based on signal-to-noise (S/N) ratio, the best sets of cutting parameters and tool geometry specifications have been determined. Using these parameters values, the surface roughness of AISI1045 steel parts may be minimized. Experimental results are provided to illustrate the effectiveness of the proposed approach.
Abstract: Studying alternative raw materials for biodiesel production is of major importance. The use of mixtures with incorporation of wastes is an environmental friendly alternative and might reduce biodiesel production costs. The objective of the present work was: (i) to study biodiesel production using waste frying oil mixed with pork lard and (ii) to understand how mixture composition influences biodiesel quality. Biodiesel was produced by transesterification and quality was evaluated through determination of several parameters according to EN 14214. The weight fraction of lard in the mixture varied from 0 to 1 in 0.2 intervals. Biodiesel production yields varied from 81.7 to 88.0 (wt%), the lowest yields being the ones obtained using waste frying oil and lard alone as raw materials. The obtained products fulfilled most of the determined quality specifications according to European biodiesel quality standard EN 14214. Minimum purity (96.5 wt%) was closely obtained when waste frying oil was used alone and when 0.2% of lard was incorporated in the raw material (96.3 wt%); however, it ranged from 93.9 to 96.3 (wt%) being always close to the limit. From the evaluation of the influence of mixture composition in biodiesel quality, it was possible to establish a model to be used for predicting some parameters of biodiesel resulting from mixtures of waste frying oil with lard when different lard contents are used.
Abstract: The aim of this study was to demonstrate the possible
effect of some variables such as age, gender, blood sugar level, and
duration of diabetes on the serum level of zinc in diabetic individuals
from Murzuk area. Serum zinc (Zn), Fasting blood sugar (FBS),
hemoglobin HbA1c (HbA1c) were evaluated in 46 type I diabetic
subjects (group 1), 48 type II diabetic subjects (group 2) and 43
healthy individuals (control) of both genders aged (30-81) years. Data
showed that both diabetic groups have significantly higher (P0.05) differences in serum Zn levels were observed
between Males and Females. Serum Zn levels were non-significantly
decreased with increasing age. In type II diabetic subjects, serum Zn
levels were non-significantly decreased with increasing duration of
disease whereas those in type I were non-significantly increased.
Abstract: Home Automation is a field that, among other
subjects, is concerned with the comfort, security and energy
requirements of private homes. The configuration of automatic
functions in this type of houses is not always simple to its inhabitants
requiring the initial setup and regular adjustments. In this work, the
ubiquitous computing system vision is used, where the users- action
patterns are captured, recorded and used to create the contextawareness
that allows the self-configuration of the home automation
system. The system will try to free the users from setup adjustments
as the home tries to adapt to its inhabitants- real habits. In this paper
it is described a completely automated process to determine the light
state and act on them, taking in account the users- daily habits.
Artificial Neural Network (ANN) is used as a pattern recognition
method, classifying for each moment the light state. The work
presented uses data from a real house where a family is actually
living.
Abstract: Aim of this study is to evaluate a new three-equation turbulence model applied to flow and heat transfer through a pipe. Uncertainty is approximated by comparing with published direct numerical simulation results for fully-developed flow. Error in the mean axial velocity, temperature, friction, and heat transfer is found to be negligible.
Abstract: Given a large sparse signal, great wishes are to
reconstruct the signal precisely and accurately from lease number of
measurements as possible as it could. Although this seems possible
by theory, the difficulty is in built an algorithm to perform the
accuracy and efficiency of reconstructing. This paper proposes a new
proved method to reconstruct sparse signal depend on using new
method called Least Support Matching Pursuit (LS-OMP) merge it
with the theory of Partial Knowing Support (PSK) given new method
called Partially Knowing of Least Support Orthogonal Matching
Pursuit (PKLS-OMP).
The new methods depend on the greedy algorithm to compute the
support which depends on the number of iterations. So to make it
faster, the PKLS-OMP adds the idea of partial knowing support of its
algorithm. It shows the efficiency, simplicity, and accuracy to get
back the original signal if the sampling matrix satisfies the Restricted
Isometry Property (RIP).
Simulation results also show that it outperforms many algorithms
especially for compressible signals.
Abstract: Beta-spline is built on G2 continuity which guarantees
smoothness of generated curves and surfaces using it. This curve is
preferred to be used in object design rather than reconstruction. This
study however, employs the Beta-spline in reconstructing a 3-
dimensional G2 image of the Stanford Rabbit. The original data
consists of multi-slice binary images of the rabbit. The result is then
compared with related works using other techniques.
Abstract: This paper addresses the problem of source separation
in images. We propose a FastICA algorithm employing a modified
Gaussian contrast function for the Blind Source Separation.
Experimental result shows that the proposed Modified Gaussian
FastICA is effectively used for Blind Source Separation to obtain
better quality images. In this paper, a comparative study has been
made with other popular existing algorithms. The peak signal to
noise ratio (PSNR) and improved signal to noise ratio (ISNR) are
used as metrics for evaluating the quality of images. The ICA metric
Amari error is also used to measure the quality of separation.
Abstract: Knowledge is a key asset for any organisation to
sustain competitive advantages, but it is difficult to identify and
represent knowledge which is needed to perform activities in
business processes. The effective knowledge management and
support for relevant business activities definitely gives a huge impact
to the performance of the organisation as a whole. This is because
that knowledge have the functions of directing, coordinating and
controlling actions within business processes. The study has
introduced organisational morphology, a norm-based approach by
applying semiotic theories which emphasise on the representation of
knowledge in norms. This approach is concerned with the
identification of activities into three categories: substantive,
communication and control activities. All activities are directed by
norms; hence three types of norms exist; each is associated to a
category of activities. The paper describes the approach briefly and
illustrates the application of this approach through a case study of
academic activities in higher education institutions. The result of the
study shows that the approach provides an effective way to profile
business knowledge and the profile enables the understanding and
specification of business requirements of an organisation.
Abstract: Due to the limited energy resources, energy efficient operation of sensor node is a key issue in wireless sensor networks. Clustering is an effective method to prolong the lifetime of energy constrained wireless sensor network. However, clustering in wireless sensor network faces several challenges such as selection of an optimal group of sensor nodes as cluster, optimum selection of cluster head, energy balanced optimal strategy for rotating the role of cluster head in a cluster, maintaining intra and inter cluster connectivity and optimal data routing in the network. In this paper, we propose a protocol supporting an energy efficient clustering, cluster head selection/rotation and data routing method to prolong the lifetime of sensor network. Simulation results demonstrate that the proposed protocol prolongs network lifetime due to the use of efficient clustering, cluster head selection/rotation and data routing.
Abstract: In the oil and gas industry, energy prediction can help
the distributor and customer to forecast the outgoing and incoming
gas through the pipeline. It will also help to eliminate any
uncertainties in gas metering for billing purposes. The objective of
this paper is to develop Neural Network Model for energy
consumption and analyze the performance model. This paper
provides a comprehensive review on published research on the
energy consumption prediction which focuses on structures and the
parameters used in developing Neural Network models. This paper is
then focused on the parameter selection of the neural network
prediction model development for energy consumption and analysis
on the result. The most reliable model that gives the most accurate
result is proposed for the prediction. The result shows that the
proposed neural network energy prediction model is able to
demonstrate an adequate performance with least Root Mean Square
Error.
Abstract: This paper proposes an algorithm which automatically aligns and stitches the component medical images (fluoroscopic) with varying degrees of overlap into a single composite image. The alignment method is based on similarity measure between the component images. As applied here the technique is intensity based rather than feature based. It works well in domains where feature based methods have difficulty, yet more robust than traditional correlation. Component images are stitched together using the new triangular averaging based blending algorithm. The quality of the resultant image is tested for photometric inconsistencies and geometric misalignments. This method cannot correct rotational, scale and perspective artifacts.
Abstract: The concept of order reduction by least-squares moment matching and generalised least-squares methods has been extended about a general point ?a?, to obtain the reduced order models for linear, time-invariant dynamic systems. Some heuristic criteria have been employed for selecting the linear shift point ?a?, based upon the means (arithmetic, harmonic and geometric) of real parts of the poles of high order system. It is shown that the resultant model depends critically on the choice of linear shift point ?a?. The validity of the criteria is illustrated by solving a numerical example and the results are compared with the other existing techniques.
Abstract: In this paper, we propose improved versions of DVHop
algorithm as QDV-Hop algorithm and UDV-Hop algorithm for
better localization without the need for additional range measurement
hardware. The proposed algorithm focuses on third step of DV-Hop,
first error terms from estimated distances between unknown node and
anchor nodes is separated and then minimized. In the QDV-Hop
algorithm, quadratic programming is used to minimize the error to
obtain better localization. However, quadratic programming requires
a special optimization tool box that increases computational
complexity. On the other hand, UDV-Hop algorithm achieves
localization accuracy similar to that of QDV-Hop by solving
unconstrained optimization problem that results in solving a system
of linear equations without much increase in computational
complexity. Simulation results show that the performance of our
proposed schemes (QDV-Hop and UDV-Hop) is superior to DV-Hop
and DV-Hop based algorithms in all considered scenarios.