Abstract: Oleic acid (C18:1) play an important role in
proliferation of fat cells. In this study, the effect of oleate on cells
viability in 3T3-L1 cells (fat cells) was investigated. The 3T3-L1
cells were treated with various concentrations of oleate in the
presence of 23 mM glucose. Oleate was added to adipogenic media
(day 0) to investigate the influence of oleate on proliferation of
postconfluent preadipocytes after 24 h induction. 0.1 mM oleate
promoted cell division by increasing 33.9% number of cells from
basal control in postconfluent preadipocytes. However, there were no
significantly different in cells viability with control cells when oleate
concentrations were increased up to 0.5 mM. When added to
differentiated adipocytes (day 12) for 48 h, the number of cells
decreased as oleate concentrations increased. 92.7% of cells lost
demonstrated apoptosis and necrosis after 48 h with 0.5 mM oleate.
The fluorochrome staining was examined under fluorescence
microscopy using acridine orange and ethidium bromide double
staining. Furthermore, the presence of high lactate (60.6% increased
from basal control) released into plasma has shown the direct
cytotoxicity of 0.5 mM oleate on adipocytes.
Abstract: This paper presents modern vibration signalprocessing
techniques for vehicle gearbox fault diagnosis, via the
wavelet analysis and the Squared Envelope (SE) technique. The
wavelet analysis is regarded as a powerful tool for the detection of
sudden changes in non-stationary signals. The Squared Envelope
(SE) technique has been extensively used for rolling bearing
diagnostics. In the present work a scheme of using the Squared
Envelope technique for early detection of gear tooth pit. The pitting
defect is manufactured on the tooth side of a fifth speed gear on the
intermediate shaft of a vehicle gearbox. The objective is to
supplement the current techniques of gearbox fault diagnosis based
on using the raw vibration and ordered signals. The test stand is
equipped with three dynamometers; the input dynamometer serves as
the internal combustion engine, the output dynamometers introduce
the load on the flanges of output joint shafts. The gearbox used for
experimental measurements is the type most commonly used in
modern small to mid-sized passenger cars with transversely mounted
powertrain and front wheel drive; a five-speed gearbox with final
drive gear and front wheel differential. The results show that the
approaches methods are effective for detecting and diagnosing
localized gear faults in early stage under different operation
conditions, and are more sensitive and robust than current gear
diagnostic techniques.
Abstract: The rapid improvement of the microprocessor and network has made it possible for the PC cluster to compete with conventional supercomputers. Lots of high throughput type of applications can be satisfied by using the current desktop PCs, especially for those in PC classrooms, and leave the supercomputers for the demands from large scale high performance parallel computations. This paper presents our development on enabling an automated deployment mechanism for cluster computing to utilize the computing power of PCs such as reside in PC classroom. After well deployment, these PCs can be transformed into a pre-configured cluster computing resource immediately without touching the existing education/training environment installed on these PCs. Thus, the training activities will not be affected by this additional activity to harvest idle computing cycles. The time and manpower required to build and manage a computing platform in geographically distributed PC classrooms also can be reduced by this development.
Abstract: The importance of our country-s communication
system is noticeable when a disaster occurs. The communication
system in our country includes wired and wireless telephone
networks, radio, satellite system and more increasingly internet. Even
though our communication system is most extensive and dependable,
extreme conditions can put a strain on them. Interoperability between
heterogeneous wireless networks can be used to provide efficient
communication for emergency first response. IEEE 802.21 specifies
Media Independent Handover (MIH) services to enhance the mobile
user experience by optimizing handovers between heterogeneous
access networks. This paper presents an algorithm to improve
congestion control in MIH framework. It is analytically shown that
by including time factor in network selection we can optimize
congestion in the network.
Abstract: A power cable is widely used for power supply in
power distributing networks and power transmission lines. Due to
limitations in the production, delivery and setting up power cables,
they are produced and delivered in several separate lengths. Cable
itself, consists of two cable terminations and arbitrary number of
cable joints, depending on the cable route length. Electrical stress
control is needed to prevent a dielectric breakdown at the end of the
insulation shield in both the air and cable insulation. Reliability of
cable joint depends on its materials, design, installation and operating
environment. The paper describes design and performance results for
new modeled cable joints. Design concepts, based on numerical
calculations, must be correct. An Equivalent Electrodes
Method/Boundary Elements Method-hybrid approach that allows
electromagnetic field calculations in multilayer dielectric media,
including inhomogeneous regions, is presented.
Abstract: The present study has been taken to explore the
screening of in vitro antimicrobial activities of D-galactose-binding
sponge lectin (HOL-30). HOL-30 was purified from the marine
demosponge Halichondria okadai by affinity chromatography. The
molecular mass of the lectin was determined to be 30 kDa with a
single polypeptide by SDS-PAGE under non-reducing and reducing
conditions. HOL-30 agglutinated trypsinized and glutaraldehydefixed
rabbit and human erythrocytes with preference for type O
erythrocytes. The lectin was subjected to evaluation for inhibition of
microbial growth by the disc diffusion method against eleven human
pathogenic gram-positive and gram-negative bacteria. The lectin
exhibited strong antibacterial activity against gram-positive bacteria,
such as Bacillus megaterium and Bacillus subtilis. However, it did
not affect against gram-negative bacteria such as Salmonella typhi
and Escherichia coli. The largest zone of inhibition was recorded of
Bacillus megaterium (12 in diameter) and Bacillus subtilis (10 mm in
diameter) at a concentration of the lectin (250 μg/disc). On the other
hand, the antifungal activity of the lectin was investigated against six
phytopathogenic fungi based on food poisoning technique. The lectin
has shown maximum inhibition (22.83%) of mycelial growth of
Botrydiplodia theobromae at a concentration of 100 μg/mL media.
These findings indicate that the lectin may be of importance to
clinical microbiology and have therapeutic applications.
Abstract: The purpose of this study was to investigate effects of
modality and redundancy principles on music theory learning among
pupils of different anxiety levels. The lesson of music theory was
developed in three different modes, audio and image (AI), text with
image (TI) and audio with image and text (AIT). The independent
variables were the three modes of courseware. The moderator
variable was the anxiety level, while the dependent variable was the
post test score. The study sample consisted of 405 third-grade pupils.
Descriptive and inferential statistics were conducted to analyze the
collected data. Analyses of covariance (ANCOVA) and Post hoc
were carried out to examine the main effects as well as the
interaction effects of the independent variables on the dependent
variable. The findings of this study showed that medium anxiety
pupils performed significantly better than low and high anxiety
pupils in all the three treatment modes. The AI mode was found to
help pupils with high anxiety significantly more than the TI and AIT
modes.
Abstract: Market based models are frequently used in the resource
allocation on the computational grid. However, as the size of
the grid grows, it becomes difficult for the customer to negotiate
directly with all the providers. Middle agents are introduced to
mediate between the providers and customers and facilitate the
resource allocation process. The most frequently deployed middle
agents are the matchmakers and the brokers. The matchmaking agent
finds possible candidate providers who can satisfy the requirements
of the consumers, after which the customer directly negotiates with
the candidates. The broker agents are mediating the negotiation with
the providers in real time.
In this paper we present a new type of middle agent, the marketmaker.
Its operation is based on two parallel operations - through
the investment process the marketmaker is acquiring resources and
resource reservations in large quantities, while through the resale process
it sells them to the customers. The operation of the marketmaker
is based on the fact that through its global view of the grid it can
perform a more efficient resource allocation than the one possible in
one-to-one negotiations between the customers and providers.
We present the operation and algorithms governing the operation
of the marketmaker agent, contrasting it with the matchmaker and
broker agents. Through a series of simulations in the task oriented
domain we compare the operation of the three agents types. We find
that the use of marketmaker agent leads to a better performance in the
allocation of large tasks and a significant reduction of the messaging
overhead.
Abstract: The main goal of this paper is to establish a
methodology for testing and optimizing GPRS performance over
Libya GSM network as well as to propose a suitable optimization
technique to improve performance. Some measurements of
download, upload, throughput, round-trip time, reliability, handover,
security enhancement and packet loss over a GPRS access network
were carried out. Measured values are compared to the theoretical
values that could be calculated beforehand. This data should be
processed and delivered by the server across the wireless network to
the client. The client on the fly takes those pieces of the data and
process immediately. Also, we illustrate the results by describing the
main parameters that affect the quality of service. Finally, Libya-s
two mobile operators, Libyana Mobile Phone and Al-Madar al-
Jadeed Company are selected as a case study to validate our
methodology.
Abstract: Mercury adsorption on soil was investigated at
different ionic strengths using Ca(NO3)2 as a background electrolyte.
Results fitted the Langmuir equation and the adsorption isotherms
reached a plateau at higher equilibrium concentrations. Increasing
ionic strength decreased the sorption of mercury, due to the
competition of Ca ions for the sorption sites in the soils. The
influence of ionic strength was related to the mechanisms of heavy
metal sorption by the soil. These results can be of practical
importance both in the agriculture and contaminated soils since the
solubility of mercury in soils are strictly dependent on the adsorption
and release process.
Abstract: Video sensor networks operate on stringent requirements
of latency. Packets have a deadline within which they have
to be delivered. Violation of the deadline causes a packet to be
treated as lost and the loss of packets ultimately affects the quality
of the application. Network latency is typically a function of many
interacting components. In this paper, we propose ways of reducing
the forwarding latency of a packet at intermediate nodes. The
forwarding latency is caused by a combination of processing delay
and queueing delay. The former is incurred in order to determine the
next hop in dynamic routing. We show that unless link failures in a
very specific and unlikely pattern, a vast majority of these lookups
are redundant. To counter this we propose source routing as the
routing strategy. However, source routing suffers from issues related
to scalability and being impervious to network dynamics. We propose
solutions to counter these and show that source routing is definitely
a viable option in practical sized video networks. We also propose a
fast and fair packet scheduling algorithm that reduces queueing delay
at the nodes. We support our claims through extensive simulation on
realistic topologies with practical traffic loads and failure patterns.
Abstract: In this paper, a fragile watermarking scheme is proposed for color image specified object-s authentication. The color image is first transformed from RGB to YST color space, suitable for watermarking the color media. The T channel corresponds to the chrominance component of a color image andYS ÔèÑ T , therefore selected for embedding the watermark. The T channel is first divided into 2×2 non-overlapping blocks and the two LSBs are set to zero. The object that is to be authenticated is also divided into 2×2 nonoverlapping blocks and each block-s intensity mean is computed followed by eight bit encoding. The generated watermark is then embedded into T channel randomly selected 2×2 block-s LSBs using 2D-Torus Automorphism. Selection of block size is paramount for exact localization and recovery of work. The proposed scheme is blind, efficient and secure with ability to detect and locate even minor tampering applied to the image with full recovery of original work. The quality of watermarked media is quite high both subjectively and objectively. The technique is suitable for class of images with format such as gif, tif or bitmap.
Abstract: In this paper, we propose a novel spatiotemporal fuzzy
based algorithm for noise filtering of image sequences. Our proposed algorithm uses adaptive weights based on a triangular membership
functions. In this algorithm median filter is used to suppress noise.
Experimental results show when the images are corrupted by highdensity
Salt and Pepper noise, our fuzzy based algorithm for noise filtering of image sequences, are much more effective in suppressing
noise and preserving edges than the previously reported algorithms such as [1-7]. Indeed, assigned weights to noisy pixels are very
adaptive so that they well make use of correlation of pixels. On the other hand, the motion estimation methods are erroneous and in highdensity noise they may degrade the filter performance. Therefore, our
proposed fuzzy algorithm doesn-t need any estimation of motion trajectory. The proposed algorithm admissibly removes noise without having any knowledge of Salt and Pepper noise density.
Abstract: Wireless LAN (WLAN) access in public hotspot areas
becomes popular in the recent years. Since more and more multimedia
information is available in the Internet, there is an increasing demand
for accessing multimedia information through WLAN hotspots.
Currently, the bandwidth offered by an IEEE 802.11 WLAN cannot
afford many simultaneous real-time video accesses. A possible way to
increase the offered bandwidth in a hotspot is the use of multiple access
points (APs). However, a mobile station is usually connected to the
WLAN AP with the strongest received signal strength indicator (RSSI).
The total consumed bandwidth cannot be fairly allocated among those
APs. In this paper, we will propose an effective load-balancing scheme
via the support of the IAPP and SNMP in APs. The proposed scheme is
an open solution and doesn-t need any changes in both wireless stations
and APs. This makes load balancing possible in WLAN hotspots,
where a variety of heterogeneous mobile devices are employed.
Abstract: This interdisciplinary study is an investigation to evaluate user-interfaces in business administration. The study is going to be implemented on two computerized business administration systems with two distinctive user-interfaces, so that differences between the two systems can be determined. Both systems, a commercial and a prototype developed for the purpose of this study, deal with ordering of supplies, tendering procedures, issuing purchase orders, controlling the movement of the stocks against their actual balances on the shelves and editing them on their tabulations. In the second suggested system, modern computer graphics and multimedia issues were taken into consideration to cover the drawbacks of the first system. To highlight differences between the two investigated systems regarding some chosen standard quality criteria, the study employs various statistical techniques and methods to evaluate the users- interaction with both systems. The study variables are divided into two divisions: independent representing the interfaces of the two systems, and dependent embracing efficiency, effectiveness, satisfaction, error rate etc.
Abstract: In the current age, retrieval of relevant information
from massive amount of data is a challenging job. Over the years,
precise and relevant retrieval of information has attained high
significance. There is a growing need in the market to build systems,
which can retrieve multimedia information that precisely meets the
user's current needs. In this paper, we have introduced a framework
for refining query results before showing it to the user, using ambient
intelligence, user profile, group profile, user location, time, day, user
device type and extracted features. A prototypic tool was also
developed to demonstrate the efficiency of the proposed approach.
Abstract: Composting is the process in which municipal solid
waste (MSW) and other organic waste materials such as biosolids
and manures are decomposed through the action of bacteria and other
microorganisms into a stable granular material which, applied to
land, as soil conditioner. Microorganisms, especially those that are
able to degrade polymeric organic material have a key role in speed
up this process. The aim of this study has been established to
isolation of microorganisms with high ability to production
extracellular enzymes for degradation of natural polymers that are
exists in MSW for decreasing time of degradation phase. Our
experimental study for isolation designed in two phases: in first
phase we isolated degrading microorganism with selected media that
consist a special natural polymer such as cellulose, starch, lipids and
etc as sole source of carbon. In second phase we selected
microorganism that had high degrading enzyme production with
enzymatic assay for seed production. However, our findings in pilot
scale have indicated that usage of this microbial consortium had high
efficiency for decreasing degradation phase.
Abstract: Nowadays, the rapid development of multimedia
and internet allows for wide distribution of digital media data.
It becomes much easier to edit, modify and duplicate digital
information Besides that, digital documents are also easy to
copy and distribute, therefore it will be faced by many
threatens. It-s a big security and privacy issue with the large
flood of information and the development of the digital
format, it become necessary to find appropriate protection
because of the significance, accuracy and sensitivity of the
information. Nowadays protection system classified with more
specific as hiding information, encryption information, and
combination between hiding and encryption to increase information
security, the strength of the information hiding science is due to the
non-existence of standard algorithms to be used in hiding secret
messages. Also there is randomness in hiding methods such as
combining several media (covers) with different methods to pass a
secret message. In addition, there are no formal methods to be
followed to discover the hidden data. For this reason, the task of this
research becomes difficult. In this paper, a new system of information
hiding is presented. The proposed system aim to hidden information
(data file) in any execution file (EXE) and to detect the hidden file
and we will see implementation of steganography system which
embeds information in an execution file. (EXE) files have been
investigated. The system tries to find a solution to the size of the
cover file and making it undetectable by anti-virus software. The
system includes two main functions; first is the hiding of the
information in a Portable Executable File (EXE), through the
execution of four process (specify the cover file, specify the
information file, encryption of the information, and hiding the
information) and the second function is the extraction of the hiding
information through three process (specify the steno file, extract the
information, and decryption of the information). The system has
achieved the main goals, such as make the relation of the size of the
cover file and the size of information independent and the result file
does not make any conflict with anti-virus software.
Abstract: The aim of this article is to narrate the utility of novel simulation approach i.e. convolution method to predict blood concentration of drug utilizing dissolution data of salbutamol sulphate microparticulate formulations with different release patterns (1:1, 1:2 and 1:3, drug:polymer). Dissolution apparatus II USP 2007 and 900 ml double distilled water stirrd at 50 rpm was employed for dissolution analysis. From dissolution data, blood drug concentration was determined, and in return predicted blood drug concentration data was used to calculate the pharmacokinetic parameters i.e. Cmax, Tmax, and AUC. Convolution is a good biwaiver technique; however its better utility needs it application in the conditions where biorelevant dissolution media are used.
Abstract: The implementation of electronic government started since the initiation of Multimedia Super Corridor (MSC) by the Malaysia government. The introduction of ICT in the public sector especially e-Government initiatives opens up a new book in the government administration throughout the world. The aim or this paper is to discuss the implementation of e-government in Malaysia, covering the result of public user self assessment on Malaysia's electronic government applications. E-services, e-procurement, Generic Office Environment (GOE), Human Resources Management Information System (HRMIS), Project Monitoring System (PMS), Electronic Labor Exchange (ELX) and e-syariah(religion) were the seven flagship application assessed. The study adopted a crosssectional survey research approach and information system literature were used. The analysis was done for 35 responden in pilot test and there was evidence from public user's perspective to suggest that the e-government applications were generally successful.