Abstract: Ethanol is generally used as a therapeutic reagent against Hepatocellular carcinoma (HCC or hepatoma) worldwide, as it can induce Hepatocellular carcinoma cell apoptosis at low concentration through a multifactorial process regulated by several unknown proteins. This paper provides a simple and available proteomic strategy for exploring differentially expressed proteins in the apoptotic pathway. The appropriate concentrations of ethanol required to induce HepG2 cell apoptosis were first assessed by MTT assay, Gisma and fluorescence staining. Next, the central proteins involved in the apoptosis pathway processs were determined using 2D-PAGE, SDS-PAGE, and bio-software analysis. Finally the downregulation of two proteins, AFP and survivin, were determined by immunocytochemistry and reverse transcriptase PCR (RT-PCR) technology. The simple, useful method demonstrated here provides a new approach to proteomic analysis in key bio-regulating process including proliferation, differentiation, apoptosis, immunity and metastasis.
Abstract: In this paper, a novel approach for the multidisciplinary design optimization (MDO) of complex mechatronic systems. This approach, which is a part of a global project aiming to include the MDO aspect inside an innovative design process. As a first step, the paper considers the MDO as a redesign approach which is limited to the parametric optimization. After defining and introducing the different keywords, the proposed method which is based on the V-Model which is commonly used in mechatronics.
Abstract: One of the main advantages of the LO paradigm is to
allow the availability of good quality, shareable learning material
through the Web. The effectiveness of the retrieval process requires a
formal description of the resources (metadata) that closely fits the
user-s search criteria; in spite of the huge international efforts in this
field, educational metadata schemata often fail to fulfil this
requirement. This work aims to improve the situation, by the
definition of a metadata model capturing specific didactic features of
shareable learning resources. It classifies LOs into “teacher-oriented"
and “student-oriented" categories, in order to describe the role a LO
is to play when it is integrated into the educational process. This
article describes the model and a first experimental validation process
that has been carried out in a controlled environment.
Abstract: For the first incumbent operator it is very important
to understand how to react when the second operator comes to the
market. In this paper which is prepared for preliminary study of
GSM market in Iran, we have studied five MENA markets
according to the similarity point of view. This paper aims at
analyzing the impact of second entrants in selected markets on
certain marketing key performance indicators (KPI) such as:
Market shares (by operator), prepaid share, minutes of use (MoU),
Price and average revenue per user (ARPU) (for total market
each).
Abstract: This paper explores the effectiveness of machine
learning techniques in detecting firms that issue fraudulent financial
statements (FFS) and deals with the identification of factors
associated to FFS. To this end, a number of experiments have been
conducted using representative learning algorithms, which were
trained using a data set of 164 fraud and non-fraud Greek firms in the
recent period 2001-2002. The decision of which particular method to
choose is a complicated problem. A good alternative to choosing
only one method is to create a hybrid forecasting system
incorporating a number of possible solution methods as components
(an ensemble of classifiers). For this purpose, we have implemented
a hybrid decision support system that combines the representative
algorithms using a stacking variant methodology and achieves better
performance than any examined simple and ensemble method. To
sum up, this study indicates that the investigation of financial
information can be used in the identification of FFS and underline the
importance of financial ratios.
Abstract: In single trial analysis, when using Principal
Component Analysis (PCA) to extract Visual Evoked Potential
(VEP) signals, the selection of principal components (PCs) is an
important issue. We propose a new method here that selects only
the appropriate PCs. We denote the method as selective eigen-rate
(SER). In the method, the VEP is reconstructed based on the rate
of the eigen-values of the PCs. When this technique is applied on
emulated VEP signals added with background
electroencephalogram (EEG), with a focus on extracting the
evoked P3 parameter, it is found to be feasible. The improvement
in signal to noise ratio (SNR) is superior to two other existing
methods of PC selection: Kaiser (KSR) and Residual Power (RP).
Though another PC selection method, Spectral Power Ratio (SPR)
gives a comparable SNR with high noise factors (i.e. EEGs), SER
give more impressive results in such cases. Next, we applied SER
method to real VEP signals to analyse the P3 responses for
matched and non-matched stimuli. The P3 parameters extracted
through our proposed SER method showed higher P3 response for
matched stimulus, which confirms to the existing neuroscience
knowledge. Single trial PCA using KSR and RP methods failed to
indicate any difference for the stimuli.
Abstract: In this paper we propose a family of algorithms based
on 3rd and 4th order cumulants for blind single-input single-output
(SISO) Non-Minimum Phase (NMP) Finite Impulse Response (FIR)
channel estimation driven by non-Gaussian signal. The input signal
represents the signal used in 10GBASE-T (or IEEE 802.3an-2006)
as a Tomlinson-Harashima Precoded (THP) version of random
Pulse-Amplitude Modulation with 16 discrete levels (PAM-16). The
proposed algorithms are tested using three non-minimum phase
channel for different Signal-to-Noise Ratios (SNR) and for different
data input length. Numerical simulation results are presented to
illustrate the performance of the proposed algorithms.
Abstract: There are three distinct stages in the evolution of
economic thought, namely:
1. in the first stage, the major concern was to accelerate
economic growth with increased availability of material
goods, especially in developing economies with very low
living standards, because poverty eradication meant faster
economic growth.
2. in the second stage, economists made distinction between
growth and development. Development was seen as going
beyond economic growth, and bringing certain changes in
the structure of the economy with more equitable
distribution of the benefits of growth, with the growth
coming automatic and sustained.
3. the third stage is now reached. Our concern is now with
“sustainable development", that is, development not only
for the present but also of the future.
Thus the focus changed from “sustained growth" to “sustained
development". Sustained development brings to the fore the long
term relationship between the ecology and economic development.
Since the creation of UNEP in 1972 it has worked for
development without destruction for environmentally sound and
sustained development. It was realised that the environment cannot
be viewed in a vaccum, it is not separate from development, nor is it
competing. It suggested for the integration of the environment with
development whereby ecological factors enter development planning,
socio-economic policies, cost-benefit analysis, trade, technology
transfer, waste management, educational and other specific areas.
Industrialisation has contributed to the growth of economy of
several countries. It has improved the standards of living of its people
and provided benefits to the society. It has also created in the process
great environmental problems like climate change, forest destruction
and denudation, soil erosion and desertification etc.
On the other hand, industry has provided jobs and improved the
prospects of wealth for the industrialists. The working class
communities had to simply put up with the high levels of pollution in
order to keep up their jobs and also to save their income.
There are many roots of the environmental problem. They may be
political, economic, cultural and technological conditions of the
modern society. The experts concede that industrial growth lies
somewhere close to the heart of the matter. Therefore, the objective
of this paper is not to document all roots of an environmental crisis
but rather to discuss the effects of industrial growth and
development.
We have come to the conclusion that although public intervention
is often unnecessary to ensure that perfectly competitive markets will
function in society-s best interests, such intervention is necessary
when firms or consumers pollute.
Abstract: The present study concentrates on solving the along wind oscillation problem of a tall square building from first principles and across wind oscillation problem of the same from empirical relations obtained by experiments. The criterion for human comfort at the worst condition at the top floor of the building is being considered and a limiting value of height of a building for a given cross section is predicted. Numerical integrations are carried out as and when required. The results show severeness of across wind oscillations in comparison to along wind oscillation. The comfort criterion is combined with across wind oscillation results to determine the maximum allowable height of a building for a given square cross-section.
Abstract: The correct design of the regulators structure requires complete prediction of the ultimate dimensions of the scour hole profile formed downstream the solid apron. The study of scour downstream regulator is studied either on solid aprons by means of velocity distribution or on movable bed by studying the topography of the scour hole formed in the downstream. In this paper, a new technique was developed to study the scour hole downstream regulators on movable beds. The study was divided into two categories; the first is to find out the sum of the lengths of rigid apron behind the gates in addition to the length of scour hole formed downstream, while the second is to find the minimum length of rigid apron behind the gates to prevent erosion downstream it. The study covers free and submerged hydraulic jump conditions in both symmetrical and asymmetrical under-gated regulations. From the comparison between the studied categories, we found that the minimum length of rigid apron to prevent scour (Ls) is greater than the sum of the lengths of rigid apron and that of scour hole formed behind it (L+Xs). On the other hand, the scour hole dimensions in case of submerged hydraulic jump is always greater than free one, also the scour hole dimensions in asymmetrical operation is greater than symmetrical one.
Abstract: In this paper, we discuss the paradigm shift in bank
capital from the “gone concern" to the “going concern" mindset. We
then propose a methodology for pricing a product of this shift called
Contingent Capital Notes (“CoCos"). The Merton Model can
determine a price for credit risk by using the firm-s equity value as a
call option on those assets. Our pricing methodology for CoCos also
uses the credit spread implied by the Merton Model in a subsequent
derivative form created by John Hull et al . Here, a market implied
asset volatility is calculated by using observed market CDS spreads.
This implied asset volatility is then used to estimate the probability of
triggering a predetermined “contingency event" given the distanceto-
trigger (DTT). The paper then investigates the effect of varying
DTTs and recovery assumptions on the CoCo yield. We conclude
with an investment rationale.
Abstract: Grid computing provides an effective infrastructure for massive computation among flexible and dynamic collection of individual system for resource discovery. The major challenge for grid computing is to prevent breaches and secure the data from trespassers. To overcome such conflicts a semantic approach can be designed which will filter the access requests of peers by checking the resource description specifying the data and the metadata as factual statements. Between every node in the grid a semantic firewall as a middleware will be present The intruder will be required to present an application specifying there needs to the firewall and hence accordingly the system will grant or deny the application request.
Abstract: To investigate the correspondence of theory and
practice, a successfully implemented Knowledge Management
System (KMS) is explored through the lens of Alavi and Leidner-s
proposed KMS framework for the analysis of an information system
in knowledge management (Framework-AISKM). The applied KMS
system was designed to manage curricular knowledge in a distributed
university environment. The motivation for the KMS is discussed
along with the types of knowledge necessary in an academic setting.
Elements of the KMS involved in all phases of capturing and
disseminating knowledge are described. As the KMS matures the
resulting data stores form the precursor to and the potential for
knowledge mining. The findings from this exploratory study indicate
substantial correspondence between the successful KMS and the
theory-based framework providing provisional confirmation for the
framework while suggesting factors that contributed to the system-s
success. Avenues for future work are described.
Abstract: In this cyber age, the job market has been rapidly transforming and being digitalized. Submitting a paper-based curriculum vitae (CV) nowadays does not grant a job seeker a high employability rate. This paper calls for attention on the creation of mobile Curriculum Vitae or m-CV (http://mcurriculumvitae. blogspot.com), a sample of an individual CV developed using weblog, which can enhance the job hunter especially fresh graduate-s higher marketability rate. This study is designed to identify the perceptions held by Malaysian university students regarding m-CV grounded on a modified Technology Acceptance Model (TAM). It measures the strength and the direction of relationships among three major variables – Perceived Ease of Use (PEOU), Perceived Usefulness (PU) and Behavioral Intention (BI) to use. The finding shows that university students generally accepted adopting m-CV since they perceived m-CV to be more useful rather than easy to use. Additionally, this study has confirmed TAM to be a useful theoretical model in helping to understand and explain the behavioral intention to use Web 2.0 application-weblog publishing their CV. The result of the study has underlined another significant positive value of using weblog to create personal CV. Further research of m-CV has been highlighted in this paper.
Abstract: In this work we numerically examine structures which
could confine light in nanometer areas. A system consisting of two silicon disks with in plane separation of a few tens of nanometers has
been studied first. The normalized unitless effective mode volume, Veff, has been calculated for the two lowest whispering gallery mode resonances. The effective mode volume is reduced significantly as the gap between the disks decreases. In addition, the effect of the substrate is also studied. In that case, Veff of approximately the same
value as the non-substrate case for a similar two disk system can be
obtained by using disks almost twice as thick. We also numerically examine a structure consisting of a circular slot waveguide which is formed into a silicon disk resonator. We show that the proposed structure could have high Q resonances thus raising the belief that it
is a very promising candidate for optical interconnects applications.
The study includes several numerical calculations for all the geometric parameters of the structure. It also includes numerical simulations of the coupling between a waveguide and the proposed
disk resonator leading to a very promising conclusion about its applicability.
Abstract: In this paper, we present a new algorithm for clustering data in large datasets using image processing approaches. First the dataset is mapped into a binary image plane. The synthesized image is then processed utilizing efficient image processing techniques to cluster the data in the dataset. Henceforth, the algorithm avoids exhaustive search to identify clusters. The algorithm considers only a small set of the data that contains critical boundary information sufficient to identify contained clusters. Compared to available data clustering techniques, the proposed algorithm produces similar quality results and outperforms them in execution time and storage requirements.
Abstract: Bovine viral diarrhea virus (BVDV) can cause lifelong
persistent infection. One reason for the phenomena is attributed to
BVDV infection to placenta tissue. However the mechanisms that
BVDV invades into placenta tissue remain unclear. To clarify the
molecular mechanisms, we investigated the possible means that
BVDV entered into bovine trophoblast cells (TPC). Yeast two-hybrid
system was used to identify proteins extracted from TPC, which
interact with BVDV envelope glycoprotein E2. A PGbkt7-E2 yeast
expression vector and TPC cDNA library were constructed. Through
two rounds of screening, three positive clones were identified.
Sequencing analysis indicated that all the three positive clones
encoded the same protein clathrin. Physical interaction between
clathrin and BVDV E2 protein was further confirmed by
coimmunoprecipitation experiments. This result suggested that the
clathrin might play a critical role in the process of BVDV entry into
placenta tissue and might be a novel antiviral target for preventing
BVDV infection.
Abstract: The new framework the Higher Education is
immersed in involves a complete change in the way lecturers must
teach and students must learn. Whereas the lecturer was the main
character in traditional education, the essential goal now is to
increase the students' participation in the process. Thus, one of the
main tasks of lecturers in this new context is to design activities of
different nature in order to encourage such participation. Seminars
are one of the activities included in this environment. They are active
sessions that enable going in depth into specific topics as support of
other activities. They are characterized by some features such as
favoring interaction between students and lecturers or improving
their communication skills. Hence, planning and organizing strategic
seminars is indeed a great challenge for lecturers with the aim of
acquiring knowledge and abilities. This paper proposes a method
using Artificial Intelligence techniques to obtain student profiles
from their marks and preferences. The goal of building such profiles
is twofold. First, it facilitates the task of splitting the students into
different groups, each group with similar preferences and learning
difficulties. Second, it makes it easy to select adequate topics to be a
candidate for the seminars. The results obtained can be either a
guarantee of what the lecturers could observe during the development
of the course or a clue to reconsider new methodological strategies in
certain topics.
Abstract: The performance of adaptive beamforming degrades
substantially in the presence of steering vector mismatches. This
degradation is especially severe in the near-field, for the
3-dimensional source location is more difficult to estimate than the
2-dimensional direction of arrival in far-field cases. As a solution, a
novel approach of near-field robust adaptive beamforming (RABF) is
proposed in this paper. It is a natural extension of the traditional
far-field RABF and belongs to the class of diagonal loading
approaches, with the loading level determined based on worst-case
performance optimization. However, different from the methods
solving the optimal loading by iteration, it suggests here a simple
closed-form solution after some approximations, and consequently,
the optimal weight vector can be expressed in a closed form. Besides
simplicity and low computational cost, the proposed approach reveals
how different factors affect the optimal loading as well as the weight
vector. Its excellent performance in the near-field is confirmed via a
number of numerical examples.
Abstract: The main objective of this paper is to contribute the
existing knowledge transfer and IT Outsourcing literature
specifically in the context of Malaysia by reviewing the current
practices of e-government IT outsourcing in Malaysia including the
issues and challenges faced by the public agencies in transferring the
knowledge during the engagement. This paper discusses various
factors and different theoretical model of knowledge transfer starting
from the traditional model to the recent model suggested by the
scholars. The present paper attempts to align organizational
knowledge from the knowledge-based view (KBV) and
organizational learning (OL) lens. This review could help shape the
direction of both future theoretical and empirical studies on inter-firm
knowledge transfer specifically on how KBV and OL perspectives
could play significant role in explaining the complex relationships
between the client and vendor in inter-firm knowledge transfer and
the role of organizational management information system and
Transactive Memory System (TMS) to facilitate the organizational
knowledge transferring process. Conclusion is drawn and further
research is suggested.