Abstract: This paper aims to provide a conceptual framework to examine competitive disadvantage of banks that suffer from poor performance. Banks generate revenues mainly from the interest rate spread on taking deposits and making loans while collecting fees in the process. To maximize firm value, banks seek loan growth and expense control while managing risk associated with loans with respect to non-performing borrowers or narrowing interest spread between assets and liabilities. Competitive disadvantage refers to the failure to access imitable resources and to build managing capabilities to gain sustainable return given appropriate risk management. This paper proposes a four-quadrant framework of organizational typology is subsequently proposed to examine the features of competitive disadvantage in the banking sector. A resource configuration model, which is extracted from CAMEL indicators to examine the underlying features of bank failures.
Abstract: The term private equity usually refers to any type of
equity investment in an asset in which the equity is not freely
tradable on a public stock market. Some researchers believe that
private equity contributed to the extent of the crisis and increased
the pace of its spread over the world. We do not agree with this.
On the other hand, we argue that during the economic recession
private equity might become an important source of funds for firms
with special needs (e.g. for firms seeking buyout financing, venture
capital, expansion capital or distress debt financing). However,
over-regulation of private equity in both the European Union and
the US can slow down this specific funding channel to the
economy and deepen credit crunch during global crises.
Abstract: Knowledge Discovery of Databases (KDD) is the
process of extracting previously unknown but useful and significant
information from large massive volume of databases. Data Mining is
a stage in the entire process of KDD which applies an algorithm to
extract interesting patterns. Usually, such algorithms generate huge
volume of patterns. These patterns have to be evaluated by using
interestingness measures to reflect the user requirements.
Interestingness is defined in different ways, (i) Objective measures
(ii) Subjective measures. Objective measures such as support and
confidence extract meaningful patterns based on the structure of the
patterns, while subjective measures such as unexpectedness and
novelty reflect the user perspective. In this report, we try to brief the
more widely spread and successful subjective measures and propose
a new subjective measure of interestingness, i.e. shocking.
Abstract: Ultra-low-power (ULP) circuits have received
widespread attention due to the rapid growth of biomedical
applications and Battery-less Electronics. Subthreshold region of
transistor operation is used in ULP circuits. Major research challenge
in the subthreshold operating region is to extract the ULP benefits
with minimal degradation in speed and robustness. Process, Voltage
and Temperature (PVT) variations significantly affect the
performance of subthreshold circuits. Designed performance
parameters of ULP circuits may vary largely due to temperature
variations. Hence, this paper investigates the effect of temperature
variation on device and circuit performance parameters at different
biasing voltages in the subthreshold region. Simulation results clearly
demonstrate that in deep subthreshold and near threshold voltage
regions, performance parameters are significantly affected whereas in
moderate subthreshold region, subthreshold circuits are more
immune to temperature variations. This establishes that moderate
subthreshold region is ideal for temperature immune circuits.
Abstract: The major urban centers are all facing rapid growth is
most often associated with spreading urbanization, social status of the
car has also changed: it has become a commodity of mass
consumption. There are currently about 5 million and 260 cars in
Algeria (2008), this number increases every year 200,000 new cars.
These phenomena induce a demand for greater mobility and a
significant need for transport infrastructure. Faced with these
problems and development of the growing use of the automobile,
central governments and local authorities in charge of urban transport
issues are aware of the need to develop their urban transport systems
but often lack opportunities.
Urban Transport Plans (PDU) were born in reaction to the "culture
of automobile." Their existence in the world the '80s, however, they
had little success before laws on air and rational use of energy in 90
years does not alter substantially their content and make mandatory
their implementation in cities of over 100,000 inhabitants (Abroad)
[1].
The objective of this work is to use the tool and specifically
Geomatics techniques as decision support in the organization and
management of travel while taking into consideration the influence,
which will then translate by National Urban Transport Plan.
Abstract: The paper compares different channel models used for
modeling Broadband Power-Line Communication (BPLC) system.
The models compared are Zimmermann and Dostert, Philipps,
Anatory et al and Anatory et al generalized Transmission Line (TL)
model. The validity of each model was compared in time domain
with ATP-EMTP software which uses transmission line approach. It
is found that for a power-line network with minimum number of
branches all the models give similar signal/pulse time responses
compared with ATP-EMTP software; however, Zimmermann and
Dostert model indicates the same amplitude but different time delay.
It is observed that when the numbers of branches are increased only
generalized TL theory approach results are comparable with ATPEMTP
results. Also the Multi-Carrier Spread Spectrum (MC-SS)
system was applied to check the implication of such behavior on the
modulation schemes. It is observed that using Philipps on the
underground cable can predict the performance up to 25dB better
than other channel models which can misread the actual performance
of the system. Also modified Zimmermann and Dostert under
multipath can predict a better performance of about 5dB better than
the actual predicted by Generalized TL theory. It is therefore
suggested for a realistic BPLC system design and analyses the model
based on generalized TL theory be used.
Abstract: The third generation (3G) of cellular system adopted
the spread spectrum as solution for the transmission of the data in the
physical layer. Contrary to systems IS-95 or CDMAOne (systems
with spread spectrum of the preceding generation), the new standard,
called Universal Mobil Telecommunications System (UMTS), uses
long codes in the down link. The system is conceived for the vocal
communication and the transmission of the data. In particular, the
down link is very important, because of the asymmetrical request of
the data, i.e., more remote loading towards the mobiles than towards
the basic station. Moreover, the UMTS uses for the down link an
orthogonal spreading out with a variable factor of spreading out
(OVSF for Orthogonal Variable Spreading Factor). This
characteristic makes it possible to increase the flow of data of one or
more users by reducing their factor of spreading out without
changing the factor of spreading out of other users. In the current
standard of the UMTS, two techniques to increase the performances
of the down link were proposed, the diversity of sending antenna and
the codes space-time. These two techniques fight only fainding. The
receiver proposed for the mobil station is the RAKE, but one can
imagine a receiver more sophisticated, able to reduce the interference
between users and the impact of the coloured noise and interferences
to narrow band. In this context, where the users have long codes
synchronized with variable factor of spreading out and ignorance by
the mobile of the other active codes/users, the use of the sequences of
code pseudo-noises different lengths is presented in the form of one
of the most appropriate solutions.
Abstract: Background: Widespread use of chemotherapeutic
drugs in the treatment of cancer has lead to higher health hazards
among employee who handle and administer such drugs, so nurses
should know how to protect themselves, their patients and their work
environment against toxic effects of chemotherapy. Aim of this study
was carried out to examine the effect of chemotherapy safety protocol
for oncology nurses on their protective measure practices. Design: A
quasi experimental research design was utilized. Setting: The study
was carried out in oncology department of Menoufia university
hospital and Tanta oncology treatment center. Sample: A
convenience sample of forty five nurses in Tanta oncology treatment
center and eighteen nurses in Menoufiya oncology department.
Tools: 1. an interviewing questionnaire that covering sociodemographic
data, assessment of unit and nurses' knowledge about
chemotherapy. II: Obeservational check list to assess nurses' actual
practices of handling and adminestration of chemotherapy. A base
line data were assessed before implementing Chemotherapy Safety
protocol, then Chemotherapy Safety protocol was implemented, and
after 2 monthes they were assessed again. Results: reveled that 88.9%
of study group I and 55.6% of study group II improved to good total
knowledge scores after educating on the safety protocol, also 95.6%
of study group I and 88.9% of study group II had good total practice
score after educating on the safety protocol. Moreover less than half
of group I (44.4%) reported that heavy workload is the most barriers
for them, while the majority of group II (94.4%) had many barriers
for adhering to the safety protocol such as they didn’t know the
protocol, the heavy work load and inadequate equipment.
Conclusions: Safety protocol for Oncology Nurses seemed to have
positive effect on improving nurses' knowledge and practice.
Recommendation: chemotherapy safety protocol should be instituted
for all oncology nurses who are working in any oncology unit and/ or
center to enhance compliance, and this protocol should be done at
frequent intervals.
Abstract: The paradigm of mobile agent provides a promising technology for the development of distributed and open applications. However, one of the main obstacles to widespread adoption of the mobile agent paradigm seems to be security. This paper treats the security of the mobile agent against malicious host attacks. It describes generic mobile agent protection architecture. The proposed approach is based on the dynamic adaptability and adopts the reflexivity as a model of conception and implantation. In order to protect it against behaviour analysis attempts, the suggested approach supplies the mobile agent with a flexibility faculty allowing it to present an unexpected behaviour. Furthermore, some classical protective mechanisms are used to reinforce the level of security.
Abstract: The Siemens Healthcare Sector is one of the world's
largest suppliers to the healthcare industry and a trendsetter in
medical imaging and therapy, laboratory diagnostics, medical
information technology, and hearing aids.
Siemens offers its customers products and solutions for the entire
range of patient care from a single source – from prevention and
early detection to diagnosis, and on to treatment and aftercare. By
optimizing clinical workflows for the most common diseases,
Siemens also makes healthcare faster, better, and more cost effective.
The optimization of clinical workflows requires a
multidisciplinary focus and a collaborative approach of e.g. medical
advisors, researchers and scientists as well as healthcare economists.
This new form of collaboration brings together experts with deep
technical experience, physicians with specialized medical knowledge
as well as people with comprehensive knowledge about health
economics.
As Charles Darwin is often quoted as saying, “It is neither the
strongest of the species that survive, nor the most intelligent, but the
one most responsive to change," We believe that those who can
successfully manage this change will emerge as winners, with
valuable competitive advantage.
Current medical information and knowledge are some of the core
assets in the healthcare industry. The main issue is to connect
knowledge holders and knowledge recipients from various
disciplines efficiently in order to spread and distribute knowledge.
Abstract: In this paper, Selective Adaptive Parallel Interference Cancellation (SA-PIC) technique is presented for Multicarrier Direct Sequence Code Division Multiple Access (MC DS-CDMA) scheme. The motivation of using SA-PIC is that it gives high performance and at the same time, reduces the computational complexity required to perform interference cancellation. An upper bound expression of the bit error rate (BER) for the SA-PIC under Rayleigh fading channel condition is derived. Moreover, the implementation complexities for SA-PIC and Adaptive Parallel Interference Cancellation (APIC) are discussed and compared. The performance of SA-PIC is investigated analytically and validated via computer simulations.
Abstract: In this paper we introduce three watermarking methods that can be used to count the number of times that a user has played some content. The proposed methods are tested with audio content in our experimental system using the most common signal processing attacks. The test results show that the watermarking methods used enable the watermark to be extracted under the most common attacks with a low bit error rate.
Abstract: Intelligent systems based on machine learning
techniques, such as classification, clustering, are gaining wide spread
popularity in real world applications. This paper presents work on
developing a software system for predicting crop yield, for example
oil-palm yield, from climate and plantation data. At the core of our
system is a method for unsupervised partitioning of data for finding
spatio-temporal patterns in climate data using kernel methods which
offer strength to deal with complex data. This work gets inspiration
from the notion that a non-linear data transformation into some high
dimensional feature space increases the possibility of linear
separability of the patterns in the transformed space. Therefore, it
simplifies exploration of the associated structure in the data. Kernel
methods implicitly perform a non-linear mapping of the input data
into a high dimensional feature space by replacing the inner products
with an appropriate positive definite function. In this paper we
present a robust weighted kernel k-means algorithm incorporating
spatial constraints for clustering the data. The proposed algorithm
can effectively handle noise, outliers and auto-correlation in the
spatial data, for effective and efficient data analysis by exploring
patterns and structures in the data, and thus can be used for
predicting oil-palm yield by analyzing various factors affecting the
yield.
Abstract: Housebuilders in England have been the target of numerous government policies in recent years promoting increased productivity and affordability. As a result, the housebuilding industry is currently faced with objectives to improve the affordability and sustainability of new homes whilst also increasing production rates to 240,000 per year by 2016.Yet amidst a faltering economic climate, the UK Government is forging ahead with the 'Code for Sustainable Homes', which includes stringent sustainable standards for all new homes and sets ambitious targets for the housebuilding industry, the culmination of which is the production of zero carbon homes by 2016.Great uncertainty exists amongst housebuilders as to the costs, benefits and risks of building zero carbon homes. This paper examines the key barriers to zero carbon homes from housebuilders- perspective. A comprehensive opinion on the challenges to deliver zero carbon homes is gathered through a questionnaire survey issued to the major housing developers in England. The study found that a number of cultural, legislative, and financial barriers stand in the way of the widespread construction of zero carbon homes. The study concludes with several recommendations to both the Government and the housebuilding industry to address the barriers that hinder a successful delivery of zero carbon homes in England.
Abstract: Controlled release urea has become popular in agricultural industry as it helps to solve environmental issues and increase crop yield. Recently biomass was identified to replace the polymer used as a coating material in the conventional coated urea. In this paper spreading and contact angle of biomass droplet (lignin, cellulose and clay) on urea surface are investigated experimentally. There were two tests were conducted, sessile drop for contact angle measurement and pendant drop for contact angle measurement. A different concentration of biomass droplet was released from 30 mm above a substrate. Glass was used as a controlled substrate. Images were recorded as soon as the droplet impacted onto the urea before completely adsorb into the urea. Digitized droplets were then used to identify the droplet-s surface tension and contact angle. There is large difference observed between the low surface tension and high surface tension liquids, where the wetting and spreading diameter is higher for lower surface tension. From the contact angle results, the data showed that the biomass coating films were possible as wetting liquid (θ < 90º). Contact angle of biomass coating material gives good indication for the wettablity of a liquid on urea surface.
Abstract: The usage of internet is rapidly increasing and the usage of mobile agent technology in internet environment has a great demand. The security issue one of main obstacles that restrict the mobile agent technology to spread. This paper proposes Secure-Image Mechanism (SIM) as a new mechanism to protect mobile agents against malicious hosts. . SIM aims to protect mobile agent by using the symmetric encryption and hash function in cryptography science. This mechanism can prevent the eavesdropping and alteration attacks. It assists the mobile agents to continue their journey normally incase attacks occurred.
Abstract: This study presents a new approach based on Tanaka's
fuzzy linear regression (FLP) algorithm to solve well-known power
system economic load dispatch problem (ELD). Tanaka's fuzzy linear
regression (FLP) formulation will be employed to compute the
optimal solution of optimization problem after linearization. The
unknowns are expressed as fuzzy numbers with a triangular
membership function that has middle and spread value reflected on
the unknowns. The proposed fuzzy model is formulated as a linear
optimization problem, where the objective is to minimize the sum of
the spread of the unknowns, subject to double inequality constraints.
Linear programming technique is employed to obtain the middle and
the symmetric spread for every unknown (power generation level).
Simulation results of the proposed approach will be compared with
those reported in literature.
Abstract: A zero-field ferromagnetic Ising model is utilized to
simulate the propagation of infection in a population that assumes a
square lattice structure. The rate of infection increases with
temperature. The disease spreads faster among individuals with low J
values. Such effect, however, diminishes at higher temperatures.
Abstract: Most of the commercial gluten free products are
nutritionally inferior when compared to gluten containing
counterparts as manufacturers most often use the refined flours and
starches. So it is possible that people on gluten free diet have low
intake of fibre content. The foxtail millet flour and copra meal are
gluten free and have high fibre and protein contents. The formulation
of fibre rich gluten free cookies was optimized by response surface
methodology considering independent process variables as proportion
of Foxtail millet (Setaria italica) flour in mixed flour, fat content and
guar gum. The sugar, sodium chloride, sodium bicarbonates and
water were added in fixed proportion as 60, 1.0, 0.4 and 20% of
mixed flour weight, respectively. Optimum formulation obtained for
maximum spread ratio, fibre content, surface L-value, overall
acceptability and minimum breaking strength were 80% foxtail millet
flour in mixed flour, 42.8 % fat content and 0.05% guar gum.
Abstract: In this paper, the concepts of dichotomous logistic
regression (DLR) with leave-one-out (L-O-O) were discussed. To
illustrate this, the L-O-O was run to determine the importance of the
simulation conditions for robust test of spread procedures with good
Type I error rates. The resultant model was then evaluated. The
discussions included 1) assessment of the accuracy of the model, and
2) parameter estimates. These were presented and illustrated by
modeling the relationship between the dichotomous dependent
variable (Type I error rates) with a set of independent variables (the
simulation conditions). The base SAS software containing PROC
LOGISTIC and DATA step functions can be making used to do the
DLR analysis.