Abstract: Basel III (or the Third Basel Accord) is a global
regulatory standard on bank capital adequacy, stress testing and
market liquidity risk agreed upon by the members of the Basel
Committee on Banking Supervision in 2010-2011, and scheduled to
be introduced from 2013 until 2018. Basel III is a comprehensive set
of reform measures. These measures aim to; (1) improve the banking
sector-s ability to absorb shocks arising from financial and economic
stress, whatever the source, (2) improve risk management and
governance, (3) strengthen banks- transparency and disclosures.
Similarly the reform target; (1) bank level or micro-prudential,
regulation, which will help raise the resilience of individual banking
institutions to periods of stress. (2) Macro-prudential regulations,
system wide risk that can build up across the banking sector as well
as the pro-cyclical implication of these risks over time. These two
approaches to supervision are complementary as greater resilience at
the individual bank level reduces the risk system wide shocks.
Macroeconomic impact of Basel III; OECD estimates that the
medium-term impact of Basel III implementation on GDP growth is
in the range -0,05 percent to -0,15 percent per year. On the other hand
economic output is mainly affected by an increase in bank lending
spreads as banks pass a rise in banking funding costs, due to higher
capital requirements, to their customers. Consequently the estimated
effects on GDP growth assume no active response from monetary
policy. Basel III impact on economic output could be offset by a
reduction (or delayed increase) in monetary policy rates by about 30
to 80 basis points. The aim of this paper is to create a framework
based on the recent regulations in order to prevent financial crises.
Thus the need to overcome the global financial crisis will contribute
to financial crises that may occur in the future periods. In the first
part of the paper, the effects of the global crisis on the banking
system examine the concept of financial regulations. In the second
part; especially in the financial regulations and Basel III are analyzed.
The last section in this paper explored the possible consequences of
the macroeconomic impacts of Basel III.
Abstract: Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.
Abstract: We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.
Abstract: Embedded systems need to respect stringent real
time constraints. Various hardware components included in such
systems such as cache memories exhibit variability and therefore
affect execution time. Indeed, a cache memory access from an
embedded microprocessor might result in a cache hit where the
data is available or a cache miss and the data need to be fetched
with an additional delay from an external memory. It is therefore
highly desirable to predict future memory accesses during
execution in order to appropriately prefetch data without incurring
delays. In this paper, we evaluate the potential of several artificial
neural networks for the prediction of instruction memory
addresses. Neural network have the potential to tackle the nonlinear
behavior observed in memory accesses during program
execution and their demonstrated numerous hardware
implementation emphasize this choice over traditional forecasting
techniques for their inclusion in embedded systems. However,
embedded applications execute millions of instructions and
therefore millions of addresses to be predicted. This very
challenging problem of neural network based prediction of large
time series is approached in this paper by evaluating various neural
network architectures based on the recurrent neural network
paradigm with pre-processing based on the Self Organizing Map
(SOM) classification technique.
Abstract: Falling has been one of the major concerns and threats
to the independence of the elderly in their daily lives. With the
worldwide significant growth of the aging population, it is essential
to have a promising solution of fall detection which is able to operate
at high accuracy in real-time and supports large scale implementation
using multiple cameras. Field Programmable Gate Array (FPGA) is a
highly promising tool to be used as a hardware accelerator in many
emerging embedded vision based system. Thus, it is the main
objective of this paper to present an FPGA-based solution of visual
based fall detection to meet stringent real-time requirements with
high accuracy. The hardware architecture of visual based fall
detection which utilizes the pixel locality to reduce memory accesses
is proposed. By exploiting the parallel and pipeline architecture of
FPGA, our hardware implementation of visual based fall detection
using FGPA is able to achieve a performance of 60fps for a series of
video analytical functions at VGA resolutions (640x480). The results
of this work show that FPGA has great potentials and impacts in
enabling large scale vision system in the future healthcare industry
due to its flexibility and scalability.
Abstract: Research in quantum computation is looking for the consequences of having information encoding, processing and communication exploit the laws of quantum physics, i.e. the laws which govern the ultimate knowledge that we have, today, of the foreign world of elementary particles, as described by quantum mechanics. This paper starts with a short survey of the principles which underlie quantum computing, and of some of the major breakthroughs brought by the first ten to fifteen years of research in this domain; quantum algorithms and quantum teleportation are very biefly presented. The next sections are devoted to one among the many directions of current research in the quantum computation paradigm, namely quantum programming languages and their semantics. A few other hot topics and open problems in quantum information processing and communication are mentionned in few words in the concluding remarks, the most difficult of them being the physical implementation of a quantum computer. The interested reader will find a list of useful references at the end of the paper.
Abstract: This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA
datasets. An important challenge in reliably detecting aortic is the
need to overcome problems associated with intensity
inhomogeneities. Level sets are part of an important class of methods
that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the
level set formulation aids the suppression of noise in the extracted
regions of interest and then guides the motion of the evolving contour
for the detection of weak boundaries. The speed of curve evolution
has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level
sets, and are shown to be more effective than other approaches in
coping with intensity inhomogeneities. We have applied the Courant
Friedrichs Levy (CFL) condition as stability criterion for our algorithm.
Abstract: The ever growing sentiment of environmentalism across the globe has made many people think on the green lines. But most of such ideas halt short of implementation because of the short term economic viability issues with the concept of going green. In this paper we have tried to amalgamate the green concept with social entrepreneurship for solving a variety of issues faced by the society today. In addition the paper also tries to ensure that the short term economic viability does not act as a deterrent. The paper comes up three sustainable models of social entrepreneurship which tackle a wide assortment of issues such as nutrition problem, land problems, pollution problems and employment problems. The models described fall under the following heads: - Spirulina cultivation: The model addresses nutrition, land and employment issues. It deals with cultivation of a blue green alga called Spirulina which can be used as a very nutritious food. Also, the implementation of this model would bring forth employment to the poor people of the area. - Biocomposites: The model comes up with various avenues in which biocomposites can be used in an economically sustainable manner. This model deals with the environmental concerns and addresses the depletion of natural resources. - Packaging material from empty fruit bunches (EFB) of oil palm: This one deals with air and land pollution. It is intended to be a substitute for packaging materials made from Styrofoam and plastics which are non-biodegradable. It takes care of the biodegradability and land pollution issues. It also reduces air pollution as the empty fruit bunches are not incinerated. All the three models are sustainable and do not deplete the natural resources any further. This paper explains each of the models in detail and deals with the operational/manufacturing procedures and cost analysis while also throwing light on the benefits derived and sustainability aspects.
Abstract: Visualizing sound and noise often help us to determine
an appropriate control over the source localization. Near-field acoustic
holography (NAH) is a powerful tool for the ill-posed problem.
However, in practice, due to the small finite aperture size, the discrete
Fourier transform, FFT based NAH couldn-t predict the activeregion-
of-interest (AROI) over the edges of the plane. Theoretically
few approaches were proposed for solving finite aperture problem.
However most of these methods are not quite compatible for the
practical implementation, especially near the edge of the source. In
this paper, a zip-stuffing extrapolation approach has suggested with
2D Kaiser window. It is operated on wavenumber complex space
to localize the predicted sources. We numerically form a practice
environment with touch impact databases to test the localization of
sound source. It is observed that zip-stuffing aperture extrapolation
and 2D window with evanescent components provide more accuracy
especially in the small aperture and its derivatives.
Abstract: In this paper, we have presented the effect of varying
time-delays on performance and stability in the single-channel multirate
sampled-data system in hard real-time (RT-Linux) environment.
The sampling task require response time that might exceed the
capacity of RT-Linux. So a straight implementation with RT-Linux is
not feasible, because of the latency of the systems and hence,
sampling period should be less to handle this task. The best sampling
rate is chosen for the sampled-data system, which is the slowest rate
meets all performance requirements. RT-Linux is consistent with its
specifications and the resolution of the real-time is considered 0.01
seconds to achieve an efficient result. The test results of our
laboratory experiment shows that the multi-rate control technique in
hard real-time operating system (RTOS) can improve the stability
problem caused by the random access delays and asynchronization.
Abstract: Data Mining aims at discovering knowledge out of
data and presenting it in a form that is easily comprehensible to
humans. One of the useful applications in Egypt is the Cancer
management, especially the management of Acute Lymphoblastic
Leukemia or ALL, which is the most common type of cancer in
children.
This paper discusses the process of designing a prototype that can
help in the management of childhood ALL, which has a great
significance in the health care field. Besides, it has a social impact
on decreasing the rate of infection in children in Egypt. It also
provides valubale information about the distribution and
segmentation of ALL in Egypt, which may be linked to the possible
risk factors.
Undirected Knowledge Discovery is used since, in the case of this
research project, there is no target field as the data provided is
mainly subjective. This is done in order to quantify the subjective
variables. Therefore, the computer will be asked to identify
significant patterns in the provided medical data about ALL. This
may be achieved through collecting the data necessary for the
system, determimng the data mining technique to be used for the
system, and choosing the most suitable implementation tool for the
domain.
The research makes use of a data mining tool, Clementine, so as to
apply Decision Trees technique. We feed it with data extracted from
real-life cases taken from specialized Cancer Institutes. Relevant
medical cases details such as patient medical history and diagnosis
are analyzed, classified, and clustered in order to improve the disease
management.
Abstract: The empirical studies on High Performance Work Systems (HPWSs) and their impacts on firm performance have remarkably little in the developing countries. This paper reviews literatures on the HPWSs practices in different work settings, Western and Asian countries. A review on the empirical research leads to a conclusion that, country differences influence the Human Resource Management (HRM) practices. It is anticipated that there are similarities and differences in the extent of implementation of HPWSs practices by the Malaysian manufacturing firms due to the organizational contextual factors and, the HPWSs have a significant impact on firms- better performance amongst MNCs and local firms.
Abstract: Due to the mobility of users, many information
systems are now developed with the capability of supporting retrieval
of information from both static and mobile users. Hence, the
amount, content and format of the information retrieved will need to
be tailored according to the device and the user who requested for it.
Thus, this paper presents a framework for the design and
implementation of such a system, which is to be developed for
communicating final examination related information to the
academic community at one university in Malaysia. The concept of
personalization will be implemented in the system so that only highly
relevant information will be delivered to the users. The
personalization concept used will be based on user profiling as well
as context. The system in its final state will be accessible through cell
phones as well as intranet connected personal computers.
Abstract: To fight against the economic crisis, French
Government, like many others in Europe, has decided to give a boost
to high-speed line projects. This paper explores the implementation
and decision-making process in TGV projects, their evolutions,
especially since the Mediterranean TGV-line. This project was
probably the most controversial, but paradoxically represents today a
huge success for all the actors involved.
What kind of lessons we can learn from this experience? How to
evaluate the impact of this project on TGV-line planning? How can
we characterize this implementation and decision-making process
regards to the sustainability challenges?
The construction of Mediterranean TGV-line was the occasion to
make several innovations: to introduce more dialog into the decisionmaking
process, to take into account the environment, to introduce a
new project management and technological innovations. That-s why
this project appears today as an example in terms of integration of
sustainable development.
In this paper we examine the different kinds of innovations
developed in this project, by using concepts from sociology of
innovation to understand how these solutions emerged in a
controversial situation. Then we analyze the lessons which were
drawn from this decision-making process (in the immediacy and a
posteriori) and the way in which procedures evolved: creation of new
tools and devices (public consultation, project management...).
Finally we try to highlight the impact of this evolution on TGV
projects governance. In particular, new methods of implementation
and financing involve a reconfiguration of the system of actors. The
aim of this paper is to define the impact of this reconfiguration on
negotiations between stakeholders.
Abstract: When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.
Abstract: The purpose of this study is to identify the critical success factors (CSFs) for the effective implementation of Six Sigma in non-formal service Sectors.
Based on the survey of literature, the critical success factors (CSFs) for Six Sigma have been identified and are assessed for their importance in Non-formal service sector using Delphi Technique. These selected CSFs were put forth to the panel of expert to cluster them and prepare cognitive map to establish their relationship.
All the critical success factors examined and obtained from the review of literature have been assessed for their importance with respect to their contribution to Six Sigma effectiveness in non formal service sector.
The study is limited to the non-formal service sectors involved in the organization of religious festival only. However, the similar exercise can be conducted for broader sample of other non-formal service sectors like temple/ashram management, religious tours management etc.
The research suggests an approach to identify CSFs of Six Sigma for Non-formal service sector. All the CSFs of the formal service sector will not be applicable to Non-formal services, hence opinion of experts was sought to add or delete the CSFs. In the first round of Delphi, the panel of experts has suggested, two new CSFs-“competitive benchmarking (F19) and resident’s involvement (F28)”, which were added for assessment in the next round of Delphi. One of the CSFs-“fulltime six sigma personnel (F15)” has been omitted in proposed clusters of CSFs for non-formal organization, as it is practically impossible to deploy full time trained Six Sigma recruits.
Abstract: This paper evaluate the multilevel modulation for
different techniques such as amplitude shift keying (M-ASK), MASK,
differential phase shift keying (M-ASK-Bipolar), Quaternary
Amplitude Shift Keying (QASK) and Quaternary Polarization-ASK
(QPol-ASK) at a total bit rate of 107 Gbps. The aim is to find a costeffective
very high speed transport solution. Numerical investigation
was performed using Monte Carlo simulations. The obtained results
indicate that some modulation formats can be operated at 100Gbps
in optical communication systems with low implementation effort
and high spectral efficiency.
Abstract: In spite of all advancement in software testing,
debugging remains a labor-intensive, manual, time consuming, and
error prone process. A candidate solution to enhance debugging
process is to fuse it with testing process. To achieve this integration,
a possible solution may be categorizing common software tests and
errors followed by the effort on fixing the errors through general
solutions for each test/error pair. Our approach to address this issue is
based on Christopher Alexander-s pattern and pattern language
concepts. The patterns in this language are grouped into three major
sections and connect the three concepts of test, error, and debug.
These patterns and their hierarchical relationship shape a pattern
language that introduces a solution to solve software errors in a
known testing context.
Finally, we will introduce our developed framework ADE as a
sample implementation to support a pattern of proposed language,
which aims to automate the whole process of evolving software
design via evolutionary methods.
Abstract: TELMES project aims to develop a securized
multimedia system devoted to medical consultation teleservices. It
will be finalized with a pilot system for a regional telecenters
network that connects local telecenters, having as support
multimedia platforms. This network will enable the implementation
of complex medical teleservices (teleconsulations, telemonitoring,
homecare, urgency medicine, etc.) for a broader range of patients
and medical professionals, mainly for family doctors and those
people living in rural or isolated regions. Thus, a multimedia,
scalable network, based on modern IT&C paradigms, will result. It
will gather two inter-connected regional telecenters, in Iaşi and
Piteşti, Romania, each of them also permitting local connections of
hospitals, diagnostic and treatment centers, as well as local networks
of family doctors, patients, even educational entities. As
communications infrastructure, we aim to develop a combined fixmobile-
internet (broadband) links. Other possible communication
environments will be GSM/GPRS/3G and radio waves. The
electrocardiogram (ECG) acquisition, internet transmission and
local analysis, using embedded technologies, was already
successfully done for patients- telemonitoring.
Abstract: Noise has adverse effect on human health and
comfort. Noise not only cause hearing impairment, but it also acts as
a causal factor for stress and raising systolic pressure. Additionally it
can be a causal factor in work accidents, both by marking hazards
and warning signals and by impeding concentration. Industry
workers also suffer psychological and physical stress as well as
hearing loss due to industrial noise. This paper proposes an approach
to enable engineers to point out quantitatively the noisiest source for
modification, while multiple machines are operating simultaneously.
The model with the point source and spherical radiation in a free field
was adopted to formulate the problem. The procedure works very
well in ideal cases (point source and free field). However, most of the
industrial noise problems are complicated by the fact that the noise is
confined in a room. Reflections from the walls, floor, ceiling, and
equipment in a room create a reverberant sound field that alters the
sound wave characteristics from those for the free field. So the model
was validated for relatively low absorption room at NIT Kurukshetra
Central Workshop. The results of validation pointed out that the
estimated sound power of noise sources under simultaneous
conditions were on lower side, within the error limits 3.56 - 6.35 %.
Thus suggesting the use of this methodology for practical
implementation in industry. To demonstrate the application of the
above analytical procedure for estimating the sound power of noise
sources under simultaneous operating conditions, a manufacturing
facility (Railway Workshop at Yamunanagar, India) having five
sound sources (machines) on its workshop floor is considered in this
study. The findings of the case study had identified the two most
effective candidates (noise sources) for noise control in the Railway
Workshop Yamunanagar, India. The study suggests that the
modification in the design and/or replacement of these two identified
noisiest sources (machine) would be necessary so as to achieve an
effective reduction in noise levels. Further, the estimated data allows
engineers to better understand the noise situations of the workplace
and to revise the map when changes occur in noise level due to a
workplace re-layout.