Abstract: This paper covers the present situation and problem of experimental teaching of mathematics specialty in recent years, puts
forward and demonstrates experimental teaching methods for different
education. From the aspects of content and experimental teaching
approach, uses as an example the course “Experiment for Program
Designing & Algorithmic Language" and discusses teaching practice
and laboratory course work. In addition a series of successful methods
and measures are introduced in experimental teaching.
Abstract: Basel III (or the Third Basel Accord) is a global
regulatory standard on bank capital adequacy, stress testing and
market liquidity risk agreed upon by the members of the Basel
Committee on Banking Supervision in 2010-2011, and scheduled to
be introduced from 2013 until 2018. Basel III is a comprehensive set
of reform measures. These measures aim to; (1) improve the banking
sector-s ability to absorb shocks arising from financial and economic
stress, whatever the source, (2) improve risk management and
governance, (3) strengthen banks- transparency and disclosures.
Similarly the reform target; (1) bank level or micro-prudential,
regulation, which will help raise the resilience of individual banking
institutions to periods of stress. (2) Macro-prudential regulations,
system wide risk that can build up across the banking sector as well
as the pro-cyclical implication of these risks over time. These two
approaches to supervision are complementary as greater resilience at
the individual bank level reduces the risk system wide shocks.
Macroeconomic impact of Basel III; OECD estimates that the
medium-term impact of Basel III implementation on GDP growth is
in the range -0,05 percent to -0,15 percent per year. On the other hand
economic output is mainly affected by an increase in bank lending
spreads as banks pass a rise in banking funding costs, due to higher
capital requirements, to their customers. Consequently the estimated
effects on GDP growth assume no active response from monetary
policy. Basel III impact on economic output could be offset by a
reduction (or delayed increase) in monetary policy rates by about 30
to 80 basis points. The aim of this paper is to create a framework
based on the recent regulations in order to prevent financial crises.
Thus the need to overcome the global financial crisis will contribute
to financial crises that may occur in the future periods. In the first
part of the paper, the effects of the global crisis on the banking
system examine the concept of financial regulations. In the second
part; especially in the financial regulations and Basel III are analyzed.
The last section in this paper explored the possible consequences of
the macroeconomic impacts of Basel III.
Abstract: Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.
Abstract: Echocardiography imaging is one of the most common diagnostic tests that are widely used for assessing the abnormalities of the regional heart ventricle function. The main goal of the image enhancement task in 2D-echocardiography (2DE) is to solve two major anatomical structure problems; speckle noise and low quality. Therefore, speckle noise reduction is one of the important steps that used as a pre-processing to reduce the distortion effects in 2DE image segmentation. In this paper, we present the common filters that based on some form of low-pass spatial smoothing filters such as Mean, Gaussian, and Median. The Laplacian filter was used as a high-pass sharpening filter. A comparative analysis was presented to test the effectiveness of these filters after being applied to original 2DE images of 4-chamber and 2-chamber views. Three statistical quantity measures: root mean square error (RMSE), peak signal-to-ratio (PSNR) and signal-tonoise ratio (SNR) are used to evaluate the filter performance quantitatively on the output enhanced image.
Abstract: Jordan exerts many efforts to nurture their academically gifted students in special schools since 2001. During
the past nine years of launching these schools, their learning and excellence environments were believed to be distinguished compared
to public schools. This study investigated the environments of gifted
students compared with other non-gifted, using a survey instrument
that measures the dimensions of family, peers, teachers, school- support, society, and resources –dimensions rooted deeply in supporting gifted education, learning, and achievement. A total
number of 109 were selected from excellence schools for
academically gifted students, and 119 non-gifted students were selected from public schools. Around 8.3% of the non-gifted students
reported that they “Never" received any support from their surrounding environments, 14.9% reported “Seldom" support, 23.7% reported “ Often" support, 26.0% reported “Frequent" support, and
32.8% reported “Very frequent" support. Where the gifted students reported more “Never" support than the non-gifted did with 11.3%,
“Seldom" support with 15.4%, “Often" support with 26.6%,
“Frequent" support with 29.0%, and reported “Very frequent" support less than the non-gifted students with 23.6%. Unexpectedly,
statistical differences were found between the two groups favoring
non-gifted students in perception of their surrounding environments
in specific dimensions, namely, school- support, teachers, and society. No statistical differences were found in the other dimensions
of the survey, namely, family, peers, and resources. As the
differences were found in teachers, school- support, and society, the
nurturing environments for the excellence schools need to be revised to adopt more creative teaching styles, rich school atmosphere and
infrastructures, interactive guiding for the students and their parents, promoting for the excellence environments, and re-build successful
identification models. Thus, families, schools, and society should
increase their cooperation, communication, and awareness of the
gifted supportive environments. However, more studies to investigate
other aspects of promoting academic giftedness and excellence are recommended.
Abstract: Given that entrepreneurship is a very significant factor of regional development, it is necessary to approach systematically the development with measures of regional politics. According to international classification The Nomenclature of Territorial Units for Statistics (NUTS II), there are three regions in Croatia. The indicators of entrepreneurial activities on the national level of Croatia are analyzed in the paper, taking into consideration the results of referent research. The level of regional development is shown based on the analysis of entrepreneurs- operations. The results of the analysis show a very unfavorable situation in entrepreneurial activities on the national level of Croatia. The origin of this situation is to be found in the surroundings with an expressed inequality of regional development, which is caused by the non-existence of a strategically directed regional policy. In this paper recommendations which could contribute to the reduction of regional inequality in Croatia, have been made.
Abstract: When architecting an application, key nonfunctional requirements such as performance, scalability, availability and security, which influence the architecture of the system, are some times not adequately addressed. Performance of the application may not be looked at until there is a concern. There are several problems with this reactive approach. If the system does not meet its performance objectives, the application is unlikely to be accepted by the stakeholders. This paper suggests an approach for performance modeling for web based J2EE and .Net applications to address performance issues early in the development life cycle. It also includes a Performance Modeling Case Study, with Proof-of-Concept (PoC) and implementation details for .NET and J2EE platforms.
Abstract: Diabetes mellitus (DM) is frequently characterized by
autonomic nervous dysfunction. Analysis of heart rate variability
(HRV) has become a popular noninvasive tool for assessing the
activities of autonomic nervous system (ANS). In this paper, changes
in ANS activity are quantified by means of frequency and time
domain analysis of R-R interval variability. Electrocardiograms
(ECG) of 16 patients suffering from DM and of 16 healthy volunteers
were recorded. Frequency domain analysis of extracted normal to
normal interval (NN interval) data indicates significant difference in
very low frequency (VLF) power, low frequency (LF) power and
high frequency (HF) power, between the DM patients and control
group. Time domain measures, standard deviation of NN interval
(SDNN), root mean square of successive NN interval differences
(RMSSD), successive NN intervals differing more than 50 ms (NN50
Count), percentage value of NN50 count (pNN50), HRV triangular
index and triangular interpolation of NN intervals (TINN) also show
significant difference between the DM patients and control group.
Abstract: This paper includes a positive analysis to quantitatively grasp the relationship among vulnerability, information security incidents, and the countermeasures by using data based on a 2007 questionnaire survey for Japanese ISPs (Internet Service Providers). To grasp the relationships, logistic regression analysis is used. The results clarify that there are relationships between information security incidents and the countermeasures. Concretely, there is a positive relationship between information security incidents and the number of information security systems introduced as well as a negative relationship between information security incidents and information security education. It is also pointed out that (especially, local) ISPs do not execute efficient information security countermeasures/ investment concerned with systems, and it is suggested that they should positively execute information security education. In addition, to further heighten the information security level of Japanese telecommunication infrastructure, the necessity and importance of the government to implement policy to support the countermeasures of ISPs is insisted.
Abstract: The European countries that during the past two
decades based their exchange rate regimes on currency board
arrangement (CBA) are usually analysed from the perspective of
corner solution choice’s stabilisation effects. There is an open
discussion on the positive and negative background of a strict
exchange rate regime choice, although it should be seen as part of the
transition process towards the monetary union membership. The
focus of the paper is on the Baltic countries that after two decades of
a rigid exchange rate arrangement and strongly influenced by global
crisis are finishing their path towards the euro zone. Besides the
stabilising capacity, the CBA is highly vulnerable regime, with
limited developing potential. The rigidity of the exchange rate (and
monetary) system, despite the ensured credibility, do not leave
enough (or any) space for the adjustment and/or active crisis
management. Still, the Baltics are in a process of recovery, with fiscal
consolidation measures combined with (painful and politically
unpopular) measures of internal devaluation. Today, two of them
(Estonia and Latvia) are members of euro zone, fulfilling their
ultimate transition targets, but de facto exchanging one fixed regime
with another.
The paper analyses the challenges for the CBA in unstable
environment since the fixed regimes rely on imported stability and
are sensitive to external shocks. With limited monetary instruments,
these countries were oriented to the fiscal policies and used a
combination of internal devaluation and tax policy measures. Despite
their rather quick recovery, our second goal is to analyse the long
term influence that the measures had on the national economy.
Abstract: There are very complex communication systems, as
the multifunction radar, MFAR (Multi-Function Array Radar), where
functions are integrated all together, and simultaneously are
performed the classic functions of tracking and surveillance, as all
the functions related to the communication, countermeasures, and
calibration. All these functions are divided into the tasks to execute.
The task scheduler is a key element of the radar, since it does the
planning and distribution of energy and time resources to be shared
and used by all tasks. This paper presents schedulers based on the use
of multiple queue. Several schedulers have been designed and
studied, and it has been made a comparative analysis of different
performed schedulers. The tests and experiments have been done by
means of system software simulation. Finally a suitable set of radar
characteristics has been selected to evaluate the behavior of the task
scheduler working.
Abstract: Due to the recovering global economy, enterprises are
increasingly focusing on logistics. Investing in logistic measures for
a production generates a large potential for achieving a good starting
point within a competitive field. Unlike during the global economic
crisis, enterprises are now challenged with investing available capital
to maximize profits. In order to be able to create an informed and
quantifiably comprehensible basis for a decision, enterprises need an
adequate model for logistically and monetarily evaluating measures
in production. The Collaborate Research Centre 489 (SFB 489) at the
Institute for Production Systems (IFA) developed a Logistic
Information System which provides support in making decisions and
is designed specifically for the forging industry. The aim of a project
that has been applied for is to now transfer this process in order to
develop a universal approach to logistically and monetarily evaluate
measures in production.
Abstract: This paper presents a computational methodology
based on matrix operations for a computer based solution to the
problem of performance analysis of software reliability models
(SRMs). A set of seven comparison criteria have been formulated to
rank various non-homogenous Poisson process software reliability
models proposed during the past 30 years to estimate software
reliability measures such as the number of remaining faults, software
failure rate, and software reliability. Selection of optimal SRM for
use in a particular case has been an area of interest for researchers in
the field of software reliability. Tools and techniques for software
reliability model selection found in the literature cannot be used with
high level of confidence as they use a limited number of model
selection criteria. A real data set of middle size software project from
published papers has been used for demonstration of matrix method.
The result of this study will be a ranking of SRMs based on the
Permanent value of the criteria matrix formed for each model based
on the comparison criteria. The software reliability model with
highest value of the Permanent is ranked at number – 1 and so on.
Abstract: In this paper we study the fuzzy c-mean clustering algorithm
combined with principal components method. Demonstratively
analysis indicate that the new clustering method is well rather than
some clustering algorithms. We also consider the validity of clustering
method.
Abstract: The world-s largest Pre-stressed Concrete Cylinder
Pipe (PCCP) water supply project had a series of pipe failures which
occurred between 1999 and 2001. This has led the Man-Made River
Authority (MMRA), the authority in charge of the implementation
and operation of the project, to setup a rehabilitation plan for the
conveyance system while maintaining the uninterrupted flow of
water to consumers. At the same time, MMRA recognized the need
for a long term management tool that would facilitate repair and
maintenance decisions and enable taking the appropriate preventive
measures through continuous monitoring and estimation of the
remaining life of each pipe. This management tool is known as the
Pipe Risk Management System (PRMS) and now in operation at
MMRA. Both the rehabilitation plan and the PRMS require the
availability of complete and accurate pipe construction and
manufacturing data
This paper describes a systematic approach of data collection,
analysis, evaluation and correction for the construction and
manufacturing data files of phase I pipes which are the platform for
the PRMS database and any other related decision support system.
Abstract: Knowledge Discovery of Databases (KDD) is the
process of extracting previously unknown but useful and significant
information from large massive volume of databases. Data Mining is
a stage in the entire process of KDD which applies an algorithm to
extract interesting patterns. Usually, such algorithms generate huge
volume of patterns. These patterns have to be evaluated by using
interestingness measures to reflect the user requirements.
Interestingness is defined in different ways, (i) Objective measures
(ii) Subjective measures. Objective measures such as support and
confidence extract meaningful patterns based on the structure of the
patterns, while subjective measures such as unexpectedness and
novelty reflect the user perspective. In this report, we try to brief the
more widely spread and successful subjective measures and propose
a new subjective measure of interestingness, i.e. shocking.
Abstract: The heuristic decision rules used for project
scheduling will vary depending upon the project-s size, complexity,
duration, personnel, and owner requirements. The concept of project
complexity has received little detailed attention. The need to
differentiate between easy and hard problem instances and the
interest in isolating the fundamental factors that determine the
computing effort required by these procedures inspired a number of
researchers to develop various complexity measures.
In this study, the most common measures of project complexity are
presented. A new measure of project complexity is developed. The
main privilege of the proposed measure is that, it considers size,
shape and logic characteristics, time characteristics, resource
demands and availability characteristics as well as number of critical
activities and critical paths. The degree of sensitivity of the proposed
measure for complexity of project networks has been tested and
evaluated against the other measures of complexity of the considered
fifty project networks under consideration in the current study. The
developed measure showed more sensitivity to the changes in the
network data and gives accurate quantified results when comparing
the complexities of networks.
Abstract: Knowledge Discovery in Databases (KDD) is the process of extracting previously unknown, hidden and interesting patterns from a huge amount of data stored in databases. Data mining is a stage of the KDD process that aims at selecting and applying a particular data mining algorithm to extract an interesting and useful knowledge. It is highly expected that data mining methods will find interesting patterns according to some measures, from databases. It is of vital importance to define good measures of interestingness that would allow the system to discover only the useful patterns. Measures of interestingness are divided into objective and subjective measures. Objective measures are those that depend only on the structure of a pattern and which can be quantified by using statistical methods. While, subjective measures depend only on the subjectivity and understandability of the user who examine the patterns. These subjective measures are further divided into actionable, unexpected and novel. The key issues that faces data mining community is how to make actions on the basis of discovered knowledge. For a pattern to be actionable, the user subjectivity is captured by providing his/her background knowledge about domain. Here, we consider the actionability of the discovered knowledge as a measure of interestingness and raise important issues which need to be addressed to discover actionable knowledge.
Abstract: Organizational innovation favors technological
innovation, but does it also influence technological innovation
persistence? This article investigates empirically the pattern of
technological innovation persistence and tests the potential impact of
organizational innovation using firm-level data from three waves of
the French Community Innovation Surveys. Evidence shows a
positive effect of organizational innovation on technological
innovation persistence, according to various measures of
organizational innovation. Moreover, this impact is more significant
for complex innovators (i.e., those who innovate in both products and
processes). These results highlight the complexity of managing
organizational practices with regard to the firm-s technological
innovation. They also add to comprehension of the drivers of
innovation persistence, through a focus on an often forgotten
dimension of innovation in a broader sense.
Abstract: Information theory and Statistics play an important role in Biological Sciences when we use information measures for the study of diversity and equitability. In this communication, we develop the link among the three disciplines and prove that sampling distributions can be used to develop new information measures. Our study will be an interdisciplinary and will find its applications in Biological systems.