Abstract: In this paper bi-annual time series data on unemployment rates (from the Labour Force Survey) are expanded to quarterly rates and linked to quarterly unemployment rates (from the Quarterly Labour Force Survey). The resultant linked series and the consumer price index (CPI) series are examined using Johansen’s cointegration approach and vector error correction modeling. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant co-integrating relationship is found to exist between the time series of unemployment rates and the CPI. Given this significant relationship, the study models this relationship using Vector Error Correction Models (VECM), one with a restriction on the deterministic term and the other with no restriction.
A formal statistical confirmation of the existence of a unique linear and lagged relationship between inflation and unemployment for the period between September 2000 and June 2011 is presented. For the given period, the CPI was found to be an unbiased predictor of the unemployment rate. This relationship can be explored further for the development of appropriate forecasting models incorporating other study variables.
Abstract: To strengthen the capital market, there is a need to
integrate the capital markets within the region by removing legal or informal restriction, specifically, stock market liberalization. Thus the paper is to investigate the effects of the subsequent stock market liberalization on stock market integration in 4 ASEAN countries (Malaysia, Indonesia, Thailand, Singapore) and Korea from 1997 to 2007. The correlation between stock market liberalization and stock
market integration are to be examined by analyzing the stock prices
and returns within the region and in comparison with the world
MSCI index. Event study method is to be used with windows of ±12
months and T-7 + T. The results show that the subsequent stock
market liberalization generally, gives minor positive effects to stock
returns, except for one or two countries. The subsequent
liberalization also integrates the markets short-run and long-run.
Abstract: In this study we present our developed formative
assessment tool for students' assignments. The tool enables lecturers
to define assignments for the course and assign each problem in each
assignment a list of criteria and weights by which the students' work
is evaluated. During assessment, the lecturers feed the scores for each
criterion with justifications. When the scores of the current
assignment are completely fed in, the tool automatically generates
reports for both students and lecturers. The students receive a report
by email including detailed description of their assessed work, their
relative score and their progress across the criteria along the course
timeline. This information is presented via charts generated
automatically by the tool based on the scores fed in. The lecturers
receive a report that includes summative (e.g., averages, standard
deviations) and detailed (e.g., histogram) data of the current
assignment. This information enables the lecturers to follow the class
achievements and adjust the learning process accordingly. The tool
was examined on two pilot groups of college students that study a
course in (1) Object-Oriented Programming (2) Plane Geometry.
Results reveal that most of the students were satisfied with the
assessment process and the reports produced by the tool. The
lecturers who used the tool were also satisfied with the reports and
their contribution to the learning process.
Abstract: Phishing, or stealing of sensitive information on the
web, has dealt a major blow to Internet Security in recent times. Most
of the existing anti-phishing solutions fail to handle the fuzziness
involved in phish detection, thus leading to a large number of false
positives. This fuzziness is attributed to the use of highly flexible and
at the same time, highly ambiguous HTML language. We introduce a
new perspective against phishing, that tries to systematically prove,
whether a given page is phished or not, using the corresponding
original page as the basis of the comparison. It analyzes the layout of
the pages under consideration to determine the percentage distortion
between them, indicative of any form of malicious alteration. The
system design represents an intelligent system, employing dynamic
assessment which accurately identifies brand new phishing attacks
and will prove effective in reducing the number of false positives.
This framework could potentially be used as a knowledge base, in
educating the internet users against phishing.
Abstract: This study proposes a conceptual model and
empirically tests the relationships between customers and librarians
(i.e. tangibles, responsiveness, assurance, reliability and empathy)
with a dependent variable (customer satisfaction) regarding library
services. The SERVQUAL instrument was administered to 100
respondents which comprises of staff and students at a public higher
learning institution in the Federal Territory of Labuan, Malaysia.
They were public university library users. Results revealed that all
service quality dimensions tested were significant and influenced
customer satisfaction of visitors to a public university library.
Assurance is the most important factor that influences customer
satisfaction with the services rendered by the librarian. It is
imperative for the library management to take note that the top five
service attributes that gained greatest attention from library visitors-
perspective includes employee willingness to help customers,
availability of customer representatives online for response to
queries, library staff actively and promptly provide services, signs in
the building are clear and library staff are friendly and courteous.
This study provides valuable results concerning the determinants of
the service quality and customer satisfaction of public university
library services from the users' perspective.
Abstract: Recent trends in building constructions in Libya are
more toward tall (high-rise) building projects. As a consequence, a
better estimation of the lateral loading in the design process is
becoming the focal of a safe and cost effective building industry. Byin-
large, Libya is not considered a potential earthquake prone zone,
making wind is the dominant design lateral loads. Current design
practice in the country estimates wind speeds on a mere random
bases by considering certain factor of safety to the chosen wind
speed. Therefore, a need for a more accurate estimation of wind
speeds in Libya was the motivation behind this study. Records of
wind speed data were collected from 22 metrological stations in
Libya, and were statistically analysed. The analysis of more than four
decades of wind speed records suggests that the country can be
divided into four zones of distinct wind speeds. A computer “survey"
program was manipulated to draw design wind speeds contour map
for the state of Libya.
The paper presents the statistical analysis of Libya-s recorded
wind speed data and proposes design wind speed values for a 50-year
return period that covers the entire country.
Abstract: The so-called all-pass filter circuits are commonly
used in the field of signal processing, control and measurement.
Being connected to capacitive loads, these circuits tend to loose their
stability; therefore the elaborate analysis of their dynamic behavior is
necessary. The compensation methods intending to increase the
stability of such circuits are discussed in this paper, including the socalled
lead-lag compensation technique being treated in detail. For
the dynamic modeling, a two-port network model of the all-pass filter
is being derived. The results of the model analysis show, that
effective lead-lag compensation can be achieved, alone by the
optimization of the circuit parameters; therefore the application of
additional electric components are not needed to fulfill the stability
requirement.
Abstract: Complex engineering design problems consist of
numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive
waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the
problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well
understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It
provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction
hierarchies in a recursive and bottom-up approach that guarantees no
backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the
developed methodology is demonstrated by a design problem.
Abstract: Speedups from mapping four real-life DSP
applications on an embedded system-on-chip that couples coarsegrained
reconfigurable logic with an instruction-set processor are
presented. The reconfigurable logic is realized by a 2-Dimensional
Array of Processing Elements. A design flow for improving
application-s performance is proposed. Critical software parts, called
kernels, are accelerated on the Coarse-Grained Reconfigurable
Array. The kernels are detected by profiling the source code. For
mapping the detected kernels on the reconfigurable logic a prioritybased
mapping algorithm has been developed. Two 4x4 array
architectures, which differ in their interconnection structure among
the Processing Elements, are considered. The experiments for eight
different instances of a generic system show that important overall
application speedups have been reported for the four applications.
The performance improvements range from 1.86 to 3.67, with an
average value of 2.53, compared with an all-software execution.
These speedups are quite close to the maximum theoretical speedups
imposed by Amdahl-s law.
Abstract: Non-saturated soils that while saturation greatly
decrease their volume, have sudden settlement due to increasing
humidity, fracture and structural crack are called loess soils. Whereas
importance of civil projects including: dams, canals and
constructions bearing this type of soil and thereof problems, it is
required for carrying out more research and study in relation to loess
soils. This research studies shear strength parameters by using
grading test, Atterberg limit, compression, direct shear and
consolidation and then effect of using cement and lime additives on
stability of loess soils is studied. In related tests, lime and cement are
separately added to mixed ratios under different percentages of soil
and for different times the stabilized samples are processed and effect
of aforesaid additives on shear strength parameters of soil is studied.
Results show that upon passing time the effect of additives and
collapsible potential is greatly decreased and upon increasing
percentage of cement and lime the maximum dry density is
decreased; however, optimum humidity is increased. In addition,
liquid limit and plastic index is decreased; however, plastic index
limit is increased. It is to be noted that results of direct shear test
reveal increasing shear strength of soil due to increasing cohesion
parameter and soil friction angle.
Abstract: Independent component analysis (ICA) is a computational method for finding underlying signals or components from multivariate statistical data. The ICA method has been successfully applied in many fields, e.g. in vision research, brain imaging, geological signals and telecommunications. In this paper, we apply the ICA method to an analysis of mass spectra of oligomeric species emerged from aluminium sulphate. Mass spectra are typically complex, because they are linear combinations of spectra from different types of oligomeric species. The results show that ICA can decomposite the spectral components for useful information. This information is essential in developing coagulation phases of water treatment processes.
Abstract: In this paper a new definition of adjacency matrix in
the simple graphs is presented that is called fuzzy adjacency matrix,
so that elements of it are in the form of 0 and
n N
n
1 , ∈
that are
in the interval [0, 1], and then some charactristics of this matrix are
presented with the related examples . This form matrix has complete
of information of a graph.
Abstract: The present work compares the performance of three
turbulence modeling approach (based on the two-equation k -ε
model) in predicting erosive wear in multi-size dense slurry flow
through rotating channel. All three turbulence models include
rotation modification to the production term in the turbulent kineticenergy
equation. The two-phase flow field obtained numerically
using Galerkin finite element methodology relates the local flow
velocity and concentration to the wear rate via a suitable wear model.
The wear models for both sliding wear and impact wear mechanisms
account for the particle size dependence. Results of predicted wear
rates using the three turbulence models are compared for a large
number of cases spanning such operating parameters as rotation rate,
solids concentration, flow rate, particle size distribution and so forth.
The root-mean-square error between FE-generated data and the
correlation between maximum wear rate and the operating
parameters is found less than 2.5% for all the three models.
Abstract: In this paper, a new probability density function (pdf)
is proposed to model the statistics of wavelet coefficients, and a
simple Kalman-s filter is derived from the new pdf using Bayesian
estimation theory. Specifically, we decompose the speckled image
into wavelet subbands, we apply the Kalman-s filter to the high
subbands, and reconstruct a despeckled image from the modified
detail coefficients. Experimental results demonstrate that our method
compares favorably to several other despeckling methods on test
synthetic aperture radar (SAR) images.
Abstract: In this work, we study the impact of dynamically changing link slowdowns on the stability properties of packetswitched networks under the Adversarial Queueing Theory framework. Especially, we consider the Adversarial, Quasi-Static Slowdown Queueing Theory model, where each link slowdown may take on values in the two-valued set of integers {1, D} with D > 1 which remain fixed for a long time, under a (w, p)-adversary. In this framework, we present an innovative systematic construction for the estimation of adversarial injection rate lower bounds, which, if exceeded, cause instability in networks that use the LIS (Longest-in- System) protocol for contention-resolution. In addition, we show that a network that uses the LIS protocol for contention-resolution may result in dropping its instability bound at injection rates p > 0 when the network size and the high slowdown D take large values. This is the best ever known instability lower bound for LIS networks.
Abstract: Estimation of voltage stability based on optimal
filtering method is presented. PV curve is used as a tool for voltage stability analysis. Dynamic voltage stability estimation is done by
using particle filter method. Optimum value (nose point) of PV curve can be estimated by estimating parameter of PV curve equation
optimal value represents critical voltage and
condition at specified point of measurement. Voltage stability is then estimated by analyzing loading margin condition c stimating equation. This
maximum loading
ecified dynamically.
Abstract: A steady two-dimensional magnetohydrodynamics
flow and heat transfer over a stretching vertical sheet influenced by
radiation and porosity is studied. The governing boundary layer
equations of partial differential equations are reduced to a system of
ordinary differential equations using similarity transformation. The
system is solved numerically by using a finite difference scheme
known as the Keller-box method for some values of parameters,
namely the radiation parameter N, magnetic parameter M, buoyancy
parameter l , Prandtl number Pr and permeability parameter K. The
effects of the parameters on the heat transfer characteristics are
analyzed and discussed. It is found that both the skin friction
coefficient and the local Nusselt number decrease as the magnetic
parameter M and permeability parameter K increase. Heat transfer
rate at the surface decreases as the radiation parameter increases.
Abstract: In the recent years multimedia traffic and in particular
VoIP services are growing dramatically. We present a new algorithm
to control the resource utilization and to optimize the voice codec
selection during SIP call setup on behalf of the traffic condition
estimated on the network path.
The most suitable methodologies and the tools that perform realtime
evaluation of the available bandwidth on a network path have
been integrated with our proposed algorithm: this selects the best
codec for a VoIP call in function of the instantaneous available
bandwidth on the path. The algorithm does not require any explicit
feedback from the network, and this makes it easily deployable over
the Internet. We have also performed intensive tests on real network
scenarios with a software prototype, verifying the algorithm
efficiency with different network topologies and traffic patterns
between two SIP PBXs.
The promising results obtained during the experimental validation
of the algorithm are now the basis for the extension towards a larger
set of multimedia services and the integration of our methodology
with existing PBX appliances.
Abstract: The case study was conducted to show the effect of milking method in goat called half day milking on the milk production and the growth of kids. Data were collected by interviewing farmers and investigating goat production in the communal goat housing from June 2008 to May 2009. The interview was conducted to collect data about goat management. The observations were conducted on 10 goats, which were selected based on the uniformity of age, number of kid born/goat and the milking method in practice. The samples were divided into two groups; those were full 3 months nursing and half day milked goats (in this group the kids were separated from goat during the previous night milking and then the kids were allowed to suck the goat during the day). The result showed that the communal goat housing had 138 goats and 25% of the farmers milked the goat. The implementation of half day milking increased the milk production significantly (P
Abstract: Medical negligence disputes in Malaysia are mainly resolved through litigation by using the tort system. The tort system, being adversarial in nature has subjected parties to litigation hazards such as delay, excessive costs and uncertainty of outcome. The dissatisfaction of the tort system in compensating medically injured victims has created various alternatives to litigation. Amongst them is the implementation of a no-fault compensation system which would allow compensation to be given without the need of proving fault on the medical personnel. Instead, the community now bears the burden of compensating and at the end, promotes collective responsibility. For Malaysia, introducing a no-fault system would provide a tempting solution and may ultimately, achieve justice for the medical injured victims. Nevertheless, such drastic change requires a great deal of consideration to determine the suitability of the system and whether or not it will eventually cater for the needs of the Malaysian population