Abstract: Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.
Abstract: This paper presents a novel CMOS four-transistor
SRAM cell for very high density and low power embedded SRAM
applications as well as for stand-alone SRAM applications. This cell
retains its data with leakage current and positive feedback without
refresh cycle. The new cell size is 20% smaller than a conventional
six-transistor cell using same design rules. Also proposed cell uses
two word-lines and one pair bit-line. Read operation perform from
one side of cell, and write operation perform from another side of
cell, and swing voltage reduced on word-lines thus dynamic power
during read/write operation reduced. The fabrication process is fully
compatible with high-performance CMOS logic technologies,
because there is no need to integrate a poly-Si resistor or a TFT load.
HSPICE simulation in standard 0.25μm CMOS technology confirms
all results obtained from this paper.
Abstract: System testing is actually done to the entire system
against the Functional Requirement Specification and/or the System
Requirement Specification. Moreover, it is an investigatory testing
phase, where the focus is to have almost a destructive attitude and
test not only the design, but also the behavior and even the believed
expectations of the customer. It is also intended to test up to and
beyond the bounds defined in the software/hardware requirements
specifications. In Motorola®, Automated Testing is one of the testing
methodologies uses by GSG-iSGT (Global Software Group - iDEN
TM
Subcriber Group-Test) to increase the testing volume, productivity
and reduce test cycle-time in iDEN
TM
phones testing. Testing is able
to produce more robust products before release to the market. In this
paper, iHopper is proposed as a tool to perform stress test on iDEN
TM
phonse. We will discuss the value that automation has brought to
iDEN
TM
Phone testing such as improving software quality in the
iDEN
TM
phone together with some metrics. We will also look into
the advantages of the proposed system and some discussion of the
future work as well.
Abstract: In this paper a study on the vibration of thin
cylindrical shells with ring supports and made of functionally graded
materials (FGMs) composed of stainless steel and nickel is presented.
Material properties vary along the thickness direction of the shell
according to volume fraction power law. The cylindrical shells have
ring supports which are arbitrarily placed along the shell and impose
zero lateral deflections. The study is carried out based on third order
shear deformation shell theory (T.S.D.T). The analysis is carried out
using Hamilton-s principle. The governing equations of motion of
FGM cylindrical shells are derived based on shear deformation
theory. Results are presented on the frequency characteristics,
influence of ring support position and the influence of boundary
conditions. The present analysis is validated by comparing results
with those available in the literature.
Abstract: Image enhancement is the most important challenging preprocessing for almost all applications of Image Processing. By now, various methods such as Median filter, α-trimmed mean filter, etc. have been suggested. It was proved that the α-trimmed mean filter is the modification of median and mean filters. On the other hand, ε-filters have shown excellent performance in suppressing noise. In spite of their simplicity, they achieve good results. However, conventional ε-filter is based on moving average. In this paper, we suggested a new ε-filter which utilizes α-trimmed mean. We argue that this new method gives better outcomes compared to previous ones and the experimental results confirmed this claim.
Abstract: The present work presents a method of calculating the
ductility of rectangular sections of beams considering nonlinear
behavior of concrete and steel. This calculation procedure allows us
to trace the curvature of the section according to the bending
moment, and consequently deduce ductility. It also allowed us to
study the various parameters that affect the value of the ductility. A
comparison of the effect of maximum rates of tension steel, adopted
by the codes, ACI [1], EC8 [2] and RPA [3] on the value of the
ductility was made. It was concluded that the maximum rate of steels
permitted by the ACI [1] codes and RPA [3] are almost similar in
their effect on the ductility and too high. Therefore, the ductility
mobilized in case of an earthquake is low, the inverse of code EC8
[2]. Recommendations have been made in this direction.
Abstract: A challenging problem in radar signal processing is to
achieve reliable target detection in the presence of interferences. In
this paper, we propose a novel algorithm for automatic censoring of
radar interfering targets in log-normal clutter. The proposed
algorithm, termed the forward automatic censored cell averaging
detector (F-ACCAD), consists of two steps: removing the corrupted
reference cells (censoring) and the actual detection. Both steps are
performed dynamically by using a suitable set of ranked cells to
estimate the unknown background level and set the adaptive
thresholds accordingly. The F-ACCAD algorithm does not require
any prior information about the clutter parameters nor does it require
the number of interfering targets. The effectiveness of the F-ACCAD
algorithm is assessed by computing, using Monte Carlo simulations,
the probability of censoring and the probability of detection in
different background environments.
Abstract: The article is devoted to Kazakh repatriates and their
migration to Kazakhstan as historical homeland, and also addresses
the problem of migrants- adaptation in the republic, particularly in
Almaty oblast (region). The authors used up-to-date statictics and
materials of the Department of Migration Committee to analyze the
newcomers- number and features of the repatriate-s location in this
oblast. Having studied this region they were able to identify the main
reasons why Kazakh Diaspora in Central Asia, Iran, Avganistana and
Turkey is eager to come back to their historic homeland along with
repatriates adaptation to the republic.
Abstract: Smoothing or filtering of data is first preprocessing step
for noise suppression in many applications involving data analysis.
Moving average is the most popular method of smoothing the data,
generalization of this led to the development of Savitzky-Golay filter.
Many window smoothing methods were developed by convolving
the data with different window functions for different applications;
most widely used window functions are Gaussian or Kaiser. Function
approximation of the data by polynomial regression or Fourier
expansion or wavelet expansion also gives a smoothed data. Wavelets
also smooth the data to great extent by thresholding the wavelet
coefficients. Almost all smoothing methods destroys the peaks and
flatten them when the support of the window is increased. In certain
applications it is desirable to retain peaks while smoothing the data
as much as possible. In this paper we present a methodology called
as peak-wise smoothing that will smooth the data to any desired level
without losing the major peak features.
Abstract: Since the 1980s, banks and financial service institutions have been running in an endless race of innovation to cope with the advancing technology, the fierce competition, and the more sophisticated and demanding customers. In order to guide their innovation efforts, several researches were conducted to identify the success and failure factors of new financial services. These mainly included organizational factors, marketplace factors and new service development process factors. They almost all emphasized the importance of customer and market orientation as a response to the highly perceptual and intangible characteristics of financial services. However, they deemphasized the critical characteristics of high involvement of risk and close correlation with the economic conditions, a factor that heavily contributed to the Global financial Crisis of 2008. This paper reviews the success and failure factors of new financial services. It then adds new perspectives emerging from the analysis of the role of innovation in the global financial crisis.
Abstract: Higher-order Statistics (HOS), also known as
cumulants, cross moments and their frequency domain counterparts,
known as poly spectra have emerged as a powerful signal processing
tool for the synthesis and analysis of signals and systems. Algorithms
used for the computation of cross moments are computationally
intensive and require high computational speed for real-time
applications. For efficiency and high speed, it is often advantageous
to realize computation intensive algorithms in hardware. A promising
solution that combines high flexibility together with the speed of a
traditional hardware is Field Programmable Gate Array (FPGA). In
this paper, we present FPGA-based parallel architecture for the
computation of third-order cross moments. The proposed design is
coded in Very High Speed Integrated Circuit (VHSIC) Hardware
Description Language (VHDL) and functionally verified by
implementing it on Xilinx Spartan-3 XC3S2000FG900-4 FPGA.
Implementation results are presented and it shows that the proposed
design can operate at a maximum frequency of 86.618 MHz.
Abstract: The permanent magnet synchronous motor (PMSM) is
very useful in many applications. Vector control of PMSM is popular
kind of its control. In this paper, at first an optimal vector control for
PMSM is designed and then results are compared with conventional
vector control. Then, it is assumed that the measurements are noisy
and linear quadratic Gaussian (LQG) methodology is used to filter
the noises. The results of noisy optimal vector control and filtered
optimal vector control are compared to each other. Nonlinearity of
PMSM and existence of inverter in its control circuit caused that the
system is nonlinear and time-variant. With deriving average model,
the system is changed to nonlinear time-invariant and then the
nonlinear system is converted to linear system by linearization of
model around average values. This model is used to optimize vector
control then two optimal vector controls are compared to each other.
Simulation results show that the performance and robustness to noise
of the control system has been highly improved.
Abstract: While compressing text files is useful, compressing
still image files is almost a necessity. A typical image takes up much
more storage than a typical text message and without compression
images would be extremely clumsy to store and distribute. The
amount of information required to store pictures on modern
computers is quite large in relation to the amount of bandwidth
commonly available to transmit them over the Internet and
applications. Image compression addresses the problem of reducing
the amount of data required to represent a digital image. Performance
of any image compression method can be evaluated by measuring the
root-mean-square-error & peak signal to noise ratio. The method of
image compression that will be analyzed in this paper is based on the
lossy JPEG image compression technique, the most popular
compression technique for color images. JPEG compression is able to
greatly reduce file size with minimal image degradation by throwing
away the least “important" information. In JPEG, both color
components are downsampled simultaneously, but in this paper we
will compare the results when the compression is done by
downsampling the single chroma part. In this paper we will
demonstrate more compression ratio is achieved when the
chrominance blue is downsampled as compared to downsampling the
chrominance red in JPEG compression. But the peak signal to noise
ratio is more when the chrominance red is downsampled as compared
to downsampling the chrominance blue in JPEG compression. In
particular we will use the hats.jpg as a demonstration of JPEG
compression using low pass filter and demonstrate that the image is
compressed with barely any visual differences with both methods.
Abstract: A mammography image is composed of low contrast area where the breast tissues and the breast abnormalities such as microcalcification can hardly be differentiated by the medical practitioner. This paper presents the application of active contour models (Snakes) for the segmentation of microcalcification in mammography images. Comparison on the microcalcifiation areas segmented by the Balloon Snake, Gradient Vector Flow (GVF) Snake, and Distance Snake is done against the true value of the microcalcification area. The true area value is the average microcalcification area in the original mammography image traced by the expert radiologists. From fifty images tested, the result obtained shows that the accuracy of the Balloon Snake, GVF Snake, and Distance Snake in segmenting boundaries of microcalcification are 96.01%, 95.74%, and 95.70% accuracy respectively. This implies that the Balloon Snake is a better segmentation method to locate the exact boundary of a microcalcification region.
Abstract: Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.
Abstract: Artifact free photoplethysmographic (PPG) signals are
necessary for non-invasive estimation of oxygen saturation (SpO2) in
arterial blood. Movement of a patient corrupts the PPGs with motion
artifacts, resulting in large errors in the computation of Sp02. This
paper presents a study on using Kalman Filter in an innovative way
by modeling both the Artillery Blood Pressure (ABP) and the
unwanted signal, additive motion artifact, to reduce motion artifacts
from corrupted PPG signals. Simulation results show acceptable
performance regarding LMS and variable step LMS, thus
establishing the efficacy of the proposed method.
Abstract: Roundabout work on the principle of circulation and
entry flows, where the maximum entry flow rates depend largely on
circulating flow bearing in mind that entry flows must give away to
circulating flows. Where an existing roundabout has a road hump
installed at the entry arm, it can be hypothesized that the kinematics
of vehicles may prevent the entry arm from achieving optimum
performance. Road humps are traffic calming devices placed across
road width solely as speed reduction mechanism. They are the
preferred traffic calming option in Malaysia and often used on single
and dual carriageway local routes. The speed limit on local routes is
30mph (50 km/hr). Road humps in their various forms achieved the
biggest mean speed reduction (based on a mean speed before traffic
calming of 30mph) of up to 10mph or 16 km/hr according to the UK
Department of Transport. The underlying aim of reduced speed
should be to achieve a 'safe' distribution of speeds which reflects the
function of the road and the impacts on the local community.
Constraining safe distribution of speeds may lead to poor drivers
timing and delayed reflex reaction that can probably cause accident.
Previous studies on road hump impact have focused mainly on speed
reduction, traffic volume, noise and vibrations, discomfort and delay
from the use of road humps. The paper is aimed at optimal entry and
circulating flow induced by road humps. Results show that
roundabout entry and circulating flow perform better in
circumstances where there is no road hump at entrance.
Abstract: The main focus of the work was concerned with hydrodynamic and thermal analysis of the plate heat exchanger channel with corrugation patterns suggested to be triangular, sinusoidal, and square corrugation. This study was to numerically model and validate the triangular corrugated channel with dimensions/parameters taken from open literature, and then model/analyze both sinusoidal, and square corrugated channel referred to the triangular model. Initially, 2D modeling with local extensive analysis for triangular corrugated channel was carried out. By that, all local pressure drop, wall shear stress, friction factor, static temperature, heat flux, Nusselt number, and surface heat coefficient, were analyzed to interpret the hydrodynamic and thermal phenomena occurred in the flow. Furthermore, in order to facilitate confidence in this model, a comparison between the values predicted, and experimental results taken from literature for almost the same case, was done. Moreover, a holistic numerical study for sinusoidal and square channels together with global comparisons with triangular corrugation under the same condition, were handled. Later, a comparison between electric, and fluid cooling through varying the boundary condition was achieved. The constant wall temperature and constant wall heat flux boundary conditions were employed, and the different resulted Nusselt numbers as a consequence were justified. The results obtained can be used to come up with an optimal design, a 'compromise' between heat transfer and pressure drop.
Abstract: In this paper, we consider the almost periodic solutions of a discrete cooperation system with feedback controls. Assuming that the coefficients in the system are almost periodic sequences, we obtain the existence and uniqueness of the almost periodic solution which is uniformly asymptotically stable.
Abstract: Saudi Arabia in recent years has seen drastic increase
in traffic related crashes. With population of over 29 million, Saudi
Arabia is considered as a fast growing and emerging economy. The
rapid population increase and economic growth has resulted in rapid
expansion of transportation infrastructure, which has led to increase
in road crashes. Saudi Ministry of Interior reported more than 7,000
people killed and 68,000 injured in 2011 ranking Saudi Arabia to be
one of the worst worldwide in traffic safety. The traffic safety issues
in the country also result in distress to road users and cause and
economic loss exceeding 3.7 billion Euros annually. Keeping this in
view, the researchers in Saudi Arabia are investigating ways to
improve traffic safety conditions in the country. This paper presents a
multilevel approach to collect traffic safety related data required to do
traffic safety studies in the region. Two highway corridors including
King Fahd Highway 39 kilometre and Gulf Cooperation Council
Highway 42 kilometre long connecting the cities of Dammam and
Khobar were selected as a study area. Traffic data collected included
traffic counts, crash data, travel time data, and speed data. The
collected data was analysed using geographic information system to
evaluate any correlation. Further research is needed to investigate the
effectiveness of traffic safety related data when collected in a
concerted effort.