Abstract: The H.264/AVC video coding standard contains a number of advanced features. Ones of the new features introduced in this standard is the multiple intramode prediction. Its function exploits directional spatial correlation with adjacent block for intra prediction. With this new features, intra coding of H.264/AVC offers a considerably higher improvement in coding efficiency compared to other compression standard, but computational complexity is increased significantly when brut force rate distortion optimization (RDO) algorithm is used. In this paper, we propose a new fast intra prediction mode decision method for the complexity reduction of H.264 video coding. for luma intra prediction, the proposed method consists of two step: in the first step, we make the RDO for four mode of intra 4x4 block, based the distribution of RDO cost of those modes and the idea that the fort correlation with adjacent mode, we select the best mode of intra 4x4 block. In the second step, we based the fact that the dominating direction of a smaller block is similar to that of bigger block, the candidate modes of 8x8 blocks and 16x16 macroblocks are determined. So, in case of chroma intra prediction, the variance of the chroma pixel values is much smaller than that of luma ones, since our proposed uses only the mode DC. Experimental results show that the new fast intra mode decision algorithm increases the speed of intra coding significantly with negligible loss of PSNR.
Abstract: since in natural accidents, facilities that relate to this vita element are underground so, it is difficult to find quickly some right, exact and definite information about water utilities. There fore, this article has done operationally in Boukan city in Western Azarbaijan of Iran and it tries to represent operation and capabilities of Geographical Information system (GIS) in urban water management at the time of natural accidents. Structure of this article is that firstly it has established a comprehensive data base related to water utilities by collecting, entering, saving and data management, then by modeling water utilities we have practically considered its operational aspects related to water utility problems in urban regions.
Abstract: Master plan is a tool to guide and manage the growth of cities in a planned manner. The soul of a master plan lies in its implementation framework. If not implemented, people are trapped in a mess of urban problems and laissez-faire development having serious long term repercussions. Unfortunately, Master Plans prepared for several major cities of Pakistan could not be fully implemented due to host of reasons and Lahore is no exception. Being the second largest city of Pakistan with a population of over 7 million people, Lahore holds the distinction that the first ever Master Plan in the country was prepared for this city in 1966. Recently in 2004, a new plan titled `Integrated Master Plan for Lahore-2021- has been approved for implementation. This paper provides a comprehensive account of the weaknesses and constraints in the plan preparation process and implementation strategies of Master Plans prepared for Lahore. It also critically reviews the new Master Plan particularly with respect to the proposed implementation framework. The paper discusses the prospects and pre-conditions for successful implementation of the new Plan in the light of historic analysis, interviews with stakeholders and the new institutional context under the devolution plan.
Abstract: In this paper the behavior of the decision feedback
equalizers (DFEs) adapted by the decision-directed or the constant
modulus blind algorithms is presented. An analysis of the error
surface of the corresponding criterion cost functions is first
developed. With the intention of avoiding the ill-convergence of the
algorithm, the paper proposes to modify the shape of the cost
function error surface by using a soft decision instead of the hard
one. This was shown to reduce the influence of false decisions and to
smooth the undesirable minima. Modified algorithms using the soft
decision during a pseudo-training phase with an automatic switch to
the properly tracking phase are then derived. Computer simulations
show that these modified algorithms present better ability to avoid
local minima than conventional ones.
Abstract: In today-s modern world, the number of vehicles is
increasing on the road. This causes more people to choose walking
instead of traveling using vehicles. Thus, proper planning of
pedestrians- paths is important to ensure the safety of pedestrians in a
walking area. Crowd dynamics study the pedestrians- behavior and
modeling pedestrians- movement to ensure safety in their walking paths.
To date, many models have been designed to ease pedestrians-
movement. The Social Force Model is widely used among researchers
as it is simpler and provides better simulation results. We will discuss
the problem regarding the ritual of circumambulating the Ka-aba
(Tawaf) where the entrances to this area are usually congested which
worsens during the Hajj season. We will use the computer simulation
model SimWalk which is based on the Social Force Model to simulate
the movement of pilgrims in the Tawaf area. We will first discuss the
effect of uni and bi-directional flows at the gates. We will then restrict
certain gates to the area as the entrances only and others as exits only.
From the simulations, we will study the effect of the distance of other
entrances from the beginning line and their effects on the duration of
pilgrims circumambulate Ka-aba. We will distribute the pilgrims at the
different entrances evenly so that the congestion at the entrances can be
reduced. We would also discuss the various locations and designs of
barriers at the exits and its effect on the time taken for the pilgrims to
exit the Tawaf area.
Abstract: We present a theory for optimal filtering of infinite sets of random signals. There are several new distinctive features of the proposed approach. First, we provide a single optimal filter for processing any signal from a given infinite signal set. Second, the filter is presented in the special form of a sum with p terms where each term is represented as a combination of three operations. Each operation is a special stage of the filtering aimed at facilitating the associated numerical work. Third, an iterative scheme is implemented into the filter structure to provide an improvement in the filter performance at each step of the scheme. The final step of the concerns signal compression and decompression. This step is based on the solution of a new rank-constrained matrix approximation problem. The solution to the matrix problem is described in this paper. A rigorous error analysis is given for the new filter.
Abstract: In this study we focus on improvement performance
of a cue based Motor Imagery Brain Computer Interface (BCI). For
this purpose, data fusion approach is used on results of different
classifiers to make the best decision. At first step Distinction
Sensitive Learning Vector Quantization method is used as a feature
selection method to determine most informative frequencies in
recorded signals and its performance is evaluated by frequency
search method. Then informative features are extracted by packet
wavelet transform. In next step 5 different types of classification
methods are applied. The methodologies are tested on BCI
Competition II dataset III, the best obtained accuracy is 85% and the
best kappa value is 0.8. At final step ordered weighted averaging
(OWA) method is used to provide a proper aggregation classifiers
outputs. Using OWA enhanced system accuracy to 95% and kappa
value to 0.9. Applying OWA just uses 50 milliseconds for
performing calculation.
Abstract: In data mining, the association rules are used to find
for the associations between the different items of the transactions
database. As the data collected and stored, rules of value can be found
through association rules, which can be applied to help managers
execute marketing strategies and establish sound market frameworks.
This paper aims to use Fuzzy Frequent Pattern growth (FFP-growth)
to derive from fuzzy association rules. At first, we apply fuzzy
partition methods and decide a membership function of quantitative
value for each transaction item. Next, we implement FFP-growth
to deal with the process of data mining. In addition, in order to
understand the impact of Apriori algorithm and FFP-growth algorithm
on the execution time and the number of generated association
rules, the experiment will be performed by using different sizes of
databases and thresholds. Lastly, the experiment results show FFPgrowth
algorithm is more efficient than other existing methods.
Abstract: Trust management is one of the drawbacks in Peer-to-Peer (P2P) system. Lack of centralized control makes it difficult to control the behavior of the peers. Reputation system is one approach to provide trust assessment in P2P system. In this paper, we use fuzzy logic to model trust in a P2P environment. Our trust model combines first-hand (direct experience) and second-hand (reputation)information to allow peers to represent and reason with uncertainty regarding other peers' trustworthiness. Fuzzy logic can help in handling the imprecise nature and uncertainty of trust. Linguistic labels are used to enable peers assign a trust level intuitively. Our fuzzy trust model is flexible such that inference rules are used to weight first-hand and second-hand accordingly.
Abstract: This work presents a recursive identification algorithm. This algorithm relates to the identification of closed loop system with Variable Structure Controller. The approach suggested includes two stages. In the first stage a genetic algorithm is used to obtain the parameters of switching function which gives a control signal rich in commutations (i.e. a control signal whose spectral characteristics are closest possible to those of a white noise signal). The second stage consists in the identification of the system parameters by the instrumental variable method and using the optimal switching function parameters obtained with the genetic algorithm. In order to test the validity of this algorithm a simulation example is presented.
Abstract: Like any sentient organism, a smart environment
relies first and foremost on sensory data captured from the real
world. The sensory data come from sensor nodes of different
modalities deployed on different locations forming a Wireless Sensor
Network (WSN). Embedding smart sensors in humans has been a
research challenge due to the limitations imposed by these sensors
from computational capabilities to limited power. In this paper, we
first propose a practical WSN application that will enable blind
people to see what their neighboring partners can see. The challenge
is that the actual mapping between the input images to brain pattern
is too complex and not well understood. We also study the
connectivity problem in 3D/2D wireless sensor networks and propose
distributed efficient algorithms to accomplish the required
connectivity of the system. We provide a new connectivity algorithm
CDCA to connect disconnected parts of a network using cooperative
diversity. Through simulations, we analyze the connectivity gains
and energy savings provided by this novel form of cooperative
diversity in WSNs.
Abstract: This paper presents a technique for diagnosis of the abdominal aorta aneurysm in magnetic resonance imaging (MRI) images. First, our technique is designed to segment the aorta image in MRI images. This is a required step to determine the volume of aorta image which is the important step for diagnosis of the abdominal aorta aneurysm. Our proposed technique can detect the volume of aorta in MRI images using a new external energy for snakes model. The new external energy for snakes model is calculated from Law-s texture. The new external energy can increase the capture range of snakes model efficiently more than the old external energy of snakes models. Second, our technique is designed to diagnose the abdominal aorta aneurysm by Bayesian classifier which is classification models based on statistical theory. The feature for data classification of abdominal aorta aneurysm was derived from the contour of aorta images which was a result from segmenting of our snakes model, i.e., area, perimeter and compactness. We also compare the proposed technique with the traditional snakes model. In our experiment results, 30 images are trained, 20 images are tested and compared with expert opinion. The experimental results show that our technique is able to provide more accurate results than 95%.
Abstract: This paper investigates the problem of sampling from transactional data streams. We introduce CFISDS as a content based sampling algorithm that works on a landmark window model of data streams and preserve more informed sample in sample space. This algorithm that work based on closed frequent itemset mining tasks, first initiate a concept lattice using initial data, then update lattice structure using an incremental mechanism.Incremental mechanism insert, update and delete nodes in/from concept lattice in batch manner. Presented algorithm extracts the final samples on demand of user. Experimental results show the accuracy of CFISDS on synthetic and real datasets, despite on CFISDS algorithm is not faster than exist sampling algorithms such as Z and DSS.
Abstract: Generalized Center String (GCS) problem are
generalized from Common Approximate Substring problem
and Common substring problems. GCS are known to be
NP-hard allowing the problems lies in the explosion of
potential candidates. Finding longest center string without
concerning the sequence that may not contain any motifs is
not known in advance in any particular biological gene
process. GCS solved by frequent pattern-mining techniques
and known to be fixed parameter tractable based on the
fixed input sequence length and symbol set size. Efficient
method known as Bpriori algorithms can solve GCS with
reasonable time/space complexities. Bpriori 2 and Bpriori
3-2 algorithm are been proposed of any length and any
positions of all their instances in input sequences. In this
paper, we reduced the time/space complexity of Bpriori
algorithm by Constrained Based Frequent Pattern mining
(CBFP) technique which integrates the idea of Constraint
Based Mining and FP-tree mining. CBFP mining technique
solves the GCS problem works for all center string of any
length, but also for the positions of all their mutated copies
of input sequence. CBFP mining technique construct TRIE
like with FP tree to represent the mutated copies of center
string of any length, along with constraints to restraint
growth of the consensus tree. The complexity analysis for
Constrained Based FP mining technique and Bpriori
algorithm is done based on the worst case and average case
approach. Algorithm's correctness compared with the
Bpriori algorithm using artificial data is shown.
Abstract: A virtualized and virtual approach is presented on
academically preparing students to successfully engage at a strategic
perspective to understand those concerns and measures that are both
structured and not structured in the area of cyber security and
information assurance. The Master of Science in Cyber Security and
Information Assurance (MSCSIA) is a professional degree for those
who endeavor through technical and managerial measures to ensure
the security, confidentiality, integrity, authenticity, control,
availability and utility of the world-s computing and information
systems infrastructure. The National University Cyber Security and
Information Assurance program is offered as a Master-s degree. The
emphasis of the MSCSIA program uniquely includes hands-on
academic instruction using virtual computers. This past year, 2011,
the NU facility has become fully operational using system
architecture to provide a Virtual Education Laboratory (VEL)
accessible to both onsite and online students. The first student cohort
completed their MSCSIA training this past March 2, 2012 after
fulfilling 12 courses, for a total of 54 units of college credits. The
rapid pace scheduling of one course per month is immensely
challenging, perpetually changing, and virtually multifaceted. This
paper analyses these descriptive terms in consideration of those
globalization penetration breaches as present in today-s world of
cyber security. In addition, we present current NU practices to
mitigate risks.
Abstract: This paper presents a comparison between two Pulse
Width Modulation (PWM) algorithms applied to a three-level Neutral
Point Clamped (NPC) Voltage Source Inverter (VSI). The first
algorithm applied is the triangular-sinusoidal strategy; the second is
the Space Vector Pulse Width Modulation (SVPWM) strategy. In the
first part, we present a topology of three-level NCP VSI. After that,
we develop the two PWM strategies to control this converter. At the
end the experimental results are presented.
Abstract: Collision is considered as a time-depended nonlinear
dynamic phenomenon. The majority of researchers have focused on
deriving the resultant damage of the ship collisions via analytical,
experimental, and finite element methods.In this paper, first, the
force-penetration curve of a head collision on a container ship with
rigid barrier based on Yang and Pedersen-s methods for internal
mechanic section is studied. Next, the obtained results from different
analytical methods are compared with each others. Then, through a
simulation of the container ship collision in Ansys Ls-Dyna, results
from finite element approach are compared with analytical methods
and the source of errors is discussed. Finally, the effects of
parameters such as velocity, and angle of collision on the forcepenetration
curve are investigated.
Abstract: In this paper, we study the multi-scenario knapsack problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of an adaptive algorithm for solving heuristically the problem. The used method combines two complementary phases: a size reduction phase and a dynamic 2- opt procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for reducing the size problem. Second, the adaptive search procedure is applied in order to attain a feasible solution Finally, the performances of two versions of the proposed algorithm are evaluated on a set of randomly generated instances.
Abstract: One promising way to achieve low temperature
combustion regime is the use of a large amount of cooled EGR. In
this paper, the effect of injection timing on low temperature
combustion process and emissions were investigated via three
dimensional computational fluid dynamics (CFD) procedures in a DI
diesel engine using high EGR rates. The results show when
increasing EGR from low levels to levels corresponding to reduced
temperature combustion, soot emission after first increasing, is
decreased beyond 40% EGR and get the lowest value at 58% EGR
rate. Soot and NOx emissions are simultaneously decreased at
advanced injection timing before 20.5 ÂșCA BTDC in conjunction
with 58% cooled EGR rate in compared to baseline case.
Abstract: A rare phenomenon of SDS-induced activation of a latent protease activity associated with the purified silkworm excretory red fluorescent protein (SE-RFP) was noticed. SE-RFP aliquots incubated with SDS for different time intervals indicated that the protein undergoes an obligatory breakdown into a number of subunits which exhibit autoproteolytic (acting upon themselves) and/or heteroproteolytic (acting on other proteins) activities. A strong serine protease activity of SE-RFP subunits on Bombyx mori nucleopolyhedrovirus (BmNPV) polyhedral protein was detected by zymography technique. A complete inhibition of BmNPV infection to silkworms was observed by the oral administration assay of the SE-RFP. Here, it is proposed that the SE-RFP prevents the initial infection of BmNPV to silkworms by obliterating the polyhedral protein. This is the first report on a silkworm red fluorescent protein that exhibits a protease activity on exposure to SDS. The present studies would help in understanding the antiviral mechanism of silkworm red fluorescent proteins.