Abstract: This paper presents an optimal design of linear phase
digital high pass finite impulse response (FIR) filter using Improved
Particle Swarm Optimization (IPSO). In the design process, the filter
length, pass band and stop band frequencies, feasible pass band and
stop band ripple sizes are specified. FIR filter design is a multi-modal
optimization problem. An iterative method is introduced to find the
optimal solution of FIR filter design problem. Evolutionary
algorithms like real code genetic algorithm (RGA), particle swarm
optimization (PSO), improved particle swarm optimization (IPSO)
have been used in this work for the design of linear phase high pass
FIR filter. IPSO is an improved PSO that proposes a new definition
for the velocity vector and swarm updating and hence the solution
quality is improved. A comparison of simulation results reveals the
optimization efficacy of the algorithm over the prevailing
optimization techniques for the solution of the multimodal, nondifferentiable,
highly non-linear, and constrained FIR filter design
problems.
Abstract: In this paper we present discretization and decomposition methods for a multi-component transport model of a chemical vapor deposition (CVD) process. CVD processes are used to manufacture deposition layers or bulk materials. In our transport model we simulate the deposition of thin layers. The microscopic model is based on the heavy particles, which are derived by approximately solving a linearized multicomponent Boltzmann equation. For the drift-process of the particles we propose diffusionreaction equations as well as for the effects of heat conduction. We concentrate on solving the diffusion-reaction equation with analytical and numerical methods. For the chemical processes, modelled with reaction equations, we propose decomposition methods and decouple the multi-component models to simpler systems of differential equations. In the numerical experiments we present the computational results of our proposed models.
Abstract: Long number multiplications (n ≥ 128-bit) are a
primitive in most cryptosystems. They can be performed better by
using Karatsuba-Ofman technique. This algorithm is easy to
parallelize on workstation network and on distributed memory, and
it-s known as the practical method of choice. Multiplying long
numbers using Karatsuba-Ofman algorithm is fast but is highly
recursive. In this paper, we propose different designs of
implementing Karatsuba-Ofman multiplier. A mixture of sequential
and combinational system design techniques involving pipelining is
applied to our proposed designs. Multiplying large numbers can be
adapted flexibly to time, area and power criteria. Computationally
and occupation constrained in embedded systems such as: smart
cards, mobile phones..., multiplication of finite field elements can be
achieved more efficiently. The proposed designs are compared to
other existing techniques. Mathematical models (Area (n), Delay (n))
of our proposed designs are also elaborated and evaluated on
different FPGAs devices.
Abstract: Stochastic modeling of network traffic is an area of
significant research activity for current and future broadband
communication networks. Multimedia traffic is statistically
characterized by a bursty variable bit rate (VBR) profile. In this
paper, we develop an improved model for uniform activity level
video sources in ATM using a doubly stochastic autoregressive
model driven by an underlying spatial point process. We then
examine a number of burstiness metrics such as the peak-to-average
ratio (PAR), the temporal autocovariance function (ACF) and the
traffic measurements histogram. We found that the former measure is
most suitable for capturing the burstiness of single scene video
traffic. In the last phase of this work, we analyse statistical
multiplexing of several constant scene video sources. This proved,
expectedly, to be advantageous with respect to reducing the
burstiness of the traffic, as long as the sources are statistically
independent. We observed that the burstiness was rapidly
diminishing, with the largest gain occuring when only around 5
sources are multiplexed. The novel model used in this paper for
characterizing uniform activity video was thus found to be an
accurate model.
Abstract: This paper provides a framework in order to
incorporate reliability issue as a sign of disruption in distribution
systems and partial covering theory as a response to limitation in
coverage radios and economical preferences, simultaneously into the
traditional literatures of capacitated facility location problems. As a
result we develop a bi-objective model based on the discrete
scenarios for expected cost minimization and demands coverage
maximization through a three echelon supply chain network by
facilitating multi-capacity levels for provider side layers and
imposing gradual coverage function for distribution centers (DCs).
Additionally, in spite of objectives aggregation for solving the model
through LINGO software, a branch of LP-Metric method called Min-
Max approach is proposed and different aspects of corresponds
model will be explored.
Abstract: This research simulates one of the natural phenomena,
the ocean wave. Our goal is to be able to simulate the ocean wave at
real-time rate with the water surface interacting with objects. The
wave in this research is calm and smooth caused by the force of the
wind above the ocean surface. In order to make the simulation of the
wave real-time, the implementation of the GPU and the
multithreading techniques are used here. Based on the fact that the
new generation CPUs, for personal computers, have multi cores, they
are useful for the multithread. This technique utilizes more than one
core at a time. This simulation is programmed by C language with
OpenGL. To make the simulation of the wave look more realistic, we
applied an OpenGL technique called cube mapping (environmental
mapping) to make water surface reflective and more realistic.
Abstract: The paper contains a review of the literature in terms of the critical analysis of methodologies of university ranking systems. Furthermore, the initiatives supported by the European Commission (U-Map, U-Multirank) and CHE Ranking are described. Special attention is paid to the tendencies in the development of ranking systems. According to the author, the ranking organizations should abandon the classic form of ranking, namely a hierarchical ordering of universities from “the best" to “the worse". In the empirical part of this paper, using one of the method of cluster analysis called k-means clustering, the author presents university classifications of the top universities from the Shanghai Jiao Tong University-s (SJTU) Academic Ranking of World Universities (ARWU).
Abstract: In this paper, we use an M/G/C/C state dependent
queuing model within a complex network topology to determine the
different performance measures for pedestrian traffic flow. The
occupants in this network topology need to go through some source
corridors, from which they can choose their suitable exiting
corridors. The performance measures were calculated using arrival
rates that maximize the throughputs of source corridors. In order to
increase the throughput of the network, the result indicates that the
flow direction of pedestrian through the corridors has to be restricted
and the arrival rates to the source corridor need to be controlled.
Abstract: Olomouc is a unique and complex landmark with
widespread forestation and land use. This research work was
conducted to assess important and complex land use change
trajectories in Olomouc region. Multi-temporal satellite data from
1991, 2001 and 2013 were used to extract land use/cover types by
object oriented classification method. To achieve the objectives, three
different aspects were used: (1) Calculate the quantity of each
transition; (2) Allocate location based landscape pattern (3) Compare
land use/cover evaluation procedure. Land cover change trajectories
shows that 16.69% agriculture, 54.33% forest and 21.98% other areas
(settlement, pasture and water-body) were stable in all three decade.
Approximately 30% of the study area maintained as a same land cove
type from 1991 to 2013. Here broad scale of political and socioeconomic
factors was also affect the rate and direction of landscape
changes. Distance from the settlements was the most important
predictor of land cover change trajectories. This showed that most of
landscape trajectories were caused by socio-economic activities and
mainly led to virtuous change on the ecological environment.
Abstract: This paper explains how mobile learning assures sustainable e-education for multicultural group of students. This paper reports the impact of mobile learning on distance education in multicultural environment. The emergence of learning technologies through CD, internet, and mobile is increasingly adopted by distance institutes for quick delivery and cost-effective purposes. Their sustainability is conditioned by the structure of learners as well as the teaching community. The experimental study was conducted among the distant learners of Vinayaka Missions University located at Salem in India. Students were drawn from multicultural environment based on different languages, religions, class and communities. During the mobile learning sessions, the students, who are divided on language, religion, class and community, were dominated by play impulse rather than study anxiety or cultural inhibitions. This study confirmed that mobile learning improved the performance of the students despite their division based on region, language or culture. In other words, technology was able to transcend the relative deprivation in the multicultural groups. It also confirms sustainable e-education through mobile learning and cost-effective system of instruction. Mobile learning appropriates the self-motivation and play impulse of the young learners in providing sustainable e-education to multicultural social groups of students.
Abstract: In this paper we introduce a novel method for
the characterization of synchronziation and coupling effects
in multivariate time series that can be used for the analysis
of EEG or ECoG signals recorded during epileptic seizures.
The method allows to visualize the spatio-temporal evolution
of synchronization and coupling effects that are characteristic
for epileptic seizures. Similar to other methods proposed for
this purpose our method is based on a regression analysis.
However, a more general definition of the regression together
with an effective channel selection procedure allows to use the
method even for time series that are highly correlated, which
is commonly the case in EEG/ECoG recordings with large
numbers of electrodes. The method was experimentally tested
on ECoG recordings of epileptic seizures from patients with
temporal lobe epilepsies. A comparision with the results from
a independent visual inspection by clinical experts showed
an excellent agreement with the patterns obtained with the
proposed method.
Abstract: There exists a strong correlation between efficient project management and competitive advantage for organizations. Therefore, organizations are striving to standardize and assess the rigor of their project management processes and capabilities i.e. project management maturity. Researchers and standardization organizations have developed several project management maturity models (PMMMs) to assess project management maturity of the organizations. This study presents a critical evaluation of some of the leading PMMMs against OPM3® in a multitude of ways to look at which PMMM is the most comprehensive model - which could assess most aspects of organizations and also help the organizations in gaining competitive advantage over competitors. After a detailed morphological analysis of the models, it is concluded that OPM3® is the most promising maturity model that can really provide a competitive advantage to the organizations due to its unique approach of assessment and improvement strategies.
Abstract: Vernonia divergens Benth., commonly known as
“Insulin Plant” (Fam: Asteraceae) is a potent sugar killer. Locally the
leaves of the plant, boiled in water are successfully administered to a
large number of diabetic patients. The present study evaluates the
putative anti-diabetic ingredients, isolated from the in vivo and in
vitro grown plantlets of V. divergens for their antimicrobial and
anticancer activities. Sterilized explants of nodal segments were
cultured on MS (Musashige and Skoog, 1962) medium in presence of
different combinations of hormones. Multiple shoots along with
bunch of roots were regenerated at 1mg l-1 BAP and 0.5 mg l-1 NAA.
Micro-plantlets were separated and sub-cultured on the double
strength (2X) of the above combination of hormones leading to
increased length of roots and shoots. These plantlets were
successfully transferred to soil and survived well in nature. The
ethanol extract of plantlets from both in vivo & in vitro sources were
prepared in soxhlet extractor and then concentrated to dryness under
reduced pressure in rotary evaporator. Thus obtainedconcentrated
extracts showed significant inhibitory activity against gram
negative bacteria like Escherichia coli and Pseudomonas
aeruginosa but no inhibition was found against gram positive
bacteria. Further, these ethanol extracts were screened for in vitro
percentage cytotoxicity at different time periods (24 h, 48 h and 72 h)
of different dilutions. The in vivo plant extract inhibited the growth of
EAC mouse cell lines in the range of 65, 66, 78, and 88% at 100, 50,
25 & 12.5μg mL-1 but at 72 h of treatment. In case of the extract of in
vitro origin, the inhibition was found against EAC cell lines even at
48h. During spectrophotometric scanning, the extracts exhibited
different maxima (ʎ) - four peaks in in vitro extracts as against single
in in vivo preparation suggesting the possible change in the nature of
ingredients during micropropagation through tissue culture
techniques.
Abstract: A physically based, spatially-distributed water quality model is being developed to simulate spatial and temporal distributions of material transport in the Great Lakes Watersheds of the U.S. Multiple databases of meteorology, land use, topography, hydrography, soils, agricultural statistics, and water quality were used to estimate nonpoint source loading potential in the study watersheds. Animal manure production was computed from tabulations of animals by zip code area for the census years of 1987, 1992, 1997, and 2002. Relative chemical loadings for agricultural land use were calculated from fertilizer and pesticide estimates by crop for the same periods. Comparison of these estimates to the monitored total phosphorous load indicates that both point and nonpoint sources are major contributors to the total nutrient loads in the study watersheds, with nonpoint sources being the largest contributor, particularly in the rural watersheds. These estimates are used as the input to the distributed water quality model for simulating pollutant transport through surface and subsurface processes to Great Lakes waters. Visualization and GIS interfaces are developed to visualize the spatial and temporal distribution of the pollutant transport in support of water management programs.
Abstract: With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.
Abstract: In a none-super-competitive environment the concepts
of closed system, management control remains to be the dominant
guiding concept to management. The merits of closed loop have been
the sources of most of the management literature and culture for
many decades. It is a useful exercise to investigate and poke into the
dynamics of the control loop phenomenon and draws some lessons to
use for refining the practice of management. This paper examines the
multitude of lessons abstracted from the behavior of the Input /output
/feedback control loop model, which is the core of control theory.
There are numerous lessons that can be learned from the insights this
model would provide and how it parallels the management dynamics
of the organization. It is assumed that an organization is basically a
living system that interacts with the internal and external variables. A
viable control loop is the one that reacts to the variation in the
environment and provide or exert a corrective action. In managing
organizations this is reflected in organizational structure and
management control practices. This paper will report findings that
were a result of examining several abstract scenarios that are
exhibited in the design, operation, and dynamics of the control loop
and how they are projected on the functioning of the organization.
Valuable lessons are drawn in trying to find parallels and new
paradigms, and how the control theory science is reflected in the
design of the organizational structure and management practices. The
paper is structured in a logical and perceptive format. Further
research is needed to extend these findings.
Abstract: Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Abstract: High-frequency (HF) communications have been used by military organizations for more than 90 years. The opportunity of very long range communications without the need for advanced equipment makes HF a convenient and inexpensive alternative of satellite communications. Besides the advantages, voice and data transmission over HF is a challenging task, because the HF channel generally suffers from Doppler shift and spread, multi-path, cochannel interference, and many other sources of noise. In constructing an HF data modem, all these effects must be taken into account. STANAG 4539 is a NATO standard for high-speed data transmission over HF. It allows data rates up to 12800 bps over an HF channel of 3 kHz. In this work, an efficient implementation of STANAG 4539 on a single Texas Instruments- TMS320C6747 DSP chip is described. The state-of-the-art algorithms used in the receiver and the efficiency of the implementation enables real-time high-speed data / digitized voice transmission over poor HF channels.
Abstract: On the basis of Bayesian inference using the
maximizer of the posterior marginal estimate, we carry out phase
unwrapping using multiple interferograms via generalized mean-field
theory. Numerical calculations for a typical wave-front in remote
sensing using the synthetic aperture radar interferometry, phase
diagram in hyper-parameter space clarifies that the present method
succeeds in phase unwrapping perfectly under the constraint of
surface- consistency condition, if the interferograms are not corrupted
by any noises. Also, we find that prior is useful for extending a phase
in which phase unwrapping under the constraint of the
surface-consistency condition. These results are quantitatively
confirmed by the Monte Carlo simulation.
Abstract: We present the results of a case study aiming to assess the reflection of the tourism community in the Web and its usability to propose new ways to communicate visually. The wealth of information contained in the Web and the clear facilities to communicate personals points of view makes of the social web a new space of exploration. In this way, social web allow the sharing of information between communities with similar interests. However, the tourism community remains unexplored as is the case of the information covered in travel stories. Along the Web, we find multiples sites allowing the users to communicate their experiences and personal points of view of a particular place of the world. This cultural heritage is found in multiple documents, usually very little supplemented with photos, so they are difficult to explore due to the lack of visual information. This paper explores the possibility of analyzing travel stories to display them visually on maps and generate new knowledge such as patterns of travel routes. This way, travel narratives published in electronic formats can be very important especially to the tourism community because of the great amount of knowledge that can be extracted. Our approach is based on the use of a Geoparsing Web Service to extract geographic coordinates from travel narratives in order to draw the geo-positions and link the documents into a map image.