Abstract: Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.
Abstract: Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is very important as it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, it helps to reduce the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we make a survey of the factors that influence exclusive breastfeeding and use two dispersed statistical models to analyze data. The models are the Generalized Poisson regression model and the Com-Poisson regression models.
Abstract: Removal of Methylene Blue (MB) from aqueous
solution by adsorbing it on Gypsum was investigated by batch
method. The studies were conducted at 25°C and included the effects
of pH and initial concentration of Methylene Blue. The adsorption
data was analyzed by using the Langmuir, Freundlich and Tempkin
isotherm models. The maximum monolayer adsorption capacity was
found to be 36 mg of the dye per gram of gypsum. The data were
also analyzed in terms of their kinetic behavior and was found to
obey the pseudo second order equation.
Abstract: This paper presents a integer frequency offset (IFO)
estimation scheme for the 3GPP long term evolution (LTE) downlink
system. Firstly, the conventional joint detection method for IFO and
sector cell index (CID) information is introduced. Secondly, an IFO
estimation without explicit sector CID information is proposed, which
can operate jointly with the proposed IFO estimation and reduce
the time delay in comparison with the conventional joint method.
Also, the proposed method is computationally efficient and has almost
similar performance in comparison with the conventional method over
the Pedestrian and Vehicular channel models.
Abstract: Protection and proper management of archaeological heritage are an essential process of studying and interpreting the generations present and future. Protecting the archaeological heritage is based upon multidiscipline professional collaboration. This study aims to gather data by different sources (Photogrammetry and Geographic Information System (GIS)) integrated for the purpose of documenting one the of significant archeological sites (Ahl-Alkahf, Jordan). 3D modeling deals with the actual image of the features, shapes and texture to represent reality as realistically as possible by using texture. The 3D coordinates that result of the photogrammetric adjustment procedures are used to create 3D-models of the study area. Adding Textures to the 3D-models surfaces gives a 'real world' appearance to the displayed models. GIS system combined all data, including boundary maps, indicating the location of archeological sites, transportation layer, digital elevation model and orthoimages. For realistic representation of the study area, 3D - GIS model prepared, where efficient generation, management and visualization of such special data can be achieved.
Abstract: The main purpose of this research is the calculation of implicit prices of the environmental level of air quality in the city of Moscow on the basis of housing property prices. The database used contains records of approximately 20 thousand apartments and has been provided by a leading real estate agency operating in Russia. The explanatory variables include physical characteristics of the houses, environmental (industry emissions), neighbourhood sociodemographic and geographic data: GPS coordinates of each house. The hedonic regression results for ecological variables show «negative» prices while increasing the level of air contamination from such substances as carbon monoxide, nitrogen dioxide, sulphur dioxide, and particles (CO, NO2, SO2, TSP). The marginal willingness to pay for higher environmental quality is presented for linear and log-log models.
Abstract: Most of the real queuing systems include special properties and constraints, which can not be analyzed directly by using the results of solved classical queuing models. Lack of Markov chains features, unexponential patterns and service constraints, are the mentioned conditions. This paper represents an applied general algorithm for analysis and optimizing the queuing systems. The algorithm stages are described through a real case study. It is consisted of an almost completed non-Markov system with limited number of customers and capacities as well as lots of common exception of real queuing networks. Simulation is used for optimizing this system. So introduced stages over the following article include primary modeling, determining queuing system kinds, index defining, statistical analysis and goodness of fit test, validation of model and optimizing methods of system with simulation.
Abstract: Optimizing equipment selection in heavy earthwork
operations is a critical key in the success of any construction project.
The objective of this research incentive was geared towards
developing a computer model to assist contractors and construction
managers in estimating the cost of heavy earthwork operations.
Economical operation analysis was conducted for an equipment fleet
taking into consideration the owning and operating costs involved in
earthwork operations. The model is being developed in a Microsoft
environment and is capable of being integrated with other estimating
and optimization models. In this study, Caterpillar® Performance
Handbook [5] was the main resource used to obtain specifications of
selected equipment. The implementation of the model shall give
optimum selection of equipment fleet not only based on cost
effectiveness but also in terms of versatility. To validate the model, a
case study of an actual dam construction project was selected to
quantify its degree of accuracy.
Abstract: This study aims to examine the factors affecting
knowledge sharing behavior in knowledge-based electronic communities (e-communities) because quantity and quality of
knowledge shared among the members play a critical role in the community-s sustainability. Past research has suggested three
perspectives that may affect the quantity and quality of knowledge
shared: economics, social psychology, and social ecology. In this
study, we strongly believe that an economic perspective may be suitable to validate factors influencing newly registered members-
knowledge contribution at the beginning of relationship development.
Accordingly, this study proposes a model to validate the factors influencing members- knowledge sharing based on Transaction Cost
Theory. By doing so, we may empirically test our hypotheses in various types of e-communities to determine the generalizability of our research models.
Abstract: Nonlinear finite element method and Serendipity eight
nodes element are used for determining of ground surface settlement
due to tunneling. Linear element with elastic behavior is used for
modeling of lining. Modified Generalized plasticity model with nonassociated
flow rule is applied for analysis of a tunnel in Sao Paulo –
Brazil. The tunnel had analyzed by Lades- model with 16 parameters.
In this work modified Generalized Plasticity is used with 10
parameters, also Mohr-Coulomb model is used to analysis the tunnel.
The results show good agreement with observed results of field data
by modified Generalized Plasticity model than other models. The
obtained result by Mohr-Coulomb model shows less settlement than
other model due to excavation.
Abstract: One of the most important parts of a cement factory is
the cement rotary kiln which plays a key role in quality and quantity of produced cement. In this part, the physical exertion and bilateral
movement of air and materials, together with chemical reactions take
place. Thus, this system has immensely complex and nonlinear dynamic equations. These equations have not worked out yet. Only
in exceptional case; however, a large number of the involved parameter were crossed out and an approximation model was
presented instead. This issue caused many problems for designing a
cement rotary kiln controller. In this paper, we presented nonlinear predictor and simulator models for a real cement rotary kiln by using
nonlinear identification technique on the Locally Linear Neuro-
Fuzzy (LLNF) model. For the first time, a simulator model as well as
a predictor one with a precise fifteen minute prediction horizon for a
cement rotary kiln is presented. These models are trained by
LOLIMOT algorithm which is an incremental tree-structure
algorithm. At the end, the characteristics of these models are expressed. Furthermore, we presented the pros and cons of these
models. The data collected from White Saveh Cement Company is used for modeling.
Abstract: In this paper, we consider the problem of tracking
multiple maneuvering targets using switching multiple target motion
models. With this paper, we aim to contribute in solving the problem
of model-based body motion estimation by using data coming from
visual sensors. The Interacting Multiple Model (IMM) algorithm is
specially designed to track accurately targets whose state and/or
measurement (assumed to be linear) models changes during motion
transition. However, when these models are nonlinear, the IMM
algorithm must be modified in order to guarantee an accurate track.
In this paper we propose to avoid the Extended Kalman filter because
of its limitations and substitute it with the Unscented Kalman filter
which seems to be more efficient especially according to the
simulation results obtained with the nonlinear IMM algorithm (IMMUKF).
To resolve the problem of data association, the JPDA
approach is combined with the IMM-UKF algorithm, the derived
algorithm is noted JPDA-IMM-UKF.
Abstract: In this study, we propose a tongue diagnosis method
which detects the tongue from face image and divides the tongue area into six areas, and finally generates tongue coating ratio of each area.
To detect the tongue area from face image, we use ASM as one of the active shape models. Detected tongue area is divided into six areas
widely used in the Korean traditional medicine and the distribution of tongue coating of the six areas is examined by SVM(Support Vector
Machine). For SVM, we use a 3-dimensional vector calculated by PCA(Principal Component Analysis) from a 12-dimentional vector
consisting of RGB, HIS, Lab, and Luv. As a result, we detected the tongue area stably using ASM and found that PCA and SVM helped
raise the ratio of tongue coating detection.
Abstract: Delay-Tolerant Networks (DTNs) are sparse, wireless
networks where disconnections are common due to host mobility and
low node density. The Message Ferrying (MF) scheme is a mobilityassisted
paradigm to improve connectivity in DTN-like networks. A
ferry or message ferry is a special node in the network which has
a per-determined route in the deployed area and relays messages
between mobile hosts (MHs) which are intermittently connected.
Increased contact opportunities among mobile hosts and the ferry
improve the performance of the network, both in terms of message
delivery ratio and average end-end delay. However, due to the inherent
mobility of mobile hosts and pre-determined periodicity of the
message ferry, mobile hosts may often -miss- contact opportunities
with a ferry. In this paper, we propose the combination of stationary
ferry access points (FAPs) with MF routing to increase contact
opportunities between mobile hosts and the MF and consequently
improve the performance of the DTN. We also propose several
placement models for deploying FAPs on MF routes. We evaluate the
performance of the FAP placement models through comprehensive
simulation. Our findings show that FAPs do improve the performance
of MF-assisted DTNs and symmetric placement of FAPs outperforms
other placement strategies.
Abstract: The software system goes through a number of stages
during its life and a software process model gives a standard format
for planning, organizing and running a project. The article presents a
new software development process model named as “Divide and
Conquer Process Model", based on the idea first it divides the things
to make them simple and then gathered them to get the whole work
done. The article begins with the backgrounds of different software
process models and problems in these models. This is followed by a
new divide and conquer process model, explanation of its different
stages and at the end edge over other models is shown.
Abstract: Chicken feathers were used as biosorbent for Pb
removal from aqueous solution. In this paper, the kinetics and
equilibrium studies at several pH, temperature, and metal
concentration values are reported. For tested conditions, the Pb
sorption capacity of this poultry waste ranged from 0.8 to 8.3 mg/g.
Optimal conditions for Pb removal by chicken feathers have been
identified. Pseudo-first order and pseudo-second order equations
were used to analyze the experimental data. In addition, the sorption
isotherms were fitted to classical Langmuir and Freundlich models.
Finally, thermodynamic parameters for the sorption process have
been determined. In summary, the results showed that chicken
feathers are an alternative and promising sorbent for the treatment of
effluents polluted by Pb ions.
Abstract: Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.
Abstract: New generation mobile communication networks have
the ability of supporting triple play. In order that, Orthogonal
Frequency Division Multiplexing (OFDM) access techniques have
been chosen to enlarge the system ability for high data rates
networks. Many of cross-layer modeling and optimization schemes
for Quality of Service (QoS) and capacity of downlink multiuser
OFDM system were proposed. In this paper, the Maximum Weighted
Capacity (MWC) based resource allocation at the Physical (PHY)
layer is used. This resource allocation scheme provides a much better
QoS than the previous resource allocation schemes, while
maintaining the highest or nearly highest capacity and costing similar
complexity. In addition, the Delay Satisfaction (DS) scheduling at the
Medium Access Control (MAC) layer, which allows more than one
connection to be served in each slot is used. This scheduling
technique is more efficient than conventional scheduling to
investigate both of the number of users as well as the number of
subcarriers against system capacity. The system will be optimized for
different operational environments: the outdoor deployment scenarios
as well as the indoor deployment scenarios are investigated and also
for different channel models. In addition, effective capacity approach
[1] is used not only for providing QoS for different mobile users, but
also to increase the total wireless network's throughput.
Abstract: CIM is the standard formalism for modeling management
information developed by the Distributed Management Task
Force (DMTF) in the context of its WBEM proposal, designed to
provide a conceptual view of the managed environment. In this
paper, we propose the inclusion of formal knowledge representation
techniques, based on Description Logics (DLs) and the Web Ontology
Language (OWL), in CIM-based conceptual modeling, and then we
examine the benefits of such a decision. The proposal is specified as a
CIM metamodel level mapping to a highly expressive subset of DLs
capable of capturing all the semantics of the models. The paper shows
how the proposed mapping can be used for automatic reasoning
about the management information models, as a design aid, by means
of new-generation CASE tools, thanks to the use of state-of-the-art
automatic reasoning systems that support the proposed logic and use
algorithms that are sound and complete with respect to the semantics.
Such a CASE tool framework has been developed by the authors and
its architecture is also introduced. The proposed formalization is not
only useful at design time, but also at run time through the use of
rational autonomous agents, in response to a need recently recognized
by the DMTF.
Abstract: This paper proposes a new method for analyzing textual data. The method deals with items of textual data, where each item is described based on various viewpoints. The method acquires 2- class classification models of the viewpoints by applying an inductive learning method to items with multiple viewpoints. The method infers whether the viewpoints are assigned to the new items or not by using the models. The method extracts expressions from the new items classified into the viewpoints and extracts characteristic expressions corresponding to the viewpoints by comparing the frequency of expressions among the viewpoints. This paper also applies the method to questionnaire data given by guests at a hotel and verifies its effect through numerical experiments.