Abstract: Class cohesion is an important object-oriented
software quality attribute. It indicates how much the members in a
class are related. Assessing the class cohesion and improving the
class quality accordingly during the object-oriented design phase
allows for cheaper management of the later phases. In this paper, the
notion of distance between pairs of methods and pairs of attribute
types in a class is introduced and used as a basis for introducing a
novel class cohesion metric. The metric considers the methodmethod,
attribute-attribute, and attribute-method direct interactions.
It is shown that the metric gives more sensitive values than other
well-known design-based class cohesion metrics.
Abstract: The market transformation in Kazakhstan during the
last two decades has essentially strengthened a gap between
development of urban and rural areas. Implementation of market
institutes, transition from public financing to paid rendering of social
services, change of forms of financing of social and economic
infrastructure have led to strengthening of an economic inequality of
social groups, including growth of stratification of the city and the
village. Sociological survey of urban and rural households in Almaty
city and villages of Almaty region has been carried out within the
international research project “Livelihoods Strategies of Private
Households in Central Asia: A Rural–Urban Comparison in
Kazakhstan and Kyrgyzstan" (Germany, Kazakhstan, Kyrgyzstan).
The analysis of statistical data and results of sociological research of
urban and rural households allows us to reveal issues of territorial
development, to investigate an availability of medical, educational
and other services in the city and the village, to reveal an evaluation
urban and rural dwellers of living conditions, to compare economic
strategies of households in the city and the village.
Abstract: This paper presents the H-ARQ techniques comparison for OFDM systems with a new family of non-binary LDPC codes which has been developed within the EU FP7 DAVINCI project. The punctured NB-LDPC codes have been used in a simulated model of the transmission system. The link level performance has been evaluated in terms of spectral efficiency, codeword error rate and average number of retransmissions. The NB-LDPC codes can be easily and effective implemented with different methods of the retransmission needed if correct decoding of a codeword failed. Here the Optimal Symbol Selection method is proposed as a Chase Combining technique.
Abstract: Curriculum is one of the most important inputs in higher education system and for knowing the strong and weak spots of it we need evaluation. The main purpose of this study was to survey of the curriculum quality of Insurance Management field. Case: University of Allameh Taba Tabaee(according to view point of students,alumni,employer and faculty members).Descriptive statistics (mean, tables, percentages, frequency distribution) and inferential statistics (CHI SQUARE) were used to analyze the data. Six criterions considered for the Quality of curriculum: objectives, content, teaching and learning methods, space and facilities, Time, assessment of learning. objectives, teaching and learning methods criterions was desirable level, content criteria was undesirable level, space and facilities, time and assessment of learning were rather desirable level. The quality of curriculum of insurance management field was relatively desirable level.
Abstract: These paper, we approximate the average run length
(ARL) for CUSUM chart when observation are an exponential first
order moving average sequence (EMA1). We used Gauss-Legendre
numerical scheme for integral equations (IE) method for approximate
ARL0 and ARL1, where ARL in control and out of control,
respectively. We compared the results from IE method and exact
solution such that the two methods perform good agreement.
Abstract: In this paper a combined feature selection method is
proposed which takes advantages of sample domain filtering,
resampling and feature subset evaluation methods to reduce
dimensions of huge datasets and select reliable features. This method
utilizes both feature space and sample domain to improve the process
of feature selection and uses a combination of Chi squared with
Consistency attribute evaluation methods to seek reliable features.
This method consists of two phases. The first phase filters and
resamples the sample domain and the second phase adopts a hybrid
procedure to find the optimal feature space by applying Chi squared,
Consistency subset evaluation methods and genetic search.
Experiments on various sized datasets from UCI Repository of
Machine Learning databases show that the performance of five
classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First
Decision Tree and JRIP) improves simultaneously and the
classification error for these classifiers decreases considerably. The
experiments also show that this method outperforms other feature
selection methods.
Abstract: Dealing with hundreds of features in character
recognition systems is not unusual. This large number of features
leads to the increase of computational workload of recognition
process. There have been many methods which try to remove
unnecessary or redundant features and reduce feature dimensionality.
Besides because of the characteristics of Farsi scripts, it-s not
possible to apply other languages algorithms to Farsi directly. In this
paper some methods for feature subset selection using genetic
algorithms are applied on a Farsi optical character recognition (OCR)
system. Experimental results show that application of genetic
algorithms (GA) to feature subset selection in a Farsi OCR results in
lower computational complexity and enhanced recognition rate.
Abstract: Many firms implemented various initiatives such as outsourced manufacturing which could make a supply chain (SC) more vulnerable to various types of disruptions. So managing risk has become a critical component of SC management. Different types of SC vulnerability management methodologies have been proposed for managing SC risk, most offer only point-based solutions that deal with a limited set of risks. This research aims to reinforce SC risk management by proposing an integrated approach. SC risks are identified and a risk index classification structure is created. Then we develop a SC risk assessment approach based on the analytic network process (ANP) and the VIKOR methods under the fuzzy environment where the vagueness and subjectivity are handled with linguistic terms parameterized by triangular fuzzy numbers. By using FANP, risks weights are calculated and then inserted to the FVIKOR to rank the SC members and find the most risky partner.
Abstract: Clustering techniques have been used by many intelligent software agents to group similar access patterns of the Web users into high level themes which express users intentions and interests. However, such techniques have been mostly focusing on one salient feature of the Web document visited by the user, namely the extracted keywords. The major aim of these techniques is to come up with an optimal threshold for the number of keywords needed to produce more focused themes. In this paper we focus on both keyword and similarity thresholds to generate themes with concentrated themes, and hence build a more sound model of the user behavior. The purpose of this paper is two fold: use distance based clustering methods to recognize overall themes from the Proxy log file, and suggest an efficient cut off levels for the keyword and similarity thresholds which tend to produce more optimal clusters with better focus and efficient size.
Abstract: In this paper, collocation based cubic B-spline and
extended cubic uniform B-spline method are considered for
solving one-dimensional heat equation with a nonlocal initial
condition. Finite difference and θ-weighted scheme is used for
time and space discretization respectively. The stability of the
method is analyzed by the Von Neumann method. Accuracy of
the methods is illustrated with an example. The numerical results
are obtained and compared with the analytical solutions.
Abstract: During the last few decades in the academic field, the
debate has increased on the effects of social geography on the
opportunities of socioeconomic integration. On one hand, it has been
discussed how the contents of the urban structure and social
geography affect not only the way people interact, but also their
chances of social and economic integration. On the other hand, it has
also been discussed how the urban structure is also constrained and
transformed by the action of social actors. Without questioning the
powerful influence of structural factors, related to the logic of the
production system, labor markets, education and training, the
research has shown the role played by place of residence in shaping
individual outcomes such as unemployment. In the context of this
debate the importance of territory of residence with respect to the
problem of unemployment has been highlighted.
Although statistics of unemployment have already demonstrated
the unequal incidence of the phenomenon in social groups, the issue
of uneven territorial impact on the phenomenon at intra-urban level
remains relatively unknown.
The purpose of this article is to show and to interpret the spatial
patterns of unemployment in the city of Porto using GIS (Geographic
Information System - GIS) technology. Under this analysis the
overlap of the spatial patterns of unemployment with the spatial
distribution of social housing, allows the discussion of the
relationship that occurs between these patterns and the reasons that
might explain the relative immutability of socioeconomic problems in
some neighborhoods.
Abstract: The batch nature limits the standard kernel principal component analysis (KPCA) methods in numerous applications, especially for dynamic or large-scale data. In this paper, an efficient adaptive approach is presented for online extraction of the kernel principal components (KPC). The contribution of this paper may be divided into two parts. First, kernel covariance matrix is correctly updated to adapt to the changing characteristics of data. Second, KPC are recursively formulated to overcome the batch nature of standard KPCA.This formulation is derived from the recursive eigen-decomposition of kernel covariance matrix and indicates the KPC variation caused by the new data. The proposed method not only alleviates sub-optimality of the KPCA method for non-stationary data, but also maintains constant update speed and memory usage as the data-size increases. Experiments for simulation data and real applications demonstrate that our approach yields improvements in terms of both computational speed and approximation accuracy.
Abstract: Collateralized Debt Obligations are not as widely used
nowadays as they were before 2007 Subprime crisis. Nonetheless
there remains an enthralling challenge to optimize cash flows
associated with synthetic CDOs. A Gaussian-based model is used
here in which default correlation and unconditional probabilities of
default are highlighted. Then numerous simulations are performed
based on this model for different scenarios in order to evaluate the
associated cash flows given a specific number of defaults at different
periods of time. Cash flows are not solely calculated on a single
bought or sold tranche but rather on a combination of bought and
sold tranches. With some assumptions, the simplex algorithm gives
a way to find the maximum cash flow according to correlation of
defaults and maturities. The used Gaussian model is not realistic in
crisis situations. Besides present system does not handle buying or
selling a portion of a tranche but only the whole tranche. However the
work provides the investor with relevant elements on how to know
what and when to buy and sell.
Abstract: Purpose: Planning and dosimetry of different VMAT algorithms (SmartArc, Ergo++, Autobeam) is compared with IMRT for Head and Neck Cancer patients. Modelling was performed to rule out the causes of discrepancies between planned and delivered dose. Methods: Five HNC patients previously treated with IMRT were re-planned with SmartArc (SA), Ergo++ and Autobeam. Plans were compared with each other and against IMRT and evaluated using DVHs for PTVs and OARs, delivery time, monitor units (MU) and dosimetric accuracy. Modelling of control point (CP) spacing, Leaf-end Separation and MLC/Aperture shape was performed to rule out causes of discrepancies between planned and delivered doses. Additionally estimated arc delivery times, overall plan generation times and effect of CP spacing and number of arcs on plan generation times were recorded. Results: Single arc SmartArc plans (SA4d) were generally better than IMRT and double arc plans (SA2Arcs) in terms of homogeneity and target coverage. Double arc plans seemed to have a positive role in achieving improved Conformity Index (CI) and better sparing of some Organs at Risk (OARs) compared to Step and Shoot IMRT (ss-IMRT) and SA4d. Overall Ergo++ plans achieved best CI for both PTVs. Dosimetric validation of all VMAT plans without modelling was found to be lower than ss-IMRT. Total MUs required for delivery were on average 19%, 30%, 10.6% and 6.5% lower than ss-IMRT for SA4d, SA2d (Single arc with 20 Gantry Spacing), SA2Arcs and Autobeam plans respectively. Autobeam was most efficient in terms of actual treatment delivery times whereas Ergo++ plans took longest to deliver. Conclusion: Overall SA single arc plans on average achieved best target coverage and homogeneity for both PTVs. SA2Arc plans showed improved CI and some OARs sparing. Very good dosimetric results were achieved with modelling. Ergo++ plans achieved best CI. Autobeam resulted in fastest treatment delivery times.
Abstract: In recent years asymmetric cross section aluminum
alloy stock has been finding increasing use in various industrial manufacturing areas such as general structures and automotive
components. In these areas, components are generally required to have
complex curved configuration and, as such, a bending process is required during manufacture. Undesirable deformation in bending
processes such as flattening or wrinkling can easily occur when thin-walled sections are bent. Hence, a thorough understanding of the
bending behavior of such sections is needed to prevent these undesirable deformations. In this study, the bending behavior of
asymmetric channel section was examined using finite element analysis (FEA). Typical methods of preventing undesirable
deformation, such as asymmetric laminated elastic mandrels were included in FEA model of draw bending. Additionally, axial tension
was applied to prevent wrinkling. By utilizing the FE simulations effect of restriction dies and axial tension on undesirable deformation during the process was clarified.
Abstract: This paper presents a new configurable decimation
filter for sigma-delta modulators. The filter employs the Pascal-s
triangle-s theorem for building the coefficients of non-recursive
decimation filters. The filter can be connected to the back-end of
various modulators with different output accuracy. In this work two
methods are shown and then compared from area occupation
viewpoint. First method uses the memory and the second one
employs Pascal-s triangle-s method, aiming to reduce required gates.
XILINX ISE v10 is used for implementation and confirmation the
filter.
Abstract: Firstly, this study briefly presents the current situation that there exists a vast gap between current Chinese and Japanese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground; The Chinese and Japanese seismic design method and technical detail for bridge pile foundation in liquefying and lateral spreading ground are described and compared systematically and comprehensively, the methods of determining coefficient of subgrade reaction and its reduction factor as well as the computing mode of the applied force on pile foundation due to liquefaction-induced lateral spreading soil in Japanese design specification are especially introduced. Subsequently, the comparison indicates that the content of Chinese seismic design specification for bridge pile foundation in liquefiable and liquefaction-induced lateral spreading ground, just presenting some qualitative items, is too general and lacks systematicness and maneuverability. Finally, some defects of seismic design specification in China are summarized, so the improvement and revision of specification in the field turns out to be imperative for China, some key problems of current Chinese specifications are generalized and the corresponding improvement suggestions are proposed.
Abstract: In literatures, many researches proposed various
methods to reduce PAPR (Peak to Average Power Ratio). Among
those, DSI (Dummy Sequence Insertion) is one of the most attractive
methods for WiMAX systems because it does not require side
information transmitted along with user data. However, the
conventional DSI methods find dummy sequence by performing an
iterative procedure until achieving PAPR under a desired threshold.
This causes a significant delay on finding dummy sequence and also
effects to the overall performances in WiMAX systems. In this paper,
the new method based on DSI is proposed by finding dummy
sequence without the need of iterative procedure. The fast DSI
method can reduce PAPR without either delays or required side
information. The simulation results confirm that the proposed method
is able to carry out PAPR performances as similar to the other
methods without any delays. In addition, the simulations of WiMAX
system with adaptive modulations are also investigated to realize the
use of proposed methods on various fading schemes. The results
suggest the WiMAX designers to modify a new Signal to Noise Ratio
(SNR) criteria for adaptation.
Abstract: Multimedia courseware has been accepted as a tool
that can support teaching and learning process. 'Li2D' courseware
was developed to assist student-s visualization on the topic of Loci in
Two Dimension. This paper describes an evaluation on the
effectiveness and usability of a 'Li2D' courseware. The quasi
experiment was used for the effectiveness evaluation. Usability
evaluation was accomplished based on four constructs of usability,
namely: efficiency, learnability, screen design and satisfaction. An
evaluation on the multimedia elements was also conducted. A total of
63 students of Form Two are involved in the study. The students are
divided into two groups: control and experimental. The experimental
group had to interact with 'Li2D' courseware as part of the learning
activities while the control group used the conventional learning
methods. The results indicate that the experimental group performed
better than the control group in understanding the Loci in Two
Dimensions topic. In terms of usability, the results showed that the
students agreed on the usability in multimedia elements in the 'Li2D'
courseware.
Abstract: This paper focuses on the quadratic stabilization problem for a class of uncertain impulsive switched systems. The uncertainty is assumed to be norm-bounded and enters both the state and the input matrices. Based on the Lyapunov methods, some results on robust stabilization and quadratic stabilization for the impulsive switched system are obtained. A stabilizing state feedback control law realizing the robust stabilization of the closed-loop system is constructed.