Supporting QoS-aware Multicasting in Differentiated Service Networks

A scalable QoS aware multicast deployment in DiffServ networks has become an important research dimension in recent years. Although multicasting and differentiated services are two complementary technologies, the integration of the two technologies is a non-trivial task due to architectural conflicts between them. A popular solution proposed is to extend the functionality of the DiffServ components to support multicasting. In this paper, we propose an algorithm to construct an efficient QoSdriven multicast tree, taking into account the available bandwidth per service class. We also present an efficient way to provision the limited available bandwidth for supporting heterogeneous users. The proposed mechanism is evaluated using simulated tests. The simulated result reveals that our algorithm can effectively minimize the bandwidth use and transmission cost

Person Identification using Gait by Combined Features of Width and Shape of the Binary Silhouette

Current image-based individual human recognition methods, such as fingerprints, face, or iris biometric modalities generally require a cooperative subject, views from certain aspects, and physical contact or close proximity. These methods cannot reliably recognize non-cooperating individuals at a distance in the real world under changing environmental conditions. Gait, which concerns recognizing individuals by the way they walk, is a relatively new biometric without these disadvantages. The inherent gait characteristic of an individual makes it irreplaceable and useful in visual surveillance. In this paper, an efficient gait recognition system for human identification by extracting two features namely width vector of the binary silhouette and the MPEG-7-based region-based shape descriptors is proposed. In the proposed method, foreground objects i.e., human and other moving objects are extracted by estimating background information by a Gaussian Mixture Model (GMM) and subsequently, median filtering operation is performed for removing noises in the background subtracted image. A moving target classification algorithm is used to separate human being (i.e., pedestrian) from other foreground objects (viz., vehicles). Shape and boundary information is used in the moving target classification algorithm. Subsequently, width vector of the outer contour of binary silhouette and the MPEG-7 Angular Radial Transform coefficients are taken as the feature vector. Next, the Principal Component Analysis (PCA) is applied to the selected feature vector to reduce its dimensionality. These extracted feature vectors are used to train an Hidden Markov Model (HMM) for identification of some individuals. The proposed system is evaluated using some gait sequences and the experimental results show the efficacy of the proposed algorithm.

Tobephobia: Teachers- Ineptitude to Manage Curriculum Change

In this paper, Tobephobia (TBP) alludes to the fear of failure experienced by teachers to manage curriculum change. TBP is an emerging concept and it extends the boundaries of research in terms of how we view achievement and failure in education. Outcomes-based education (OBE) was introduced fifteen years ago in South African schools without simultaneously upgrading teachers- professional competencies. This exploratory research, therefore examines a simple question: What is the impact of TBP and OBE on teachers? Teacher ineptitude to cope with the OBE curriculum in the classroom is a serious problem affecting large numbers of South African teachers. This exploratory study sought to determine the perceived negative impact of OBE and TBP on teachers. A survey was conducted amongst 311 teachers in Port Elizabeth and Durban, South Africa. The results confirm the very negative impact of TBP and OBE on teachers. This exploratory study authenticates the existence of TBP.

Beneficial Use of Coal Combustion By-products in the Rehabilitation of Failed Asphalt Pavements

This study demonstrates the use of Class F fly ash in combination with lime or lime kiln dust in the full depth reclamation (FDR) of asphalt pavements. FDR, in the context of this paper, is a process of pulverizing a predetermined amount of flexible pavement that is structurally deficient, blending it with chemical additives and water, and compacting it in place to construct a new stabilized base course. Test sections of two structurally deficient asphalt pavements were reclaimed using Class F fly ash in combination with lime and lime kiln dust. In addition, control sections were constructed using cement, cement and emulsion, lime kiln dust and emulsion, and mill and fill. The service performance and structural behavior of the FDR pavement test sections were monitored to determine how the fly ash sections compared to other more traditional pavement rehabilitation techniques. Service performance and structural behavior were determined with the use of sensors embedded in the road and Falling Weight Deflectometer (FWD) tests. Monitoring results of the FWD tests conducted up to 2 years after reclamation show that the cement, fly ash+LKD, and fly ash+lime sections exhibited two year resilient modulus values comparable to open graded cement stabilized aggregates (more than 750 ksi). The cement treatment resulted in a significant increase in resilient modulus within 3 weeks of construction and beyond this curing time, the stiffness increase was slow. On the other hand, the fly ash+LKD and fly ash+lime test sections indicated slower shorter-term increase in stiffness. The fly ash+LKD and fly ash+lime section average resilient modulus values at two years after construction were in excess of 800 ksi. Additional longer-term testing data will be available from ongoing pavement performance and environmental condition data collection at the two pavement sites.

Gene Selection Guided by Feature Interdependence

Cancers could normally be marked by a number of differentially expressed genes which show enormous potential as biomarkers for a certain disease. Recent years, cancer classification based on the investigation of gene expression profiles derived by high-throughput microarrays has widely been used. The selection of discriminative genes is, therefore, an essential preprocess step in carcinogenesis studies. In this paper, we have proposed a novel gene selector using information-theoretic measures for biological discovery. This multivariate filter is a four-stage framework through the analyses of feature relevance, feature interdependence, feature redundancy-dependence and subset rankings, and having been examined on the colon cancer data set. Our experimental result show that the proposed method outperformed other information theorem based filters in all aspect of classification errors and classification performance.

The Effects of the Impact of Instructional Immediacy on Cognition and Learning in Online Classes

Current research has explored the impact of instructional immediacy, defined as those behaviors that help build close relationships or feelings of closeness, both on cognition and motivation in the traditional classroom and online classroom; however, online courses continue to suffer from higher dropout rates. Based on Albert Bandura-s Social Cognitive Theory, four primary relationships or interactions in an online course will be explored in light of how they can provide immediacy thereby reducing student attrition and improving cognitive learning. The four relationships are teacher-student, student-student, and student-content, and studentcomputer. Results of a study conducted with inservice teachers completing a 14-week online professional development technology course will be examined to demonstrate immediacy strategies that improve cognitive learning and reduce student attrition. Results of the study reveal that students can be motivated through various interactions and instructional immediacy behaviors which lead to higher completion rates, improved self-efficacy, and cognitive learning.

Brain Drain of Doctors; Causes and Consequences in Pakistan

Pakistani doctors (MBBS) are emigrating towards developed countries for professional adjustments. This study aims to highlight causes and consequences of doctors- brain drain from Pakistan. Primary data was collected from Mayo Hospital, Lahore by interviewing doctors (n=100) through systematic random sampling technique. It found that various socio-economic and political conditions are working as push and pull factors for brain drain of doctors in Pakistan. Majority of doctors (83%) declared poor remunerations and professional infrastructure of health department as push factor of doctors- brain drain. 81% claimed that continuous instability in political situation and threats of terrorism are responsible for emigration of doctors. 84% respondents considered fewer opportunities of further studies responsible for their emigration. Brain drain of doctors is affecting health sector-s policies / programs, standard doctor-patient ratios and quality of health services badly.

A Proposal of an Automatic Formatting Method for Transforming XML Data

PPX(Pretty Printer for XML) is a query language that offers a concise description method of formatting the XML data into HTML. In this paper, we propose a simple specification of formatting method that is a combination description of automatic layout operators and variables in the layout expression of the GENERATE clause of PPX. This method can automatically format irregular XML data included in a part of XML with layout decision rule that is referred to DTD. In the experiment, a quick comparison shows that PPX requires far less description compared to XSLT or XQuery programs doing same tasks.

Using Ontology Search in the Design of Class Diagram from Business Process Model

Business process model describes process flow of a business and can be seen as the requirement for developing a software application. This paper discusses a BPM2CD guideline which complements the Model Driven Architecture concept by suggesting how to create a platform-independent software model in the form of a UML class diagram from a business process model. An important step is the identification of UML classes from the business process model. A technique for object-oriented analysis called domain analysis is borrowed and key concepts in the business process model will be discovered and proposed as candidate classes for the class diagram. The paper enhances this step by using ontology search to help identify important classes for the business domain. As ontology is a source of knowledge for a particular domain which itself can link to ontologies of related domains, the search can give a refined set of candidate classes for the resulting class diagram.

Constitutional Complaint as an Instrument of Fulfilling the Worker ׳s Rights in Croatian Legal System

This paper begins with formal defining of human rights and freedoms, and the basic document regarding the said subject is undoubtedly French Declaration of the Rights of Man and of the Citizen from 789. This paper furthermore parses legal sources relevant for the workers' rights in legal system of the Republic of Croatia, international contracts and the Labour Act, which is also a master bill regarding workers' rights The authors are also dealing with issues of Constitutional Court of the Republic of Croatia and its' position in judicial system of the Republic of Croatia, as well as with the specifics of Constitutional Complaint, and the crucial part of the paper is based on the research conducted with an aim to determine implementation of rights and liberties guaranteed by the articles 54. and 55. of the Constitution of the Republic of Croatia by means of Constitutional Complaint.

Learning to Recognize Faces by Local Feature Design and Selection

Studies in neuroscience suggest that both global and local feature information are crucial for perception and recognition of faces. It is widely believed that local feature is less sensitive to variations caused by illumination, expression and illumination. In this paper, we target at designing and learning local features for face recognition. We designed three types of local features. They are semi-global feature, local patch feature and tangent shape feature. The designing of semi-global feature aims at taking advantage of global-like feature and meanwhile avoiding suppressing AdaBoost algorithm in boosting weak classifies established from small local patches. The designing of local patch feature targets at automatically selecting discriminative features, and is thus different with traditional ways, in which local patches are usually selected manually to cover the salient facial components. Also, shape feature is considered in this paper for frontal view face recognition. These features are selected and combined under the framework of boosting algorithm and cascade structure. The experimental results demonstrate that the proposed approach outperforms the standard eigenface method and Bayesian method. Moreover, the selected local features and observations in the experiments are enlightening to researches in local feature design in face recognition.

Control Chart Pattern Recognition Using Wavelet Based Neural Networks

Control chart pattern recognition is one of the most important tools to identify the process state in statistical process control. The abnormal process state could be classified by the recognition of unnatural patterns that arise from assignable causes. In this study, a wavelet based neural network approach is proposed for the recognition of control chart patterns that have various characteristics. The procedure of proposed control chart pattern recognizer comprises three stages. First, multi-resolution wavelet analysis is used to generate time-shape and time-frequency coefficients that have detail information about the patterns. Second, distance based features are extracted by a bi-directional Kohonen network to make reduced and robust information. Third, a back-propagation network classifier is trained by these features. The accuracy of the proposed method is shown by the performance evaluation with numerical results.

Specialized Web Robot for Objectionable Web Content Classification

This paper proposes a specialized Web robot to automatically collect objectionable Web contents for use in an objectionable Web content classification system, which creates the URL database of objectionable Web contents. It aims at shortening the update period of the DB, increasing the number of URLs in the DB, and enhancing the accuracy of the information in the DB.

Fuzzy Relatives of the CLARANS Algorithm With Application to Text Clustering

This paper introduces new algorithms (Fuzzy relative of the CLARANS algorithm FCLARANS and Fuzzy c Medoids based on randomized search FCMRANS) for fuzzy clustering of relational data. Unlike existing fuzzy c-medoids algorithm (FCMdd) in which the within cluster dissimilarity of each cluster is minimized in each iteration by recomputing new medoids given current memberships, FCLARANS minimizes the same objective function minimized by FCMdd by changing current medoids in such away that that the sum of the within cluster dissimilarities is minimized. Computing new medoids may be effected by noise because outliers may join the computation of medoids while the choice of medoids in FCLARANS is dictated by the location of a predominant fraction of points inside a cluster and, therefore, it is less sensitive to the presence of outliers. In FCMRANS the step of computing new medoids in FCMdd is modified to be based on randomized search. Furthermore, a new initialization procedure is developed that add randomness to the initialization procedure used with FCMdd. Both FCLARANS and FCMRANS are compared with the robust and linearized version of fuzzy c-medoids (RFCMdd). Experimental results with different samples of the Reuter-21578, Newsgroups (20NG) and generated datasets with noise show that FCLARANS is more robust than both RFCMdd and FCMRANS. Finally, both FCMRANS and FCLARANS are more efficient and their outputs are almost the same as that of RFCMdd in terms of classification rate.

Analysis of a Mathematical Model for Dengue Disease in Pregnant Cases

Dengue fever is an important human arboviral disease. Outbreaks are now reported quite often from many parts of the world. The number of cases involving pregnant women and infant cases are increasing every year. The illness is often severe and complications may occur. Deaths often occur because of the difficulties in early diagnosis and in the improper management of the diseases. Dengue antibodies from pregnant women are passed on to infants and this protects the infants from dengue infections. Antibodies from the mother are transferred to the fetus when it is still in the womb. In this study, we formulate a mathematical model to describe the transmission of this disease in pregnant women. The model is formulated by dividing the human population into pregnant women and non-pregnant human (men and non-pregnant women). Each class is subdivided into susceptible (S), infectious (I) and recovered (R) subclasses. We apply standard dynamical analysis to our model. Conditions for the local stability of the equilibrium points are given. The numerical simulations are shown. The bifurcation diagrams of our model are discussed. The control of this disease in pregnant women is discussed in terms of the threshold conditions.

Versatile Dual-Mode Class-AB Four-Quadrant Analog Multiplier

Versatile dual-mode class-AB CMOS four-quadrant analog multiplier circuit is presented. The dual translinear loops and current mirrors are the basic building blocks in realization scheme. This technique provides; wide dynamic range, wide-bandwidth response and low power consumption. The major advantages of this approach are; its has single ended inputs; since its input is dual translinear loop operate in class-AB mode which make this multiplier configuration interesting for low-power applications; current multiplying, voltage multiplying, or current and voltage multiplying can be obtainable with balanced input. The simulation results of versatile analog multiplier demonstrate a linearity error of 1.2 %, a -3dB bandwidth of about 19MHz, a maximum power consumption of 0.46mW, and temperature compensated. Operation of versatile analog multiplier was also confirmed through an experiment using CMOS transistor array.

Improved Segmentation of Speckled Images Using an Arithmetic-to-Geometric Mean Ratio Kernel

In this work, we improve a previously developed segmentation scheme aimed at extracting edge information from speckled images using a maximum likelihood edge detector. The scheme was based on finding a threshold for the probability density function of a new kernel defined as the arithmetic mean-to-geometric mean ratio field over a circular neighborhood set and, in a general context, is founded on a likelihood random field model (LRFM). The segmentation algorithm was applied to discriminated speckle areas obtained using simple elliptic discriminant functions based on measures of the signal-to-noise ratio with fractional order moments. A rigorous stochastic analysis was used to derive an exact expression for the cumulative density function of the probability density function of the random field. Based on this, an accurate probability of error was derived and the performance of the scheme was analysed. The improved segmentation scheme performed well for both simulated and real images and showed superior results to those previously obtained using the original LRFM scheme and standard edge detection methods. In particular, the false alarm probability was markedly lower than that of the original LRFM method with oversegmentation artifacts virtually eliminated. The importance of this work lies in the development of a stochastic-based segmentation, allowing an accurate quantification of the probability of false detection. Non visual quantification and misclassification in medical ultrasound speckled images is relatively new and is of interest to clinicians.

Treatment of Oily Wastewater by Fibrous Coalescer Process: Stage Coalescer and Model Prediction

The coalescer process is one of the methods for oily water treatment by increasing the oil droplet size in order to enhance the separating velocity and thus effective separation. However, the presence of surfactants in an oily emulsion can limit the obtained mechanisms due to the small oil size related with stabilized emulsion. In this regard, the purpose of this research is to improve the efficiency of the coalescer process for treating the stabilized emulsion. The effects of bed types, bed height, liquid flow rate and stage coalescer (step-bed) on the treatment efficiencies in term of COD values were studied. Note that the treatment efficiency obtained experimentally was estimated by using the COD values and oil droplet size distribution. The study has shown that the plastic media has more effective to attach with oil particles than the stainless one due to their hydrophobic properties. Furthermore, the suitable bed height (3.5 cm) and step bed (3.5 cm with 2 steps) were necessary in order to well obtain the coalescer performance. The application of step bed coalescer process in reactor has provided the higher treatment efficiencies in term of COD removal than those obtained with classical process. The proposed model for predicting the area under curve and thus treatment efficiency, based on the single collector efficiency (ηT) and the attachment efficiency (α), provides relatively a good coincidence between the experimental and predicted values of treatment efficiencies in this study.

Defect Detection of Tiles Using 2D-Wavelet Transform and Statistical Features

In this article, a method has been offered to classify normal and defective tiles using wavelet transform and artificial neural networks. The proposed algorithm calculates max and min medians as well as the standard deviation and average of detail images obtained from wavelet filters, then comes by feature vectors and attempts to classify the given tile using a Perceptron neural network with a single hidden layer. In this study along with the proposal of using median of optimum points as the basic feature and its comparison with the rest of the statistical features in the wavelet field, the relational advantages of Haar wavelet is investigated. This method has been experimented on a number of various tile designs and in average, it has been valid for over 90% of the cases. Amongst the other advantages, high speed and low calculating load are prominent.

The Application of Homotopy Method In Solving Electrical Circuit Design Problem

This paper describes simple implementation of homotopy (also called continuation) algorithm for determining the proper resistance of the resistor to dissipate energy at a specified rate of an electric circuit. Homotopy algorithm can be considered as a developing of the classical methods in numerical computing such as Newton-Raphson and fixed point methods. In homoptopy methods, an embedding parameter is used to control the convergence. The method purposed in this work utilizes a special homotopy called Newton homotopy. Numerical example solved in MATLAB is given to show the effectiveness of the purposed method