Extension of the Client-Centric Approach under Small Buffer Space

Periodic broadcast is a cost-effective solution for large-scale distribution of popular videos because this approach guarantees constant worst service latency, regardless of the number of video requests. An essential periodic broadcast method is the client-centric approach (CCA), which allows clients to use smaller receiving bandwidth to download broadcast data. An enhanced version, namely CCA++, was proposed to yield a shorter waiting time. This work further improves CCA++ in reducing client buffer requirements. The new scheme decreases the buffer requirements by as much as 52% when compared to CCA++. This study also provides an analytical evaluation to demonstrate the performance advantage, as compared with particular schemes.

Derivation of Monotone Likelihood Ratio Using Two Sided Uniformly Normal Distribution Techniques

In this paper, two-sided uniformly normal distribution techniques were used in the derivation of monotone likelihood ratio. The approach mainly employed the parameters of the distribution for a class of all size a. The derivation technique is fast, direct and less burdensome when compared to some existing methods.

Adaptive Score Normalization: A Novel Approach for Multimodal Biometric Systems

Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.

Investigation of Hydraulic and Thermal Performances of Fin Array at Different Shield Positions without By-Pass

In heat sinks, the flow within the core exhibits separation and hence does not lend itself to simple analytical boundary layer or duct flow analysis of the wall friction. In this paper, we present some findings from an experimental and numerical study aimed to obtain physical insight into the influence of the presence of the shield and its position on the hydraulic and thermal performance of square pin fin heat sink without top by-pass. The variations of the Nusselt number and friction factor are obtained under varied parameters, such as the Reynolds number and the shield position. The numerical code is validated by comparing the numerical results with the available experimental data. It is shown that, there is a good agreement between the temperature predictions based on the model and the experimental data. Results show that, as the presence of the shield, the heat transfer of fin array is enhanced and the flow resistance increased. The surface temperature distribution of the heat sink base is more uniform when the dimensionless shield position equals to 1/3 or 2/3. The comprehensive performance evaluation approach based on identical pumping power criteria is adopted and shows that the optimum shield position is at x/l=0.43.

Clustering of Variables Based On a Probabilistic Approach Defined on the Hypersphere

We consider n individuals described by p standardized variables, represented by points of the surface of the unit hypersphere Sn-1. For a previous choice of n individuals we suppose that the set of observables variables comes from a mixture of bipolar Watson distribution defined on the hypersphere. EM and Dynamic Clusters algorithms are used for identification of such mixture. We obtain estimates of parameters for each Watson component and then a partition of the set of variables into homogeneous groups of variables. Additionally we will present a factor analysis model where unobservable factors are just the maximum likelihood estimators of Watson directional parameters, exactly the first principal component of data matrix associated to each group previously identified. Such alternative model it will yield us to directly interpretable solutions (simple structure), avoiding factors rotations.

A Distance Function for Data with Missing Values and Its Application

Missing values in data are common in real world applications. Since the performance of many data mining algorithms depend critically on it being given a good metric over the input space, we decided in this paper to define a distance function for unlabeled datasets with missing values. We use the Bhattacharyya distance, which measures the similarity of two probability distributions, to define our new distance function. According to this distance, the distance between two points without missing attributes values is simply the Mahalanobis distance. When on the other hand there is a missing value of one of the coordinates, the distance is computed according to the distribution of the missing coordinate. Our distance is general and can be used as part of any algorithm that computes the distance between data points. Because its performance depends strongly on the chosen distance measure, we opted for the k nearest neighbor classifier to evaluate its ability to accurately reflect object similarity. We experimented on standard numerical datasets from the UCI repository from different fields. On these datasets we simulated missing values and compared the performance of the kNN classifier using our distance to other three basic methods. Our  experiments show that kNN using our distance function outperforms the kNN using other methods. Moreover, the runtime performance of our method is only slightly higher than the other methods.

Beneficiation of Pyrolitic Carbon Black

This research investigated treatment of crude carbon black produced from pyrolysis of waste tyres in order to evaluate its quality and possible industrial applications. A representative sample of crude carbon black was dry screened to determine the initial particle size distribution. This was followed by pulverizing the crude carbon black and leaching in hot concentrated sulphuric acid for the removal of heavy metals and other contaminants. Analysis of the refined carbon black showed a significant improvement of the product quality compared to crude carbon black. It was discovered that refined carbon black can be further classified into multiple high value products for various industrial applications such as filler, paint pigment, activated carbon and fuel briquettes.

The Implementation of the Multi-Agent Classification System (MACS) in Compliance with FIPA Specifications

The paper discusses the implementation of the MultiAgent classification System (MACS) and utilizing it to provide an automated and accurate classification of end users developing applications in the spreadsheet domain. However, different technologies have been brought together to build MACS. The strength of the system is the integration of the agent technology with the FIPA specifications together with other technologies, which are the .NET widows service based agents, the Windows Communication Foundation (WCF) services, the Service Oriented Architecture (SOA), and Oracle Data Mining (ODM). The Microsoft's .NET widows service based agents were utilized to develop the monitoring agents of MACS, the .NET WCF services together with SOA approach allowed the distribution and communication between agents over the WWW. The Monitoring Agents (MAs) were configured to execute automatically to monitor excel spreadsheets development activities by content. Data gathered by the Monitoring Agents from various resources over a period of time was collected and filtered by a Database Updater Agent (DUA) residing in the .NET client application of the system. This agent then transfers and stores the data in Oracle server database via Oracle stored procedures for further processing that leads to the classification of the end user developers.

Effect of Processing Methods on Texture Evolution in AZ31 Mg Alloy Sheet

Textures of AZ31 Mg alloy sheets were evaluated by using neutron diffraction method in this study. The AZ31 sheets were fabricated either by conventional casting and subsequent hot rolling or strip casting. The effect of warm rolling was investigated using the AZ31 Mg alloy sheet produced by conventional casting. Warm rolling of 30% thickness reduction per pass was possible without any side-crack at temperatures as low as 200oC under the roll speed of 30 m/min. The initial microstructure of conventionally cast specimen was found to be partially recrystallized structures. Grain refinement was found to occur actively during the warm rolling. The (0002),(10-10) (10-11),and (10-12) complete pole figures were measured using the HANARO FCD (Neutron Four Circle Diffractometer) and ODF were calculated. The major texture of all specimens can be expressed by ND//(0001) fiber texture. Texture of hot rolled specimen showed the strongest fiber component, while that of strip cast sheet seemed to be similar to random distribution.

Technique for Voltage Control in Distribution System

This paper presents the techniques for voltage control in distribution system. It is integrated in the distribution management system. Voltage is an important parameter for the control of electrical power systems. The distribution network operators have the responsibility to regulate the voltage supplied to consumer within statutory limits. Traditionally, the On-Load Tap Changer (OLTC) transformer equipped with automatic voltage control (AVC) relays is the most popular and effective voltage control device. A static synchronous compensator (STATCOM) may be equipped with several controllers to perform multiple control functions. Static Var Compensation (SVC) is regulation slopes and available margins for var dispatch. The voltage control in distribution networks is established as a centralized analytical function in this paper. 

Reliability Improvement with Optimal Placement of Distributed Generation in Distribution System

This paper presents the optimal placement and sizing of distributed generation (DG) in a distribution system. The problem is to reliability improvement of distribution system with distributed generations. The technique employed to solve the minimization problem is based on a developed Tabu search algorithm and reliability worth analysis. The developed methodology is tested with a distribution system of Roy Billinton Test System (RBTS) bus 2. It can be seen from the case study that distributed generation can reduce the customer interruption cost and therefore improve the reliability of the system. It is expected that our proposed method will be utilized effectively for distribution system operator.

Analysis of GI/M(n)/1/N Queue with Single Working Vacation and Vacation Interruption

This paper presents a finite buffer renewal input single working vacation and vacation interruption queue with state dependent services and state dependent vacations, which has a wide range of applications in several areas including manufacturing, wireless communication systems. Service times during busy period, vacation period and vacation times are exponentially distributed and are state dependent. As a result of the finite waiting space, state dependent services and state dependent vacation policies, the analysis of these queueing models needs special attention. We provide a recursive method using the supplementary variable technique to compute the stationary queue length distributions at pre-arrival and arbitrary epochs. An efficient computational algorithm of the model is presented which is fast and accurate and easy to implement. Various performance measures have been discussed. Finally, some special cases and numerical results have been depicted in the form of tables and graphs. 

Reliability Approximation through the Discretization of Random Variables using Reversed Hazard Rate Function

Sometime it is difficult to determine the exact reliability for complex systems in analytical procedures. Approximate solution of this problem can be provided through discretization of random variables. In this paper we describe the usefulness of discretization of a random variable using the reversed hazard rate function of its continuous version. Discretization of the exponential distribution has been demonstrated. Applications of this approach have also been cited. Numerical calculations indicate that the proposed approach gives very good approximation of reliability of complex systems under stress-strength set-up. The performance of the proposed approach is better than the existing discrete concentration method of discretization. This approach is conceptually simple, handles analytic intractability and reduces computational time. The approach can be applied in manufacturing industries for producing high-reliable items.

The Influence of Website Quality on Customer E-Satisfaction in Low Cost Airline

The evolution of customer behavior in purchasing products or services through the Internet leads to airline companies engaging in the e-ticketing process in order to maintain their business. A well-designed website is vitally significant for the airline companies to provide effective communication, support, and competitive advantage. This study was conducted to identify the dimensions of website quality for low cost airline and to investigate the relationship between the website quality and customer esatisfaction at low cost airline. A total of 381 responses were conveniently collected among local passengers at Low Cost Carrier Terminal, Kuala Lumpur via questionnaire distribution. This study found that the five determinant factors of website quality for AirAsia were Information Content, Navigation, Responsiveness, Personalization, and Security and Privacy. The results of this study revealed that there is a positive relationship between the five dimensions of website quality and customer e-satisfaction, and also information content was the most significant contributor to customer e-satisfaction.

Analysis of Mathematical Models and Their Application to Extreme Events

This paper discusses the application of extreme events distribution taking the Limpopo River Basin at Xai-Xai station, in Mozambique, as a case analysis. We analyze the extreme value concepts, namely Gumbel, Fréchet, Weibull and Generalized Extreme Value Distributions and then extrapolate the original data to 1000, 5000 and 10000 figures for further simulations and we compare their outcomes based on these three main distributions.

Development of Perez-Du Mortier Calibration Algorithm for Ground-Based Aerosol Optical Depth Measurement with Validation using SMARTS Model

Aerosols are small particles suspended in air that have wide varying spatial and temporal distributions. The concentration of aerosol in total columnar atmosphere is normally measured using aerosol optical depth (AOD). In long-term monitoring stations, accurate AOD retrieval is often difficult due to the lack of frequent calibration. To overcome this problem, a near-sea-level Langley calibration algorithm is developed using the combination of clear-sky detection model and statistical filter. It attempts to produce a dataset that consists of only homogenous and stable atmospheric condition for the Langley calibration purposes. In this paper, a radiance-based validation method is performed to further investigate the feasibility and consistency of the proposed algorithm at different location, day, and time. The algorithm is validated using SMARTS model based n DNI value. The overall results confirmed that the proposed calibration algorithm feasible and consistent for measurements taken at different sites and weather conditions.

Maximum Likelihood Estimation of Burr Type V Distribution under Left Censored Samples

The paper deals with the maximum likelihood estimation of the parameters of the Burr type V distribution based on left censored samples. The maximum likelihood estimators (MLE) of the parameters have been derived and the Fisher information matrix for the parameters of the said distribution has been obtained explicitly. The confidence intervals for the parameters have also been discussed. A simulation study has been conducted to investigate the performance of the point and interval estimates.

Limit State of Trapezoidal Metal Sheets Exposed to Concentrated Load

In most industrial compounds are used trapezoidal metal sheets like a roof decks. These trapezoidal metal sheets are exposed by concentrated loads, usually by service loads arise from installation of air distribution, sanitary distribution, sprinkler system or wiring installation. In objects of public facilities (like shopping centre, tennis hall, etc.) they can be used for hanging advertising posters etc, too. These systems work as “building kit”. These anchoring systems are represented by clamps in shape of “V”. This paper is occupy with recapitulation of installation systems available in trade with focus on load-bearing capacity specified by producer and on possible methods, how exactly define load bearing capacity of trapezoidal sheet loaded by concentrated load. The load bearing capacity was verified at experimental samples to determine real behavior of trapezoidal metal sheets exposed to concentrated loads.

Revision of Genus Polygonum L. s.l. in Flora of Armenia

The account of genus Polygonum L. in "Flora of Armenia" was made more than five decades ago. After that many expeditions have been carried out in different regions of Armenia and a huge herbarium material has been collected. The genus included 5 sections with 20 species. Since then many authors accepted the sections as separate genera on the basis of anatomical, morphological, palynological and molecular data. According to the above mentioned it became clear, that the taxonomy of Armenian representatives of Polygonum s. l. also needs revision. New literature data and our investigations of live and herbarium material (ERE, LE) with specification of the morphological characters, distribution, ecology, flowering and fruiting terms brought us to conclusion, that genus Polygonum s. l. has to be split into 5 different genera (Aconogonon (Meisn.) Reichenb., Bistorta (L.) Scop., Fallopia Adans., Persicaria Mill., Polygonum L. s. s.). The number of species has been reduced to 16 species. For each genus new determination keys has been created. 

Determination and Comparison of Fabric Pills Distribution Using Image Processing and Spatial Data Analysis Tools

This work deals with the determination and comparison of pill patterns in 2 sets of fabric samples which differ in way of pill creation. The first set contains fabric samples with the pills created by simulation on a Martindale abrasion machine, while pills in the second set originated during normal wearing and maintenance. The goal of the study is to determine whether the pattern of the fabric pills created by simulation is the same as the pattern of naturally occurring pills. The system of determination and comparison of the pills is based on image processing and spatial data analysis tools. Firstly, 3D reconstruction of the fabric surfaces with the pills is realized with using a gradient fields method. The gradient fields method creates a 3D fabric surface from a set of 4 images. Thereafter, the pills are detected in 3D fabric surfaces using image-processing tools in the MATLAB software. Determination and comparison of the pills patterns of two sets of fabric samples is based on spatial data analysis using tools in R software.