A Methodology for Quality Problems Diagnosis in SMEs

This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.

H∞ Approach to Functional Projective Synchronization for Chaotic Systems with Disturbances

This paper presents a method for functional projective H∞ synchronization problem of chaotic systems with external disturbance. Based on Lyapunov theory and linear matrix inequality (LMI) formulation, the novel feedback controller is established to not only guarantee stable synchronization of both drive and response systems but also reduce the effect of external disturbance to an H∞ norm constraint.

Consumer Product Demand Forecasting based on Artificial Neural Network and Support Vector Machine

The nature of consumer products causes the difficulty in forecasting the future demands and the accuracy of the forecasts significantly affects the overall performance of the supply chain system. In this study, two data mining methods, artificial neural network (ANN) and support vector machine (SVM), were utilized to predict the demand of consumer products. The training data used was the actual demand of six different products from a consumer product company in Thailand. The results indicated that SVM had a better forecast quality (in term of MAPE) than ANN in every category of products. Moreover, another important finding was the margin difference of MAPE from these two methods was significantly high when the data was highly correlated.

Generating Concept Trees from Dynamic Self-organizing Map

Self-organizing map (SOM) provides both clustering and visualization capabilities in mining data. Dynamic self-organizing maps such as Growing Self-organizing Map (GSOM) has been developed to overcome the problem of fixed structure in SOM to enable better representation of the discovered patterns. However, in mining large datasets or historical data the hierarchical structure of the data is also useful to view the cluster formation at different levels of abstraction. In this paper, we present a technique to generate concept trees from the GSOM. The formation of tree from different spread factor values of GSOM is also investigated and the quality of the trees analyzed. The results show that concept trees can be generated from GSOM, thus, eliminating the need for re-clustering of the data from scratch to obtain a hierarchical view of the data under study.

Three-Level Converters based Generalized Unified Power Quality Conditioner

A generalized unified power quality conditioner (GUPQC) by using three single-phase three-level voltage source converters (VSCs) connected back-to-back through a common dc link is proposed in this paper as a new custom power device for a three-feeder distribution system. One of the converters is connected in shunt with one feeder for mitigation of current harmonics and reactive power compensation, while the other two VSCs are connected in series with the other two feeders to maintain the load voltage sinusoidal and at constant level. A new control scheme based on synchronous reference frame is proposed for series converters. The simulation analysis on compensation performance of GUPQC based on PSCAD/EMTDC is reported.

A New Method for Contour Approximation Using Basic Ramer Idea

This paper presented two new efficient algorithms for contour approximation. The proposed algorithm is compared with Ramer (good quality), Triangle (faster) and Trapezoid (fastest) in this work; which are briefly described. Cartesian co-ordinates of an input contour are processed in such a manner that finally contours is presented by a set of selected vertices of the edge of the contour. In the paper the main idea of the analyzed procedures for contour compression is performed. For comparison, the mean square error and signal-to-noise ratio criterions are used. Computational time of analyzed methods is estimated depending on a number of numerical operations. Experimental results are obtained both in terms of image quality, compression ratios, and speed. The main advantages of the analyzed algorithm is small numbers of the arithmetic operations compared to the existing algorithms.

Comprehensive Hierarchy Evaluation of Power Quality Based on an Incentive Mechanism

In a liberalized electricity market, it is not surprising that different customers require different power quality (PQ) levels at different price. Power quality related to several power disturbances is described by many parameters, so how to define a comprehensive hierarchy evaluation system of power quality (PQCHES) has become a concerned issue. In this paper, based on four electromagnetic compatibility (EMC) levels, the numerical range of each power disturbance is divided into five grades (Grade I –Grade V), and the “barrel principle" of power quality is used for the assessment of overall PQ performance with only one grade indicator. A case study based on actual monitored data of PQ shows that the site PQ grade indicates the electromagnetic environment level and also expresses the characteristics of loads served by the site. The shortest plank principle of PQ barrel is an incentive mechanism, which can combine with the rewards/penalty mechanism (RPM) of consumed energy “on quality demand", to stimulate utilities to improve the overall PQ level and also stimulate end-user more “smart" under the infrastructure of future SmartGrid..

Development of Mechanical Properties of Self Compacting Concrete Contain Rice Husk Ash

Self-compacting concrete (SCC), a new kind of high performance concrete (HPC) have been first developed in Japan in 1986. The development of SCC has made casting of dense reinforcement and mass concrete convenient, has minimized noise. Fresh self-compacting concrete (SCC) flows into formwork and around obstructions under its own weight to fill it completely and self-compact (without any need for vibration), without any segregation and blocking. The elimination of the need for compaction leads to better quality concrete and substantial improvement of working conditions. SCC mixes generally have a much higher content of fine fillers, including cement, and produce excessively high compressive strength concrete, which restricts its field of application to special concrete only. To use SCC mixes in general concrete construction practice, requires low cost materials to make inexpensive concrete. Rice husk ash (RHA) has been used as a highly reactive pozzolanic material to improve the microstructure of the interfacial transition zone (ITZ) between the cement paste and the aggregate in self compacting concrete. Mechanical experiments of RHA blended Portland cement concretes revealed that in addition to the pozzolanic reactivity of RHA (chemical aspect), the particle grading (physical aspect) of cement and RHA mixtures also exerted significant influences on the blending efficiency. The scope of this research was to determine the usefulness of Rice husk ash (RHA) in the development of economical self compacting concrete (SCC). The cost of materials will be decreased by reducing the cement content by using waste material like rice husk ash instead of. This paper presents a study on the development of Mechanical properties up to 180 days of self compacting and ordinary concretes with rice-husk ash (RHA), from a rice paddy milling industry in Rasht (Iran). Two different replacement percentages of cement by RHA, 10%, and 20%, and two different water/cementicious material ratios (0.40 and 0.35), were used for both of self compacting and normal concrete specimens. The results are compared with those of the self compacting concrete without RHA, with compressive, flexural strength and modulus of elasticity. It is concluded that RHA provides a positive effect on the Mechanical properties at age after 60 days. Base of the result self compacting concrete specimens have higher value than normal concrete specimens in all test except modulus of elasticity. Also specimens with 20% replacement of cement by RHA have the best performance.

Exploiting Machine Learning Techniques for the Enhancement of Acceptance Sampling

This paper proposes an innovative methodology for Acceptance Sampling by Variables, which is a particular category of Statistical Quality Control dealing with the assurance of products quality. Our contribution lies in the exploitation of machine learning techniques to address the complexity and remedy the drawbacks of existing approaches. More specifically, the proposed methodology exploits Artificial Neural Networks (ANNs) to aid decision making about the acceptance or rejection of an inspected sample. For any type of inspection, ANNs are trained by data from corresponding tables of a standard-s sampling plan schemes. Once trained, ANNs can give closed-form solutions for any acceptance quality level and sample size, thus leading to an automation of the reading of the sampling plan tables, without any need of compromise with the values of the specific standard chosen each time. The proposed methodology provides enough flexibility to quality control engineers during the inspection of their samples, allowing the consideration of specific needs, while it also reduces the time and the cost required for these inspections. Its applicability and advantages are demonstrated through two numerical examples.

Fabrication and Electrical Characterization of Al/BaxSr1-xTiO3/Pt/SiO2/Si Configuration for FeFET Applications

The ferroelectric behavior of barium strontium titanate (BST) in thin film form has been investigated in order to study the possibility of using BST for ferroelectric gate-field effect transistor (FeFET) for memory devices application. BST thin films have been fabricated as Al/BST/Pt/SiO2/Si-gate configuration. The variation of the dielectric constant (ε) and tan δ with frequency have been studied to ensure the dielectric quality of the material. The results show that at low frequencies, ε increases as the Ba content increases, whereas at high frequencies, it shows the opposite variation, which is attributed to the dipole dynamics. tan δ shows low values with a peak at the mid-frequency range. The ferroelectric behavior of the Al/BST/Pt/SiO2/Si has been investigated using C-V characteristics. The results show that the strength of the ferroelectric hysteresis loop increases as the Ba content increases; this is attributed to the grain size and dipole dynamics effect.

Using Interval Constrained Petri Nets for the Fuzzy Regulation of Quality: Case of Assembly Process Mechanics

The indistinctness of the manufacturing processes makes that a parts cannot be realized in an absolutely exact way towards the specifications on the dimensions. It is thus necessary to assume that the effectively realized product has to belong in a very strict way to compatible intervals with a correct functioning of the parts. In this paper we present an approach based on mixing tow different characteristics theories, the fuzzy system and Petri net system. This tool has been proposed to model and control the quality in an assembly system. A robust command of a mechanical assembly process is presented as an application. This command will then have to maintain the specifications interval of parts in front of the variations. It also illustrates how the technique reacts when the product quality is high, medium, or low.

The Micro Ecosystem Restoration Mechanism Applied for Feasible Research of Lakes Eutrophication Enhancement

The technique of inducing micro ecosystem restoration is one of aquatic ecology engineering methods used to retrieve the polluted water. Batch scale study, pilot plant study, and field study were carried out to observe the eutrophication using the Inducing Ecology Restorative Symbiosis Agent (IERSA) consisting mainly degraded products by using lactobacillus, saccharomycete, and phycomycete. The results obtained from the experiments of the batch scale and pilot plant study allowed us to development the parameters for the field study. A pond, 5 m to the outlet of a lake, with an area of 500 m2 and depth of 0.6-1.2 m containing about 500 tons of water was selected as a model. After the treatment with 10 mg IERSA/L water twice a week for 70 days, the micro restoration mechanisms consisted of three stages (i.e., restoration, impact maintenance, and ecology recovery experiment after impact). The COD, TN, TKN, and chlorophyll a were reduced significantly in the first week. Although the unexpected heavy rain and contaminate from sewage system might slow the ecology restoration. However, the self-cleaning function continued and the chlorophyll a reduced for 50% in one month. In the 4th week, amoeba, paramecium, rotifer, and red wriggle worm reappeared, and the number of fish flies appeared up to1000 fish fries/m3. Those results proved that inducing restorative mechanism can be applied to improve the eutrophication and to control the growth of algae in the lakes by gaining the selfcleaning through inducing and competition of microbes. The situation for growth of fishes also can reach an excellent result due to the improvement of water quality.

Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Enhanced Spectral Envelope Coding Based On NLMS for G.729.1

In this paper, a new encoding algorithm of spectral envelope based on NLMS in G.729.1 for VoIP is proposed. In the TDAC part of G.729.1, the spectral envelope and MDCT coefficients extracted in the weighted CELP coding error (lower-band) and the higher-band input signal are encoded. In order to reduce allocation bits for spectral envelope coding, a new quantization algorithm based on NLMS is proposed. Also, reduced bits are used to enhance sound quality. The performance of the proposed algorithm is evaluated by sound quality and bit reduction rates in clean and frame loss conditions.

A Multiple-Objective Environmental Rationalization and Optimization for Material Substitution in the Production of Stone-Washed Jeans- Garments

As the Textile Industry is the second largest industry in Egypt and as small and medium-sized enterprises (SMEs) make up a great portion of this industry therein it is essential to apply the concept of Cleaner Production for the purpose of reducing pollution. In order to achieve this goal, a case study concerned with ecofriendly stone-washing of jeans-garments was investigated. A raw material-substitution option was adopted whereby the toxic potassium permanganate and sodium sulfide were replaced by the environmentally compatible hydrogen peroxide and glucose respectively where the concentrations of both replaced chemicals together with the operating time were optimized. In addition, a process-rationalization option involving four additional processes was investigated. By means of criteria such as product quality, effluent analysis, mass and heat balance; and cost analysis with the aid of a statistical model, a process optimization treatment revealed that the superior process optima were 50%, 0.15% and 50min for H2O2 concentration, glucose concentration and time, respectively. With these values the superior process ought to reduce the annual cost by about EGP 105 relative to the currently used conventional method.

A Study on the Quality of Hexapod Machine Tool's Workspace

One of the main concerns about parallel mechanisms is the presence of singular points within their workspaces. In singular positions the mechanism gains or loses one or several degrees of freedom. It is impossible to control the mechanism in singular positions. Therefore, these positions have to be avoided. This is a vital need especially in computer controlled machine tools designed and manufactured on the basis of parallel mechanisms. This need has to be taken into consideration when selecting design parameters. A prerequisite to this is a thorough knowledge about the effect of design parameters and constraints on singularity. In this paper, quality condition index was introduced as a criterion for evaluating singularities of different configurations of a hexapod mechanism obtainable by different design parameters. It was illustrated that this method can effectively be employed to obtain the optimum configuration of hexapod mechanism with the aim of avoiding singularity within the workspace. This method was then employed to design the hexapod table of a CNC milling machine.

A Similarity Function for Global Quality Assessment of Retinal Vessel Segmentations

Retinal vascularity assessment plays an important role in diagnosis of ophthalmic pathologies. The employment of digital images for this purpose makes possible a computerized approach and has motivated development of many methods for automated vascular tree segmentation. Metrics based on contingency tables for binary classification have been widely used for evaluating performance of these algorithms and, concretely, the accuracy has been mostly used as measure of global performance in this topic. However, this metric shows very poor matching with human perception as well as other notable deficiencies. Here, a new similarity function for measuring quality of retinal vessel segmentations is proposed. This similarity function is based on characterizing the vascular tree as a connected structure with a measurable area and length. Tests made indicate that this new approach shows better behaviour than the current one does. Generalizing, this concept of measuring descriptive properties may be used for designing functions for measuring more successfully segmentation quality of other complex structures.

Adaptive Bidirectional Flow for Image Interpolation and Enhancement

Image interpolation is a common problem in imaging applications. However, most interpolation algorithms in existence suffer visually the effects of blurred edges and jagged artifacts in the image to some extent. This paper presents an adaptive feature preserving bidirectional flow process, where an inverse diffusion is performed to sharpen edges along the normal directions to the isophote lines (edges), while a normal diffusion is done to remove artifacts (“jaggies") along the tangent directions. In order to preserve image features such as edges, corners and textures, the nonlinear diffusion coefficients are locally adjusted according to the directional derivatives of the image. Experimental results on synthetic images and nature images demonstrate that our interpolation algorithm substantially improves the subjective quality of the interpolated images over conventional interpolations.

Using Suffix Tree Document Representation in Hierarchical Agglomerative Clustering

In text categorization problem the most used method for documents representation is based on words frequency vectors called VSM (Vector Space Model). This representation is based only on words from documents and in this case loses any “word context" information found in the document. In this article we make a comparison between the classical method of document representation and a method called Suffix Tree Document Model (STDM) that is based on representing documents in the Suffix Tree format. For the STDM model we proposed a new approach for documents representation and a new formula for computing the similarity between two documents. Thus we propose to build the suffix tree only for any two documents at a time. This approach is faster, it has lower memory consumption and use entire document representation without using methods for disposing nodes. Also for this method is proposed a formula for computing the similarity between documents, which improves substantially the clustering quality. This representation method was validated using HAC - Hierarchical Agglomerative Clustering. In this context we experiment also the stemming influence in the document preprocessing step and highlight the difference between similarity or dissimilarity measures to find “closer" documents.

Hybrid of Hunting Search and Modified Simplex Methods for Grease Position Parameter Design Optimisation

This study proposes a multi-response surface optimization problem (MRSOP) for determining the proper choices of a process parameter design (PPD) decision problem in a noisy environment of a grease position process in an electronic industry. The proposed models attempts to maximize dual process responses on the mean of parts between failure on left and right processes. The conventional modified simplex method and its hybridization of the stochastic operator from the hunting search algorithm are applied to determine the proper levels of controllable design parameters affecting the quality performances. A numerical example demonstrates the feasibility of applying the proposed model to the PPD problem via two iterative methods. Its advantages are also discussed. Numerical results demonstrate that the hybridization is superior to the use of the conventional method. In this study, the mean of parts between failure on left and right lines improve by 39.51%, approximately. All experimental data presented in this research have been normalized to disguise actual performance measures as raw data are considered to be confidential.