Integrating the Theory of Constraints and Six Sigma in Manufacturing Process Improvement

Six Sigma is a well known discipline that reduces variation using complex statistical tools and the DMAIC model. By integrating Goldratts-s Theory of Constraints, the Five Focusing Points and System Thinking tools, Six Sigma projects can be selected where it can cause more impact in the company. This research defines an integrated model of six sigma and constraint management that shows a step-by-step guide using the original methodologies from each discipline and is evaluated in a case study from the production line of a Automobile engine monoblock V8, resulting in an increase in the line capacity from 18.7 pieces per hour to 22.4 pieces per hour, a reduction of 60% of Work-In-Process and a variation decrease of 0.73%.

A File Splitting Technique for Reducing the Entropy of Text Files

A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.

What Creative Industries Have to Offer to Business? Creative Partnerships and Mutual Benefits

In the time of globalisation, growing uncertainty, ambiguity and change, traditional way of doing business are no longer sufficient and it is important to consider non-conventional methods and approaches to release creativity and facilitate innovation and growth. Thus, creative industries, as a natural source of creativity and innovation, draw particular attention. This paper explores feasibility of building creative partnerships between creative industries and business and brings attention to mutual benefits derived from such partnerships. Design/approach - This paper is a theoretical exploration of projects, practices and research findings addressing collaboration between creative industries and business. Thus, it concerns creative industries, arts, business and its representatives in order to define requirements for creative partnerships to work and succeed. Findings – Current practices in engaging into arts-business partnerships are still very few, although most of creative partnerships proved to be highly valuable and mutually beneficial. Certain conditions shall be provided in order to benefit from arts-business creative synergy. Originality/value- By integrating different sources of literature, this article provides a base for conducting empirical research in several dimensions within arts-business partnerships.

Dynamic Traffic Simulation for Traffic Congestion Problem Using an Enhanced Algorithm

Traffic congestion has become a major problem in many countries. One of the main causes of traffic congestion is due to road merges. Vehicles tend to move slower when they reach the merging point. In this paper, an enhanced algorithm for traffic simulation based on the fluid-dynamic algorithm and kinematic wave theory is proposed. The enhanced algorithm is used to study traffic congestion at a road merge. This paper also describes the development of a dynamic traffic simulation tool which is used as a scenario planning and to forecast traffic congestion level in a certain time based on defined parameter values. The tool incorporates the enhanced algorithm as well as the two original algorithms. Output from the three above mentioned algorithms are measured in terms of traffic queue length, travel time and the total number of vehicles passing through the merging point. This paper also suggests an efficient way of reducing traffic congestion at a road merge by analyzing the traffic queue length and travel time.

Identification of Regulatory Mechanism of Orthostatic Response

En bloc assumes modeling all phases of the orthostatic test with the only one mathematical model, which allows the complex parametric view of orthostatic response. The work presents the implementation of a mathematical model for processing of the measurements of systolic, diastolic blood pressure and heart rate performed on volunteers during orthostatic test. The original assumption of model hypothesis that every postural change means only one Stressor, did not complying with the measurements of physiological circulation factor-time profiles. Results of the identification support the hypothesis that second postural change of orthostatic test causes induced Stressors, with the observation of a physiological regulation mechanism. Maximal demonstrations are on the heart rate and diastolic blood pressure-time profile, minimal are for the measurements of the systolic blood pressure. Presented study gives a new view on orthostatic test with impact on clinical practice.

Diagnosis of Multivariate Process via Nonlinear Kernel Method Combined with Qualitative Representation of Fault Patterns

The fault detection and diagnosis of complicated production processes is one of essential tasks needed to run the process safely with good final product quality. Unexpected events occurred in the process may have a serious impact on the process. In this work, triangular representation of process measurement data obtained in an on-line basis is evaluated using simulation process. The effect of using linear and nonlinear reduced spaces is also tested. Their diagnosis performance was demonstrated using multivariate fault data. It has shown that the nonlinear technique based diagnosis method produced more reliable results and outperforms linear method. The use of appropriate reduced space yielded better diagnosis performance. The presented diagnosis framework is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. The use of reduced model space helps to mitigate the sensitivity of the fault pattern to noise.

Enhancement of Stereo Video Pairs Using SDNs To Aid In 3D Reconstruction

This paper presents the results of enhancing images from a left and right stereo pair in order to increase the resolution of a 3D representation of a scene generated from that same pair. A new neural network structure known as a Self Delaying Dynamic Network (SDN) has been used to perform the enhancement. The advantage of SDNs over existing techniques such as bicubic interpolation is their ability to cope with motion and noise effects. SDNs are used to generate two high resolution images, one based on frames taken from the left view of the subject, and one based on the frames from the right. This new high resolution stereo pair is then processed by a disparity map generator. The disparity map generated is compared to two other disparity maps generated from the same scene. The first is a map generated from an original high resolution stereo pair and the second is a map generated using a stereo pair which has been enhanced using bicubic interpolation. The maps generated using the SDN enhanced pairs match more closely the target maps. The addition of extra noise into the input images is less problematic for the SDN system which is still able to out perform bicubic interpolation.

A New Integer Programming Formulation for the Chinese Postman Problem with Time Dependent Travel Times

The Chinese Postman Problem (CPP) is one of the classical problems in graph theory and is applicable in a wide range of fields. With the rapid development of hybrid systems and model based testing, Chinese Postman Problem with Time Dependent Travel Times (CPPTDT) becomes more realistic than the classical problems. In the literature, we have proposed the first integer programming formulation for the CPPTDT problem, namely, circuit formulation, based on which some polyhedral results are investigated and a cutting plane algorithm is also designed. However, there exists a main drawback: the circuit formulation is only available for solving the special instances with all circuits passing through the origin. Therefore, this paper proposes a new integer programming formulation for solving all the general instances of CPPTDT. Moreover, the size of the circuit formulation is too large, which is reduced dramatically here. Thus, it is possible to design more efficient algorithm for solving the CPPTDT in the future research.

On the Reduction of Side Effects in Tomography

As the Computed Tomography(CT) requires normally hundreds of projections to reconstruct the image, patients are exposed to more X-ray energy, which may cause side effects such as cancer. Even when the variability of the particles in the object is very less, Computed Tomography requires many projections for good quality reconstruction. In this paper, less variability of the particles in an object has been exploited to obtain good quality reconstruction. Though the reconstructed image and the original image have same projections, in general, they need not be the same. In addition to projections, if a priori information about the image is known, it is possible to obtain good quality reconstructed image. In this paper, it has been shown by experimental results why conventional algorithms fail to reconstruct from a few projections, and an efficient polynomial time algorithm has been given to reconstruct a bi-level image from its projections along row and column, and a known sub image of unknown image with smoothness constraints by reducing the reconstruction problem to integral max flow problem. This paper also discusses the necessary and sufficient conditions for uniqueness and extension of 2D-bi-level image reconstruction to 3D-bi-level image reconstruction.

An Algorithm for Secure Visible Logo Embedding and Removing in Compression Domain

Digital watermarking is the process of embedding information into a digital signal which can be used in DRM (digital rights managements) system. The visible watermark (often called logo) can indicate the owner of the copyright which can often be seen in the TV program and protects the copyright in an active way. However, most of the schemes do not consider the visible watermark removing process. To solve this problem, a visible watermarking scheme with embedding and removing process is proposed under the control of a secure template. The template generates different version of watermarks which can be seen visually the same for different users. Users with the right key can completely remove the watermark and recover the original image while the unauthorized user is prevented to remove the watermark. Experiment results show that our watermarking algorithm obtains a good visual quality and is hard to be removed by the illegally users. Additionally, the authorized users can completely remove the visible watermark and recover the original image with a good quality.

Performance Trade-Off of File System between Overwriting and Dynamic Relocation on a Solid State Drive

Most file systems overwrite modified file data and metadata in their original locations, while the Log-structured File System (LFS) dynamically relocates them to other locations. We design and implement the Evergreen file system that can select between overwriting or relocation for each block of a file or metadata. Therefore, the Evergreen file system can achieve superior write performance by sequentializing write requests (similar to LFS-style relocation) when space utilization is low and overwriting when utilization is high. Another challenging issue is identifying performance benefits of LFS-style relocation over overwriting on a newly introduced SSD (Solid State Drive) which has only Flash-memory chips and control circuits without mechanical parts. Our experimental results measured on a SSD show that relocation outperforms overwriting when space utilization is below 80% and vice versa.

Rheology of Composites with Nature Vegetal Origin Fibers

Conventional materials like glass, wood or metals replacement with polymer materials is still continuing. More simple thus cheaper production is the main reason. However due to high energy and petrochemical prices are polymer prices increasing too. That´s why various kinds of fillers are used to make polymers cheaper. Of course target is to maintain or improve properties of these compounds. In this paper are solved rheology issues of polymers compounded with vegetal origin fibers.

A Watermarking Scheme for MP3 Audio Files

In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.

The Taste of Native Land in Everyday Practices of Repatriates – Variations by the Countries of Origin (by Field Materials)

Practices of food sharing as part of the brotherhood and hospitality interpretation have been essential part of the Kazakh ethnic culture since early times. Dialogue in time and space between Kazakhs through differences in food interpretation among the ethnic repatriates may become a link connecting them and platform for stable relations with the host society or serious barrier on the way of their integration in the Kazakhstani society. The article elucidates by the field materials how some aspects of food culture differences among ethnic Kazakhs living abroad (XUAR of China) and ethnic repatriates in Kazakhstan may influence their integration path.

Relative Mapping Errors of Linear Time Invariant Systems Caused By Particle Swarm Optimized Reduced Order Model

The authors present an optimization algorithm for order reduction and its application for the determination of the relative mapping errors of linear time invariant dynamic systems by the simplified models. These relative mapping errors are expressed by means of the relative integral square error criterion, which are determined for both unit step and impulse inputs. The reduction algorithm is based on minimization of the integral square error by particle swarm optimization technique pertaining to a unit step input. The algorithm is simple and computer oriented. It is shown that the algorithm has several advantages, e.g. the reduced order models retain the steady-state value and stability of the original system. Two numerical examples are solved to illustrate the superiority of the algorithm over some existing methods.

Embedded Semi-Fragile Signature Based Scheme for Ownership Identification and Color Image Authentication with Recovery

In this paper, a novel scheme is proposed for Ownership Identification and Color Image Authentication by deploying Cryptography & Digital Watermarking. The color image is first transformed from RGB to YST color space exclusively designed for watermarking. Followed by color space transformation, each channel is divided into 4×4 non-overlapping blocks with selection of central 2×2 sub-blocks. Depending upon the channel selected two to three LSBs of each central 2×2 sub-block are set to zero to hold the ownership, authentication and recovery information. The size & position of sub-block is important for correct localization, enhanced security & fast computation. As YS ÔèÑ T so it is suitable to embed the recovery information apart from the ownership and authentication information, therefore 4×4 block of T channel along with ownership information is then deployed by SHA160 to compute the content based hash that is unique and invulnerable to birthday attack or hash collision instead of using MD5 that may raise the condition i.e. H(m)=H(m'). For recovery, intensity mean of 4x4 block of each channel is computed and encoded upto eight bits. For watermark embedding, key based mapping of blocks is performed using 2DTorus Automorphism. Our scheme is oblivious, generates highly imperceptible images with correct localization of tampering within reasonable time and has the ability to recover the original work with probability of near one.

The effect of Gamma Irradiation on the Nutritional Properties of Functional Products of the Green Banana

Banana is one of the most consumed fruits in the tropics and subtropics. Brazil accounts for about 9% of the world banana production. However, the production losses are as high as 30 to 40% and even much higher in some developing countries. The green banana flour is a complex carbohydrate source, including a high total starch (73.4%), resistant starch (17.5%) with functional properties. Gamma irradiation is considered to be an alternative method for food preservation. It has been performed due to the need of extending the shelf - life of foods, whilst maintaining their safety and avoiding one of the main concerns: the nutrient loss. In this work data about on the effects of ionizing radiation on the physicochemical analysis (carbohydrate, proteins, lipids, alimentary fiber, moistures and ashes) of Brazilian functional products (biscuits and bread) of the green banana pulp are presented. The caloric value was calculated. No significant difference was observed between the samples of irradiated and non – irradiated green banana biscuits with the following determinations: carbohydrates, proteins, alimentary fiber and ashes. Only a small significant difference was found in lipids (macronutrients). The results of physical chemical analysis of the irradiated and non- irradiated green banana bread non- irradiated showed no significant difference with the following determinations: carbohydrates, lipids (macronutrients), moisture, ashes and caloric value. A small difference was found in proteins (macronutrients). Irradiation of functional products (biscuits and bread) with doses of 1 and 3kGy maintained their original macronutrients content, showing good radioresistance.

A Novel Optimized JTAG Interface Circuit Design

This paper describes a novel optimized JTAG interface circuit between a JTAG controller and target IC. Being able to access JTAG using only one or two pins, this circuit does not change the original boundary scanning test frequency of target IC. Compared with the traditional JTAG interface which based on IEEE std. 1149.1, this reduced pin technology is more applicability in pin limited devices, and it is easier to control the scale of target IC for the designer.

Non-Overlapping Hierarchical Index Structure for Similarity Search

In order to accelerate the similarity search in highdimensional database, we propose a new hierarchical indexing method. It is composed of offline and online phases. Our contribution concerns both phases. In the offline phase, after gathering the whole of the data in clusters and constructing a hierarchical index, the main originality of our contribution consists to develop a method to construct bounding forms of clusters to avoid overlapping. For the online phase, our idea improves considerably performances of similarity search. However, for this second phase, we have also developed an adapted search algorithm. Our method baptized NOHIS (Non-Overlapping Hierarchical Index Structure) use the Principal Direction Divisive Partitioning (PDDP) as algorithm of clustering. The principle of the PDDP is to divide data recursively into two sub-clusters; division is done by using the hyper-plane orthogonal to the principal direction derived from the covariance matrix and passing through the centroid of the cluster to divide. Data of each two sub-clusters obtained are including by a minimum bounding rectangle (MBR). The two MBRs are directed according to the principal direction. Consequently, the nonoverlapping between the two forms is assured. Experiments use databases containing image descriptors. Results show that the proposed method outperforms sequential scan and SRtree in processing k-nearest neighbors.

Model Reduction of Linear Systems by Conventional and Evolutionary Techniques

Reduction of Single Input Single Output (SISO) continuous systems into Reduced Order Model (ROM), using a conventional and an evolutionary technique is presented in this paper. In the conventional technique, the mixed advantages of Mihailov stability criterion and continued fraction expansions (CFE) technique is employed where the reduced denominator polynomial is derived using Mihailov stability criterion and the numerator is obtained by matching the quotients of the Cauer second form of Continued fraction expansions. In the evolutionary technique method Particle Swarm Optimization (PSO) is employed to reduce the higher order model. PSO method is based on the minimization of the Integral Squared Error (ISE) between the transient responses of original higher order model and the reduced order model pertaining to a unit step input. Both the methods are illustrated through numerical example.