A Study of Thai Muslims’ Way of Life through Their Clothes

The purpose of this research was to investigate Thai Muslims’ way of life through the way their clothes. The data of this qualitative research were collected from related documents and research reports, ancient cloths and clothing, and in-depth interviews with clothes owners and weavers. The research found that in the 18th century Thai Muslims in the three southern border provinces used many types of clothing in their life. At home women wore plain clothes. They used checked cloths to cover the upper part of their body from the breasts down to the waist. When going out, they used Lima cloth and So Kae with a piece of Pla-nging cloth as a head scarf. For men, they wore a checked sarong as a lower garment, and wore no upper garment. However, when going out, they wore Puyo Potong. In addition, Thai Muslims used cloths in various religious rites, namely, the rite of placing a baby in a cradle, the Masoyawi rite, the Nikah rite, and the burial rite. These types of cloths were related to the way of life of Thai Muslims from birth to death. They reflected the race, gender, age, social status, values, and beliefs in traditions that have been inherited. Practical Implication: Woven in these cloths are the lost local wisdom, and therefore, aesthetics on the cloths are like mirrors reflecting the background of people in this region that is fading away. These cloths are pages of a local history book that is of importance and value worth for preservation and publicity so that they are treasured. Government organizations can expand and materialize the knowledge received from the study in accordance with government policy in supporting the One Tambon, One Product project.

Lean Changeability – Evaluation and Design of Lean and Transformable Factories

In today-s turbulent environment, companies are faced with two principal challenges. On the one hand, it is necessary to produce ever more cost-effectively to remain competitive. On the other hand, factories need to be transformable in order to manage unpredictable changes in the corporate environment. To deal with these different challenges, companies use the philosophy of lean production in the first case, in the second case the philosophy of transformability. To a certain extent these two approaches follow different directions. This can cause conflicts when designing factories. Therefore, the Institute of Production Systems and Logistics (IFA) of the Leibniz University of Hanover has developed a procedure to allow companies to evaluate and design their factories with respect to the requirements of both philosophies.

Neural Adaptive Switching Control of Robotic Systems

In this paper a neural adaptive control method has been developed and applied to robot control. Simulation results are presented to verify the effectiveness of the controller. These results show that the performance by using this controller is better than those which just use either direct inverse control or predictive control. In addition, they show that the resulting is a useful method which combines the advantages of both direct inverse control and predictive control.

A Study on Algorithm Fusion for Recognition and Tracking of Moving Robot

This paper presents an algorithm for the recognition and tracking of moving objects, 1/10 scale model car is used to verify performance of the algorithm. Presented algorithm for the recognition and tracking of moving objects in the paper is as follows. SURF algorithm is merged with Lucas-Kanade algorithm. SURF algorithm has strong performance on contrast, size, rotation changes and it recognizes objects but it is slow due to many computational complexities. Processing speed of Lucas-Kanade algorithm is fast but the recognition of objects is impossible. Its optical flow compares the previous and current frames so that can track the movement of a pixel. The fusion algorithm is created in order to solve problems which occurred using the Kalman Filter to estimate the position and the accumulated error compensation algorithm was implemented. Kalman filter is used to create presented algorithm to complement problems that is occurred when fusion two algorithms. Kalman filter is used to estimate next location, compensate for the accumulated error. The resolution of the camera (Vision Sensor) is fixed to be 640x480. To verify the performance of the fusion algorithm, test is compared to SURF algorithm under three situations, driving straight, curve, and recognizing cars behind the obstacles. Situation similar to the actual is possible using a model vehicle. Proposed fusion algorithm showed superior performance and accuracy than the existing object recognition and tracking algorithms. We will improve the performance of the algorithm, so that you can experiment with the images of the actual road environment.

Intellectual Capital Report for Universities

Intellectual capital reporting becomes critical at universities, mainly due to the fact that knowledge is the main output as well as input in these institutions. In addition, universities have continuous external demands for greater information and transparency about the use of public funds, and are increasingly provided with greater autonomy regarding their organization, management, and budget allocation. This situation requires new management and reporting systems. The purpose of the present study is to provide a model for intellectual capital report in Spanish universities. To this end, a questionnaire was sent to every member of the Social Councils of Spanish public universities in order to identify which intangible elements university stakeholders demand most. Our proposal for an intellectual capital report aims to act as a guide to help the Spanish universities on the road to the presentation of information on intellectual capital which can assist stakeholders to make the right decisions.

Multi-stage Directional Median Filter

Median filter is widely used to remove impulse noise without blurring sharp edges. However, when noise level increased, or with thin edges, median filter may work poorly. This paper proposes a new filter, which will detect edges along four possible directions, and then replace noise corrupted pixel with estimated noise-free edge median value. Simulations show that the proposed multi-stage directional median filter can provide excellent performance of suppressing impulse noise in all situations.

SWARM: A Meta-Scheduler to Minimize Job Queuing Times on Computational Grids

Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.

Reciprocating Compressor Optimum Design and Manufacturing with Respect to Performance, Reliability and Cost

Reciprocating compressors are flexible to handle wide capacity and condition swings, offer a very efficient method of compressing almost any gas mixture in wide range of pressure, can generate high head independent of density, and have numerous applications and wide power ratings. These make them vital component in various units of industrial plants. In this paper optimum reciprocating compressor configuration regarding interstage pressures, low suction pressure, non-lubricated cylinder, speed of machine, capacity control system, compressor valve, lubrication system, piston rod coating, cylinder liner material, barring device, pressure drops, rod load, pin reversal, discharge temperature, cylinder coolant system, performance, flow, coupling, special tools, condition monitoring (including vibration, thermal and rod drop monitoring), commercial points, delivery and acoustic conditions are presented.

Compiler-Based Architecture for Context Aware Frameworks

Computers are being integrated in the various aspects of human every day life in different shapes and abilities. This fact has intensified a requirement for the software development technologies which is ability to be: 1) portable, 2) adaptable, and 3) simple to develop. This problem is also known as the Pervasive Computing Problem (PCP) which can be implemented in different ways, each has its own pros and cons and Context Oriented Programming (COP) is one of the methods to address the PCP. In this paper a design for a COP framework, a context aware framework, is presented which has eliminated weak points of a previous design based on interpreter languages, while introducing the compiler languages power in implementing these frameworks. The key point of this improvement is combining COP and Dependency Injection (DI) techniques. Both old and new frameworks are analyzed to show advantages and disadvantages. Finally a simulation of both designs is proposed to indicating that the practical results agree with the theoretical analysis while the new design runs almost 8 times faster.

System-Level Energy Estimation for SoC based on the Dynamic Behavior of Embedded Software

This paper describes a system-level SoC energy consumption estimation method based on a dynamic behavior of embedded software in the early stages of the SoC development. A major problem of SOC development is development rework caused by unreliable energy consumption estimation at the early stages. The energy consumption of an SoC used in embedded systems is strongly affected by the dynamic behavior of the software. At the early stages of SoC development, modeling with a high level of abstraction is required for both the dynamic behavior of the software, and the behavior of the SoC. We estimate the energy consumption by a UML model-based simulation. The proposed method is applied for an actual embedded system in an MFP. The energy consumption estimation of the SoC is more accurate than conventional methods and this proposed method is promising to reduce the chance of development rework in the SoC development. ∈

Weed Classification using Histogram Maxima with Threshold for Selective Herbicide Applications

Information on weed distribution within the field is necessary to implement spatially variable herbicide application. Since hand labor is costly, an automated weed control system could be feasible. This paper deals with the development of an algorithm for real time specific weed recognition system based on Histogram Maxima with threshold of an image that is used for the weed classification. This algorithm is specifically developed to classify images into broad and narrow class for real-time selective herbicide application. The developed system has been tested on weeds in the lab, which have shown that the system to be very effectiveness in weed identification. Further the results show a very reliable performance on images of weeds taken under varying field conditions. The analysis of the results shows over 95 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.

Hiding Data in Images Using PCP

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

A Field Research for Investigating the Effect of Strategic Management on Institutionalization Levels of Enterprises

The aim of this study is to determine the effect of strategic management implementations on the institutionalization levels. In this regard a field study has been made over 31 stone quarry enterprises in cement producing sector in Konya by using survey method. In this study, institutionalization levels of the enterprises have been evaluated regarding three dimensions: professionalization, management approach, participation in decisions and delegation of authority. According to the results of the survey, there is a highly positive and statistically significant relationship between the strategic management implementations and institutionalization levels of the enterprises. Additionally,-considering the results of regression analysis made for establishing the relationship between strategic management and institutionalization levels- it has been determined that strategic management implementations of the enterprises can be used as a variable to explain the institutionalization levels of them, and also strategic management implementations of the enterprises increase the institutionalization levels of them.

The Studying of The “Бақыт”(“Happiness”) Concept In The Kazakh Language

The given article deals with the usage of the concept in many spheres of science, including its place in the Kazakh linguistics One of such concepts is the role of the “бақыт” (“happiness”) concept in the Kazakh outlook. The work tells us about its studying. The data about studying of the “happiness” concept in the sphere of philosophy, psychology, cognitive linguistics, lingo cultural study, logics, psycho-linguistic are given in this work. Particularly dwelling at length on the studying level of the concept in the sphere of cognitive linguistics, analysis have been made pertaining linguist point of views. It was pointed out that the concept of “happiness” hasn’t been studied yet in the Kazakh linguistics and it is necessary to find out the meaning of the language units related to this concept, i.e. blessings, proverbs, sayings and phrasiological units.

Evaluation Framework for Agent-Oriented Methodologies

Many agent-oriented software engineering methodologies have been proposed for software developing; however their application is still limited due to their lack of maturity. Evaluating the strengths and weaknesses of these methodologies plays an important role in improving them and in developing new stronger methodologies. This paper presents an evaluation framework for agent-oriented methodologies, which addresses six major areas: concepts, notation, process, pragmatics, support for software engineering and marketability. The framework is then used to evaluate the Gaia methodology to identify its strengths and weaknesses, and to prove the ability of the framework for promoting the agent-oriented methodologies by detecting their weaknesses in detail.

The Role of Knowledge Management in Enterprise 2.0

The term Enterprise 2.0 (E2.0) describes a collection of organizational and IT practices that help organizations establish flexible work models, visible knowledge-sharing practices, and higher levels of community participation. E2.0 parallels and builds on another term commonly being used in the industry – Web 2.0. E2.0 represents also new packaging for strategic collaboration and Knowledge Management (KM). Organizations rely on collaboration and KM initiatives to attain innovation, growth, productivity, and performance goals.

Use of Novel Algorithms MAJE4 and MACJER-320 for Achieving Confidentiality and Message Authentication in SSL and TLS

Extensive use of the Internet coupled with the marvelous growth in e-commerce and m-commerce has created a huge demand for information security. The Secure Socket Layer (SSL) protocol is the most widely used security protocol in the Internet which meets this demand. It provides protection against eaves droppings, tampering and forgery. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL. But recent attacks against RC4 and HMAC have raised questions in the confidence on these algorithms. Hence two novel cryptographic algorithms MAJE4 and MACJER-320 have been proposed as substitutes for them. The focus of this work is to demonstrate the performance of these new algorithms and suggest them as dependable alternatives to satisfy the need of security services in SSL. The performance evaluation has been done by using practical implementation method.

Effect of Enzyme and Heat Pretreatment on Sunflower Oil Recovery Using Aqueous and Hexane Extractions

The effects of enzyme action and heat pretreatment on oil extraction yield from sunflower kernels were analysed using hexane extraction with Soxhlet, and aqueous extraction with incubator shaker. Ground kernels of raw and heat treated kernels, each with and without Viscozyme treatment were used. Microscopic images of the kernels were taken to analyse the visible effects of each treatment on the cotyledon cell structure of the kernels. Heat pretreated kernels before both extraction processes produced enhanced oil extraction yields than the control, with steam explosion the most efficient. In hexane extraction, applying a combination of steam explosion and Viscozyme treatments to the kernels before the extraction gave the maximum oil extractable in 1 hour; while for aqueous extraction, raw kernels treated with Viscozyme gave the highest oil extraction yield. Remarkable cotyledon cell disruption was evident in kernels treated with Viscozyme; whereas steam explosion and conventional heat treated kernels had similar effects.

A State Aggregation Approach to Singularly Perturbed Markov Reward Processes

In this paper, we propose a single sample path based algorithm with state aggregation to optimize the average rewards of singularly perturbed Markov reward processes (SPMRPs) with a large scale state spaces. It is assumed that such a reward process depend on a set of parameters. Differing from the other kinds of Markov chain, SPMRPs have their own hierarchical structure. Based on this special structure, our algorithm can alleviate the load in the optimization for performance. Moreover, our method can be applied on line because of its evolution with the sample path simulated. Compared with the original algorithm applied on these problems of general MRPs, a new gradient formula for average reward performance metric in SPMRPs is brought in, which will be proved in Appendix, and then based on these gradients, the schedule of the iteration algorithm is presented, which is based on a single sample path, and eventually a special case in which parameters only dominate the disturbance matrices will be analyzed, and a precise comparison with be displayed between our algorithm with the old ones which is aim to solve these problems in general Markov reward processes. When applied in SPMRPs, our method will approach a fast pace in these cases. Furthermore, to illustrate the practical value of SPMRPs, a simple example in multiple programming in computer systems will be listed and simulated. Corresponding to some practical model, physical meanings of SPMRPs in networks of queues will be clarified.

Managing Meat Safety at South African Abattoirs

The importance of ensuring safe meat handling and processing practices has been demonstrated in global reports on food safety scares and related illness and deaths. This necessitated stricter meat safety control strategies. Today, many countries have regulated towards preventative and systematic control over safe meat processing at abattoirs utilizing the Hazard Analysis Critical Control Point (HACCP) principles. HACCP systems have been reported as effective in managing food safety risks, if correctly implemented. South Africa has regulated the Hygiene Management System (HMS) based on HACCP principles applicable to abattoirs. Regulators utilise the Hygiene Assessment System (HAS) to audit compliance at abattoirs. These systems were benchmarked from the United Kingdom (UK). Little research has been done them since inception as of 2004. This paper presents a review of the two systems, its implementation and comparison with HACCP. Recommendations are made for future research to demonstrate the utility of the HMS and HAS in assuring safe meat to consumers.