Anti-Money Laundering Requirements – Perceived Effectiveness

Anti-money laundering is commonly recognized as a set of procedures, laws or regulations designed to reduce the practice of generating income through illegal actions. In Malaysia, the government and law enforcement agencies have stepped up their capacities and efforts to curb money laundering since 2001. One of these measures was the enactment of the Anti-Money Laundering Act (AMLA) in 2001. The implementation costs on anti-money laundering requirements (AMLR) can be burdensome to those who are involved in enforcing them. The objective of this paper is to explore the perceived effectiveness of AMLR from the enforcement agencies- perspective. This is a preliminary study whose findings will help to give direction for further AML research in Malaysia. In addition, the results of this study provide empirical evidences on the perceived effectiveness of AMLR prior to further investigations on barriers and improvements of the implementation of the anti-money laundering regime in Malaysia.

Aerodynamic Stall Control of a Generic Airfoil using Synthetic Jet Actuator

The aerodynamic stall control of a baseline 13-percent thick NASA GA(W)-2 airfoil using a synthetic jet actuator (SJA) is presented in this paper. Unsteady Reynolds-averaged Navier-Stokes equations are solved on a hybrid grid using a commercial software to simulate the effects of a synthetic jet actuator located at 13% of the chord from the leading edge at a Reynolds number Re = 2.1x106 and incidence angles from 16 to 22 degrees. The experimental data for the pressure distribution at Re = 3x106 and aerodynamic coefficients at Re = 2.1x106 (angle of attack varied from -16 to 22 degrees) without SJA is compared with the computational fluid dynamic (CFD) simulation as a baseline validation. A good agreement of the CFD simulations is obtained for aerodynamic coefficients and pressure distribution. A working SJA has been integrated with the baseline airfoil and initial focus is on the aerodynamic stall control at angles of attack from 16 to 22 degrees. The results show a noticeable improvement in the aerodynamic performance with increase in lift and decrease in drag at these post stall regimes.

Parametric Study of a Vapor Compression Refrigeration Cycle Using a Two-Phase Constant Area Ejector

There are several ways of improving the performance of a vapor compression refrigeration cycle. Use of an ejector as expansion device is one of the alternative ways. The present paper aims at evaluate the performance improvement of a vapor compression refrigeration cycle under a wide range of operating conditions. A numerical model is developed and a parametric study of important parameters such as condensation (30-50°C) and evaporation temperatures (-20-5°C), nozzle and diffuser efficiencies (0.75-0.95), subcooling and superheating degrees (0-15K) are investigated. The model verification gives a good agreement with the literature data. The simulation results revealed that condensation temperature has the highest effect (129%) on the performance improvement ratio while superheating has the lowest one (6.2%). Among ejector efficiencies, the diffuser efficiency has a significant effect on the COP of ejector expansion refrigeration cycle. The COP improvement percentage decreases from 10.9% to 4.6% as subcooling degrees increases by 15K.

Control Improvement of a C Sugar Cane Crystallization Using an Auto-Tuning PID Controller Based on Linearization of a Neural Network

The industrial process of the sugar cane crystallization produces a residual that still contains a lot of soluble sucrose and the objective of the factory is to improve its extraction. Therefore, there are substantial losses justifying the search for the optimization of the process. Crystallization process studied on the industrial site is based on the “three massecuites process". The third step of this process constitutes the final stage of exhaustion of the sucrose dissolved in the mother liquor. During the process of the third step of crystallization (Ccrystallization), the phase that is studied and whose control is to be improved, is the growing phase (crystal growth phase). The study of this process on the industrial site is a problem in its own. A control scheme is proposed to improve the standard PID control law used in the factory. An auto-tuning PID controller based on instantaneous linearization of a neural network is then proposed.

The Risk and Value Engineering Structures and their Integration with Industrial Projects Management (A Case Study on I. K.Corporation)

Value engineering is an efficacious contraption for administrators to make up their minds. Value perusals proffer the gaffers a suitable instrument to decrease the expenditures of the life span, quality amelioration, structural improvement, curtailment of the construction schedule, longevity prolongation or a merging of the aforementioned cases. Subjecting organizers to pressures on one hand and their accountability towards their pertinent fields together with inherent risks and ambiguities of other options on the other hand set some comptrollers in a dilemma utilization of risk management and the value engineering in projects manipulation with regard to complexities of implementing projects can be wielded as a contraption to identify and efface each item which wreaks unnecessary expenses and time squandering sans inflicting any damages upon the essential project applications. Of course It should be noted that implementation of risk management and value engineering with regard to the betterment of efficiency and functions may lead to the project implementation timing elongation. Here time revamping does not refer to time diminishing in the whole cases. his article deals with risk and value engineering conceptualizations at first. The germane reverberations effectuated due to its execution in Iran Khodro Corporation are regarded together with the joint features and amalgamation of the aforesaid entia; hence the proposed blueprint is submitted to be taken advantage of in engineering and industrial projects including Iran Khodro Corporation.

Information Filtering using Index Word Selection based on the Topics

We have proposed an information filtering system using index word selection from a document set based on the topics included in a set of documents. This method narrows down the particularly characteristic words in a document set and the topics are obtained by Sparse Non-negative Matrix Factorization. In information filtering, a document is often represented with the vector in which the elements correspond to the weight of the index words, and the dimension of the vector becomes larger as the number of documents is increased. Therefore, it is possible that useless words as index words for the information filtering are included. In order to address the problem, the dimension needs to be reduced. Our proposal reduces the dimension by selecting index words based on the topics included in a document set. We have applied the Sparse Non-negative Matrix Factorization to the document set to obtain these topics. The filtering is carried out based on a centroid of the learning document set. The centroid is regarded as the user-s interest. In addition, the centroid is represented with a document vector whose elements consist of the weight of the selected index words. Using the English test collection MEDLINE, thus, we confirm the effectiveness of our proposal. Hence, our proposed selection can confirm the improvement of the recommendation accuracy from the other previous methods when selecting the appropriate number of index words. In addition, we discussed the selected index words by our proposal and we found our proposal was able to select the index words covered some minor topics included in the document set.

Vector Space of the Extended Base-triplets over the Galois Field of five DNA Bases Alphabet

A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D, G, A, U, C}, where the letter D represent one or more hypothetical bases with unspecific pairing. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvements of a primitive DNA repair system could make possible the transition from the ancient to the modern genetic code. Our results suggest that the Watson-Crick base pairing and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as the transition from the former to the later. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences.

Statistical Process Optimization Through Multi-Response Surface Methodology

In recent years, response surface methodology (RSM) has brought many attentions of many quality engineers in different industries. Most of the published literature on robust design methodology is basically concerned with optimization of a single response or quality characteristic which is often most critical to consumers. For most products, however, quality is multidimensional, so it is common to observe multiple responses in an experimental situation. Through this paper interested person will be familiarize with this methodology via surveying of the most cited technical papers. It is believed that the proposed procedure in this study can resolve a complex parameter design problem with more than two responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready-made standard statistical packages.

Towards a Systematic Planning of Standardization Projects in Plant Engineering

In today-s economy plant engineering faces many challenges. For instance the intensifying competition in this business is leading to cost competition and needs for a shorter time-to-market. To remain competitive companies need to make their businesses more profitable by implementing improvement programs such as standardization projects. But they have difficulties to tap their full economic potential for various reasons. One of them is non-holistic planning and implementation of standardization projects. This paper describes a new conceptual framework - the layer-model. The model combines and expands existing proven approaches in order to improve design, implementation and management of standardization projects. Based on a holistic approach it helps to systematically analyze the effects of standardization projects on different business layers and enables companies to better seize the opportunities offered by standardization.

Designing a Multilingual Auction Website for Selling Agricultural Products

The study aimed to identify the logical structure of data and particularities of developing and testing a website designed for selling farm products through online auctions. The research is based on a short literature review in the field and exploratory trials of some successful models from other industries, in order to identify the advantages of using such tool, as well as the optimal structure and functionality of an auction portal. In the last part, the study focuses on the results of testing the website by the potential beneficiaries. Conclusions of the study underlines that the particularities of some agricultural products could raise difficulties in the process of selling them through online auctions, but the use of such system it is perceived to bring significant improvements in the supply chain. The results of scientific investigations require a more detailed study regarding the importance of using quality standards for agricultural products sold via online auction, the impact that implementation of an online payment system could have on trade with agricultural products and problems which could arise in using the website in different countries.

A Perceptual Image Coding method of High Compression Rate

In the framework of the image compression by Wavelet Transforms, we propose a perceptual method by incorporating Human Visual System (HVS) characteristics in the quantization stage. Indeed, human eyes haven-t an equal sensitivity across the frequency bandwidth. Therefore, the clarity of the reconstructed images can be improved by weighting the quantization according to the Contrast Sensitivity Function (CSF). The visual artifact at low bit rate is minimized. To evaluate our method, we use the Peak Signal to Noise Ratio (PSNR) and a new evaluating criteria witch takes into account visual criteria. The experimental results illustrate that our technique shows improvement on image quality at the same compression ratio.

Factors Influencing Knowledge Management Process Model: A Case Study of Manufacturing Industry in Thailand

The objectives of this research were to explore factors influencing knowledge management process in the manufacturing industry and develop a model to support knowledge management processes. The studied factors were technology infrastructure, human resource, knowledge sharing, and the culture of the organization. The knowledge management processes included discovery, capture, sharing, and application. Data were collected through questionnaires and analyzed using multiple linear regression and multiple correlation. The results found that technology infrastructure, human resource, knowledge sharing, and culture of the organization influenced the discovery and capture processes. However, knowledge sharing had no influence in sharing and application processes. A model to support knowledge management processes was developed, which indicated that sharing knowledge needed further improvement in the organization.

An Empirical Analysis of the Influence of Application Experience on Working Methods of Process Modelers

In view of growing competition in the service sector, services are as much in need of modeling, analysis and improvement as business or working processes. Graphical process models are important means to capture process-related know-how for an effective management of the service process. In this contribution, a human performance analysis of process model development paying special attention to model development time and the working method was conducted. It was found that modelers with higher application experience need significantly less time for mental activities than modelers with lower application experience, spend more time on labeling graphical elements, and achieved higher process model quality in terms of activity label quality.

A Novel Neighborhood Defined Feature Selection on Phase Congruency Images for Recognition of Faces with Extreme Variations

A novel feature selection strategy to improve the recognition accuracy on the faces that are affected due to nonuniform illumination, partial occlusions and varying expressions is proposed in this paper. This technique is applicable especially in scenarios where the possibility of obtaining a reliable intra-class probability distribution is minimal due to fewer numbers of training samples. Phase congruency features in an image are defined as the points where the Fourier components of that image are maximally inphase. These features are invariant to brightness and contrast of the image under consideration. This property allows to achieve the goal of lighting invariant face recognition. Phase congruency maps of the training samples are generated and a novel modular feature selection strategy is implemented. Smaller sub regions from a predefined neighborhood within the phase congruency images of the training samples are merged to obtain a large set of features. These features are arranged in the order of increasing distance between the sub regions involved in merging. The assumption behind the proposed implementation of the region merging and arrangement strategy is that, local dependencies among the pixels are more important than global dependencies. The obtained feature sets are then arranged in the decreasing order of discriminating capability using a criterion function, which is the ratio of the between class variance to the within class variance of the sample set, in the PCA domain. The results indicate high improvement in the classification performance compared to baseline algorithms.

Quantification of Periodicities in Fugitive Emission of Gases from Lyari Waterway

Periodicities in the environmetric time series can be idyllically assessed by utilizing periodic models. In this communication fugitive emission of gases from open sewer channel Lyari which follows periodic behaviour are approximated by employing periodic autoregressive model of order p. The orders of periodic model for each season are selected through the examination of periodic partial autocorrelation or information criteria. The parameters for the selected order of season are estimated individually for each emitted air toxin. Subsequently, adequacies of fitted models are established by examining the properties of the residual for each season. These models are beneficial for schemer and administrative bodies for the improvement of implemented policies to surmount future environmental problems.

A Weighted-Profiling Using an Ontology Basefor Semantic-Based Search

The information on the Web increases tremendously. A number of search engines have been developed for searching Web information and retrieving relevant documents that satisfy the inquirers needs. Search engines provide inquirers irrelevant documents among search results, since the search is text-based rather than semantic-based. Information retrieval research area has presented a number of approaches and methodologies such as profiling, feedback, query modification, human-computer interaction, etc for improving search results. Moreover, information retrieval has employed artificial intelligence techniques and strategies such as machine learning heuristics, tuning mechanisms, user and system vocabularies, logical theory, etc for capturing user's preferences and using them for guiding the search based on the semantic analysis rather than syntactic analysis. Although a valuable improvement has been recorded on search results, the survey has shown that still search engines users are not really satisfied with their search results. Using ontologies for semantic-based searching is likely the key solution. Adopting profiling approach and using ontology base characteristics, this work proposes a strategy for finding the exact meaning of the query terms in order to retrieve relevant information according to user needs. The evaluation of conducted experiments has shown the effectiveness of the suggested methodology and conclusion is presented.

Neutronic Study of Two Reactor Cores Cooled with Light and Heavy Water Using Computation Method

Most HWRs currently use natural uranium fuel. Using enriched uranium fuel results in a significant improvement in fuel cycle costs and uranium utilization. On the other hand, reactivity changes of HWRs over the full range of operating conditions from cold shutdown to full power are small. This reduces the required reactivity worth of control devices and minimizes local flux distribution perturbations, minimizing potential problems due to transient local overheating of fuel. Analyzing heavy water effectiveness on neutronic parameters such as enrichment requirements, peaking factor and reactivity is important and should pay attention as primary concepts of a HWR core designing. Two nuclear nuclear reactors of CANDU-type and hexagonal-type reactor cores of 33 fuel assemblies and 19 assemblies in 1.04 P/D have been respectively simulated using MCNP-4C code. Using heavy water and light water as moderator have been compared for achieving less reactivity insertion and enrichment requirements. Two fuel matrixes of (232Th/235U)O2 and (238/235U)O2 have been compared to achieve more economical and safe design. Heavy water not only decreased enrichment needs, but it concluded in negative reactivity insertions during moderator density variations. Thorium oxide fuel assemblies of 2.3% enrichment loaded into the core of heavy water moderator resulted in 0.751 fission to absorption ratio and peaking factor of 1.7 using. Heavy water not only provides negative reactivity insertion during temperature raises which changes moderator density but concluded in 2 to 10 kg reduction of enrichment requirements, depend on geometry type.

MinRoot and CMesh: Interconnection Architectures for Network-on-Chip Systems

The success of an electronic system in a System-on- Chip is highly dependent on the efficiency of its interconnection network, which is constructed from routers and channels (the routers move data across the channels between nodes). Since neither classical bus based nor point to point architectures can provide scalable solutions and satisfy the tight power and performance requirements of future applications, the Network-on-Chip (NoC) approach has recently been proposed as a promising solution. Indeed, in contrast to the traditional solutions, the NoC approach can provide large bandwidth with moderate area overhead. The selected topology of the components interconnects plays prime rule in the performance of NoC architecture as well as routing and switching techniques that can be used. In this paper, we present two generic NoC architectures that can be customized to the specific communication needs of an application in order to reduce the area with minimal degradation of the latency of the system. An experimental study is performed to compare these structures with basic NoC topologies represented by 2D mesh, Butterfly-Fat Tree (BFT) and SPIN. It is shown that Cluster mesh (CMesh) and MinRoot schemes achieves significant improvements in network latency and energy consumption with only negligible area overhead and complexity over existing architectures. In fact, in the case of basic NoC topologies, CMesh and MinRoot schemes provides substantial savings in area as well, because they requires fewer routers. The simulation results show that CMesh and MinRoot networks outperforms MESH, BFT and SPIN in main performance metrics.

Efficient Hardware Architecture of the Direct 2- D Transform for the HEVC Standard

This paper presents the hardware design of a unified architecture to compute the 4x4, 8x8 and 16x16 efficient twodimensional (2-D) transform for the HEVC standard. This architecture is based on fast integer transform algorithms. It is designed only with adders and shifts in order to reduce the hardware cost significantly. The goal is to ensure the maximum circuit reuse during the computing while saving 40% for the number of operations. The architecture is developed using FIFOs to compute the second dimension. The proposed hardware was implemented in VHDL. The VHDL RTL code works at 240 MHZ in an Altera Stratix III FPGA. The number of cycles in this architecture varies from 33 in 4-point- 2D-DCT to 172 when the 16-point-2D-DCT is computed. Results show frequency improvements reaching 96% when compared to an architecture described as the direct transcription of the algorithm.

A survey Method and new design Lecture Chair for Complied Ergonomics Guideline at Classroom Building 2 Suranaree University of Technology, Thailand

The paper describes ergonomics problems trend of student at B5101 classroom building 2, Suranaree University of Technology. The objective to survey ergonomics problems and effect from use chairs for sitting in class room. The result from survey method 100 student they use lecture chair for sitting in classroom more than 2 hours/ day by RULA[1]. and Body discomfort survey[2]. The result from Body discomfort survey contribute fatigue problems at neck, lower back, upper back and right shoulder 2.93, 2.91, 2.33, 1.75 respectively and result from RULA contribute fatigue problems at neck, body and right upper arm 4.00, 3.75 and 3.00 respectively are consistent. After that the researcher provide improvement plan for design new chair support student fatigue reduction by prepare data of sample anthropometry and design ergonomics chair prototype 3 unit. Then sample 100 student trial to use new chair and evaluate again by RULA, Body discomfort and satisfaction. The result from trial new chair after improvement by RULA present fatigue reduction average of head and neck from 4.00 to 2.25 , body and trunk from 3.75 to 2.00 and arm force from 1.00 to 0.25 respectively. The result from trial new chair after improvement by Body discomfort present fatigue reduction average of lower back from 2.91 to 0.87, neck from 2.93 to 1.24, upper back 2.33 to 0.84 and right upper arm from 1.75 to 0.74. That statistical of RULA and Body discomfort survey present fatigue reduction after improvement significance with a confidence level of 95% (p-value 0.05). When analyzing the relationship of fatigue as part of the body by Chi – square test during RULA and Body discomfort that before and after improvements were consistent with the significant level of confidence 95% (p-value 0.05) . Moreover the students satisfaction result from trial with a new chair for 30 minutes [3]. 72 percent very satisfied of the folding of the secondary writing simple 66% the width of the writing plate, 64% the suitability of the writing plate, 62% of soft seat cushion and 61% easy to seat the chair.