Generalization of SGIP Surface Tension Force Model in Three-Dimensional Flows and Compare to Other Models in Interfacial Flows

In this paper, the two-dimensional stagger grid interface pressure (SGIP) model has been generalized and presented into three-dimensional form. For this purpose, various models of surface tension force for interfacial flows have been investigated and compared with each other. The VOF method has been used for tracking the interface. To show the ability of the SGIP model for three-dimensional flows in comparison with other models, pressure contours, maximum spurious velocities, norm spurious flow velocities and pressure jump error for motionless drop of liquid and bubble of gas are calculated using different models. It has been pointed out that SGIP model in comparison with the CSF, CSS and PCIL models produces the least maximum and norm spurious velocities. Additionally, the new model produces more accurate results in calculating the pressure jumps across the interface for motionless drop of liquid and bubble of gas which is generated in surface tension force.

Towards Better Understanding of the Concept of Tacit Knowledge – A Cognitive Approach

Tacit knowledge has been one of the most discussed and contradictory concepts in the field of knowledge management since the mid 1990s. The concept is used relatively vaguely to refer to any type of information that is difficult to articulate, which has led to discussions about the original meaning of the concept (adopted from Polanyi-s philosophy) and the nature of tacit knowing. It is proposed that the subject should be approached from the perspective of cognitive science in order to connect tacit knowledge to empirically studied cognitive phenomena. Some of the most important examples of tacit knowing presented by Polanyi are analyzed in order to trace the cognitive mechanisms of tacit knowing and to promote better understanding of the nature of tacit knowledge. The cognitive approach to Polanyi-s theory reveals that the tacit/explicit typology of knowledge often presented in the knowledge management literature is not only artificial but totally opposite approach compared to Polanyi-s thinking.

Combating Money Laundering in the Banking Industry: Malaysian Experience

Money laundering has been described by many as the lifeblood of crime and is a major threat to the economic and social well-being of societies. It has been recognized that the banking system has long been the central element of money laundering. This is in part due to the complexity and confidentiality of the banking system itself. It is generally accepted that effective anti-money laundering (AML) measures adopted by banks will make it tougher for criminals to get their "dirty money" into the financial system. In fact, for law enforcement agencies, banks are considered to be an important source of valuable information for the detection of money laundering. However, from the banks- perspective, the main reason for their existence is to make as much profits as possible. Hence their cultural and commercial interests are totally distinct from that of the law enforcement authorities. Undoubtedly, AML laws create a major dilemma for banks as they produce a significant shift in the way banks interact with their customers. Furthermore, the implementation of the laws not only creates significant compliance problems for banks, but also has the potential to adversely affect the operations of banks. As such, it is legitimate to ask whether these laws are effective in preventing money launderers from using banks, or whether they simply put an unreasonable burden on banks and their customers. This paper attempts to address these issues and analyze them against the background of the Malaysian AML laws. It must be said that effective coordination between AML regulator and the banking industry is vital to minimize problems faced by the banks and thereby to ensure effective implementation of the laws in combating money laundering.

Principal Component Analysis using Singular Value Decomposition of Microarray Data

A series of microarray experiments produces observations of differential expression for thousands of genes across multiple conditions. Principal component analysis(PCA) has been widely used in multivariate data analysis to reduce the dimensionality of the data in order to simplify subsequent analysis and allow for summarization of the data in a parsimonious manner. PCA, which can be implemented via a singular value decomposition(SVD), is useful for analysis of microarray data. For application of PCA using SVD we use the DNA microarray data for the small round blue cell tumors(SRBCT) of childhood by Khan et al.(2001). To decide the number of components which account for sufficient amount of information we draw scree plot. Biplot, a graphic display associated with PCA, reveals important features that exhibit relationship between variables and also the relationship of variables with observations.

Raman Scattering and PL Studies on AlGaN/GaN HEMT Layers on 200 mm Si(111)

The crystalline quality of the AlGaN/GaN high electron mobility transistor (HEMT) structure grown on a 200 mm silicon substrate has been investigated using UV-visible micro- Raman scattering and photoluminescence (PL). The visible Raman scattering probes the whole nitride stack with the Si substrate and shows the presence of a small component of residual in-plane stress in the thick GaN buffer resulting from a wafer bowing, while the UV micro-Raman indicates a tensile interfacial stress induced at the top GaN/AlGaN/AlN layers. PL shows a good crystal quality GaN channel where the yellow band intensity is very low compared to that of the near-band-edge transition. The uniformity of this sample is shown by measurements from several points across the epiwafer.

The Analysis of Printing Quality of Offset - Printing Ink with Coconut Oil Base

The objectives of this research are to produce prototype coconut oil based solvent offset printing inks and to analyze a basic quality of printing work derived from coconut oil based solvent offset printing inks, by mean of bringing coconut oil for producing varnish and bringing such varnish to produce black offset printing inks. Then, analysis of qualities i.e. CIELAB value, density value, and dot gain value of printing work from coconut oil based solvent offset printing inks which printed on gloss-coated woodfree paper weighs 130 grams were done. The research result of coconut oil based solvent offset printing inks indicated that the suitable varnish formulation is using 51% of coconut oil, 36% of phenolic resin, and 14% of solvent oil 14%, while the result of producing black offset ink displayed that the suitable formula of printing ink is using varnish mixed with 20% of coconut oil, and the analyzing printing work of coconut oil based solvent offset printing inks which printed on paper, the results were as follows: CIELAB value of black offset printing ink is at L* = 31.90, a* = 0.27, and b* = 1.86, density value is at 1.27 and dot gain value was high at mid tone area of image area.

How Prior Knowledge Affects User's Understanding of System Requirements?

Requirements are critical to system validation as they guide all subsequent stages of systems development. Inadequately specified requirements generate systems that require major revisions or cause system failure entirely. Use Cases have become the main vehicle for requirements capture in many current Object Oriented (OO) development methodologies, and a means for developers to communicate with different stakeholders. In this paper we present the results of a laboratory experiment that explored whether different types of use case format are equally effective in facilitating high knowledge user-s understanding. Results showed that the provision of diagrams along with the textual use case descriptions significantly improved user comprehension of system requirements in both familiar and unfamiliar application domains. However, when comparing groups that received models of textual description accompanied with diagrams of different level of details (simple and detailed) we found no significant difference in performance.

A Study of Grounding Grid Characteristics with Conductive Concrete

The purpose of this paper is to improve electromagnetic characteristics on grounding grid by applying the conductive concrete. The conductive concrete in this study is under an extra high voltage (EHV, 345kV) system located in a high-tech industrial park or science park. Instead of surrounding soil of grounding grid, the application of conductive concrete can reduce equipment damage and body damage caused by switching surges. The focus of the two cases on the EHV distribution system in a high-tech industrial park is presented to analyze four soil material styles. By comparing several soil material styles, the study results have shown that the conductive concrete can effectively reduce the negative damages caused by electromagnetic transient. The adoption of the style of grounding grid located 1.0 (m) underground and conductive concrete located from the ground surface to 1.25 (m) underground can obviously improve the electromagnetic characteristics so as to advance protective efficiency.

An Integrated Biotechnology Database of the National Agricultural Information Center in Korea

The National Agricultural Biotechnology Information Center (NABIC) plays a leading role in the biotechnology information database for agricultural plants in Korea. Since 2002, we have concentrated on functional genomics of major crops, building an integrated biotechnology database for agro-biotech information that focuses on bioinformatics of major agricultural resources such as rice, Chinese cabbage, and microorganisms. In the NABIC, integration-based biotechnology database provides useful information through a user-friendly web interface that allows analysis of genome infrastructure, multiple plants, microbial resources, and living modified organisms.

A Method of Protecting Relational Databases Copyright with Cloud Watermark

With the development of Internet and databases application techniques, the demand that lots of databases in the Internet are permitted to remote query and access for authorized users becomes common, and the problem that how to protect the copyright of relational databases arises. This paper simply introduces the knowledge of cloud model firstly, includes cloud generators and similar cloud. And then combined with the property of the cloud, a method of protecting relational databases copyright with cloud watermark is proposed according to the idea of digital watermark and the property of relational databases. Meanwhile, the corresponding watermark algorithms such as cloud watermark embedding algorithm and detection algorithm are proposed. Then, some experiments are run and the results are analyzed to validate the correctness and feasibility of the watermark scheme. In the end, the foreground of watermarking relational database and its research direction are prospected.

A Programmer’s Survey of the Quantum Computing Paradigm

Research in quantum computation is looking for the consequences of having information encoding, processing and communication exploit the laws of quantum physics, i.e. the laws which govern the ultimate knowledge that we have, today, of the foreign world of elementary particles, as described by quantum mechanics. This paper starts with a short survey of the principles which underlie quantum computing, and of some of the major breakthroughs brought by the first ten to fifteen years of research in this domain; quantum algorithms and quantum teleportation are very biefly presented. The next sections are devoted to one among the many directions of current research in the quantum computation paradigm, namely quantum programming languages and their semantics. A few other hot topics and open problems in quantum information processing and communication are mentionned in few words in the concluding remarks, the most difficult of them being the physical implementation of a quantum computer. The interested reader will find a list of useful references at the end of the paper.

Evaluation of Degree and the Effect of Order in the Family on Violence against Children A Survey among Guidance School Students in Gilanegharb City in Iran

A review of the literature found that Domestic violence and child maltreatment co-occur in many families, the purpose of this study attempts to emphasize the factors relating to intra-family relationships (order point of view) on violence against the children, For this purpose a survey technique on the sample size amounted 200 students of governmental guidance schools of city of Gilanegharb in country of Iran were considered. For measurement of violence against the children (VAC) the CTS scaled has been used .The results showed that children have experienced the violence more than once during the last year. degree of order in family is high. Explanation result indicated that the order variables in family including collective thinking, empathy, communal co-circumstance have significant effects on VAC.

Robust Minutiae Watermarking in Wavelet Domain for Fingerprint Security

In this manuscript, a wavelet-based blind watermarking scheme has been proposed as a means to provide security to authenticity of a fingerprint. The information used for identification or verification of a fingerprint mainly lies in its minutiae. By robust watermarking of the minutiae in the fingerprint image itself, the useful information can be extracted accurately even if the fingerprint is severely degraded. The minutiae are converted in a binary watermark and embedding these watermarks in the detail regions increases the robustness of watermarking, at little to no additional impact on image quality. It has been experimentally shown that when the minutiae is embedded into wavelet detail coefficients of a fingerprint image in spread spectrum fashion using a pseudorandom sequence, the robustness is observed to have a proportional response while perceptual invisibility has an inversely proportional response to amplification factor “K". The DWT-based technique has been found to be very robust against noises, geometrical distortions filtering and JPEG compression attacks and is also found to give remarkably better performance than DCT-based technique in terms of correlation coefficient and number of erroneous minutiae.

The Investigations of Water-ethanol Mixture by Monte Carlo Method

Energetic and structural results for ethanol-water mixtures as a function of the mole fraction were calculated using Monte Carlo methodology. Energy partitioning results obtained for equimolar water-ethanol mixture and ether organic liquids are compared. It has been shown that at xet=0.22 the RDFs for waterethanol and ethanol-ethanol interactions indicated strong hydrophobic interactions between ethanol molecules and the local structure of solution is less structured at this concentration as at ether ones. Results obtained for ethanol-water mixture as a function of concentration are in good agreement with the experimental data.

Comparing Academically Gifted and Non-Gifted Students- Supportive Environments in Jordan

Jordan exerts many efforts to nurture their academically gifted students in special schools since 2001. During the past nine years of launching these schools, their learning and excellence environments were believed to be distinguished compared to public schools. This study investigated the environments of gifted students compared with other non-gifted, using a survey instrument that measures the dimensions of family, peers, teachers, school- support, society, and resources –dimensions rooted deeply in supporting gifted education, learning, and achievement. A total number of 109 were selected from excellence schools for academically gifted students, and 119 non-gifted students were selected from public schools. Around 8.3% of the non-gifted students reported that they “Never" received any support from their surrounding environments, 14.9% reported “Seldom" support, 23.7% reported “ Often" support, 26.0% reported “Frequent" support, and 32.8% reported “Very frequent" support. Where the gifted students reported more “Never" support than the non-gifted did with 11.3%, “Seldom" support with 15.4%, “Often" support with 26.6%, “Frequent" support with 29.0%, and reported “Very frequent" support less than the non-gifted students with 23.6%. Unexpectedly, statistical differences were found between the two groups favoring non-gifted students in perception of their surrounding environments in specific dimensions, namely, school- support, teachers, and society. No statistical differences were found in the other dimensions of the survey, namely, family, peers, and resources. As the differences were found in teachers, school- support, and society, the nurturing environments for the excellence schools need to be revised to adopt more creative teaching styles, rich school atmosphere and infrastructures, interactive guiding for the students and their parents, promoting for the excellence environments, and re-build successful identification models. Thus, families, schools, and society should increase their cooperation, communication, and awareness of the gifted supportive environments. However, more studies to investigate other aspects of promoting academic giftedness and excellence are recommended.

Therapeutic Product Preparation Bioprocess Modeling

An immunomodulator bioproduct is prepared in a batch bioprocess with a modified bacterium Pseudomonas aeruginosa. The bioprocess is performed in 100 L Bioengineering bioreactor with 42 L cultivation medium made of peptone, meat extract and sodium chloride. The optimal bioprocess parameters were determined: temperature – 37 0C, agitation speed - 300 rpm, aeration rate – 40 L/min, pressure – 0.5 bar, Dow Corning Antifoam M-max. 4 % of the medium volume, duration - 6 hours. This kind of bioprocesses are appreciated as difficult to control because their dynamic behavior is highly nonlinear and time varying. The aim of the paper is to present (by comparison) different models based on experimental data. The analysis criteria were modeling error and convergence rate. The estimated values and the modeling analysis were done by using the Table Curve 2D. The preliminary conclusions indicate Andrews-s model with a maximum specific growth rate of the bacterium in the range of 0.8 h-1.

Housing Defect of Newly Completed House: An Analysis Using Condition Survey Protocol (CSP) 1 Matrix

Housing is a basic human right. The provision of new house shall be free from any defects, even for the defects that people do normally considered as 'cosmetic defects'. This paper studies about the building defects of newly completed house of 72 unit of double-storey terraced located in Bangi, Selangor. The building survey implemented using protocol 1 (visual inspection). As for new house, the survey work is very stringent in determining the defects condition and priority. Survey and reporting procedure is carried out based on CSP1 Matrix that involved scoring system, photographs and plan tagging. The analysis is done using Statistical Package for Social Sciences (SPSS). The finding reveals that there are 2119 defects recorded in 72 terraced houses. The cumulative score obtained was 27644 while the overall rating is 13.05. These results indicate that the construction quality of the newly terraced houses is low and not up to an acceptable standard as the new house should be.

Medical Image Segmentation Using Deformable Model and Local Fitting Binary: Thoracic Aorta

This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA datasets. An important challenge in reliably detecting aortic is the need to overcome problems associated with intensity inhomogeneities. Level sets are part of an important class of methods that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the level set formulation aids the suppression of noise in the extracted regions of interest and then guides the motion of the evolving contour for the detection of weak boundaries. The speed of curve evolution has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level sets, and are shown to be more effective than other approaches in coping with intensity inhomogeneities. We have applied the Courant Friedrichs Levy (CFL) condition as stability criterion for our algorithm.

An Adaptive Memetic Algorithm With Dynamic Population Management for Designing HIV Multidrug Therapies

In this paper, a mathematical model of human immunodeficiency virus (HIV) is utilized and an optimization problem is proposed, with the final goal of implementing an optimal 900-day structured treatment interruption (STI) protocol. Two type of commonly used drugs in highly active antiretroviral therapy (HAART), reverse transcriptase inhibitors (RTI) and protease inhibitors (PI), are considered. In order to solving the proposed optimization problem an adaptive memetic algorithm with population management (AMAPM) is proposed. The AMAPM uses a distance measure to control the diversity of population in genotype space and thus preventing the stagnation and premature convergence. Moreover, the AMAPM uses diversity parameter in phenotype space to dynamically set the population size and the number of crossovers during the search process. Three crossover operators diversify the population, simultaneously. The progresses of crossover operators are utilized to set the number of each crossover per generation. In order to escaping the local optima and introducing the new search directions toward the global optima, two local searchers assist the evolutionary process. In contrast to traditional memetic algorithms, the activation of these local searchers is not random and depends on both the diversity parameters in genotype space and phenotype space. The capability of AMAPM in finding optimal solutions compared with three popular metaheurestics is introduced.

Investigation of Effective Parameters on Annealing and Hot Spotting Processes for Straightening of Bent Turbine Rotors

The most severe damage of the turbine rotor is its distortion. The rotor straightening process must lead, at the first stage, to removal of the stresses from the material by annealing and next, to straightening of the plastic distortion without leaving any stress by hot spotting. The straightening method does not produce stress accumulations and the heating technique, developed specifically for solid forged rotors and disks, enables to avoid local overheating and structural changes in the material. This process also does not leave stresses in the shaft material. An experimental study of hot spotting is carried out on a large turbine rotor and some of the most important effective parameters that must be considered on annealing and hot spotting processes are investigated in this paper.