Numerical Simulation of Interfacial Flow with Volume-Of-Fluid Method

In this article, various models of surface tension force (CSF, CSS and PCIL) for interfacial flows have been applied to dynamic case and the results were compared. We studied the Kelvin- Helmholtz instabilities, which are produced by shear at the interface between two fluids with different physical properties. The velocity inlet is defined as a sinusoidal perturbation. When gravity and surface tension are taking into account, we observe the development of the Instability for a critic value of the difference of velocity of the both fluids. The VOF Model enables to simulate Kelvin-Helmholtz Instability as dynamic case.

Classifying Biomedical Text Abstracts based on Hierarchical 'Concept' Structure

Classifying biomedical literature is a difficult and challenging task, especially when a large number of biomedical articles should be organized into a hierarchical structure. In this paper, we present an approach for classifying a collection of biomedical text abstracts downloaded from Medline database with the help of ontology alignment. To accomplish our goal, we construct two types of hierarchies, the OHSUMED disease hierarchy and the Medline abstract disease hierarchies from the OHSUMED dataset and the Medline abstracts, respectively. Then, we enrich the OHSUMED disease hierarchy before adapting it to ontology alignment process for finding probable concepts or categories. Subsequently, we compute the cosine similarity between the vector in probable concepts (in the “enriched" OHSUMED disease hierarchy) and the vector in Medline abstract disease hierarchies. Finally, we assign category to the new Medline abstracts based on the similarity score. The results obtained from the experiments show the performance of our proposed approach for hierarchical classification is slightly better than the performance of the multi-class flat classification.

MJPEG Real-Time Transmission in Industrial Environments Using a CBR Channel

Currently, there are many local area industrial networks that can give guaranteed bandwidth to synchronous traffic, particularly providing CBR channels (Constant Bit Rate), which allow improved bandwidth management. Some of such networks operate over Ethernet, delivering channels with enough capacity, specially with compressors, to integrate multimedia traffic in industrial monitoring and image processing applications with many sources. In these industrial environments where a low latency is an essential requirement, JPEG is an adequate compressing technique but it generates VBR traffic (Variable Bit Rate). Transmitting VBR traffic in CBR channels is inefficient and current solutions to this problem significantly increase the latency or further degrade the quality. In this paper an R(q) model is used which allows on-line calculation of the JPEG quantification factor. We obtained increased quality, a lower requirement for the CBR channel with reduced number of discarded frames along with better use of the channel bandwidth.

An Efficient Data Mining Approach on Compressed Transactions

In an era of knowledge explosion, the growth of data increases rapidly day by day. Since data storage is a limited resource, how to reduce the data space in the process becomes a challenge issue. Data compression provides a good solution which can lower the required space. Data mining has many useful applications in recent years because it can help users discover interesting knowledge in large databases. However, existing compression algorithms are not appropriate for data mining. In [1, 2], two different approaches were proposed to compress databases and then perform the data mining process. However, they all lack the ability to decompress the data to their original state and improve the data mining performance. In this research a new approach called Mining Merged Transactions with the Quantification Table (M2TQT) was proposed to solve these problems. M2TQT uses the relationship of transactions to merge related transactions and builds a quantification table to prune the candidate itemsets which are impossible to become frequent in order to improve the performance of mining association rules. The experiments show that M2TQT performs better than existing approaches.

Robust Integrated Design for a Mechatronic Feed Drive System of Machine Tools

This paper aims at to develop a robust optimization methodology for the mechatronic modules of machine tools by considering all important characteristics from all structural and control domains in one single process. The relationship between these two domains is strongly coupled. In order to reduce the disturbance caused by parameters in either one, the mechanical and controller design domains need to be integrated. Therefore, the concurrent integrated design method Design For Control (DFC), will be employed in this paper. In this connect, it is not only applied to achieve minimal power consumption but also enhance structural performance and system response at same time. To investigate the method for integrated optimization, a mechatronic feed drive system of the machine tools is used as a design platform. Pro/Engineer and AnSys are first used to build the 3D model to analyze and design structure parameters such as elastic deformation, nature frequency and component size, based on their effects and sensitivities to the structure. In addition, the robust controller,based on Quantitative Feedback Theory (QFT), will be applied to determine proper control parameters for the controller. Therefore, overall physical properties of the machine tool will be obtained in the initial stage. Finally, the technology of design for control will be carried out to modify the structural and control parameters to achieve overall system performance. Hence, the corresponding productivity is expected to be greatly improved.

Active Packaging Influence on the Shelf Life of Milk Pomade Sweet – Sherbet

The objective of the research was to evaluate the quality of milk pomade sweet – sherbet packed in different packaging materials (Multibarrier 60, met.BOPET/PE, Aluthen), by several packaging technologies – active and modified atmosphere (MAP) (consisting of 100% CO2), and control – in air ambiance. Experiments were carried out at the Faculty of Food Technology of Latvia University of Agriculture. Samples were stored at the room temperature +21±1 °C. The physiochemical properties – weight losses, moisture, hardening, colour and changes in headspace atmosphere concentration (CO2 and O2) of packs were analysed before packaging and after 2, 4, 6, 8, 10 and 12 storage weeks.

Investigation of Plant Density and Weed Competition in Different Cultivars of Wheat In Khoramabad Region

In order to study the effect of plant density and competition of wheat with field bindweed (Convolvulus arvensis) on yield and agronomical properties of wheat(Triticum Sativum) in irrigated conditions, a factorial experiment as the base of complete randomize block design in three replication was conducted at the field of Kamalvand in khoramabad (Lorestan) region of Iran during 2008-2009. Three plant density (Factor A=200, 230 and 260kg/ha) three cultivar (Factor B=Bahar,Pishtaz and Alvand) and weed control (Factor C= control and no control of weeds)were assigned in experiment. Results show that: Plant density had significant effect (statistically) on seed yield, 1000 seed weight, weed density and dry weight of weeds, seed yield and harvest index had been meaningful effect for cultivars. The interaction between plant density and cultivars for weed density, seed yield, thousand seed weight and harvest index were significant. 260 kg/ha (plant density) of wheat had more effect on increasing of seed yield in Bahar cultivar wheat in khoramabad region of Iran.

Technology Integrated Education – Shaping the Personality and Social Development of the Young

There has been a strong link between computermediated education and constructivism learning and teaching theory.. Acknowledging how well the constructivism doctrine would work online, it has been established that constructivist views of learning would agreeably correlate with the philosophy of open and distance learning. Asynchronous and synchronous communications have placed online learning on the right track of a constructive learning path. This paper is written based on the social constructivist framework, where knowledge is constructed from social communication and interaction. The study explores the possibility of practicing this theory through incorporating online discussion in the syllabus and the ways it can be implemented to contribute to young people-s personality and social development by addressing some aspects that may contribute to the social problem such as prejudice, ignorance and intolerance.

Modeling and Optimization of Abrasive Waterjet Parameters using Regression Analysis

Abrasive waterjet is a novel machining process capable of processing wide range of hard-to-machine materials. This research addresses modeling and optimization of the process parameters for this machining technique. To model the process a set of experimental data has been used to evaluate the effects of various parameter settings in cutting 6063-T6 aluminum alloy. The process variables considered here include nozzle diameter, jet traverse rate, jet pressure and abrasive flow rate. Depth of cut, as one of the most important output characteristics, has been evaluated based on different parameter settings. The Taguchi method and regression modeling are used in order to establish the relationships between input and output parameters. The adequacy of the model is evaluated using analysis of variance (ANOVA) technique. The pairwise effects of process parameters settings on process response outputs are also shown graphically. The proposed model is then embedded into a Simulated Annealing algorithm to optimize the process parameters. The optimization is carried out for any desired values of depth of cut. The objective is to determine proper levels of process parameters in order to obtain a certain level of depth of cut. Computational results demonstrate that the proposed solution procedure is quite effective in solving such multi-variable problems.

Building an e-Learning System Model with Implications for Research and Instructional Use

This paper demonstrates a model of an e-Learning system based on nowadays learning theory and distant education practice. The relationships in the model are designed to be simple and functional and do not necessarily represent any particular e- Learning environments. It is meant to be a generic e-Learning system model with implications for any distant education course instructional design. It allows online instructors to move away from the discrepancy between the courses and body of knowledge. The interrelationships of four primary sectors that are at the e-Learning system are presented in this paper. This integrated model includes [1] pedagogy, [2] technology, [3] teaching, and [4] learning. There are interactions within each of these sectors depicted by system loop map.

A Matrix Evaluation Model for Sustainability Assessment of Manufacturing Technologies

Technology assessment is a vital part of decision process in manufacturing, particularly for decisions on selection of new sustainable manufacturing processes. To assess these processes, a matrix approach is introduced and sustainability assessment models are developed. Case studies show that the matrix-based approach provides a flexible and practical way for sustainability evaluation of new manufacturing technologies such as those used in surface coating. The technology assessment of coating processes reveals that compared with powder coating, the sol-gel coating can deliver better technical, economical and environmental sustainability with respect to the selected sustainability evaluation criteria for a decorative coating application of car wheels.

Application of l1-Norm Minimization Technique to Image Retrieval

Image retrieval is a topic where scientific interest is currently high. The important steps associated with image retrieval system are the extraction of discriminative features and a feasible similarity metric for retrieving the database images that are similar in content with the search image. Gabor filtering is a widely adopted technique for feature extraction from the texture images. The recently proposed sparsity promoting l1-norm minimization technique finds the sparsest solution of an under-determined system of linear equations. In the present paper, the l1-norm minimization technique as a similarity metric is used in image retrieval. It is demonstrated through simulation results that the l1-norm minimization technique provides a promising alternative to existing similarity metrics. In particular, the cases where the l1-norm minimization technique works better than the Euclidean distance metric are singled out.

Denitrification of Wastewater Containing High Nitrate Using a Bioreactor System Packed by Microbial Cellulose

A Laboratory-scale packed bed reactor with microbial cellulose as the biofilm carrier was used to investigate the denitrification of high-strength nitrate wastewater with specific emphasis on the effect the nitrogen loading rate and hydraulic retention time. Ethanol was added as a carbon source for denitrification. As a result of this investigation, it was found that up to 500 mg/l feed nitrate concentration the present system is able to produce an effluent with nitrate content below 10 ppm at 3 h hydraulic retention time. The highest observed denitrification rate was 4.57 kg NO3-N/ (m3 .d) at a nitrate load of 5.64 kg NO3- N/(m3 .d), and removal efficiencies higher than 90% were obtained for loads up to 4.2 kg NO3-N/(m3 .d). A mass relation between COD consumed and NO3-N removed around 2.82 was observed. This continuous-flow bioreactor proved an efficient denitrification system with a relatively low retention time.

Three Computational Mathematics Techniques: Comparative Determination of Area under Curve

The objective of this manuscript is to find area under the plasma concentration- time curve (AUC) for multiple doses of salbutamol sulphate sustained release tablets (Ventolin® oral tablets SR 8 mg, GSK, Pakistan) in the group of 18 healthy adults by using computational mathematics techniques. Following the administration of 4 doses of Ventolin® tablets 12 hourly to 24 healthy human subjects and bioanalysis of obtained plasma samples, plasma drug concentration-time profile was constructed. AUC, an important pharmacokinetic parameter, was measured using integrated equation of multiple oral dose regimens. The approximated AUC was also calculated by using computational mathematics techniques such as repeated rectangular, repeated trapezium and repeated Simpson's rule and compared with exact value of AUC calculated by using integrated equation of multiple oral dose regimens to find best computational mathematics method that gives AUC values closest to exact. The exact values of AUC for four consecutive doses of Ventolin® oral tablets were 150.5819473, 157.8131756, 164.4178231 and 162.78 ng.h/ml while the closest values approximated AUC values were 149.245962, 157.336171, 164.2585768 and 162.289224 ng.h/ml, respectively as found by repeated rectangular rule. The errors in the approximated values of AUC were negligible. It is concluded that all computational tools approximated values of AUC accurately but the repeated rectangular rule gives slightly better approximated values of AUC as compared to repeated trapezium and repeated Simpson's rules.

Noise Factors of RFID-Aided Positioning

In recent years, Radio Frequency Identification (RFID) is followed with interest by many researches, especially for the purpose of indoor positioning as the innate properties of RFID are profitable for achieving it. A lot of algorithms or schemes are proposed to be used in the RFID-based positioning system, but most of them are lack of environmental consideration and it induces inaccuracy of application. In this research, a lot of algorithms and schemes of RFID indoor positioning are discussed to see whether effective or not on application, and some rules are summarized for achieving accurate positioning. On the other hand, a new term “Noise Factor" is involved to describe the signal loss between the target and the obstacle. As a result, experimental data can be obtained but not only simulation; and the performance of the positioning system can be expressed substantially.

Performance Analysis of Adaptive LMS Filter through Regression Analysis using SystemC

The LMS adaptive filter has several parameters which can affect their performance. From among these parameters, most papers handle the step size parameter for controlling the performance. In this paper, we approach three parameters: step-size, filter tap-size and filter form. The regression analysis is used for defining the relation between parameters and performance of LMS adaptive filter with using the system level simulation results. The results present that all parameters have performance trends in each own particular form, which can be estimated from equations drawn by regression analysis.

LAYMOD; A Layered and Modular Platform for CAx Collaboration Management and Supporting Product data Integration based on STEP Standard

Nowadays companies strive to survive in a competitive global environment. To speed up product development/modifications, it is suggested to adopt a collaborative product development approach. However, despite the advantages of new IT improvements still many CAx systems work separately and locally. Collaborative design and manufacture requires a product information model that supports related CAx product data models. To solve this problem many solutions are proposed, which the most successful one is adopting the STEP standard as a product data model to develop a collaborative CAx platform. However, the improvement of the STEP-s Application Protocols (APs) over the time, huge number of STEP AP-s and cc-s, the high costs of implementation, costly process for conversion of older CAx software files to the STEP neutral file format; and lack of STEP knowledge, that usually slows down the implementation of the STEP standard in collaborative data exchange, management and integration should be considered. In this paper the requirements for a successful collaborative CAx system is discussed. The STEP standard capability for product data integration and its shortcomings as well as the dominant platforms for supporting CAx collaboration management and product data integration are reviewed. Finally a platform named LAYMOD to fulfil the requirements of CAx collaborative environment and integrating the product data is proposed. The platform is a layered platform to enable global collaboration among different CAx software packages/developers. It also adopts the STEP modular architecture and the XML data structures to enable collaboration between CAx software packages as well as overcoming the STEP standard limitations. The architecture and procedures of LAYMOD platform to manage collaboration and avoid contradicts in product data integration are introduced.

The Role of State in Combating Religious Extremism and Terrorism

terrorism and extremism are among the most dangerous and difficult to forecast the phenomena of our time, which are becoming more diverse forms and rampant. Terrorist attacks often produce mass casualties, involve the destruction of material and spiritual values, beyond the recovery times, sow hatred among nations, provoke war, mistrust and hatred between the social and national groups, which sometimes can not be overcome within a generation. Currently, the countries of Central Asia are a topical issue – the threat of terrorism and religious extremism, which grow not only in our area, but throughout the world. Of course, in each of the terrorist threat is assessed differently. In our country the problem of terrorism should not be acutely. Thus, after independence and sovereignty of Kazakhstan has chosen the path of democracy, progress and free economy. With the policy of the President of Kazakhstan Nursultan Nazarbayev and well-organized political and economic reforms, there has been economic growth and rising living standards, socio-political stability, ensured civil peace and accord in society [1].

Communication and Quality in Distributed Agile Development: An Empirical Case Study

Through inward perceptions, we intuitively expect distributed software development to increase the risks associated with achieving cost, schedule, and quality goals. To compound this problem, agile software development (ASD) insists one of the main ingredients of its success is cohesive communication attributed to collocation of the development team. The following study identified the degree of communication richness needed to achieve comparable software quality (reduce pre-release defects) between distributed and collocated teams. This paper explores the relevancy of communication richness in various development phases and its impact on quality. Through examination of a large distributed agile development project, this investigation seeks to understand the levels of communication required within each ASD phase to produce comparable quality results achieved by collocated teams. Obviously, a multitude of factors affects the outcome of software projects. However, within distributed agile software development teams, the mode of communication is one of the critical components required to achieve team cohesiveness and effectiveness. As such, this study constructs a distributed agile communication model (DAC-M) for potential application to similar distributed agile development efforts using the measurement of the suitable level of communication. The results of the study show that less rich communication methods, in the appropriate phase, might be satisfactory to achieve equivalent quality in distributed ASD efforts.

Face Recognition using a Kernelization of Graph Embedding

Linearization of graph embedding has been emerged as an effective dimensionality reduction technique in pattern recognition. However, it may not be optimal for nonlinearly distributed real world data, such as face, due to its linear nature. So, a kernelization of graph embedding is proposed as a dimensionality reduction technique in face recognition. In order to further boost the recognition capability of the proposed technique, the Fisher-s criterion is opted in the objective function for better data discrimination. The proposed technique is able to characterize the underlying intra-class structure as well as the inter-class separability. Experimental results on FRGC database validate the effectiveness of the proposed technique as a feature descriptor.