Visual Study on Flow Patterns and Heat Transfer during Convective Boiling Inside Horizontal Smooth and Microfin Tubes

Evaporator is an important and widely used heat exchanger in air conditioning and refrigeration industries. Different methods have been used by investigators to increase the heat transfer rates in evaporators. One of the passive techniques to enhance heat transfer coefficient is the application of microfin tubes. The mechanism of heat transfer augmentation in microfin tubes is dependent on the flow regime of two-phase flow. Therefore many investigations of the flow patterns for in-tube evaporation have been reported in literatures. The gravitational force, surface tension and the vapor-liquid interfacial shear stress are known as three dominant factors controlling the vapor and liquid distribution inside the tube. A review of the existing literature reveals that the previous investigations were concerned with the two-phase flow pattern for flow boiling in horizontal tubes [12], [9]. Therefore, the objective of the present investigation is to obtain information about the two-phase flow patterns for evaporation of R-134a inside horizontal smooth and microfin tubes. Also Investigation of heat transfer during flow boiling of R-134a inside horizontal microfin and smooth tube have been carried out experimentally The heat transfer coefficients for annular flow in the smooth tube is shown to agree well with Gungor and Winterton-s correlation [4]. All the flow patterns occurred in the test can be divided into three dominant regimes, i.e., stratified-wavy flow, wavy-annular flow and annular flow. Experimental data are plotted in two kinds of flow maps, i.e., Weber number for the vapor versus weber number for the liquid flow map and mass flux versus vapor quality flow map. The transition from wavy-annular flow to annular or stratified-wavy flow is identified in the flow maps.

An Example of Post-Harvest Thermotherapy as a Non-Chemical Method of Pathogen Control on Apples of Topaz Cultivar in Storage

Huge losses in apple production are caused by pathogens that cannot be seen shortly after harvest. After-harvest thermotherapy treatments can considerably improve control of storage diseases on apples and become an alternative to chemical pesticides. In the years 2010-2012 carried out research in this area. Apples of 'Topaz' cultivar were harvested at optimal maturity time for long storage and subject to water bath treatment at 45, 50, 52, 55°C for 60, 120, 180 and 240 seconds. The control was untreated fruits. After 12 and 24 weeks and during so called simulated trade turnover the fruits were checked for their condition and the originators of diseases were determined by using the standard phytopathological methods. The most common originator of 'Topaz' apple infection during storage were the fungi of genus Gloeosporium. In this paper it was proven that for effective protection of 'Topaz' apples against diseases, thermotherapy by using water treatments at temperature range of 50-52°C is quite sufficient.

Analysis of Self Excited Induction Generator using Particle Swarm Optimization

In this paper, Novel method, Particle Swarm Optimization (PSO) algorithm, based technique is proposed to estimate and analyze the steady state performance of self-excited induction generator (SEIG). In this novel method the tedious job of deriving the complex coefficients of a polynomial equation and solving it, as in previous methods, is not required. By comparing the simulation results obtained by the proposed method with those obtained by the well known mathematical methods, a good agreement between these results is obtained. The comparison validates the effectiveness of the proposed technique.

A Modification of Wireless and Internet Technologies for Logistics- Analysis

This research is designed for helping a WAPbased mobile phone-s user in order to analyze of logistics in the traffic area by applying and designing the accessible processes from mobile user to server databases. The research-s design comprises Mysql 4.1.8-nt database system for being the server which there are three sub-databases, traffic light – times of intersections in periods of the day, distances on the road of area-blocks where are divided from the main sample-area and speeds of sample vehicles (motorcycle, personal car and truck) in periods of the day. For interconnections between the server and user, PHP is used to calculate distances and travelling times from the beginning point to destination, meanwhile XHTML applied for receiving, sending and displaying data from PHP to user-s mobile. In this research, the main sample-area is focused at the Huakwang-Ratchada-s area, Bangkok, Thailand where usually the congested point and 6.25 km2 surrounding area which are split into 25 blocks, 0.25 km2 for each. For simulating the results, the designed server-database and all communicating models of this research have been uploaded to www.utccengineering.com/m4tg and used the mobile phone which supports WAP 2.0 XHTML/HTML multimode browser for observing values and displayed pictures. According to simulated results, user can check the route-s pictures from the requiring point to destination along with analyzed consuming times when sample vehicles travel in various periods of the day.

A Combination of Similarity Ranking and Time for Social Research Paper Searching

Nowadays social media are important tools for web resource discovery. The performance and capabilities of web searches are vital, especially search results from social research paper bookmarking. This paper proposes a new algorithm for ranking method that is a combination of similarity ranking with paper posted time or CSTRank. The paper posted time is static ranking for improving search results. For this particular study, the paper posted time is combined with similarity ranking to produce a better ranking than other methods such as similarity ranking or SimRank. The retrieval performance of combination rankings is evaluated using mean values of NDCG. The evaluation in the experiments implies that the chosen CSTRank ranking by using weight score at ratio 90:10 can improve the efficiency of research paper searching on social bookmarking websites.

Classification and Analysis of Risks in Software Engineering

Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.

Common Acceptable Cuisine in Multicultural Countries: Towards Building the National Food Identity

Common acceptable cuisine usually discussed in the multicultural/ethnic nation as it represents the process of sharing it among the ethnic groups. The common acceptable cuisine is also considered as a precursor in the process of constructing the national food identity within ethnic groups in the multicultural countries. The adaptation of certain ethnic cuisines through its types of food, methods of cooking, ingredients and eating decorum by ethnic groups is believed creating or enhancing the process of formation on common acceptable cuisines in a multicultural country. Malaysia as the multicultural country without doubt is continuing to experience cross-culturing processes among the ethnic groups including cuisine. This study empirically investigates the adaptation level of Malay, Chinese and Indian chefs on each other ethnic cuisine attributes toward the formation on common acceptable cuisines and national food identity.

Microarrays Denoising via Smoothing of Coefficients in Wavelet Domain

We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.

Hand Gesture Recognition: Sign to Voice System (S2V)

Hand gesture is one of the typical methods used in sign language for non-verbal communication. It is most commonly used by people who have hearing or speech problems to communicate among themselves or with normal people. Various sign language systems have been developed by manufacturers around the globe but they are neither flexible nor cost-effective for the end users. This paper presents a system prototype that is able to automatically recognize sign language to help normal people to communicate more effectively with the hearing or speech impaired people. The Sign to Voice system prototype, S2V, was developed using Feed Forward Neural Network for two-sequence signs detection. Different sets of universal hand gestures were captured from video camera and utilized to train the neural network for classification purpose. The experimental results have shown that neural network has achieved satisfactory result for sign-to-voice translation.

Global Product Development Ways in Modern Thai Economy – Case Studies, Good Practices and Ways to Implement in Thailand

Advances in technology (e.g. the internet, telecommunication) and political changes (fewer trade barriers and an enlarged European Union, ASEAN, NAFTA and other organizations) have led to develop international competition and expand into new markets. Companies in Thailand, Asia and around the globe are increasingly being pressured on price and for faster time to enter the market. At the same time, new markets are appearing and many companies are looking for changes and shifts in their domestic markets. These factors have enabled the rapid growth for companies and globalizing many different business activities during the product development process from research and development (R&D) to production. This research will show and clarify methods how to develop global product. Also, it will show how important is a global product impact into Thai Economy development.

Heavy Metal Contamination of the Landscape at the ─¢ubietová Deposit (Slovakia)

The heavy metal contamination of the technogenous sediments and soils at the investigated dump-field show irregular planar distribution. Also the heavy metal content in the surface water, drainage water and in the groundwater was studied both in the dry as well as during the rainy periods. The cementation process causes substitution of iron by copper. Natural installation and development of plant species was observed at the old mine waste dumps, specific to the local chemical conditions such as low content of essential nutrients and high content of heavy metals. The individual parts of the plant tissues (roots, branches/stems, leaves/needles, flowers/ fruits) are contaminated by heavy metals and tissues are damaged differently, respectively.

A Dynamic Filter for Removal DC - Offset In Current and Voltage Waveforms

In power systems, protective relays must filter their inputs to remove undesirable quantities and retain signal quantities of interest. This job must be performed accurate and fast. A new method for filtering the undesirable components such as DC and harmonic components associated with the fundamental system signals. The method is s based on a dynamic filtering algorithm. The filtering algorithm has many advantages over some other classical methods. It can be used as dynamic on-line filter without the need of parameters readjusting as in the case of classic filters. The proposed filter is tested using different signals. Effects of number of samples and sampling window size are discussed. Results obtained are presented and discussed to show the algorithm capabilities.

The Coupling of Photocatalytic Oxidation Processes with Activated Carbon Technologies and the Comparison of the Treatment Methods for Organic Removal from Surface Water

The surface water used in this study was collected from the Chao Praya River at the lower part at the Nonthaburi bridge. It was collected and used throughout the experiment. TOC (also known as DOC) in the range between 2.5 to 5.6 mg/l were investigated in this experiment. The use of conventional treatment methods such as FeCl3 and PAC showed that TOC removal was 65% using FeCl3 and 78% using PAC (powder activated carbon). The advanced oxidation process alone showed only 35% removal of TOC. Coupling advanced oxidation with a small amount of PAC (0.05g/L) increased efficiency by upto 55%. The combined BAC with advanced oxidation process and small amount of PAC demonstrated the highest efficiency of up to 95% of TOC removal and lower sludge production compared with other methods.

Correspondence between Function and Interaction in Protein Interaction Network of Saccaromyces cerevisiae

Understanding the cell's large-scale organization is an interesting task in computational biology. Thus, protein-protein interactions can reveal important organization and function of the cell. Here, we investigated the correspondence between protein interactions and function for the yeast. We obtained the correlations among the set of proteins. Then these correlations are clustered using both the hierarchical and biclustering methods. The detailed analyses of proteins in each cluster were carried out by making use of their functional annotations. As a result, we found that some functional classes appear together in almost all biclusters. On the other hand, in hierarchical clustering, the dominancy of one functional class is observed. In the light of the clustering data, we have verified some interactions which were not identified as core interactions in DIP and also, we have characterized some functionally unknown proteins according to the interaction data and functional correlation. In brief, from interaction data to function, some correlated results are noticed about the relationship between interaction and function which might give clues about the organization of the proteins, also to predict new interactions and to characterize functions of unknown proteins.

Designing of Virtual Laboratories Based on Extended Event Driving Simulation Method

Here are many methods for designing and implementation of virtual laboratories, because of their special features. The most famous architectural designs are based on the events. This model of architecting is so efficient for virtual laboratories implemented on a local network. Later, serviceoriented architecture, gave the remote access ability to them and Peer-To-Peer architecture, hired to exchanging data with higher quality and more speed. Other methods, such as Agent- Based architecting, are trying to solve the problems of distributed processing in a complicated laboratory system. This study, at first, reviews the general principles of designing a virtual laboratory, and then compares the different methods based on EDA, SOA and Agent-Based architecting to present weaknesses and strengths of each method. At the end, we make the best choice for design, based on existing conditions and requirements.

Discovery and Capture of Organizational Knowledge from Unstructured Information

Knowledge of an organization does not merely reside in structured form of information and data; it is also embedded in unstructured form. The discovery of such knowledge is particularly difficult as the characteristic is dynamic, scattered, massive and multiplying at high speed. Conventional methods of managing unstructured information are considered too resource demanding and time consuming to cope with the rapid information growth. In this paper, a Multi-faceted and Automatic Knowledge Elicitation System (MAKES) is introduced for the purpose of discovery and capture of organizational knowledge. A trial implementation has been conducted in a public organization to achieve the objective of decision capture and navigation from a number of meeting minutes which are autonomously organized, classified and presented in a multi-faceted taxonomy map in both document and content level. Key concepts such as critical decision made, key knowledge workers, knowledge flow and the relationship among them are elicited and displayed in predefined knowledge model and maps. Hence, the structured knowledge can be retained, shared and reused. Conducting Knowledge Management with MAKES reduces work in searching and retrieving the target decision, saves a great deal of time and manpower, and also enables an organization to keep pace with the knowledge life cycle. This is particularly important when the amount of unstructured information and data grows extremely quickly. This system approach of knowledge management can accelerate value extraction and creation cycles of organizations.

Symbolic Analysis of Large Circuits Using Discrete Wavelet Transform

Symbolic Circuit Analysis (SCA) is a technique used to generate the symbolic expression of a network. It has become a well-established technique in circuit analysis and design. The symbolic expression of networks offers excellent way to perform frequency response analysis, sensitivity computation, stability measurements, performance optimization, and fault diagnosis. Many approaches have been proposed in the area of SCA offering different features and capabilities. Numerical Interpolation methods are very common in this context, especially by using the Fast Fourier Transform (FFT). The aim of this paper is to present a method for SCA that depends on the use of Wavelet Transform (WT) as a mathematical tool to generate the symbolic expression for large circuits with minimizing the analysis time by reducing the number of computations.

Comparison of Imputation Techniques for Efficient Prediction of Software Fault Proneness in Classes

Missing data is a persistent problem in almost all areas of empirical research. The missing data must be treated very carefully, as data plays a fundamental role in every analysis. Improper treatment can distort the analysis or generate biased results. In this paper, we compare and contrast various imputation techniques on missing data sets and make an empirical evaluation of these methods so as to construct quality software models. Our empirical study is based on NASA-s two public dataset. KC4 and KC1. The actual data sets of 125 cases and 2107 cases respectively, without any missing values were considered. The data set is used to create Missing at Random (MAR) data Listwise Deletion(LD), Mean Substitution(MS), Interpolation, Regression with an error term and Expectation-Maximization (EM) approaches were used to compare the effects of the various techniques.

Formal Verification of a Multicast Protocol in Mobile Networks

As computer network technology becomes increasingly complex, it becomes necessary to place greater requirements on the validity of developing standards and the resulting technology. Communication networks are based on large amounts of protocols. The validity of these protocols have to be proved either individually or in an integral fashion. One strategy for achieving this is to apply the growing field of formal methods. Formal methods research defines systems in high order logic so that automated reasoning can be applied for verification. In this research we represent and implement a formerly announced multicast protocol in Prolog language so that certain properties of the protocol can be verified. It is shown that by using this approach some minor faults in the protocol were found and repaired. Describing the protocol as facts and rules also have other benefits i.e. leads to a process-able knowledge. This knowledge can be transferred as ontology between systems in KQML format. Since the Prolog language can increase its knowledge base every time, this method can also be used to learn an intelligent network.