Experimental Investigation of Surface Roughness Effect on Single Phase Fluid Flow and Heat Transfer in Micro-Tube

An experimental investigation was conducted to study the effect of surface roughness on friction factor and heat transfer characteristics in single-phase fluid flow in a stainless steel micro-tube having diameter of 0.85 mm and average internal surface roughness of 1.7 μm with relative surface roughness of 0.002. Distilled water and R134a liquids were used as the working fluids and testing was conducted with Reynolds numbers ranging from 100 to 10,000 covering laminar, transition and turbulent flow conditions. The experiments were conducted with the micro-tube oriented horizontally with uniform heat fluxes applied at the test section. The results indicated that the friction factor of both water and R134a can be predicted by the Hagen-Poiseuille equation for laminar flow and the modified Miller correlation for turbulent flow and early transition from laminar to turbulent flows. The heat transfer results of water and R134a were in good agreement with the conventional theory in the laminar flow region and lower than the Adam’s correlation for turbulent flow region which deviates from conventional theory.

A Meta-Heuristic Algorithm for Set Covering Problem Based on Gravity

A new Meta heuristic approach called "Randomized gravitational emulation search algorithm (RGES)" for solving large size set covering problems has been designed. This algorithm is found upon introducing randomization concept along with the two of the four primary parameters -velocity- and -gravity- in physics. A new heuristic operator is introduced in the domain of RGES to maintain feasibility specifically for the set covering problem to yield best solutions. The performance of this algorithm has been evaluated on a large set of benchmark problems from OR-library. Computational results showed that the randomized gravitational emulation search algorithm - based heuristic is capable of producing high quality solutions. The performance of this heuristic when compared with other existing heuristic algorithms is found to be excellent in terms of solution quality.

Comparative Analysis of the Software Effort Estimation Models

Accurate software cost estimates are critical to both developers and customers. They can be used for generating request for proposals, contract negotiations, scheduling, monitoring and control. The exact relationship between the attributes of the effort estimation is difficult to establish. A neural network is good at discovering relationships and pattern in the data. So, in this paper a comparative analysis among existing Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model and Neural Network Based Model is performed. Neural Network has outperformed the other considered models. Hence, we proposed Neural Network system as a soft computing approach to model the effort estimation of the software systems.

New Nonlinear Filtering Strategies for Eliminating Short and Long Tailed Noise in Images with Edge Preservation Properties

Midpoint filter is quite effective in recovering the images confounded by the short-tailed (uniform) noise. It, however, performs poorly in the presence of additive long-tailed (impulse) noise and it does not preserve the edge structures of the image signals. Median smoother discards outliers (impulses) effectively, but it fails to provide adequate smoothing for images corrupted with nonimpulse noise. In this paper, two nonlinear techniques for image filtering, namely, New Filter I and New Filter II are proposed based on a nonlinear high-pass filter algorithm. New Filter I is constructed using a midpoint filter, a highpass filter and a combiner. It suppresses uniform noise quite well. New Filter II is configured using an alpha trimmed midpoint filter, a median smoother of window size 3x3, the high pass filter and the combiner. It is robust against impulse noise and attenuates uniform noise satisfactorily. Both the filters are shown to exhibit good response at the image boundaries (edges). The proposed filters are evaluated for their performance on a test image and the results obtained are included.

How Efficiency of Password Attack Based on a Keyboard

At present, dictionary attack has been the basic tool for recovering key passwords. In order to avoid dictionary attack, users purposely choose another character strings as passwords. According to statistics, about 14% of users choose keys on a keyboard (Kkey, for short) as passwords. This paper develops a framework system to attack the password chosen from Kkeys and analyzes its efficiency. Within this system, we build up keyboard rules using the adjacent and parallel relationship among Kkeys and then use these Kkey rules to generate password databases by depth-first search method. According to the experiment results, we find the key space of databases derived from these Kkey rules that could be far smaller than the password databases generated within brute-force attack, thus effectively narrowing down the scope of attack research. Taking one general Kkey rule, the combinations in all printable characters (94 types) with Kkey adjacent and parallel relationship, as an example, the derived key space is about 240 smaller than those in brute-force attack. In addition, we demonstrate the method's practicality and value by successfully cracking the access password to UNIX and PC using the password databases created

Automata Theory Approach for Solving Frequent Pattern Discovery Problems

The various types of frequent pattern discovery problem, namely, the frequent itemset, sequence and graph mining problems are solved in different ways which are, however, in certain aspects similar. The main approach of discovering such patterns can be classified into two main classes, namely, in the class of the levelwise methods and in that of the database projection-based methods. The level-wise algorithms use in general clever indexing structures for discovering the patterns. In this paper a new approach is proposed for discovering frequent sequences and tree-like patterns efficiently that is based on the level-wise issue. Because the level-wise algorithms spend a lot of time for the subpattern testing problem, the new approach introduces the idea of using automaton theory to solve this problem.

Management and Control of Industrial Effluents Discharged to Public Sewers: A Case Study

An overview of the important aspects of managing and controlling industrial effluent discharges to public sewers namely sampling, characterization, quantification and legislative controls has been presented. The findings have been validated by means of a case study covering three industrial sectors namely, tanning, textile finishing and food processing industries. Industrial effluents discharges were found to be best monitored by systematic and automatic sampling and quantified using water meter readings corrected for evaporative and consumptive losses. Based on the treatment processes employed in the public owned treatment works and the chemical oxygen demand and biochemical oxygen demand levels obtained, the effluent from all the three industrial sectors studied were found to lie in the toxic zone. Thus, physico-chemical treatment of these effluents is required to bring them into the biodegradable zone. KL values (quoted to base e) were greater than 0.50 day-1 compared to 0.39 day-1 for typical municipality wastewater.

Secure Block-Based Video Authentication with Localization and Self-Recovery

Because of the great advance in multimedia technology, digital multimedia is vulnerable to malicious manipulations. In this paper, a public key self-recovery block-based video authentication technique is proposed which can not only precisely localize the alteration detection but also recover the missing data with high reliability. In the proposed block-based technique, multiple description coding MDC is used to generate two codes (two descriptions) for each block. Although one block code (one description) is enough to rebuild the altered block, the altered block is rebuilt with better quality by the two block descriptions. So using MDC increases the ratability of recovering data. A block signature is computed using a cryptographic hash function and a doubly linked chain is utilized to embed the block signature copies and the block descriptions into the LSBs of distant blocks and the block itself. The doubly linked chain scheme gives the proposed technique the capability to thwart vector quantization attacks. In our proposed technique , anyone can check the authenticity of a given video using the public key. The experimental results show that the proposed technique is reliable for detecting, localizing and recovering the alterations.

Design and Implementation of Rule-based Expert System for Fault Management

It has been defined that the “network is the system". This implies providing levels of service, reliability, predictability and availability that are commensurate with or better than those that individual computers provide today. To provide this requires integrated network management for interconnected networks of heterogeneous devices covering both the local campus. In this paper we are addressing a framework to effectively deal with this issue. It consists of components and interactions between them which are required to perform the service fault management. A real-world scenario is used to derive the requirements which have been applied to the component identification. An analysis of existing frameworks and approaches with respect to their applicability to the framework is also carried out.

Analysis of Wi-Fi Access Networks Situation in the City Area

With increasing number of wireless devices like laptops, Wi-Fi Web Cams, network extenders, etc., a new kind of problems appeared, mostly related to poor Wi-Fi throughput or communication problems. In this paper an investigation on wireless networks and it-s saturation in Vilnius City and its surrounding is presented, covering the main problems of wireless saturation and network load during day. Also an investigation on wireless channel selection and noise levels were made, showing the impact of neighbor AP to signal and noise levels and how it changes during the day.

Improving Spatiotemporal Change Detection: A High Level Fusion Approach for Discovering Uncertain Knowledge from Satellite Image Database

This paper investigates the problem of tracking spa¬tiotemporal changes of a satellite image through the use of Knowledge Discovery in Database (KDD). The purpose of this study is to help a given user effectively discover interesting knowledge and then build prediction and decision models. Unfortunately, the KDD process for spatiotemporal data is always marked by several types of imperfections. In our paper, we take these imperfections into consideration in order to provide more accurate decisions. To achieve this objective, different KDD methods are used to discover knowledge in satellite image databases. Each method presents a different point of view of spatiotemporal evolution of a query model (which represents an extracted object from a satellite image). In order to combine these methods, we use the evidence fusion theory which considerably improves the spatiotemporal knowledge discovery process and increases our belief in the spatiotemporal model change. Experimental results of satellite images representing the region of Auckland in New Zealand depict the improvement in the overall change detection as compared to using classical methods.

Corporate Governance Practices and Audit Quality: An Empirical Study of the Listed Companies in Egypt

Recent financial international scandals around the world have led to a number of investigations into the effectiveness of corporate governance practices and audit quality. Although evidence of corporate governance practices and audit quality exists from developed economies, very scanty studies have been conducted in Egypt where corporate governance is just evolving. Therefore, this study provides evidence on the effectiveness of corporate governance practices and audit quality from a developing country. The data for analysis are gathered from the top 50 most active companies in the Egyptian Stock Exchange, covering the three year period 2007-2009. Logistic regression was used in investigating the questions that were raised in the study. Findings from the study show that board independence; CEO duality and audit committees significantly have relationship with audit quality. The results also, indicate that institutional investor and managerial ownership have no significantly relationship with audit quality. Evidence also exist that size of the company; complexity and business leverage are important factors in audit quality for companies quoted on the Egypt Stock Exchange.

Ultrasound Assisted Method to Increase the Aluminum Dissolve Rate from Acidified Water

Aluminum salt that is generally presents as a solid phase in the water purification sludge (WPS) can be dissolved, recovering a liquid phase, by adding strong acid to the sludge solution. According to the reaction kinetics, when reactant is in the form of small particles with a large specific surface area, or when the reaction temperature is high, the quantity of dissolved aluminum salt or reaction rate, respectively are high. Therefore, in this investigation, water purification sludge (WPS) solution was treated with ultrasonic waves to break down the sludge, and different acids (1 N HCl and 1 N H2SO4) were used to acidify it. Acid dosages that yielded the solution pH of less than two were used. The results thus obtained indicate that the quantity of dissolved aluminum in H2SO4-acidified solution exceeded that in HCl-acidified solution. Additionally, ultrasonic treatment increased the rate of dissolution of aluminum and the amount dissolved. The quantity of aluminum dissolved at 60℃ was 1.5 to 2.0 times higher than that at 25℃.

Springback Simulations of Monolithic and Layered Steels Used for Pressure Equipment

Carbon steel is used in boilers, pressure vessels, heat exchangers, piping, structural elements and other moderatetemperature service systems in which good strength and ductility are desired. ASME Boiler and Pressure Vessel Code, Section II Part A (2004) provides specifications of ferrous materials for construction of pressure equipment, covering wide range of mechanical properties including high strength materials for power plants application. However, increased level of springback is one of the major problems in fabricating components of high strength steel using bending. Presented work discuss the springback simulations for five different steels (i.e. SA-36, SA-299, SA-515 grade 70, SA-612 and SA-724 grade B) using finite element analysis of air V-bending. Analytical springback simulations of hypothetical layered materials are presented. Result shows that; (i) combination of the material property parameters controls the springback, (ii) layer of the high ductility steel on the high strength steel greatly suppresses the springback.

Novelty as a Measure of Interestingness in Knowledge Discovery

Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules leads to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach based on both objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules (knowledge). We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are promising.

A Neural Network Approach in Predicting the Blood Glucose Level for Diabetic Patients

Diabetes Mellitus is a chronic metabolic disorder, where the improper management of the blood glucose level in the diabetic patients will lead to the risk of heart attack, kidney disease and renal failure. This paper attempts to enhance the diagnostic accuracy of the advancing blood glucose levels of the diabetic patients, by combining principal component analysis and wavelet neural network. The proposed system makes separate blood glucose prediction in the morning, afternoon, evening and night intervals, using dataset from one patient covering a period of 77 days. Comparisons of the diagnostic accuracy with other neural network models, which use the same dataset are made. The comparison results showed overall improved accuracy, which indicates the effectiveness of this proposed system.

Synthetic Transmit Aperture Method in Medical Ultrasonic Imaging

The work describes the use of a synthetic transmit aperture (STA) with a single element transmitting and all elements receiving in medical ultrasound imaging. STA technique is a novel approach to today-s commercial systems, where an image is acquired sequentially one image line at a time that puts a strict limit on the frame rate and the amount of data needed for high image quality. The STA imaging allows to acquire data simultaneously from all directions over a number of emissions, and the full image can be reconstructed. In experiments a 32-element linear transducer array with 0.48 mm inter-element spacing was used. Single element transmission aperture was used to generate a spherical wave covering the full image region. The 2D ultrasound images of wire phantom are presented obtained using the STA and commercial ultrasound scanner Antares to demonstrate the benefits of the SA imaging.

A Study on Polymer Coated Colour Pigments for Water-Based Ink

The pigments covered by film-forming polymers have opened a prospect to improve the quality of water-based printing inks. In this study such pigments were prepared by the initiated polymerization of styrene and methacrylate derivative monomers in the aqueous pigment dispersions. The formation of polymer films covering pigment cores depends on the polymerization time and the ratio of pigment to monomers. At the time of 4 hours and the ratio of 1/10 almost pigment particles are coated by the polymer. The formed polymer covers of pigments have the average thickness of 5.95 nm. The size increasing percentage of the coated particles after a week is 4.5 %, about fourteen-fold lower than of the original ones. The obtained results indicate that the coated pigments are improved dispersion stability in water medium along with a guarantee for the optical colour.

Factors Affecting the e-Business Adoption among the Home-Based Businesses (HBBs) in Malaysia

Research in e-Business has been growing tremendously covering all related aspects such as adoption issues, e- Business models, strategies, etc. This research aims to explore the potential of adopting e-Business for a micro size business operating from home called home-based businesses (HBBs). In Malaysia, the HBB industry started many years ago and were mostly monopolized by women or housewives managed as a part-time job to support their family economy. Today, things have changed. The availability of the Internet technology and the emergence of e-Business concept promote the evolution of HBBs, which have been adopted as another alternative as a professional career for women without neglecting their family needs especially the children. Although this study is confined to a limited sample size and within geographical biasness, the findings show that it concurs with previous large scale studies. In this study, both qualitative and quantitative methods were used and data were gathered using triangulation methods via interview, direct observation, document analysis and survey questionnaires. This paper discusses the literature review, research methods and findings pertaining to e-Business adoption factors that influence the HBBs in Malaysia.

BIDENS: Iterative Density Based Biclustering Algorithm With Application to Gene Expression Analysis

Biclustering is a very useful data mining technique for identifying patterns where different genes are co-related based on a subset of conditions in gene expression analysis. Association rules mining is an efficient approach to achieve biclustering as in BIMODULE algorithm but it is sensitive to the value given to its input parameters and the discretization procedure used in the preprocessing step, also when noise is present, classical association rules miners discover multiple small fragments of the true bicluster, but miss the true bicluster itself. This paper formally presents a generalized noise tolerant bicluster model, termed as μBicluster. An iterative algorithm termed as BIDENS based on the proposed model is introduced that can discover a set of k possibly overlapping biclusters simultaneously. Our model uses a more flexible method to partition the dimensions to preserve meaningful and significant biclusters. The proposed algorithm allows discovering biclusters that hard to be discovered by BIMODULE. Experimental study on yeast, human gene expression data and several artificial datasets shows that our algorithm offers substantial improvements over several previously proposed biclustering algorithms.