Classification of Ground Water Resources for Emergency Supply

The article deals with the classification of alternative water resources in terms of potential risks which is the prerequisite for incorporating these water resources to the emergency plans. The classification is based on the quantification of risks resulting from possible damage, disruption or total destruction of water resource caused by natural and anthropogenic hazards, assessment of water quality and availability, traffic accessibility of the assessed resource and finally its water yield. The aim is to achieve the development of an integrated rescue system, which will be capable of supplying the population with drinking water on the whole stricken territory during the states of emergency.

A Visual Cryptography and Statistics Based Method for Ownership Identification of Digital Images

In this paper, a novel copyright protection scheme for digital images based on Visual Cryptography and Statistics is proposed. In our scheme, the theories and properties of sampling distribution of means and visual cryptography are employed to achieve the requirements of robustness and security. Our method does not need to alter the original image and can identify the ownership without resorting to the original image. Besides, our method allows multiple watermarks to be registered for a single host image without causing any damage to other hidden watermarks. Moreover, it is also possible for our scheme to cast a larger watermark into a smaller host image. Finally, experimental results will show the robustness of our scheme against several common attacks.

Analysis of Endovascular Graft Features Affecting Endotension Following Endovascular Aneurysm Repair

Endovascular aneurysm repair is a new and minimally invasive repair for patients with abdominal aortic aneurysm (AAA). This method has potential advantages that are incomparable with other repair methods. However, the enlargement of aneurysm in the absence of endoleak, which is known as endotension, may occur as one of post-operative compliances of this method. Typically, endotension is mainly as a result of pressure transmitted to aneurysm sac by endovascular installed graft. After installation of graft the aneurysm sac reduces significantly but remains non-zero. There are some factors which affect this pressure transmitted. In this study, the geometry features of installed vascular graft have been considered. It is inferred that graft neck angle and iliac bifurcation angle are two factors which can affect the drag force on graft and consequently the pressure transmitted to aneurysm.

Effects of Stream Tube Numbers on Flow and Sediments using GSTARS-3-A Case Study of the Karkheh Reservoir Dam in Western Dezful

Simulation of the flow and sedimentation process in the reservoir dams can be made by two methods of physical and mathematical modeling. The study area was within a region which ranged from the Jelogir hydrometric station to the Karkheh reservoir dam aimed at investigating the effects of stream tubes on the GSTARS-3 model behavior. The methodologies was to run the model based on 5 stream tubes in order to observe the influence of each scenario on longitudinal profiles, cross-section, flow velocity and bed load sediment size. Results further suggest that the use of two stream tubes or more which result in the semi-two-dimensional model will yield relatively closer results to the observational data than a singular stream tube modeling. Moreover, the results of modeling with three stream tubes shown to yield a relatively close results with the observational data. The overall conclusion of the paper is with applying various stream tubes; it would be possible to yield a significant influence on the modeling behavior Vis-a Vis the bed load sediment size.

Evaluation of the ANN Based Nonlinear System Models in the MSE and CRLB Senses

The System Identification problem looks for a suitably parameterized model, representing a given process. The parameters of the model are adjusted to optimize a performance function based on error between the given process output and identified process output. The linear system identification field is well established with many classical approaches whereas most of those methods cannot be applied for nonlinear systems. The problem becomes tougher if the system is completely unknown with only the output time series is available. It has been reported that the capability of Artificial Neural Network to approximate all linear and nonlinear input-output maps makes it predominantly suitable for the identification of nonlinear systems, where only the output time series is available. [1][2][4][5]. The work reported here is an attempt to implement few of the well known algorithms in the context of modeling of nonlinear systems, and to make a performance comparison to establish the relative merits and demerits.

Integrating the Theory of Constraints and Six Sigma in Manufacturing Process Improvement

Six Sigma is a well known discipline that reduces variation using complex statistical tools and the DMAIC model. By integrating Goldratts-s Theory of Constraints, the Five Focusing Points and System Thinking tools, Six Sigma projects can be selected where it can cause more impact in the company. This research defines an integrated model of six sigma and constraint management that shows a step-by-step guide using the original methodologies from each discipline and is evaluated in a case study from the production line of a Automobile engine monoblock V8, resulting in an increase in the line capacity from 18.7 pieces per hour to 22.4 pieces per hour, a reduction of 60% of Work-In-Process and a variation decrease of 0.73%.

Trimmed Mean as an Adaptive Robust Estimator of a Location Parameter for Weibull Distribution

One of the purposes of the robust method of estimation is to reduce the influence of outliers in the data, on the estimates. The outliers arise from gross errors or contamination from distributions with long tails. The trimmed mean is a robust estimate. This means that it is not sensitive to violation of distributional assumptions of the data. It is called an adaptive estimate when the trimming proportion is determined from the data rather than being fixed a “priori-. The main objective of this study is to find out the robustness properties of the adaptive trimmed means in terms of efficiency, high breakdown point and influence function. Specifically, it seeks to find out the magnitude of the trimming proportion of the adaptive trimmed mean which will yield efficient and robust estimates of the parameter for data which follow a modified Weibull distribution with parameter λ = 1/2 , where the trimming proportion is determined by a ratio of two trimmed means defined as the tail length. Secondly, the asymptotic properties of the tail length and the trimmed means are also investigated. Finally, a comparison is made on the efficiency of the adaptive trimmed means in terms of the standard deviation for the trimming proportions and when these were fixed a “priori". The asymptotic tail lengths defined as the ratio of two trimmed means and the asymptotic variances were computed by using the formulas derived. While the values of the standard deviations for the derived tail lengths for data of size 40 simulated from a Weibull distribution were computed for 100 iterations using a computer program written in Pascal language. The findings of the study revealed that the tail lengths of the Weibull distribution increase in magnitudes as the trimming proportions increase, the measure of the tail length and the adaptive trimmed mean are asymptotically independent as the number of observations n becomes very large or approaching infinity, the tail length is asymptotically distributed as the ratio of two independent normal random variables, and the asymptotic variances decrease as the trimming proportions increase. The simulation study revealed empirically that the standard error of the adaptive trimmed mean using the ratio of tail lengths is relatively smaller for different values of trimming proportions than its counterpart when the trimming proportions were fixed a 'priori'.

Synthesis of Unconventional Materials Using Chitosan and Crown Ether for Selective Removal of Precious Metal Ions

The polyfunctional and highly reactive bio-polymer, the chitosan was first regioselectively converted into dialkylated chitosan using dimsyl anionic solution(NaH in DMSO) and bromodecane after protecting amino groups by phthalic anhydride. The dibenzo-18-crown-6-ether, on the other hand, was converted into its carbonyl derivatives via Duff reaction prior to incorporate into chitosan by Schiff base formation. Thus formed diformylated dibenzo-18-crown-6-ether was condensed with lipophilic chitosan to prepare the novel solvent extraction reagent. The products were characterized mainly by IR and 1H-NMR. Hence, the multidentate crown ether-embedded polyfunctional bio-material was tested for extraction of Pd(II) and Pt(IV) in aqueous solution.

Perceptions of Health Risks amongst Tertiary Education Students in Mauritius

A personal estimate of a health risk may not correspond to a scientific assessment of the health risk. Hence, there is a need to investigate perceived health risks in the public. In this study, a young, educated and healthy group of people from a tertiary institute were questioned about their health concerns. Ethics clearance was obtained and data was collected by means of a questionnaire. 362 students participated in the study. Tobacco use, heavy alcohol drinking, illicit drugs, unsafe sex and potential carcinogens were perceived to be the five greatest threats to health in this cohort. On the other hand natural health products, unemployment, unmet contraceptive needs, family violence and homelessness were felt to be the least perceived health risks. Nutrition-related health risks as well as health risks due to physical inactivity and obesity were not perceived as major health threats. Such a study of health perceptions may guide health promotion campaigns.

A File Splitting Technique for Reducing the Entropy of Text Files

A novel file splitting technique for the reduction of the nth-order entropy of text files is proposed. The technique is based on mapping the original text file into a non-ASCII binary file using a new codeword assignment method and then the resulting binary file is split into several subfiles each contains one or more bits from each codeword of the mapped binary file. The statistical properties of the subfiles are studied and it is found that they reflect the statistical properties of the original text file which is not the case when the ASCII code is used as a mapper. The nth-order entropy of these subfiles are determined and it is found that the sum of their entropies is less than that of the original text file for the same values of extensions. These interesting statistical properties of the resulting subfiles can be used to achieve better compression ratios when conventional compression techniques are applied to these subfiles individually and on a bit-wise basis rather than on character-wise basis.

Optimization of Growth of Rhodobacter Sphaeroides Using Mixed Volatile Fatty Acidsby Response Surface Methodology

A combination of photosynthetic bacteria along with anaerobic acidogenic bacteria is an ideal option for efficient hydrogen production. In the present study, the optimum concentration of substrates for the growth of Rhodobacter sphaeroides was found by response surface methodology. The optimum combination of three individual fatty acids was determined by Box Behnken design. Increase of volatile fatty acid concentration decreased the growth. Combination of sodium acetate and sodium propionate was most significant for the growth of the organism. The results showed that a maximum biomass concentration of 0.916 g/l was obtained when the concentrations of acetate, propionate and butyrate were 0.73g/l,0.99g/l and 0.799g/l, respectively. The growth was studied under an optimum concentration of volatile fatty acids and at a light intensity of 3000 lux, initial pH of 7 and a temperature of 35°C.The maximum biomass concentration of 0.92g/l was obtained which verified the practicability of this optimization.

Determination of Q and R Matrices for Optimal Pitch Aircraft Control

In this paper, the process of obtaining Q and R matrices for optimal pitch aircraft control system has been described. Since the innovation of optimal control method, the determination of Q and R matrices for such system has not been fully specified. The value of Q and R for optimal pitch aircraft control application, have been simulated and calculated. The suitable results for Q and R have been observed through the performance index (PI). If the PI is small “enough", we would say the Q & R values are suitable for that certain type of optimal control system. Moreover, for the same value of PI, we could have different Q and R sets. Due to the rule-free determination of Q and R matrices, a specific method is brought to find out the rough value of Q and R referring to rather small value of PI.

Prediction of the Dynamic Characteristics of a Milling Machine Using the Integrated Model of Machine Frame and Spindle Unit

The machining performance is determined by the frequency characteristics of the machine-tool structure and the dynamics of the cutting process. Therefore, the prediction of dynamic vibration behavior of spindle tool system is of great importance for the design of a machine tool capable of high-precision and high-speed machining. The aim of this study is to develop a finite element model to predict the dynamic characteristics of milling machine tool and hence evaluate the influence of the preload of the spindle bearings. To this purpose, a three dimensional spindle bearing model of a high speed engraving spindle tool was created. In this model, the rolling interfaces with contact stiffness defined by Harris model were used to simulate the spindle bearing components. Then a full finite element model of a vertical milling machine was established by coupling the spindle tool unit with the machine frame structure. Using this model, the vibration mode that had a dominant influence on the dynamic stiffness was determined. The results of the finite element simulations reveal that spindle bearing with different preloads greatly affect the dynamic behavior of the spindle tool unit and hence the dynamic responses of the vertical column milling system. These results were validated by performing vibration on the individual spindle tool unit and the milling machine prototype, respectively. We conclude that preload of the spindle bearings is an important component affecting the dynamic characteristics and machining performance of the entire vertical column structure of the milling machine.

Integrating Bioremediation and Phytoremediation to Clean up Polychlorinated Biphenyls Contaminated Soils

This work involved the use of phytoremediation to remediate an aged soil contaminated with polychlorinated biphenyls (PCBs). At microcosm scale, tests were prepared using soil samples that have been collected in an industrial area with a total PCBs concentration of about 250 μg kg-1. Medicago sativa and Lolium italicum were the species selected in this study that is used as “feasibility test" for full scale remediation. The experiment was carried out with the addition of a mixture of randomly methylatedbeta- cyclodextrins (RAMEB). At the end of the experiment analysis of soil samples showed that in general the presence of plants has led to a higher degradation of most congeners with respect to not vegetated soil. The two plant species efficiencies were comparable and improved by RAMEB addition with a final reduction of total PCBs near to 50%. With increasing the chlorination of the congeners the removal percentage of PCBs progressively decreased.

A New Performance Characterization of Transient Analysis Method

This paper proposes a new performance characterization for the test strategy intended for second order filters denominated Transient Analysis Method (TRAM). We evaluate the ability of the addressed test strategy for detecting deviation faults under simultaneous statistical fluctuation of the non-faulty parameters. For this purpose, we use Monte Carlo simulations and a fault model that considers as faulty only one component of the filter under test while the others components adopt random values (within their tolerance band) obtained from their statistical distributions. The new data reported here show (for the filters under study) the presence of hard-to-test components and relatively low fault coverage values for small deviation faults. These results suggest that the fault coverage value obtained using only nominal values for the non-faulty components (the traditional evaluation of TRAM) seem to be a poor predictor of the test performance.

Extend of Self-Life of Potato Round Slices with Edible Coating, Green Tea and Ascorbic Acid

The effects of coatings based on sodium alginate (S.A) and carboxyl methyl cellulose (CMC) on the color and moisture characteristics of potato round slices were investigated. It is the first time that this combination of polysaccharides is used as edible coating which alone had the best performance as inhibitor of potato color discoloration during the storage of 15 days at 4oC. When ascorbic acid (AA) and green tea (GT) were added in the above edible coating its effects on potato round slices changed. The mixtures of sodium alginate and carboxyl methyl cellulose with ascorbic acid or with green tea behave as a potential moisture barrier, resulting to the extent of potato samples self–life. These data suggests that both GT and AA are potential inhibitors of dehydration in potatoes and not only natural antioxidants.

Non-destructive Watermelon Ripeness Determination Using Image Processing and Artificial Neural Network (ANN)

Agriculture products are being more demanding in market today. To increase its productivity, automation to produce these products will be very helpful. The purpose of this work is to measure and determine the ripeness and quality of watermelon. The textures on watermelon skin will be captured using digital camera. These images will be filtered using image processing technique. All these information gathered will be trained using ANN to determine the watermelon ripeness accuracy. Initial results showed that the best model has produced percentage accuracy of 86.51%, when measured at 32 hidden units with a balanced percentage rate of training dataset.

Spatial Query Localization Method in Limited Reference Point Environment

Task of object localization is one of the major challenges in creating intelligent transportation. Unfortunately, in densely built-up urban areas, localization based on GPS only produces a large error, or simply becomes impossible. New opportunities arise for the localization due to the rapidly emerging concept of a wireless ad-hoc network. Such network, allows estimating potential distance between these objects measuring received signal level and construct a graph of distances in which nodes are the localization objects, and edges - estimates of the distances between pairs of nodes. Due to the known coordinates of individual nodes (anchors), it is possible to determine the location of all (or part) of the remaining nodes of the graph. Moreover, road map, available in digital format can provide localization routines with valuable additional information to narrow node location search. However, despite abundance of well-known algorithms for solving the problem of localization and significant research efforts, there are still many issues that currently are addressed only partially. In this paper, we propose localization approach based on the graph mapped distances on the digital road map data basis. In fact, problem is reduced to distance graph embedding into the graph representing area geo location data. It makes possible to localize objects, in some cases even if only one reference point is available. We propose simple embedding algorithm and sample implementation as spatial queries over sensor network data stored in spatial database, allowing employing effectively spatial indexing, optimized spatial search routines and geometry functions.

Hi-Fi Traffic Clearance Technique for Life Saving Vehicles using Differential GPS System

This paper may be considered as combination of both pervasive computing and Differential GPS (global positioning satellite) which relates to control automatic traffic signals in such a way as to pre-empt normal signal operation and permit lifesaving vehicles. Before knowing the arrival of the lifesaving vehicles from the signal there is a chance of clearing the traffic. Traffic signal preemption system includes a vehicle equipped with onboard computer system capable of capturing diagnostic information and estimated location of the lifesaving vehicle using the information provided by GPS receiver connected to the onboard computer system and transmitting the information-s using a wireless transmitter via a wireless network. The fleet management system connected to a wireless receiver is capable of receiving the information transmitted by the lifesaving vehicle .A computer is also located at the intersection uses corrected vehicle position, speed & direction measurements, in conjunction with previously recorded data defining approach routes to the intersection, to determine the optimum time to switch a traffic light controller to preemption mode so that lifesaving vehicles can pass safely. In case when the ambulance need to take a “U" turn in a heavy traffic area we suggest a solution. Now we are going to make use of computerized median which uses LINKED BLOCKS (removable) to solve the above problem.

On the Sphere Method of Linear Programming Using Multiple Interior Points Approach

The Sphere Method is a flexible interior point algorithm for linear programming problems. This was developed mainly by Professor Katta G. Murty. It consists of two steps, the centering step and the descent step. The centering step is the most expensive part of the algorithm. In this centering step we proposed some improvements such as introducing two or more initial feasible solutions as we solve for the more favorable new solution by objective value while working with the rigorous updates of the feasible region along with some ideas integrated in the descent step. An illustration is given confirming the advantage of using the proposed procedure.