Modification by the River Vaslui of the Hydrological Regime and Its Economic Implications (Romania)

The influence of human activities produced by dams along the river beds is minor, but the location of accumulation of water directly influences the hydrological regime. The most important effect of the influence of damming on the way water flows decreases the frequency of floods. The water rate controls the water flow of the dams. These natural reservoirs become dysfunctional and, as a result, a new distribution of flow in the downstream sector, where maximum flow is, brings about, in this case, higher values. In addition to fishing, middle and lower courses of rivers located by accumulation also have a role in mitigating flood waves, thus providing flood protection. The Vaslui also ensures a good part of the needs of the town water supply. The most important lake is Solesti, close to the Vaslui River, opened in 1974. A hydrological regime of accumulation is related to an anthropogenic and natural drainage system. The design conditions and their manoeuvres drain or fill the water courses.

Organoclay of Cetyl Trimethyl Ammonium- Montmorillonite: Preparation and Study in Adsorption of Benzene-Toluene-2-Chlorophenol

Contamination of aromatic compounds in water can cause severe long-lasting effects not only for biotic organism but also on human health. Several alternative technologies for remediation of polluted water have been attempted. One of these is adsorption process of aromatic compounds by using organic modified clay mineral. Porous structure of clay is potential properties for molecular adsorptivity and it can be increased by immobilizing hydrophobic structure to attract organic compounds. In this work natural montmorillonite were modified with cetyltrimethylammonium (CTMA+) and was evaluated for use as adsorbents of aromatic compounds: benzene, toluene, and 2-chloro phenol in its single and multicomponent solution by ethanol:water solvent. Preparation of CTMA-montmorillonite was conducted by simple ion exchange procedure and characterization was conducted by using x-day diffraction (XRD), Fourier-transform infra red (FTIR) and gas sorption analysis. The influence of structural modification of montmorillonite on its adsorption capacity and adsorption affinity of organic compound were studied. It was shown that adsorptivity of montmorillonite was increased by modification associated with arrangements of CTMA+ in the structure even the specific surface area of modified montmorillonite was lower than raw montmorillonite. Adsorption rate indicated that material has affinity to adsorb compound by following order: benzene> toluene > 2-chloro phenol. The adsorption isotherms of benzene and toluene showed 1st order adsorption kinetic indicating a partition phenomenon of compounds between the aqueous and organophilic CTMAmontmorillonite.

An Automated High Pressure Differential Thermal Analysis System for Phase Transformation Studies

A piston cylinder based high pressure differential thermal analyzer system is developed to investigate phase transformations, melting, glass transitions, crystallization behavior of inorganic materials, glassy systems etc., at ambient to 4 GPa and at room temperature to 1073 K. The pressure is calibrated by the phase transition of bismuth and ytterbium and temperature is calibrated by using thermocouple data chart. The system developed is calibrated using benzoic acid, ammonium nitrate and it has a pressure and temperature control of ± 8.9 x 10 -4 GPa , ± 2 K respectively. The phase transition of Asx Te100-x chalcogenides, ferrous oxide and strontium boride are studied using the indigenously developed system.

A Comparison of Experimental Data with Monte Carlo Calculations for Optimisation of the Sourceto- Detector Distance in Determining the Efficiency of a LaBr3:Ce (5%) Detector

Cerium-doped lanthanum bromide LaBr3:Ce(5%) crystals are considered to be one of the most advanced scintillator materials used in PET scanning, combining a high light yield, fast decay time and excellent energy resolution. Apart from the correct choice of scintillator, it is also important to optimise the detector geometry, not least in terms of source-to-detector distance in order to obtain reliable measurements and efficiency. In this study a commercially available 25 mm x 25 mm BrilLanCeTM 380 LaBr3: Ce (5%) detector was characterised in terms of its efficiency at varying source-to-detector distances. Gamma-ray spectra of 22Na, 60Co, and 137Cs were separately acquired at distances of 5, 10, 15, and 20cm. As a result of the change in solid angle subtended by the detector, the geometric efficiency reduced in efficiency with increasing distance. High efficiencies at low distances can cause pulse pile-up when subsequent photons are detected before previously detected events have decayed. To reduce this systematic error the source-to-detector distance should be balanced between efficiency and pulse pile-up suppression as otherwise pile-up corrections would need to be necessary at short distances. In addition to the experimental measurements Monte Carlo simulations have been carried out for the same setup, allowing a comparison of results. The advantages and disadvantages of each approach have been highlighted.

A New Heuristic Approach for Large Size Zero-One Multi Knapsack Problem Using Intercept Matrix

This paper presents a heuristic to solve large size 0-1 Multi constrained Knapsack problem (01MKP) which is NP-hard. Many researchers are used heuristic operator to identify the redundant constraints of Linear Programming Problem before applying the regular procedure to solve it. We use the intercept matrix to identify the zero valued variables of 01MKP which is known as redundant variables. In this heuristic, first the dominance property of the intercept matrix of constraints is exploited to reduce the search space to find the optimal or near optimal solutions of 01MKP, second, we improve the solution by using the pseudo-utility ratio based on surrogate constraint of 01MKP. This heuristic is tested for benchmark problems of sizes upto 2500, taken from literature and the results are compared with optimum solutions. Space and computational complexity of solving 01MKP using this approach are also presented. The encouraging results especially for relatively large size test problems indicate that this heuristic can successfully be used for finding good solutions for highly constrained NP-hard problems.

Packing Theory for Natural and Crushed Aggregate to Obtain the Best Mix of Aggregate: Research and Development

Concrete performance is strongly affected by the particle packing degree since it determines the distribution of the cementitious component and the interaction of mineral particles. By using packing theory designers will be able to select optimal aggregate materials for preparing concrete with low cement content, which is beneficial from the point of cost. Optimum particle packing implies minimizing porosity and thereby reducing the amount of cement paste needed to fill the voids between the aggregate particles, taking also the rheology of the concrete into consideration. For reaching good fluidity superplasticizers are required. The results from pilot tests at Luleå University of Technology (LTU) show various forms of the proposed theoretical models, and the empirical approach taken in the study seems to provide a safer basis for developing new, improved packing models.

The Effect of Press Fit on Osseointegration of Acetabular Cup

The primary cause of Total Hip Replacement (THR) failure for younger patients is aseptic loosening. This complication is twice more likely to happen in acetabular cup than in femoral stem. Excessive micromotion between bone and implant will cause loosening and it depends in patient activities, age and bone. In this project, the effects of different metal back design of press fit on osseointegration of the acetabular cup are carried out. Commercial acetabular cup designs, namely Spiked, Superfix and Quadrafix are modelled and analyzed using commercial finite element software. The diameter of acetabular cup is based on the diameter of acetabular rim to make sure the component fit to the acetabular cavity. A new design of acetabular cup are proposed and analyzed to get better osseointegration between the bones and implant interface. Results shows that the proposed acetabular cup designs are more stable compared to other designs with respect to stress and displacement aspects.

Improvement of Semen Quality in Holstein Bulls during Heat Stress by Supplementing Omega-3 Fatty Acids

The aim of current study was to investigate the changes in the quality parameters of Holstein bull semen during the heat stress and the effect of feeding a source of omega-3 fatty acids in this period. Samples were obtained from 19 Holstein bulls during the expected time of heat stress in Iran (June to September 2009). Control group (n=10) were fed a standard concentrate feed while treatment group (n=9) had this feed top dressed with 100 g of an omega-3 enriched nutriceutical. Semen quality was assessed on ejaculates collected after 1, 5, 9 and 12 weeks of supplementation. Computer-assisted assessment of sperm motility, viability (eosinnigrosin) and hypo-osmotic swelling test (HOST) were conducted. Heat stress affected sperm quality parameters by week 5 and 9 (p

An Adequate Choice of Initial Sample Size for Selection Approach

In this paper, we consider the effect of the initial sample size on the performance of a sequential approach that used in selecting a good enough simulated system, when the number of alternatives is very large. We implement a sequential approach on M=M=1 queuing system under some parameter settings, with a different choice of the initial sample sizes to explore the impacts on the performance of this approach. The results show that the choice of the initial sample size does affect the performance of our selection approach.

Image Search by Features of Sorted Gray level Histogram Polynomial Curve

Image Searching was always a problem specially when these images are not properly managed or these are distributed over different locations. Currently different techniques are used for image search. On one end, more features of the image are captured and stored to get better results. Storing and management of such features is itself a time consuming job. While on the other extreme if fewer features are stored the accuracy rate is not satisfactory. Same image stored with different visual properties can further reduce the rate of accuracy. In this paper we present a new concept of using polynomials of sorted histogram of the image. This approach need less overhead and can cope with the difference in visual features of image.

Role of Acoustic Pressure on the Dynamics of Moving Single-Bubble Sonoluminescence

Role of acoustic driving pressure on the translational-radial dynamics of a moving single bubble sonoluminescence (m-SBSL) has been numerically investigated. The results indicate that increase in the amplitude of the driving pressure leads to increase in the bubble peak temperature. The length and the shape of the trajectory of the bubble depends on the acoustic pressure and because of the spatially dependence of the radial dynamics of the moving bubble, its peak temperature varies during the acoustical pulses. The results are in good agreement with the experimental reports on m-SBSL.

Fast Database Indexing for Large Protein Sequence Collections Using Parallel N-Gram Transformation Algorithm

With the rapid development in the field of life sciences and the flooding of genomic information, the need for faster and scalable searching methods has become urgent. One of the approaches that were investigated is indexing. The indexing methods have been categorized into three categories which are the lengthbased index algorithms, transformation-based algorithms and mixed techniques-based algorithms. In this research, we focused on the transformation based methods. We embedded the N-gram method into the transformation-based method to build an inverted index table. We then applied the parallel methods to speed up the index building time and to reduce the overall retrieval time when querying the genomic database. Our experiments show that the use of N-Gram transformation algorithm is an economical solution; it saves time and space too. The result shows that the size of the index is smaller than the size of the dataset when the size of N-Gram is 5 and 6. The parallel N-Gram transformation algorithm-s results indicate that the uses of parallel programming with large dataset are promising which can be improved further.

Natural Convection Boundary Layer Flow of a Viscoelastic Fluid on Solid Sphere with Newtonian Heating

The present paper considers the steady free convection boundary layer flow of a viscoelastic fluid on solid sphere with Newtonian heating. The boundary layer equations are an order higher than those for the Newtonian (viscous) fluid and the adherence boundary conditions are insufficient to determine the solution of these equations completely. Thus, the augmentation an extra boundary condition is needed to perform the numerical computational. The governing boundary layer equations are first transformed into non-dimensional form by using special dimensionless group and then solved by using an implicit finite difference scheme. The results are displayed graphically to illustrate the influence of viscoelastic K and Prandtl Number Pr parameters on skin friction, heat transfer, velocity profiles and temperature profiles. Present results are compared with the published papers and are found to concur very well.

Automated Separation of Organic Liquids through Their Boiling Points

This paper discuss the separation of the miscible liquids by means of fractional distillation. For complete separation of liquids, the process of heating, condensation, separation and storage is done automatically to achieve the objective. PIC micro-controller has been used to control each and every process of the work. The controller also controls the storage process by activating and deactivating the conveyors. The liquids are heated which on reaching their respective boiling points evaporate and enter the condensation chamber where they convert into liquid. The liquids are then directed to their respective tanks by means of stepper motor which moves in three directions, each movement into different tank. The tank on filling sends the signal to controller which then opens the solenoid valves. The tank is emptied into the beakers below the nozzle. As the beaker filled, the nozzle closes and the conveyors come into operation. The filled beaker is replaced by an empty beaker from behind. The work can be used in oil industries, chemical industries and paint industries.

Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia

Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.

On The Analysis of a Compound Neural Network for Detecting Atrio Ventricular Heart Block (AVB) in an ECG Signal

Heart failure is the most common reason of death nowadays, but if the medical help is given directly, the patient-s life may be saved in many cases. Numerous heart diseases can be detected by means of analyzing electrocardiograms (ECG). Artificial Neural Networks (ANN) are computer-based expert systems that have proved to be useful in pattern recognition tasks. ANN can be used in different phases of the decision-making process, from classification to diagnostic procedures. This work concentrates on a review followed by a novel method. The purpose of the review is to assess the evidence of healthcare benefits involving the application of artificial neural networks to the clinical functions of diagnosis, prognosis and survival analysis, in ECG signals. The developed method is based on a compound neural network (CNN), to classify ECGs as normal or carrying an AtrioVentricular heart Block (AVB). This method uses three different feed forward multilayer neural networks. A single output unit encodes the probability of AVB occurrences. A value between 0 and 0.1 is the desired output for a normal ECG; a value between 0.1 and 1 would infer an occurrence of an AVB. The results show that this compound network has a good performance in detecting AVBs, with a sensitivity of 90.7% and a specificity of 86.05%. The accuracy value is 87.9%.

An Efficient Approach to Mining Frequent Itemsets on Data Streams

The increasing importance of data stream arising in a wide range of advanced applications has led to the extensive study of mining frequent patterns. Mining data streams poses many new challenges amongst which are the one-scan nature, the unbounded memory requirement and the high arrival rate of data streams. In this paper, we propose a new approach for mining itemsets on data stream. Our approach SFIDS has been developed based on FIDS algorithm. The main attempts were to keep some advantages of the previous approach and resolve some of its drawbacks, and consequently to improve run time and memory consumption. Our approach has the following advantages: using a data structure similar to lattice for keeping frequent itemsets, separating regions from each other with deleting common nodes that results in a decrease in search space, memory consumption and run time; and Finally, considering CPU constraint, with increasing arrival rate of data that result in overloading system, SFIDS automatically detect this situation and discard some of unprocessing data. We guarantee that error of results is bounded to user pre-specified threshold, based on a probability technique. Final results show that SFIDS algorithm could attain about 50% run time improvement than FIDS approach.

Gasoline and Diesel Production via Fischer- Tropsch Synthesis over Cobalt Based Catalyst

Performance of a cobalt doped sol-gel derived silica (Co/SiO2) catalyst for Fischer–Tropsch synthesis (FTS) in slurryphase reactor was studied using paraffin wax as initial liquid media. The reactive mixed gas, hydrogen (H2) and carbon monoxide (CO) in a molar ratio of 2:1, was flowed at 50 ml/min. Braunauer-Emmett- Teller (BET) surface area and X-ray diffraction (XRD) techniques were employed to characterize both the specific surface area and crystallinity of the catalyst, respectively. The reduction behavior of Co/SiO2 catalyst was investigated using the Temperature Programmmed Reduction (TPR) method. Operating temperatures were varied from 493 to 533K to find the optimum conditions to maximize liquid fuels production, gasoline and diesel.

Automatic Generation of OWL Ontologies from UML Class Diagrams Based on Meta- Modelling and Graph Grammars

Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.

An Extension of the Kratzel Function and Associated Inverse Gaussian Probability Distribution Occurring in Reliability Theory

In view of their importance and usefulness in reliability theory and probability distributions, several generalizations of the inverse Gaussian distribution and the Krtzel function are investigated in recent years. This has motivated the authors to introduce and study a new generalization of the inverse Gaussian distribution and the Krtzel function associated with a product of a Bessel function of the third kind )(zKQ and a Z - Fox-Wright generalized hyper geometric function introduced in this paper. The introduced function turns out to be a unified gamma-type function. Its incomplete forms are also discussed. Several properties of this gamma-type function are obtained. By means of this generalized function, we introduce a generalization of inverse Gaussian distribution, which is useful in reliability analysis, diffusion processes, and radio techniques etc. The inverse Gaussian distribution thus introduced also provides a generalization of the Krtzel function. Some basic statistical functions associated with this probability density function, such as moments, the Mellin transform, the moment generating function, the hazard rate function, and the mean residue life function are also obtained.KeywordsFox-Wright function, Inverse Gaussian distribution, Krtzel function & Bessel function of the third kind.