New Approach for the Modeling and the Implementation of the Object-Relational Databases

Conception is the primordial part in the realization of a computer system. Several tools have been used to help inventors to describe their software. These tools knew a big success in the relational databases domain since they permit to generate SQL script modeling the database from an Entity/Association model. However, with the evolution of the computer domain, the relational databases proved their limits and object-relational model became used more and more. Tools of present conception don't support all new concepts introduced by this model and the syntax of the SQL3 language. We propose in this paper a tool of help to the conception and implementation of object-relational databases called «NAVIGTOOLS" that allows the user to generate script modeling its database in SQL3 language. This tool bases itself on the Entity/Association and navigational model for modeling the object-relational databases.

Applicability of Diatom-Based Water Quality Assessment Indices in Dari Stream, Isparta- Turkey

Diatoms are an important group of aquatic ecosystems and diatom-based indices are increasingly becoming important tools for the assessment of ecological conditions in lotic systems. Although the studies are very limited about Turkish rivers, diatom indices were used for monitoring rivers in different basins. In the present study, we used OMNIDIA program for estimation of stream quality. Some indices have less sensitive (IDP, WAT, LOBO, GENRE, TID, CEE, PT), intermediate sensitivities (IDSE, DESCY, IPS, DI-CH, SLA, IDAP), the others higher sensitivities (SID, IBD, SHE, EPI-D). Among the investigated diatom communities, only a few taxa indicated alfa-mesosaprobity and polysaprobity. Most of the sites were characterized by a great relative contribution of eutraphent and tolerant ones as well as oligosaprobic and betamesosaprobic diatoms. In general, SID and IBD indices gave the best results. This study suggests that the structure of benthic diatom communities and diatom indices, especially SID, can be applied for monitoring rivers in Southern Turkey. 

Nitrogen Removal in a High-efficiency Denitrification/Oxic Filter treatment System for Advanced Treatment of Municipal Wastewater

Biological treatment of secondary effluent wastewater by two combined denitrification/oxic filtration systems packed with Lock type(denitrification filter) and ceramic ball (oxic filter) has been studied for 5months. Two phases of operating conditions were carried out with an influent nitrate and ammonia concentrations varied from 5.8 to 11.7mg/L and 5.4 to 12.4mg/L,respectively. Denitrification/oxic filter treatment system were operated under an EBCT (Empty Bed Contact Time) of 4h at system recirculation ratio in the range from 0 to 300% (Linear Velocity increased 19.5m/d to 78m/d). The system efficiency of denitrification , nitrification over 95% respectively. Total nitrogen and COD removal range from 54.6%(recirculation 0%) to 92.3%(recirculation 300%) and 10% to 62.5%, respectively.

Application of Computational Intelligence for Sensor Fault Detection and Isolation

The new idea of this research is application of a new fault detection and isolation (FDI) technique for supervision of sensor networks in transportation system. In measurement systems, it is necessary to detect all types of faults and failures, based on predefined algorithm. Last improvements in artificial neural network studies (ANN) led to using them for some FDI purposes. In this paper, application of new probabilistic neural network features for data approximation and data classification are considered for plausibility check in temperature measurement. For this purpose, two-phase FDI mechanism was considered for residual generation and evaluation.

MITAutomatic ECG Beat Tachycardia Detection Using Artificial Neural Network

The application of Neural Network for disease diagnosis has made great progress and is widely used by physicians. An Electrocardiogram carries vital information about heart activity and physicians use this signal for cardiac disease diagnosis which was the great motivation towards our study. In our work, tachycardia features obtained are used for the training and testing of a Neural Network. In this study we are using Fuzzy Probabilistic Neural Networks as an automatic technique for ECG signal analysis. As every real signal recorded by the equipment can have different artifacts, we needed to do some preprocessing steps before feeding it to our system. Wavelet transform is used for extracting the morphological parameters of the ECG signal. The outcome of the approach for the variety of arrhythmias shows the represented approach is superior than prior presented algorithms with an average accuracy of about %95 for more than 7 tachy arrhythmias.

Combining Variable Ordering Heuristics for Improving Search Algorithms Performance

Variable ordering heuristics are used in constraint satisfaction algorithms. Different characteristics of various variable ordering heuristics are complementary. Therefore we have tried to get the advantages of all heuristics to improve search algorithms performance for solving constraint satisfaction problems. This paper considers combinations based on products and quotients, and then a newer form of combination based on weighted sums of ratings from a set of base heuristics, some of which result in definite improvements in performance.

A New Fuzzy Decision Support Method for Analysis of Economic Factors of Turkey's Construction Industry

Imperfect knowledge cannot be avoided all the time. Imperfections may have several forms; uncertainties, imprecision and incompleteness. When we look to classification of methods for the management of imperfect knowledge we see fuzzy set-based techniques. The choice of a method to process data is linked to the choice of knowledge representation, which can be numerical, symbolic, logical or semantic and it depends on the nature of the problem to be solved for example decision support, which will be mentioned in our study. Fuzzy Logic is used for its ability to manage imprecise knowledge, but it can take advantage of the ability of neural networks to learn coefficients or functions. Such an association of methods is typical of so-called soft computing. In this study a new method was used for the management of imprecision for collected knowledge which related to economic analysis of construction industry in Turkey. Because of sudden changes occurring in economic factors decrease competition strength of construction companies. The better evaluation of these changes in economical factors in view of construction industry will made positive influence on company-s decisions which are dealing construction.

Dynamic Data Partition Algorithm for a Parallel H.264 Encoder

The H.264/AVC standard is a highly efficient video codec providing high-quality videos at low bit-rates. As employing advanced techniques, the computational complexity has been increased. The complexity brings about the major problem in the implementation of a real-time encoder and decoder. Parallelism is the one of approaches which can be implemented by multi-core system. We analyze macroblock-level parallelism which ensures the same bit rate with high concurrency of processors. In order to reduce the encoding time, dynamic data partition based on macroblock region is proposed. The data partition has the advantages in load balancing and data communication overhead. Using the data partition, the encoder obtains more than 3.59x speed-up on a four-processor system. This work can be applied to other multimedia processing applications.

Large Vibration Amplitudes of Circular Functionally Graded Thin Plates Resting on Winkler Elastic Foundations

This paper describes a study of geometrically nonlinear free vibration of thin circular functionally graded (CFGP) plates resting on Winkler elastic foundations. The material properties of the functionally graded composites examined here are assumed to be graded smoothly and continuously through the direction of the plate thickness according to a power law and are estimated using the rule of mixture. The theoretical model is based on the classical Plate theory and the Von-Kármán geometrical nonlinearity assumptions. An homogenization procedure (HP) is developed to reduce the problem considered here to that of isotropic homogeneous circular plates resting on Winkler foundation. Hamilton-s principle is applied and a multimode approach is derived to calculate the fundamental nonlinear frequency parameters which are found to be in a good agreement with the published results. On the other hand, the influence of the foundation parameters on the nonlinear fundamental frequency has also been analysed.

Detection of Action Potentials in the Presence of Noise Using Phase-Space Techniques

Emerging Bio-engineering fields such as Brain Computer Interfaces, neuroprothesis devices and modeling and simulation of neural networks have led to increased research activity in algorithms for the detection, isolation and classification of Action Potentials (AP) from noisy data trains. Current techniques in the field of 'unsupervised no-prior knowledge' biosignal processing include energy operators, wavelet detection and adaptive thresholding. These tend to bias towards larger AP waveforms, AP may be missed due to deviations in spike shape and frequency and correlated noise spectrums can cause false detection. Also, such algorithms tend to suffer from large computational expense. A new signal detection technique based upon the ideas of phasespace diagrams and trajectories is proposed based upon the use of a delayed copy of the AP to highlight discontinuities relative to background noise. This idea has been used to create algorithms that are computationally inexpensive and address the above problems. Distinct AP have been picked out and manually classified from real physiological data recorded from a cockroach. To facilitate testing of the new technique, an Auto Regressive Moving Average (ARMA) noise model has been constructed bases upon background noise of the recordings. Along with the AP classification means this model enables generation of realistic neuronal data sets at arbitrary signal to noise ratio (SNR).

Implementation of Vertical Neutron Camera (VNC) for ITER Fusion Plasma Neutron Source Profile Reconstruction

In present work the problem of the ITER fusion plasma neutron source parameter reconstruction using only the Vertical Neutron Camera data was solved. The possibility of neutron source parameter reconstruction was estimated by the numerical simulations and the analysis of adequateness of mathematic model was performed. The neutron source was specified in a parametric form. The numerical analysis of solution stability with respect to data distortion was done. The influence of the data errors on the reconstructed parameters is shown: • is reconstructed with errors less than 4% at all examined values of δ (until 60%); • is determined with errors less than 10% when δ do not overcome 5%; • is reconstructed with relative error more than 10 %; • integral intensity of the neutron source is determined with error 10% while δ error is less than 15%; where -error of signal measurements, (R0,Z0), the plasma center position,- /parameter of neutron source profile.

Simulating Laboratory Short Term Aging to Suit Malaysian Field Conditions

This paper characterizes the effects of artificial short term aging in the laboratory on the rheological properties of virgin 80/100 penetration grade asphalt binder. After several years in service, asphalt mixture started to deteriorate due to aging. Aging is a complex physico-chemical phenomenon that influences asphalt binder rheological properties causing a deterioration in asphalt mixture performance. To ascertain asphalt binder aging effects, the virgin, artificially aged and extracted asphalt binder were tested via the Rolling Thin film Oven (RTFO), Dynamic Shear Rheometer (DSR) and Rotational Viscometer (RV). A comparative study between laboratory and field aging conditions were also carried out. The results showed that the specimens conditioned for 85 minutes inside the RTFO was insufficient to simulate the actual short term aging caused that took place in the field under Malaysian field conditions

On One Application of Hybrid Methods For Solving Volterra Integral Equations

As is known, one of the priority directions of research works of natural sciences is introduction of applied section of contemporary mathematics as approximate and numerical methods to solving integral equation into practice. We fare with the solving of integral equation while studying many phenomena of nature to whose numerically solving by the methods of quadrature are mainly applied. Taking into account some deficiency of methods of quadrature for finding the solution of integral equation some sciences suggested of the multistep methods with constant coefficients. Unlike these papers, here we consider application of hybrid methods to the numerical solution of Volterra integral equation. The efficiency of the suggested method is proved and a concrete method with accuracy order p = 4 is constructed. This method in more precise than the corresponding known methods.

Computer Verification in Cryptography

In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.

How the Iranian Free-Style Wrestlers Know and Think about Doping? – A Knowledge and Attitude Study

Nowadays, doping is an intricate dilemma. Wrestling is the nationally popular sport in Iran. Also the prevalence of doping may be high, due to its power demanding characteristics. So, we aimed to assess the knowledge and attitudes toward doping among the club wrestlers. In a cross sectional study, 426 wrestlers were studied. For this reason, a researcher made questionnaire was used. In this study, researchers selected the clubs by randomized clustered sampling and distributed the questionnaire among wrestlers. Knowledge of wrestlers in three categories of doping definitions, recognition of prohibited drugs and side effects was poor or moderate in 70.8%, 95.8% and 99.5%, respectively. Wrestlers have poor knowledge in doping. Furthermore, they believe some myths which are unfavorable. It seems necessary to design a comprehensive educational program for all of the athletes and coaches.

Grouping-Based Job Scheduling Model In Grid Computing

Grid computing is a high performance computing environment to solve larger scale computational applications. Grid computing contains resource management, job scheduling, security problems, information management and so on. Job scheduling is a fundamental and important issue in achieving high performance in grid computing systems. However, it is a big challenge to design an efficient scheduler and its implementation. In Grid Computing, there is a need of further improvement in Job Scheduling algorithm to schedule the light-weight or small jobs into a coarse-grained or group of jobs, which will reduce the communication time, processing time and enhance resource utilization. This Grouping strategy considers the processing power, memory-size and bandwidth requirements of each job to realize the real grid system. The experimental results demonstrate that the proposed scheduling algorithm efficiently reduces the processing time of jobs in comparison to others.

Delay-Dependent Stability Analysis for Neutral Type Neural Networks with Uncertain Parameters and Time-Varying Delay

In this paper, delay-dependent stability analysis for neutral type neural networks with uncertain paramters and time-varying delay is studied. By constructing new Lyapunov-Krasovskii functional and dividing the delay interval into multiple segments, a novel sufficient condition is established to guarantee the globally asymptotically stability of the considered system. Finally, a numerical example is provided to illustrate the usefulness of the proposed main results.

Modeling the Vapor Pressure of Biodiesel Fuels

The composition, vapour pressure, and heat capacity of nine biodiesel fuels from different sources were measured. The vapour pressure of the biodiesel fuels is modeled assuming an ideal liquid phase of the fatty acid methyl esters constituting the fuel. New methodologies to calculate the vapour pressure and ideal gas and liquid heat capacities of the biodiesel fuel constituents are proposed. Two alternative optimization scenarios are evaluated: 1) vapour pressure only; 2) vapour pressure constrained with liquid heat capacity. Without physical constraints, significant errors in liquid heat capacity predictions were found whereas the constrained correlation accurately fit both vapour pressure and liquid heat capacity.

A Formal Suite of Object Relational Database Metrics

Object Relational Databases (ORDB) are complex in nature than traditional relational databases because they combine the characteristics of both object oriented concepts and relational features of conventional databases. Design of an ORDB demands efficient and quality schema considering the structural, functional and componential traits. This internal quality of the schema is assured by metrics that measure the relevant attributes. This is extended to substantiate the understandability, usability and reliability of the schema, thus assuring external quality of the schema. This work institutes a formalization of ORDB metrics; metric definition, evaluation methodology and the calibration of the metric. Three ORDB schemas were used to conduct the evaluation and the formalization of the metrics. The metrics are calibrated using content and criteria related validity based on the measurability, consistency and reliability of the metrics. Nominal and summative scales are derived based on the evaluated metric values and are standardized. Future works pertaining to ORDB metrics forms the concluding note.

Use of Pesticides and Their Role in Environmental Pollution

Insect pests are the major source of crop damage, yield and quality reduction in Pakistan and else where in the world. Cotton crop is the most hit crop in Pakistan followed by rice and the second most important foreign exchange earning crop. A wide variety of staple, horticultural and cash crops grown, reflect serious problems of many types of insect pests. To overcome the insect pest problem, pesticide use in Pakistan has increased substantially which has now been further intensified. Pesticides worth more than billions of rupees are imported every year. This paper reviews the over all pesticide use in Pakistan in relation to pesticide prices, support price of cotton and rice, pesticide use in different provinces of Pakistan on different crops and their impact on crop productivity. The environmental pollution caused by the use of pesticides, contamination of soil and water resources and the danger associated with the disposal of their empty containers is also discussed in detail.