VoIP and Database Traffic Co-existence over IEEE 802.11b WLAN with Redundancy

This paper presents the findings of two experiments that were performed on the Redundancy in Wireless Connection Model (RiWC) using the 802.11b standard. The experiments were simulated using OPNET 11.5 Modeler software. The first was aimed at finding the maximum number of simultaneous Voice over Internet Protocol (VoIP) users the model would support under the G.711 and G.729 codec standards when the packetization interval was 10 milliseconds (ms). The second experiment examined the model?s VoIP user capacity using the G.729 codec standard along with background traffic using the same packetization interval as in the first experiment. To determine the capacity of the model under various experiments, we checked three metrics: jitter, delay and data loss. When background traffic was added, we checked the response time in addition to the previous three metrics. The findings of the first experiment indicated that the maximum number of simultaneous VoIP users the model was able to support was 5, which is consistent with recent research findings. When using the G.729 codec, the model was able to support up to 16 VoIP users; similar experiments in current literature have indicated a maximum of 7 users. The finding of the second experiment demonstrated that the maximum number of VoIP users the model was able to support was 12, with the existence of background traffic.

Two States Mapping Based Neural Network Model for Decreasing of Prediction Residual Error

The objective of this paper is to design a model of human vital sign prediction for decreasing prediction error by using two states mapping based time series neural network BP (back-propagation) model. Normally, lot of industries has been applying the neural network model by training them in a supervised manner with the error back-propagation algorithm for time series prediction systems. However, it still has a residual error between real value and prediction output. Therefore, we designed two states of neural network model for compensation of residual error which is possible to use in the prevention of sudden death and metabolic syndrome disease such as hypertension disease and obesity. We found that most of simulations cases were satisfied by the two states mapping based time series prediction model compared to normal BP. In particular, small sample size of times series were more accurate than the standard MLP model. We expect that this algorithm can be available to sudden death prevention and monitoring AGENT system in a ubiquitous homecare environment.

Iris Localization using Circle and Fuzzy Circle Detection Method

Iris localization is a very important approach in biometric identification systems. Identification process usually is implemented in three levels: iris localization, feature extraction, and pattern matching finally. Accuracy of iris localization as the first step affects all other levels and this shows the importance of iris localization in an iris based biometric system. In this paper, we consider Daugman iris localization method as a standard method, propose a new method in this field and then analyze and compare the results of them on a standard set of iris images. The proposed method is based on the detection of circular edge of iris, and improved by fuzzy circles and surface energy difference contexts. Implementation of this method is so easy and compared to the other methods, have a rather high accuracy and speed. Test results show that the accuracy of our proposed method is about Daugman method and computation speed of it is 10 times faster.

Globally Convergent Edge-preserving Reconstruction with Contour-line Smoothing

The standard approach to image reconstruction is to stabilize the problem by including an edge-preserving roughness penalty in addition to faithfulness to the data. However, this methodology produces noisy object boundaries and creates a staircase effect. The existing attempts to favor the formation of smooth contour lines take the edge field explicitly into account; they either are computationally expensive or produce disappointing results. In this paper, we propose to incorporate the smoothness of the edge field in an implicit way by means of an additional penalty term defined in the wavelet domain. We also derive an efficient half-quadratic algorithm to solve the resulting optimization problem, including the case when the data fidelity term is non-quadratic and the cost function is nonconvex. Numerical experiments show that our technique preserves edge sharpness while smoothing contour lines; it produces visually pleasing reconstructions which are quantitatively better than those obtained without wavelet-domain constraints.

Assessment of Water Pollution of Kowsar Dam Reservoir

The reservoir of Kowsar dam supply water for different usages such as aquaculture farms , drinking, agricultural and industrial usages for some provinces in south of Iran. The Kowsar dam is located next to the city of Dehdashat in Kohgiluye and Boyerahmad province in southern Iran. There are some towns and villages on the Kowsar dam watersheds, which Dehdasht and Choram are the most important and populated twons in this area, which can to be sources of pollution for water reservoir of the Kowsar dam . This study was done to determine of water pollution of the Kowsar dam reservoir which is one of the most important water resources of Kohkiloye and Boyerahmad and Bushehr provinces in south-west Iran. In this study , water samples during 12 months were collected to examine Biochemical Oxygen Demand (BOD) and Dissolved Oxygen(DO) as a criterion for evaluation of water pollution of the reservoir. In summary ,the study has shown Maximum, average and minimum levels of BOD have observed 25.9 ,9.15 and 2.3 mg/L respectively and statistical parameters of data such as standard deviation , variance and skewness have calculated 7.88, 62 and 1.54 respectively. Finally the results were compared with Iranian national standards. Among the analyzed samples, as the maximum value of BOD (25.9 mg/L) was observed at the May 2010 , was within the maximum admissible limits by the Iranian standards.

Validation on 3D Surface Roughness Algorithm for Measuring Roughness of Psoriasis Lesion

Psoriasis is a widespread skin disease affecting up to 2% population with plaque psoriasis accounting to about 80%. It can be identified as a red lesion and for the higher severity the lesion is usually covered with rough scale. Psoriasis Area Severity Index (PASI) scoring is the gold standard method for measuring psoriasis severity. Scaliness is one of PASI parameter that needs to be quantified in PASI scoring. Surface roughness of lesion can be used as a scaliness feature, since existing scale on lesion surface makes the lesion rougher. The dermatologist usually assesses the severity through their tactile sense, therefore direct contact between doctor and patient is required. The problem is the doctor may not assess the lesion objectively. In this paper, a digital image analysis technique is developed to objectively determine the scaliness of the psoriasis lesion and provide the PASI scaliness score. Psoriasis lesion is modelled by a rough surface. The rough surface is created by superimposing a smooth average (curve) surface with a triangular waveform. For roughness determination, a polynomial surface fitting is used to estimate average surface followed by a subtraction between rough and average surface to give elevation surface (surface deviations). Roughness index is calculated by using average roughness equation to the height map matrix. The roughness algorithm has been tested to 444 lesion models. From roughness validation result, only 6 models can not be accepted (percentage error is greater than 10%). These errors occur due the scanned image quality. Roughness algorithm is validated for roughness measurement on abrasive papers at flat surface. The Pearson-s correlation coefficient of grade value (G) of abrasive paper and Ra is -0.9488, its shows there is a strong relation between G and Ra. The algorithm needs to be improved by surface filtering, especially to overcome a problem with noisy data.

A Usability Testing Approach to Evaluate User-Interfaces in Business Administration

This interdisciplinary study is an investigation to evaluate user-interfaces in business administration. The study is going to be implemented on two computerized business administration systems with two distinctive user-interfaces, so that differences between the two systems can be determined. Both systems, a commercial and a prototype developed for the purpose of this study, deal with ordering of supplies, tendering procedures, issuing purchase orders, controlling the movement of the stocks against their actual balances on the shelves and editing them on their tabulations. In the second suggested system, modern computer graphics and multimedia issues were taken into consideration to cover the drawbacks of the first system. To highlight differences between the two investigated systems regarding some chosen standard quality criteria, the study employs various statistical techniques and methods to evaluate the users- interaction with both systems. The study variables are divided into two divisions: independent representing the interfaces of the two systems, and dependent embracing efficiency, effectiveness, satisfaction, error rate etc.

High Securing Cover-File of Hidden Data Using Statistical Technique and AES Encryption Algorithm

Nowadays, the rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threatens. It-s a big security and privacy issue with the large flood of information and the development of the digital format, it become necessary to find appropriate protection because of the significance, accuracy and sensitivity of the information. Nowadays protection system classified with more specific as hiding information, encryption information, and combination between hiding and encryption to increase information security, the strength of the information hiding science is due to the non-existence of standard algorithms to be used in hiding secret messages. Also there is randomness in hiding methods such as combining several media (covers) with different methods to pass a secret message. In addition, there are no formal methods to be followed to discover the hidden data. For this reason, the task of this research becomes difficult. In this paper, a new system of information hiding is presented. The proposed system aim to hidden information (data file) in any execution file (EXE) and to detect the hidden file and we will see implementation of steganography system which embeds information in an execution file. (EXE) files have been investigated. The system tries to find a solution to the size of the cover file and making it undetectable by anti-virus software. The system includes two main functions; first is the hiding of the information in a Portable Executable File (EXE), through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information) and the second function is the extraction of the hiding information through three process (specify the steno file, extract the information, and decryption of the information). The system has achieved the main goals, such as make the relation of the size of the cover file and the size of information independent and the result file does not make any conflict with anti-virus software.

Image Transmission in Low-Power Networks in Mobile Communications Channel

This paper studies a vital issue in wireless communications, which is the transmission of images over Wireless Personal Area Networks (WPANs) through the Bluetooth network. It presents a simple method to improve the efficiency of error control code of old Bluetooth versions over mobile WPANs through Interleaved Error Control Code (IECC) technique. The encoded packets are interleaved by simple block interleaver. Also, the paper presents a chaotic interleaving scheme as a tool against bursts of errors which depends on the chaotic Baker map. Also, the paper proposes using the chaotic interleaver instead of traditional block interleaver with Forward Error Control (FEC) scheme. A comparison study between the proposed and standard techniques for image transmission over a correlated fading channel is presented. Simulation results reveal the superiority of the proposed chaotic interleaving scheme to other schemes. Also, the superiority of FEC with proposed chaotic interleaver to the conventional interleavers with enhancing the security level with chaotic interleaving packetby- packet basis.

The Study of the Intelligent Fuzzy Weighted Input Estimation Method Combined with the Experiment Verification for the Multilayer Materials

The innovative intelligent fuzzy weighted input estimation method (FWIEM) can be applied to the inverse heat transfer conduction problem (IHCP) to estimate the unknown time-varying heat flux of the multilayer materials as presented in this paper. The feasibility of this method can be verified by adopting the temperature measurement experiment. The experiment modular may be designed by using the copper sample which is stacked up 4 aluminum samples with different thicknesses. Furthermore, the bottoms of copper samples are heated by applying the standard heat source, and the temperatures on the tops of aluminum are measured by using the thermocouples. The temperature measurements are then regarded as the inputs into the presented method to estimate the heat flux in the bottoms of copper samples. The influence on the estimation caused by the temperature measurement of the sample with different thickness, the processing noise covariance Q, the weighting factor γ , the sampling time interval Δt , and the space discrete interval Δx , will be investigated by utilizing the experiment verification. The results show that this method is efficient and robust to estimate the unknown time-varying heat input of the multilayer materials.

A Study on Cement-Based Composite Containing Polypropylene Fibers and Finely Ground Glass Exposed to Elevated Temperatures

High strength concrete has been used in situations where it may be exposed to elevated temperatures. Numerous authors have shown the significant contribution of polypropylene fiber to the spalling resistance of high strength concrete. When cement-based composite that reinforced by polypropylene fibers heated up to 170 °C, polypropylene fibers readily melt and volatilize, creating additional porosity and small channels in to the matrix that cause the poor structure and low strength. This investigation develops on the mechanical properties of mortar incorporating polypropylene fibers exposed to high temperature. Also effects of different pozzolans on strength behaviour of samples at elevated temperature have been studied. To reach this purpose, the specimens were produced by partial replacement of cement with finely ground glass, silica fume and rice husk ash as high reactive pozzolans. The amount of this replacement was 10% by weight of cement to find the effects of pozzolans as a partial replacement of cement on the mechanical properties of mortars. In this way, lots of mixtures with 0%, 0.5%, 1% and 1.5% of polypropylene fibers were cast and tested for compressive and flexural strength, accordance to ASTM standard. After that specimens being heated to temperatures of 300, 600 °C, respectively, the mechanical properties of heated samples were tested. Mechanical tests showed significant reduction in compressive strength which could be due to polypropylene fiber melting. Also pozzolans improve the mechanical properties of sampels.

The Academic Achievement of Writing via Project-Based Learning

This paper focuses on the use of project work as a pretext for applying the conventions of writing, or the correctness of mechanics, usage, and sentence formation, in a content-based class in a Rajabhat University. Its aim was to explore to what extent the student teachers’ academic achievement of the basic writing features against the 70% attainment target after the use of project is. The organization of work around an agreed theme in which the students reproduce language provided by texts and instructors is expected to enhance students’ correct writing conventions. The sample of the study comprised of 38 fourth-year English major students. The data was collected by means of achievement test and student writing works. The scores in the summative achievement test were analyzed by mean score, standard deviation, and percentage. It was found that the student teachers do more achieve of practicing mechanics and usage, and less in sentence formation. The students benefited from the exposure to texts during conducting the project; however, their automaticity of how and when to form phrases and clauses into simple/complex sentences had room for improvement.

Design Optimization Methodology of CMOS Active Mixers for Multi-Standard Receivers

A design flow of multi-standard down-conversion CMOS mixers for three modern standards: Global System Mobile, Digital Enhanced Cordless Telephone and Universal Mobile Telecommunication Systems is presented. Three active mixer-s structures are studied. The first is based on the Gilbert cell which gives a tolerable noise figure and linearity with a low conversion gain. The second and third structures use the current bleeding and charge injection techniques in order to increase the conversion gain. An improvement of about 2 dB of the conversion gain is achieved without a considerable degradation of the other characteristics. The models used for noise figure, conversion gain and IIP3 used are studied. This study describes the nature of trade-offs inherent in such structures and gives insights that help in identifying which structure is better for given conditions.

External Morphological Study of Wild Labeo calbasu with Reference to Body Weight' Total Length and Condition Factor from the River Chenab, Punjab, Pakistan

115 samples of Labeo calbasu ranged 8.0-17.9cm length with mean11.90±1.96 and 4.9-68.5g weight with mean 22.25±12.54 from the River Chenab, Southern Punjab, Pakistan were analyzed to investigate length-weight relationships (LWR) of fish in relation to condition factor (K). Standard length (SL), fork length (FL), head length (HL) head width (HW), body girth (BG), dorsal fin length (DFL), dorsal fin base (DFB), pectoral fin length (PcFL), pelvic fin length (PvFL) and anal fin length (AFL) are found to be highly correlated with increasing total length and wet body weight (r > 0.500). Wet body weight has positive (r=0.540) and total length has no correlation (r=0.344) with calculated Condition factor (K). The slope “b" in the relationship is 3.27 and intercepts -2.2258.

The Use of Project to Enhance Writing Skill

This paper explores the use of project work in a content-based instruction in a Rajabhat University, a teacher college, where student teachers are instructed to perform teaching roles mainly in basic education level. Its aim is to link theory to practice, and to help language teachers maximize the full potential of project work for genuine communication and give real meaning to writing activity. Two research questions are formulated to guide this study: a) What is the academic achievement of the students- writing skill against the 70% attainment target after the use of project to enhance the skill? and b) To what degree is the development of the students- writing skills during the course of project to enhance the skill? The sample of the study comprised of 38 fourth-year English major students. The data was collected by means of achievement test, student writing works, and project diary. The scores in the summative achievement test were analyzed by mean score, standard deviation, and t-test. Project diary serves as students- record of the language acquired during the project. List of structures and vocabulary noted in the diary has shown students- ability to attend to, recognize, and focus on meaningful patterns of language forms.

Polyphenolic Profile and Antioxidant Activities of Nigella Sativa Seed Extracts In Vitro and In Vivo

Nigella sativa L. is an aromatic plant belonging to the family Ranunculaceae. It has been used traditionally, especially in the middle East and India, for the treatment of asthma, cough, bronchitis, headache, rheumatism, fever, influenza and eczema. Several biological activities have been reported in Nigella sativa seeds, including antioxidant. In this context we tried to estimate the antioxidant activity of various extracts prepared from Nigella sativa seeds, methanolic extract (ME), chloroformic extract (CE), hexanic extract (HE : fixed oil), ethyl acetate extract (EAE) water extract (WE). The Folin-Ciocalteu assay showed that CE and EAE contained high level of phenolic compounds 81.31 and 72.43μg GAE/mg of extract respectively. Similarly, the CE and EAE exhibited the highest DPPH radical scavenging activity, with IC50 values of 106.56μg/ml and 121.62μg/ml respectively. In addition, CE and HE showed the most scavenging activity against superoxide radical generated in the PMS-NADH-NBT system with respective IC50 values of 361.86 μg/ml and 371.80 μg/ml, which is comparable to the activity of the standard antioxidant BHT (344.59 μg/ml). Ferrous ion chelating capacity assay showed that WE, EAE and ME are the most active with 40.57, 39.70 and 22.02 mg EDTA-E/g of extract. The inhibition of linoleic acid/ß-carotene coupled oxidation was estimated by ßcarotene bleaching assay, this showed a highest relative antioxidant activity with CE and EAE (69.82% of inhibition). The antioxidant activities of the methanolic extract and the fixed oil are confirmed by an in vivo assay in mice, the daily oral administration of methanolic extract (500 and 800 mg/kg/day) and fixed oil (2 and 4 ml/kg/day) during 21 days, resulted in a significant enhancement of the blood total antioxidant capacity (measured by KRL test) and the plasmatic antioxidant capacity towards DPPH radical.

Effect of Turbulence Models on Simulated Iced Aircraft Airfoil

The present work describes a computational study of aerodynamic characteristics of GLC305 airfoil clean and with 16.7 min ice shape (rime 212) and 22.5 min ice shape (glaze 944).The performance of turbulence models SA, Kε, Kω Std, and Kω SST model are observed against experimental flow fields at different Mach numbers 0.12, 0.21, 0.28 in a range of Reynolds numbers 3x106, 6x106, and 10.5x106 on clean and iced aircraft airfoil GLC305. Numerical predictions include lift, drag and pitching moment coefficients at different Mach numbers and at different angle of attacks were done. Accuracy of solutions with respect to the effects of turbulence models, variation of Mach number, initial conditions, grid resolution and grid spacing near the wall made the study much sensitive. Navier Stokes equation based computational technique is used. Results are very close to the experimental results. It has seen that SA and SST models are more efficient than Kε and Kω standard in under study problem.

Spatial Distribution of Cd, Zn and Hg in Groundwater at Rayong Province, Thailand

The objective of this study was to evaluate the distribution patterns of Cd, Zn and Hg in groundwater by geospatial interpolation. The study was performed at Rayong province in the eastern part of Thailand, with high agricultural and industrial activities. Groundwater samples were collected twice a year from 31 tubewells around this area. Inductively Coupled Plasma-Atomic Emission Spectrometer (ICP-AES) was used to measure the concentrations of Cd, Zn, and Hg in groundwater samples. The results demonstrated that concentrations of Cd, Zn and Hg range from 0.000-0.297 mg/L (x = 0.021±0.033 mg/L), 0.022-33.236 mg/L (x = 4.214±4.766 mg/L) and 0.000-0.289 mg/L (x = 0.023±0.034 mg/L), respectively. Most of the heavy metals concentrations were exceeded groundwater quality standards as specified in the Ministry of Natural Resources and Environment, Thailand. The trend distribution of heavy metals were high concentrations at the southeastern part of the area that especially vulnerable to heavy metals and other contaminants.

Equivalent Transformation for Heterogeneous Traffic Cellular Automata

Understanding driving behavior is a complicated researching topic. To describe accurate speed, flow and density of a multiclass users traffic flow, an adequate model is needed. In this study, we propose the concept of standard passenger car equivalent (SPCE) instead of passenger car equivalent (PCE) to estimate the influence of heavy vehicles and slow cars. Traffic cellular automata model is employed to calibrate and validate the results. According to the simulated results, the SPCE transformations present good accuracy.

Heuristic Method for Judging the Computational Stability of the Difference Schemes of the Biharmonic Equation

In this paper, we research the standard 13-point difference schemes for solving the biharmonic equation. Heuristic method is applied to judging the stability of multi-level difference schemes of the biharmonic equation. It is showed that the standard 13-point difference schemes are stable.