Semi-Blind Two-Dimensional Code Acquisition in CDMA Communications

In this paper, we propose a new algorithm for joint time-delay and direction-of-arrival (DOA) estimation, here called two-dimensional code acquisition, in an asynchronous directsequence code-division multiple-access (DS-CDMA) array system. This algorithm depends on eigenvector-eigenvalue decomposition of sample correlation matrix, and requires to know desired user-s training sequence. The performance of the algorithm is analyzed both analytically and numerically in uncorrelated and coherent multipath environment. Numerical examples show that the algorithm is robust with unknown number of coherent signals.

Relationship between Gender, BMI, and Lifestyle with Bone Mineral Density of Adolescent in Urban Areas

The purpose of this study was to analyze relationship between gender, BMI, and lifestyle with bone mineral density (BMD) of adolescent in urban areas . The place of this study in Jakarta State University, Indonesia. The number of samples involved as many as 200 people, consisting of 100 men and 100 women. BMD was measured using Quantitative Ultrasound Bone Densitometry. While the questionnaire used to collect data on age, gender, and lifestyle (calcium intake, smoking habits, alcohol consumption, tea, coffee, sports, and sun exposure). Mean age of men and women, respectively as much as 20.7 ± 2.18 years and 21 ± 1.61 years. Mean BMD values of men was 1.084 g/cm ² ± 0.11 while women was 0.976 g/cm ² ± 0.10. Men and women with normal BMD respectively as much as 46.7% and 16.7%. Men and women affected by osteopenia respectively as much as 50% and 80%. Men and women affected by osteoporosis respectively as much as 3.3% and 3.3%. Mean BMI of men and women, respectively as much as 21.4 ± 2.07 kg/m2 and 20.9 ± 2.06 kg/m2. Mean lifestyle score of men and women , respectively as much as 71.9 ± 5.84 and 70.1 ± 5.67 (maximum score 100). Based on Spearman and Pearson Correlation test, there were relationship significantly between gender and lifestyle with BMD.

Knowledge Management Applied to Forensic Sciences

This paper presents initiatives of Knowledge Management (KM) applied to Forensic Sciences field, especially developed at the Forensic Science Institute of the Brazilian Federal Police. Successful projects, related to knowledge sharing, drugs analysis and environmental crimes, are reported in the KM perspective. The described results are related to: a) the importance of having an information repository, like a digital library, in such a multidisciplinary organization; b) the fight against drug dealing and environmental crimes, enabling the possibility to map the evolution of crimes, drug trafficking flows, and the advance of deforestation in Amazon rain forest. Perspectives of new KM projects under development and studies are also presented, tracing an evolution line of the KM view at the Forensic Science Institute.

Optimal Based Damping Controllers of Unified Power Flow Controller Using Adaptive Tabu Search

This paper presents optimal based damping controllers of Unified Power Flow Controller (UPFC) for improving the damping power system oscillations. The design problem of UPFC damping controller and system configurations is formulated as an optimization with time domain-based objective function by means of Adaptive Tabu Search (ATS) technique. The UPFC is installed in Single Machine Infinite Bus (SMIB) for the performance analysis of the power system and simulated using MATLAB-s simulink. The simulation results of these studies showed that designed controller has an tremendous capability in damping power system oscillations.

Developing New Processes and Optimizing Performance Using Response Surface Methodology

Response surface methodology (RSM) is a very efficient tool to provide a good practical insight into developing new process and optimizing them. This methodology could help engineers to raise a mathematical model to represent the behavior of system as a convincing function of process parameters. Through this paper the sequential nature of the RSM surveyed for process engineers and its relationship to design of experiments (DOE), regression analysis and robust design reviewed. The proposed four-step procedure in two different phases could help system analyst to resolve the parameter design problem involving responses. In order to check accuracy of the designed model, residual analysis and prediction error sum of squares (PRESS) described. It is believed that the proposed procedure in this study can resolve a complex parameter design problem with one or more responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready-made standard statistical packages.

A Text Clustering System based on k-means Type Subspace Clustering and Ontology

This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.

A 3D Approach for Extraction of the Coronaryartery and Quantification of the Stenosis

Segmentation and quantification of stenosis is an important task in assessing coronary artery disease. One of the main challenges is measuring the real diameter of curved vessels. Moreover, uncertainty in segmentation of different tissues in the narrow vessel is an important issue that affects accuracy. This paper proposes an algorithm to extract coronary arteries and measure the degree of stenosis. Markovian fuzzy clustering method is applied to model uncertainty arises from partial volume effect problem. The algorithm employs: segmentation, centreline extraction, estimation of orthogonal plane to centreline, measurement of the degree of stenosis. To evaluate the accuracy and reproducibility, the approach has been applied to a vascular phantom and the results are compared with real diameter. The results of 10 patient datasets have been visually judged by a qualified radiologist. The results reveal the superiority of the proposed method compared to the Conventional thresholding Method (CTM) on both datasets.

Groundwater Quality Improvement by Using Aeration and Filtration Methods

An experiment was conducted using two aeration methods (water-into-air and air-into-water) and followed by filtration processes using manganese greensand material. The properties of groundwater such as pH, dissolved oxygen, turbidity and heavy metal concentration (iron and manganese) will be assessed. The objectives of this study are i) to determine the effective aeration method and ii) to assess the effectiveness of manganese greensand as filter media in removing iron and manganese concentration in groundwater. Results showed that final pH for all samples after treatment are in range from 7.40 and 8.40. Both aeration methods increased the dissolved oxygen content. Final turbidity for groundwater samples are between 3 NTU to 29 NTU. Only three out of eight samples achieved iron concentration of 0.3mg/L and less and all samples reach manganese concentration of 0.1mg/L and less. Air-into-water aeration method gives higher percentage of iron and manganese removal compare to water-into-air method.

Phytoremediation of Wastewater Using Some of Aquatic Macrophytes as Biological Purifiers for Irrigation Purposes

An attempt was made for availability of wastewater reuse/reclamation for irrigation purposes using phytoremediation “the low cost and less technology", using six local aquatic macrophytes “e.g. T. angustifolia, B. maritimus, Ph. australis, A. donax, A. plantago-aquatica and M. longifolia (Linn)" as biological waste purifiers. Outdoor experiments/designs were conducted from May 03, 2007 till October 15, 2008, close to one of the main sewage channels of Sulaimani City/Iraq*. All processes were mainly based on conventional wastewater treatment processes, besides two further modifications were tested, the first was sand filtration pots, implanted by individual species of experimental macrophytes and the second was constructed wetlands implanted by experimental macrophytes all together. Untreated and treated wastewater samples were analyzed for their key physico-chemical properties (only heavy metals Fe, Mn, Zn and Cu with particular reference to removal efficiency by experimental macrophytes are highlighted in this paper). On the other hand, vertical contents of heavy metals were also evaluated from both pots and the cells of constructed wetland. After 135 days, macrophytes were harvested and heavy metals were analyzed in their biomass (roots/shoots) for removal efficiency assessment (i.e. uptake/ bioaccumulation rate). Results showed that; removal efficiency of all studied heavy metals was much higher in T. angustifolia followed by Ph. Australis, B. maritimus and A. donax in triple experiment sand pots. Constructed wetland experiments have revealed that; the more replicated constructed wetland cells the highest heavy metal removal efficiency was indicated.

IT/IS Outsourcing Relationship Factors in Higher Education Institution: Behavioral Dimensions from Client Perspectives

Higher education institutions are increasingly opting to outsourcing methods in order to sustain themselves and this creates a gap of literature in terms of how they perceive the relationship. This research paper attempts to identify the behavioral and psychological factors that exist in the engagement thus providing valuable information to practicing and potential clients, and vendors. The determinants were gathered from previous literatures and analyzed to formulate the factors. This study adopts the case study and survey approaches in which interviews and questionnaires are deployed on employees of IT-related department in a Malaysian higher education institution.

Numerical Studies on Thrust Vectoring Using Shock Induced Supersonic Secondary Jet

Numerical studies have been carried out using a validated two-dimensional RNG k-epsilon turbulence model for the design optimization of a thrust vector control system using shock induced supersonic secondary jet. Parametric analytical studies have been carried out with various secondary jets at different divergent locations, jet interaction angles, jet pressures. The results from the parametric studies of the case on hand reveal that the primary nozzle with a small divergence angle, downstream injections with a distance of 2.5 times the primary nozzle throat diameter from the primary nozzle throat location warrant higher efficiency over a certain range of jet pressures and jet angles. We observed that the supersonic secondary jet opposing the core flow with jets interaction angle of 40o to the axis far downstream of the nozzle throat facilitates better thrust vectoring than the secondary jet with same direction as that of core flow with various interaction angles. We concluded that fixing of the supersonic secondary jet nozzle pointing towards the throat direction with suitable angle at a distance 2 to 4 times of the primary nozzle throat diameter, as the case may be, from the primary nozzle throat location could facilitate better thrust vectoring for the supersonic aerospace vehicles.

Biorecognizable Nanoparticles Based On Hyaluronic Acid/Poly(ε-Caprolactone) Block Copolymer

Since hyaluronic acid (HA) receptor such as CD44 is over-expressed at sites of cancer cells, HA can be used as a targeting vehicles for anti-cancer drugs. The aim of this study is to synthesize block copolymer composed of hyaluronic acid and poly(ε-caprolactone) (HAPCL) and to fabricate polymeric micelles for anticancer drug targeting against CD44 receptor of tumor cells. Chemical composition of HAPCL was confirmed using 1H NMR spectroscopy. Doxorubicin (DOX) was incorporated into polymeric micelles of HAPCL. The diameters of HAPHS polymeric micelles were changed around 80nm and have spherical shapes. Targeting potential was investigated using CD44-overexpressing. When DOX-incorporated polymeric micelles was added to KB cells, they revealed strong red fluorescence color while blocking of CD44 receptor by pretreatment of free HA resulted in reduced intensity, indicating that HAPCL polymeric micelles have targetability against CD44 receptor.

Numerical Investigation of the Optimal Spatial Domain Discretization for the 2-D Analysis of a Darrieus Vertical-Axis Water Turbine

The optimal grid spacing and turbulence model for the 2D numerical analysis of a vertical-axis water turbine (VAWaterT) operating in a 2 m/s freestream current has been investigated. The results of five different spatial domain discretizations and two turbulence models (k-ω SST and k-ε RNG) have been compared, in order to gain the optimal y+ parameter distribution along the blade walls during a full rotor revolution. The resulting optimal mesh has appeared to be quite similar to that obtained for the numerical analysis of a vertical-axis wind turbine.

A Comparative Analysis of Performance and QoS Issues in MANETs

Mobile Ad hoc networks (MANETs) are collections of wireless mobile nodes dynamically reconfiguring and collectively forming a temporary network. These types of networks assume existence of no fixed infrastructure and are often useful in battle-field tactical operations or emergency search-and-rescue type of operations where fixed infrastructure is neither feasible nor practical. They also find use in ad hoc conferences, campus networks and commercial recreational applications carrying multimedia traffic. All of the above applications of MANETs require guaranteed levels of performance as experienced by the end-user. This paper focuses on key challenges in provisioning predetermined levels of such Quality of Service (QoS). It also identifies functional areas where QoS models are currently defined and used. Evolving functional areas where performance and QoS provisioning may be applied are also identified and some suggestions are provided for further research in this area. Although each of the above functional areas have been discussed separately in recent research studies, since these QoS functional areas are highly correlated and interdependent, a comprehensive and comparative analysis of these areas and their interrelationships is desired. In this paper we have attempted to provide such an overview.

Tuning Neurons to Interaural Intensity Differences Using Spike Timing-Dependent Plasticity

Mammals are known to use Interaural Intensity Difference (IID) to determine azimuthal position of high frequency sounds. In the Lateral Superior Olive (LSO) neurons have firing behaviours which vary systematicaly with IID. Those neurons receive excitatory inputs from the ipsilateral ear and inhibitory inputs from the contralateral one. The IID sensitivity of a LSO neuron is thought to be due to delay differences between both ears, delays due to different synaptic delays and to intensity-dependent delays. In this paper we model the auditory pathway until the LSO. Inputs to LSO neurons are at first numerous and differ in their relative delays. Spike Timing-Dependent Plasticity is then used to prune those connections. We compare the pruned neuron responses with physiological data and analyse the relationship between IID-s of teacher stimuli and IID sensitivities of trained LSO neurons.

Method to Improve Channel Coding Using Cryptography

A new approach for the improvement of coding gain in channel coding using Advanced Encryption Standard (AES) and Maximum A Posteriori (MAP) algorithm is proposed. This new approach uses the avalanche effect of block cipher algorithm AES and soft output values of MAP decoding algorithm. The performance of proposed approach is evaluated in the presence of Additive White Gaussian Noise (AWGN). For the verification of proposed approach, computer simulation results are included.

Response of the Residential Building Structureon Load Technical Seismicity due to Mining Activities

In the territories where high-intensity earthquakes are frequent is paid attention to the solving of the seismic problems. In the paper are described two computational model variants based on finite element method of the construction with different subsoil simulation (rigid or elastic subsoil) is used. For simulation and calculations program system based on method final elements ANSYS was used. Seismic responses calculations of residential building structure were effected on loading characterized by accelerogram for comparing with the responses spectra method.

The Reliability of the Improved e-N Method for Transition Prediction as Checked by PSE Method

Transition prediction of boundary layers has always been an important problem in fluid mechanics both theoretically and practically, yet notwithstanding the great effort made by many investigators, there is no satisfactory answer to this problem. The most popular method available is so-called e-N method which is heavily dependent on experiments and experience. The author has proposed improvements to the e-N method, so to reduce its dependence on experiments and experience to a certain extent. One of the key assumptions is that transition would occur whenever the velocity amplitude of disturbance reaches 1-2% of the free stream velocity. However, the reliability of this assumption needs to be verified. In this paper, transition prediction on a flat plate is investigated by using both the improved e-N method and the parabolized stability equations (PSE) methods. The results show that the transition locations predicted by both methods agree reasonably well with each other, under the above assumption. For the supersonic case, the critical velocity amplitude in the improved e-N method should be taken as 0.013, whereas in the subsonic case, it should be 0.018, both are within the range 1-2%.

Selective Harmonic Elimination of PWM AC/AC Voltage Controller Using Hybrid RGA-PS Approach

Selective harmonic elimination-pulse width modulation techniques offer a tight control of the harmonic spectrum of a given voltage waveform generated by a power electronic converter along with a low number of switching transitions. Traditional optimization methods suffer from various drawbacks, such as prolonged and tedious computational steps and convergence to local optima; thus, the more the number of harmonics to be eliminated, the larger the computational complexity and time. This paper presents a novel method for output voltage harmonic elimination and voltage control of PWM AC/AC voltage converters using the principle of hybrid Real-Coded Genetic Algorithm-Pattern Search (RGA-PS) method. RGA is the primary optimizer exploiting its global search capabilities, PS is then employed to fine tune the best solution provided by RGA in each evolution. The proposed method enables linear control of the fundamental component of the output voltage and complete elimination of its harmonic contents up to a specified order. Theoretical studies have been carried out to show the effectiveness and robustness of the proposed method of selective harmonic elimination. Theoretical results are validated through simulation studies using PSIM software package.

Modular Workflow System for HPC Applications

Nowadays, HPC, Grid and Cloud systems are evolving very rapidly. However, the development of infrastructure solutions related to HPC is lagging behind. While the existing infrastructure is sufficient for simple cases, many computational problems have more complex requirements.Such computational experiments use different resources simultaneously to start a large number of computational jobs.These resources are heterogeneous. They have different purposes, architectures, performance and used software.Users need a convenient tool that allows to describe and to run complex computational experiments under conditions of HPC environment. This paper introduces a modularworkflow system called SEGL which makes it possible to run complex computational experiments under conditions of a real HPC organization. The system can be used in a great number of organizations, which provide HPC power. Significant requirements to this system are high efficiency and interoperability with the existing HPC infrastructure of the organization without any changes.