Accurate Time Domain Method for Simulation of Microstructured Electromagnetic and Photonic Structures

A time-domain numerical model within the framework of transmission line modeling (TLM) is developed to simulate electromagnetic pulse propagation inside multiple microcavities forming photonic crystal (PhC) structures. The model developed is quite general and is capable of simulating complex electromagnetic problems accurately. The field quantities can be mapped onto a passive electrical circuit equivalent what ensures that TLM is provably stable and conservative at a local level. Furthermore, the circuit representation allows a high level of hybridization of TLM with other techniques and lumped circuit models of components and devices. A photonic crystal structure formed by rods (or blocks) of high-permittivity dieletric material embedded in a low-dielectric background medium is simulated as an example. The model developed gives vital spatio-temporal information about the signal, and also gives spectral information over a wide frequency range in a single run. The model has wide applications in microwave communication systems, optical waveguides and electromagnetic materials simulations.

Application of Data Mining Tools to Predicate Completion Time of a Project

Estimation time and cost of work completion in a project and follow up them during execution are contributors to success or fail of a project, and is very important for project management team. Delivering on time and within budgeted cost needs to well managing and controlling the projects. To dealing with complex task of controlling and modifying the baseline project schedule during execution, earned value management systems have been set up and widely used to measure and communicate the real physical progress of a project. But it often fails to predict the total duration of the project. In this paper data mining techniques is used predicting the total project duration in term of Time Estimate At Completion-EAC (t). For this purpose, we have used a project with 90 activities, it has updated day by day. Then, it is used regular indexes in literature and applied Earned Duration Method to calculate time estimate at completion and set these as input data for prediction and specifying the major parameters among them using Clem software. By using data mining, the effective parameters on EAC and the relationship between them could be extracted and it is very useful to manage a project with minimum delay risks. As we state, this could be a simple, safe and applicable method in prediction the completion time of a project during execution.

An Empirical Quest for Linkages between HPWS and Employee Behaviors – a Perspective from the Non Managerial Employees in Japanese Organizations

High Performance Work Systems (HPWS) generally give rise to positive impacts on employees by increasing their commitments in workplaces. While some argued this actually have considerable negative impacts on employees with increasing possibilities of imposing strains caused by stress and intensity of such work places. Do stressful workplaces hamper employee commitment? The author has tried to find the answer by exploring linkages between HPWS practices and its impact on employees in Japanese organizations. How negative outcomes like job intensity and workplaces and job stressors can influence different forms of employees- commitments which can be a hindrance to their performance. Design: A close ended questionnaire survey was conducted amongst 16 large, medium and small sized Japanese companies from diverse industries around Chiba, Saitama, and Ibaraki Prefectures and in Tokyo from the month of October 2008 to February 2009. Questionnaires were aimed to the non managerial employees- perceptions of HPWS practices, their behavior, working life experiences in their work places. A total of 227 samples are used for analysis in the study. Methods: Correlations, MANCOVA, SEM Path analysis using AMOS software are used for data analysis in this study. Findings: Average non-managerial perception of HPWS adoption is significantly but negatively correlated to both work place Stressors and Continuous commitment, but positively correlated to job Intensity, Affective, Occupational and Normative commitments in different workplaces at Japan. The path analysis by SEM shows significant indirect relationship between Stressors and employee Affective organizational commitment and Normative organizational commitments. Intensity also has a significant indirect effect on Occupational commitments. HPWS has an additive effect on all the outcomes variables. Limitations: The sample size in this study cannot be a representative to the entire population of non-managerial employees in Japan. There were no respondents from automobile, pharmaceuticals, finance industries. The duration of the survey coincided in a period when Japan as most of the other countries is under going recession. Biases could not be ruled out completely. We must take cautions in interpreting the results of studies as they cannot be generalized. And the path analysis cannot provide the complete causality of the inter linkages between the variables used in the study. Originality: There have been limited studies on linkages in HPWS adoptions and their impacts on employees- behaviors and commitments in Japanese workplaces. This study may provide some ingredients for further research in the fields of HRM policies and practices and their linkages on different forms of employees- commitments.

Developing New Processes and Optimizing Performance Using Response Surface Methodology

Response surface methodology (RSM) is a very efficient tool to provide a good practical insight into developing new process and optimizing them. This methodology could help engineers to raise a mathematical model to represent the behavior of system as a convincing function of process parameters. Through this paper the sequential nature of the RSM surveyed for process engineers and its relationship to design of experiments (DOE), regression analysis and robust design reviewed. The proposed four-step procedure in two different phases could help system analyst to resolve the parameter design problem involving responses. In order to check accuracy of the designed model, residual analysis and prediction error sum of squares (PRESS) described. It is believed that the proposed procedure in this study can resolve a complex parameter design problem with one or more responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready-made standard statistical packages.

A Text Clustering System based on k-means Type Subspace Clustering and Ontology

This paper presents a text clustering system developed based on a k-means type subspace clustering algorithm to cluster large, high dimensional and sparse text data. In this algorithm, a new step is added in the k-means clustering process to automatically calculate the weights of keywords in each cluster so that the important words of a cluster can be identified by the weight values. For understanding and interpretation of clustering results, a few keywords that can best represent the semantic topic are extracted from each cluster. Two methods are used to extract the representative words. The candidate words are first selected according to their weights calculated by our new algorithm. Then, the candidates are fed to the WordNet to identify the set of noun words and consolidate the synonymy and hyponymy words. Experimental results have shown that the clustering algorithm is superior to the other subspace clustering algorithms, such as PROCLUS and HARP and kmeans type algorithm, e.g., Bisecting-KMeans. Furthermore, the word extraction method is effective in selection of the words to represent the topics of the clusters.

A 3D Approach for Extraction of the Coronaryartery and Quantification of the Stenosis

Segmentation and quantification of stenosis is an important task in assessing coronary artery disease. One of the main challenges is measuring the real diameter of curved vessels. Moreover, uncertainty in segmentation of different tissues in the narrow vessel is an important issue that affects accuracy. This paper proposes an algorithm to extract coronary arteries and measure the degree of stenosis. Markovian fuzzy clustering method is applied to model uncertainty arises from partial volume effect problem. The algorithm employs: segmentation, centreline extraction, estimation of orthogonal plane to centreline, measurement of the degree of stenosis. To evaluate the accuracy and reproducibility, the approach has been applied to a vascular phantom and the results are compared with real diameter. The results of 10 patient datasets have been visually judged by a qualified radiologist. The results reveal the superiority of the proposed method compared to the Conventional thresholding Method (CTM) on both datasets.

Groundwater Quality Improvement by Using Aeration and Filtration Methods

An experiment was conducted using two aeration methods (water-into-air and air-into-water) and followed by filtration processes using manganese greensand material. The properties of groundwater such as pH, dissolved oxygen, turbidity and heavy metal concentration (iron and manganese) will be assessed. The objectives of this study are i) to determine the effective aeration method and ii) to assess the effectiveness of manganese greensand as filter media in removing iron and manganese concentration in groundwater. Results showed that final pH for all samples after treatment are in range from 7.40 and 8.40. Both aeration methods increased the dissolved oxygen content. Final turbidity for groundwater samples are between 3 NTU to 29 NTU. Only three out of eight samples achieved iron concentration of 0.3mg/L and less and all samples reach manganese concentration of 0.1mg/L and less. Air-into-water aeration method gives higher percentage of iron and manganese removal compare to water-into-air method.

Blur and Ringing Artifact Measurement in Image Compression using Wavelet Transform

Quality evaluation of an image is an important task in image processing applications. In case of image compression, quality of decompressed image is also the criterion for evaluation of given coding scheme. In the process of compression -decompression various artifacts such as blocking artifacts, blur artifact, ringing or edge artifact are observed. However quantification of these artifacts is a difficult task. We propose here novel method to quantify blur and ringing artifact in an image.

IT/IS Outsourcing Relationship Factors in Higher Education Institution: Behavioral Dimensions from Client Perspectives

Higher education institutions are increasingly opting to outsourcing methods in order to sustain themselves and this creates a gap of literature in terms of how they perceive the relationship. This research paper attempts to identify the behavioral and psychological factors that exist in the engagement thus providing valuable information to practicing and potential clients, and vendors. The determinants were gathered from previous literatures and analyzed to formulate the factors. This study adopts the case study and survey approaches in which interviews and questionnaires are deployed on employees of IT-related department in a Malaysian higher education institution.

Method to Improve Channel Coding Using Cryptography

A new approach for the improvement of coding gain in channel coding using Advanced Encryption Standard (AES) and Maximum A Posteriori (MAP) algorithm is proposed. This new approach uses the avalanche effect of block cipher algorithm AES and soft output values of MAP decoding algorithm. The performance of proposed approach is evaluated in the presence of Additive White Gaussian Noise (AWGN). For the verification of proposed approach, computer simulation results are included.

Response of the Residential Building Structureon Load Technical Seismicity due to Mining Activities

In the territories where high-intensity earthquakes are frequent is paid attention to the solving of the seismic problems. In the paper are described two computational model variants based on finite element method of the construction with different subsoil simulation (rigid or elastic subsoil) is used. For simulation and calculations program system based on method final elements ANSYS was used. Seismic responses calculations of residential building structure were effected on loading characterized by accelerogram for comparing with the responses spectra method.

The Reliability of the Improved e-N Method for Transition Prediction as Checked by PSE Method

Transition prediction of boundary layers has always been an important problem in fluid mechanics both theoretically and practically, yet notwithstanding the great effort made by many investigators, there is no satisfactory answer to this problem. The most popular method available is so-called e-N method which is heavily dependent on experiments and experience. The author has proposed improvements to the e-N method, so to reduce its dependence on experiments and experience to a certain extent. One of the key assumptions is that transition would occur whenever the velocity amplitude of disturbance reaches 1-2% of the free stream velocity. However, the reliability of this assumption needs to be verified. In this paper, transition prediction on a flat plate is investigated by using both the improved e-N method and the parabolized stability equations (PSE) methods. The results show that the transition locations predicted by both methods agree reasonably well with each other, under the above assumption. For the supersonic case, the critical velocity amplitude in the improved e-N method should be taken as 0.013, whereas in the subsonic case, it should be 0.018, both are within the range 1-2%.

Selective Harmonic Elimination of PWM AC/AC Voltage Controller Using Hybrid RGA-PS Approach

Selective harmonic elimination-pulse width modulation techniques offer a tight control of the harmonic spectrum of a given voltage waveform generated by a power electronic converter along with a low number of switching transitions. Traditional optimization methods suffer from various drawbacks, such as prolonged and tedious computational steps and convergence to local optima; thus, the more the number of harmonics to be eliminated, the larger the computational complexity and time. This paper presents a novel method for output voltage harmonic elimination and voltage control of PWM AC/AC voltage converters using the principle of hybrid Real-Coded Genetic Algorithm-Pattern Search (RGA-PS) method. RGA is the primary optimizer exploiting its global search capabilities, PS is then employed to fine tune the best solution provided by RGA in each evolution. The proposed method enables linear control of the fundamental component of the output voltage and complete elimination of its harmonic contents up to a specified order. Theoretical studies have been carried out to show the effectiveness and robustness of the proposed method of selective harmonic elimination. Theoretical results are validated through simulation studies using PSIM software package.

Optimal Current Control of Externally Excited Synchronous Machines in Automotive Traction Drive Applications

The excellent suitability of the externally excited synchronous machine (EESM) in automotive traction drive applications is justified by its high efficiency over the whole operation range and the high availability of materials. Usually, maximum efficiency is obtained by modelling each single loss and minimizing the sum of all losses. As a result, the quality of the optimization highly depends on the precision of the model. Moreover, it requires accurate knowledge of the saturation dependent machine inductances. Therefore, the present contribution proposes a method to minimize the overall losses of a salient pole EESM and its inverter in steady state operation based on measurement data only. Since this method does not require any manufacturer data, it is well suited for an automated measurement data evaluation and inverter parametrization. The field oriented control (FOC) of an EESM provides three current components resp. three degrees of freedom (DOF). An analytic minimization of the copper losses in the stator and the rotor (assuming constant inductances) is performed and serves as a first approximation of how to choose the optimal current reference values. After a numeric offline minimization of the overall losses based on measurement data the results are compared to a control strategy that satisfies cos (ϕ) = 1.

Array Signal Processing: DOA Estimation for Missing Sensors

Array signal processing involves signal enumeration and source localization. Array signal processing is centered on the ability to fuse temporal and spatial information captured via sampling signals emitted from a number of sources at the sensors of an array in order to carry out a specific estimation task: source characteristics (mainly localization of the sources) and/or array characteristics (mainly array geometry) estimation. Array signal processing is a part of signal processing that uses sensors organized in patterns or arrays, to detect signals and to determine information about them. Beamforming is a general signal processing technique used to control the directionality of the reception or transmission of a signal. Using Beamforming we can direct the majority of signal energy we receive from a group of array. Multiple signal classification (MUSIC) is a highly popular eigenstructure-based estimation method of direction of arrival (DOA) with high resolution. This Paper enumerates the effect of missing sensors in DOA estimation. The accuracy of the MUSIC-based DOA estimation is degraded significantly both by the effects of the missing sensors among the receiving array elements and the unequal channel gain and phase errors of the receiver.

Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler

The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based approach for extracting data from the deep web. Deep iCrawl splits the process into two phases. The first phase includes Query analysis and Query translation and the second covers vision-based extraction of data from the dynamically created deep web pages. There are several established approaches for the extraction of deep web pages but the proposed method aims at overcoming the inherent limitations of the former. This paper also aims at comparing the data items and presenting them in the required order.

Accelerating Sparse Matrix Vector Multiplication on Many-Core GPUs

Many-core GPUs provide high computing ability and substantial bandwidth; however, optimizing irregular applications like SpMV on GPUs becomes a difficult but meaningful task. In this paper, we propose a novel method to improve the performance of SpMV on GPUs. A new storage format called HYB-R is proposed to exploit GPU architecture more efficiently. The COO portion of the matrix is partitioned recursively into a ELL portion and a COO portion in the process of creating HYB-R format to ensure that there are as many non-zeros as possible in ELL format. The method of partitioning the matrix is an important problem for HYB-R kernel, so we also try to tune the parameters to partition the matrix for higher performance. Experimental results show that our method can get better performance than the fastest kernel (HYB) in NVIDIA-s SpMV library with as high as 17% speedup.

Real-time Performance Study of EPA Periodic Data Transmission

EPA (Ethernet for Plant Automation) resolves the nondeterministic problem of standard Ethernet and accomplishes real-time communication by means of micro-segment topology and deterministic scheduling mechanism. This paper studies the real-time performance of EPA periodic data transmission from theoretical and experimental perspective. By analyzing information transmission characteristics and EPA deterministic scheduling mechanism, 5 indicators including delivery time, time synchronization accuracy, data-sending time offset accuracy, utilization percentage of configured timeslice and non-RTE bandwidth that can be used to specify the real-time performance of EPA periodic data transmission are presented and investigated. On this basis, the test principles and test methods of the indicators are respectively studied and some formulas for real-time performance of EPA system are derived. Furthermore, an experiment platform is developed to test the indicators of EPA periodic data transmission in a micro-segment. According to the analysis and the experiment, the methods to improve the real-time performance of EPA periodic data transmission including optimizing network structure, studying self-adaptive adjustment method of timeslice and providing data-sending time offset accuracy for configuration are proposed.

Flood Hazard Mapping in Dikrong Basin of Arunachal Pradesh (India)

Flood zoning studies have become more efficient in recent years because of the availability of advanced computational facilities and use of Geographic Information Systems (GIS). In the present study, flood inundated areas were mapped using GIS for the Dikrong river basin of Arunachal Pradesh, India, corresponding to different return periods (2, 5, 25, 50, and 100 years). Further, the developed inundation maps corresponding to 25, 50, and 100 year return period floods were compared to corresponding maps developed by conventional methods as reported in the Brahmaputra Board Master Plan for Dikrong basin. It was found that, the average deviation of modelled flood inundation areas from reported map inundation areas is below 5% (4.52%). Therefore, it can be said that the modelled flood inundation areas matched satisfactorily with reported map inundation areas. Hence, GIS techniques were proved to be successful in extracting the flood inundation extent in a time and cost effective manner for the remotely located hilly basin of Dikrong, where conducting conventional surveys is very difficult.

A New Method for Detection of Artificial Objects and Materials from Long Distance Environmental Images

The article presents a new method for detection of artificial objects and materials from images of the environmental (non-urban) terrain. Our approach uses the hue and saturation (or Cb and Cr) components of the image as the input to the segmentation module that uses the mean shift method. The clusters obtained as the output of this stage have been processed by the decision-making module in order to find the regions of the image with the significant possibility of representing human. Although this method will detect various non-natural objects, it is primarily intended and optimized for detection of humans; i.e. for search and rescue purposes in non-urban terrain where, in normal circumstances, non-natural objects shouldn-t be present. Real world images are used for the evaluation of the method.