Effects of Retaining Skillful Employees on the Career Management: A Field Study

Enterprises need a strategic plan to retain their skillful employees and provide their career management, sustain their existence, to have growth and leadership qualities, to reach the objectives to increase the value of the enterprise and to not to be affected from changing demographic structure. In the cases when the long term career expectations of skillful employees are in integrity with the enterprise’s interests, skill management process is directly related to the career management. With a long term plan, the enterprises should cover the labor force need that may arise in the future by using systematic career development programs and be prepared against developments for all times. Skill management is considered as a practice with which career mobility is planned for the skillful employee to be prepared for high level positions. Career planning is the planning of an employee’s progress or promotion within an organization for which he works by developing his knowledge, skills, abilities and motives. Career planning is considered as an individual’s planning his future and the position which he wants to have, the area which he want to work in, the objectives which he want to reach. With the aim of contributing the abovementioned discussion process, career management concept and its perception manner are examined in this study in a comparative manner.

Recursive Path-finding in a Dynamic Maze with Modified Tremaux's Algorithm

Number Link is a Japanese logic puzzle where pairs of same numbers are connected using lines. Number Link can be regarded as a dynamic multiple travelers, multiple entries and exits maze, where the walls and passages are dynamically changing as the travelers move. In this paper, we apply the Tremaux’s algorithm to solve Number Link puzzles of size 8x8, 10x10 and 15x20. The algorithm works well and produces a solution for puzzles of size 8x8 and 10x10. However, solving a puzzle of size 15x20 requires high computer processing power and is time consuming.

Increase of Organization in Complex Systems

Measures of complexity and entropy have not converged to a single quantitative description of levels of organization of complex systems. The need for such a measure is increasingly necessary in all disciplines studying complex systems. To address this problem, starting from the most fundamental principle in Physics, here a new measure for quantity of organization and rate of self-organization in complex systems based on the principle of least (stationary) action is applied to a model system - the central processing unit (CPU) of computers. The quantity of organization for several generations of CPUs shows a double exponential rate of change of organization with time. The exact functional dependence has a fine, S-shaped structure, revealing some of the mechanisms of self-organization. The principle of least action helps to explain the mechanism of increase of organization through quantity accumulation and constraint and curvature minimization with an attractor, the least average sum of actions of all elements and for all motions. This approach can help describe, quantify, measure, manage, design and predict future behavior of complex systems to achieve the highest rates of self organization to improve their quality. It can be applied to other complex systems from Physics, Chemistry, Biology, Ecology, Economics, Cities, network theory and others where complex systems are present.

Multiscale Blind Image Restoration with a New Method

A new method, based on the normal shrink and modified version of Katssagelous and Lay, is proposed for multiscale blind image restoration. The method deals with the noise and blur in the images. It is shown that the normal shrink gives the highest S/N (signal to noise ratio) for image denoising process. The multiscale blind image restoration is divided in two sections. The first part of this paper proposes normal shrink for image denoising and the second part of paper proposes modified version of katssagelous and Lay for blur estimation and the combination of both methods to reach a multiscale blind image restoration.

Reducing Greenhouse Gas Emissions by Recyclable Material Bank Project of Universities in Central Region of Thailand

This research studied recycled waste by the Recyclable Material Bank Project of 4 universities in the central region of Thailand for the evaluation of reducing greenhouse gas emissions compared with landfilling activity during July 2012 to June 2013. The results showed that the projects collected total amount of recyclable wastes of about 911,984.80 kilograms. Office paper had the largest amount among these recycled wastes (50.68% of total recycled waste). Groups of recycled waste can be prioritized from high to low according to their amount as paper, plastic, glass, mixed recyclables, and metal, respectively. The project reduced greenhouse gas emissions equivalent to about 2814.969 metric tons of carbon dioxide. The most significant recycled waste that affects the reduction of greenhouse gas emissions is office paper which is 70.16% of total reduced greenhouse gasses emission. According to amount of reduced greenhouse gasses emission, groups of recycled waste can be prioritized from high to low significances as paper, plastic, metals, mixed recyclables, and glass, respectively.

Acoustic Source Localization Based On the Extended Kalman Filter for an Underwater Vehicle with a Pair of Hydrophones

In this study, we consider a special situation that only a pair of hydrophone on a moving underwater vehicle is available to localize a fixed acoustic source of far distance. The trigonometry can be used in this situation by using two different DOA of different locations. Notice that the distance between the two locations should be measured. Therefore, we assume that the vehicle is sailing straightly and the moving distance for each unit time is measured continuously. However, the accuracy of the localization using the trigonometry is highly dependent to the accuracy of DOAs and measured moving distances. Therefore, we proposed another method based on the extended Kalman filter that gives more robust and accurate localization result.

Study the Effect of Soft Errors on FlexRay-Based Automotive Systems

FlexRay, as a communication protocol for automotive control systems, is developed to fulfill the increasing demand on the electronic control units for implementing systems with higher safety and more comfort. In this work, we study the impact of radiation-induced soft errors on FlexRay-based steer-by-wire system. We injected the soft errors into general purpose register set of FlexRay nodes to identify the most critical registers, the failure modes of the steer-by-wire system, and measure the probability distribution of failure modes when an error occurs in the register file.

Distribution and Source of PAHs in Surface Sediments of Canon River Mouth, Taiwan

Surface sediment samples were collected from the Canon River mouth, Taiwan and analyzed for polycyclic aromatic hydrocarbons (PAHs). Total PAHs concentrations varied from 337 to 1,252 ng/g dry weight, with a mean concentration of 827 ng/g dry weight. The spatial distribution of PAHs reveals that the PAHs concentration is relatively high in the river mouth region, and gradually diminishes toward the harbor region. Diagnostic ratios showed that the possible source of PAHs in the Canon River mouth could be petroleum combustion. The toxic equivalent concentrations (TEQcarc) of PAHs varied from 47 to 112 ng TEQ/g dry weight. Higher total TEQcarc values were found in the river mouth region. As compared with the US Sediment Quality Guidelines (SQGs), the observed levels of PAHs at Canon River mouth were lower than the effects range low (ERL), and would probably not exert adverse biological effects.

Simulation and Statistical Analysis of Motion Behavior of a Single Rockfall

The impact force of a rockfall is mainly determined by its moving behavior and velocity, which are contingent on the rock shape, slope gradient, height, and surface roughness of the moving path. It is essential to precisely calculate the moving path of the rockfall in order to effectively minimize and prevent damages caused by the rockfall. By applying the Colorado Rockfall Simulation Program (CRSP) program as the analysis tool, this research studies the influence of three shapes of rock (spherical, cylindrical and discoidal) and surface roughness on the moving path of a single rockfall. As revealed in the analysis, in addition to the slope gradient, the geometry of the falling rock and joint roughness coefficient ( JRC ) of the slope are the main factors affecting the moving behavior of a rockfall. On a single flat slope, both the rock-s bounce height and moving velocity increase as the surface gradient increases, with a critical gradient value of 1:m = 1 . Bouncing behavior and faster moving velocity occur more easily when the rock geometry is more oval. A flat piece tends to cause sliding behavior and is easily influenced by the change of surface undulation. When JRC

Analysis and Application of in Indirect MinimumJerk Method for Higher order Differential Equation in Dynamics Optimization Systems

Both the minimum energy consumption and smoothness, which is quantified as a function of jerk, are generally needed in many dynamic systems such as the automobile and the pick-and-place robot manipulator that handles fragile equipments. Nevertheless, many researchers come up with either solely concerning on the minimum energy consumption or minimum jerk trajectory. This research paper considers the indirect minimum Jerk method for higher order differential equation in dynamics optimization proposes a simple yet very interesting indirect jerks approaches in designing the time-dependent system yielding an alternative optimal solution. Extremal solutions for the cost functions of indirect jerks are found using the dynamic optimization methods together with the numerical approximation. This case considers the linear equation of a simple system, for instance, mass, spring and damping. The simple system uses two mass connected together by springs. The boundary initial is defined the fix end time and end point. The higher differential order is solved by Galerkin-s methods weight residual. As the result, the 6th higher differential order shows the faster solving time.

Automatic Segmentation of Lung Areas in Magnetic Resonance Images

Segmenting the lungs in medical images is a challenging and important task for many applications. In particular, automatic segmentation of lung cavities from multiple magnetic resonance (MR) images is very useful for oncological applications such as radiotherapy treatment planning. However, distinguishing of the lung areas is not trivial due to largely changing lung shapes, low contrast and poorly defined boundaries. In this paper, we address lung segmentation problem from pulmonary magnetic resonance images and propose an automated method based on a robust regionaided geometric snake with a modified diffused region force into the standard geometric model definition. The extra region force gives the snake a global complementary view of the lung boundary information within the image which along with the local gradient flow, helps detect fuzzy boundaries. The proposed method has been successful in segmenting the lungs in every slice of 30 magnetic resonance images with 80 consecutive slices in each image. We present results by comparing our automatic method to manually segmented lung cavities provided by an expert radiologist and with those of previous works, showing encouraging results and high robustness of our approach.

Investigation on Feature Extraction and Classification of Medical Images

In this paper we present the deep study about the Bio- Medical Images and tag it with some basic extracting features (e.g. color, pixel value etc). The classification is done by using a nearest neighbor classifier with various distance measures as well as the automatic combination of classifier results. This process selects a subset of relevant features from a group of features of the image. It also helps to acquire better understanding about the image by describing which the important features are. The accuracy can be improved by increasing the number of features selected. Various types of classifications were evolved for the medical images like Support Vector Machine (SVM) which is used for classifying the Bacterial types. Ant Colony Optimization method is used for optimal results. It has high approximation capability and much faster convergence, Texture feature extraction method based on Gabor wavelets etc..

Crystalline Graphene Nanoribbons with Atomically Smooth Edges via a Novel Physico- Chemical Route

A novel physico-chemical route to produce few layer graphene nanoribbons with atomically smooth edges is reported, via acid treatment (H2SO4:HNO3) followed by characteristic thermal shock processes involving extremely cold substances. Samples were studied by scanning electron microscopy (SEM), transmission electron microscopy (TEM), X-ray diffraction (XRD), Raman spectroscopy and X-ray photoelectron spectroscopy. This method demonstrates the importance of having the nanotubes open ended for an efficient uniform unzipping along the nanotube axis. The average dimensions of these nanoribbons are approximately ca. 210 nm wide and consist of few layers, as observed by transmission electron microscopy. The produced nanoribbons exhibit different chiralities, as observed by high resolution transmission electron microscopy. This method is able to provide graphene nanoribbons with atomically smooth edges which could be used in various applications including sensors, gas adsorption materials, composite fillers, among others.

A New High Speed Neural Model for Fast Character Recognition Using Cross Correlation and Matrix Decomposition

Neural processors have shown good results for detecting a certain character in a given input matrix. In this paper, a new idead to speed up the operation of neural processors for character detection is presented. Such processors are designed based on cross correlation in the frequency domain between the input matrix and the weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through image decomposition. Each image is divided into small in size sub-images and then each one is tested separately by using a single faster neural processor. Furthermore, faster character detection is obtained by using parallel processing techniques to test the resulting sub-images at the same time using the same number of faster neural networks. In contrast to using only faster neural processors, the speed up ratio is increased with the size of the input image when using faster neural processors and image decomposition. Moreover, the problem of local subimage normalization in the frequency domain is solved. The effect of image normalization on the speed up ratio of character detection is discussed. Simulation results show that local subimage normalization through weight normalization is faster than subimage normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.

Genetic Algorithm based Optimization approach for MR Dampers Fuzzy Modeling

Magneto-rheological (MR) fluid damper is a semiactive control device that has recently received more attention by the vibration control community. But inherent hysteretic and highly nonlinear dynamics of MR fluid damper is one of the challenging aspects to employ its unique characteristics. The combination of artificial neural network (ANN) and fuzzy logic system (FLS) have been used to imitate more precisely the behavior of this device. However, the derivative-based nature of adaptive networks causes some deficiencies. Therefore, in this paper, a novel approach that employ genetic algorithm, as a free-derivative algorithm, to enhance the capability of fuzzy systems, is proposed. The proposed method used to model MR damper. The results will be compared with adaptive neuro-fuzzy inference system (ANFIS) model, which is one of the well-known approaches in soft computing framework, and two best parametric models of MR damper. Data are generated based on benchmark program by applying a number of famous earthquake records.

Effects of Allelochemical Gramine on Photosynthetic Pigments of Cyanobacterium Microcystis aeruginosa

Toxic and bloom-forming cyanobacterium Microcystis aeruginosa was exposed to antialgal allelochemical gramine (0, 0.5, 1, 2, 4, 8 mg·L-1), The effects of gramine on photosynthetic pigments (lipid soluble: chlorophyll a and β-carotene; water soluble: phycocyanin, allophycocyanin, phycoerythrin, and total phycobilins) and absorption spectra were studied in order to identify the most sensitive pigment probe implicating the crucial suppression site on photosynthetic apparatus. The results obtained indicated that all pigment parameters were decreased with gramine concentration increasing and exposure time extending. The above serious bleaching of pigments was also reflected on the scanning results of absorption spectra. Phycoerytherin exhibited the highest sensitivity to gramine added, following by the largest relative decrease. It was concluded that gramine seriously influenced algal photosynthetic activity by destroying photosynthetic pigments and phycoerythrin most sensitive to gramine might be contributed to its placing the outside of phycobilins.

Optical Fish Tracking in Fishways using Neural Networks

One of the main issues in Computer Vision is to extract the movement of one or several points or objects of interest in an image or video sequence to conduct any kind of study or control process. Different techniques to solve this problem have been applied in numerous areas such as surveillance systems, analysis of traffic, motion capture, image compression, navigation systems and others, where the specific characteristics of each scenario determine the approximation to the problem. This paper puts forward a Computer Vision based algorithm to analyze fish trajectories in high turbulence conditions in artificial structures called vertical slot fishways, designed to allow the upstream migration of fish through obstructions in rivers. The suggested algorithm calculates the position of the fish at every instant starting from images recorded with a camera and using neural networks to execute fish detection on images. Different laboratory tests have been carried out in a full scale fishway model and with living fishes, allowing the reconstruction of the fish trajectory and the measurement of velocities and accelerations of the fish. These data can provide useful information to design more effective vertical slot fishways.

Determining of Threshold Levels of Burst by Burst AQAM/CDMA in Slow Rayleigh Fading Environments

In this paper, we are going to determine the threshold levels of adaptive modulation in a burst by burst CDMA system by a suboptimum method so that the above method attempts to increase the average bit per symbol (BPS) rate of transceiver system by switching between the different modulation modes in variable channel condition. In this method, we choose the minimum values of average bit error rate (BER) and maximum values of average BPS on different values of average channel signal to noise ratio (SNR) and then calculate the relative threshold levels of them, so that when the instantaneous SNR increases, a higher order modulation be employed for increasing throughput and vise-versa when the instantaneous SNR decreases, a lower order modulation be employed for improvement of BER. In transmission step, by this adaptive modulation method, in according to comparison between obtained estimation of pilot symbols and a set of above suboptimum threshold levels, above system chooses one of states no transmission, BPSK, 4QAM and square 16QAM for modulation of data. The expected channel in this paper is a slow Rayleigh fading.

A Meta-Heuristic Algorithm for Vertex Covering Problem Based on Gravity

A new Meta heuristic approach called "Randomized gravitational emulation search algorithm (RGES)" for solving vertex covering problems has been designed. This algorithm is found upon introducing randomization concept along with the two of the four primary parameters -velocity- and -gravity- in physics. A new heuristic operator is introduced in the domain of RGES to maintain feasibility specifically for the vertex covering problem to yield best solutions. The performance of this algorithm has been evaluated on a large set of benchmark problems from OR-library. Computational results showed that the randomized gravitational emulation search algorithm - based heuristic is capable of producing high quality solutions. The performance of this heuristic when compared with other existing heuristic algorithms is found to be excellent in terms of solution quality.

Inductance Characteristic of Annealed Titanium Dioxide on Silicon Substrate

The control of oxygen flow rate during growth of titanium dioxide by mass flow controller in DC plasma sputtering growth system is studied. The impedance of TiO2 films for inductance effect is influenced by annealing time and oxygen flow rate. As annealing time is increased, the inductance of TiO2 film is the more. The growth condition of optimum and maximum inductance for TiO2 film to serve as sensing device are oxygen flow rate of 15 sccm and large annealing time. The large inductance of TiO2 film will be adopted to fabricate the biosensor to obtain the high sensitivity of sensing in biology.