Frequency and Amplitude Measurement of a Vibrating Object in Water Using Ultrasonic Speckle Technique

The principle of frequency and amplitude measurement of a vibrating object in water using ultrasonic speckle technique is presented in this paper. Compared with other traditional techniques, the ultrasonic speckle technique can be applied to vibration measurement of a nonmetal object with rough surface in water in a noncontact way. The relationship between speckle movement and object movement was analyzed. Based on this study, an ultrasonic speckle measurement system was set up. With this system the frequency and amplitude of an underwater vibrating cantilever beam was detected. The result shows that the experimental data is in good agreement with the calibrating data.

Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences

In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.

MDA of Hexagonal Honeycomb Plates used for Space Applications

The purpose of this paper is to perform a multidisciplinary design and analysis (MDA) of honeycomb panels used in the satellites structural design. All the analysis is based on clamped-free boundary conditions. In the present work, detailed finite element models for honeycomb panels are developed and analysed. Experimental tests were carried out on a honeycomb specimen of which the goal is to compare the previous modal analysis made by the finite element method as well as the existing equivalent approaches. The obtained results show a good agreement between the finite element analysis, equivalent and tests results; the difference in the first two frequencies is less than 4% and less than 10% for the third frequency. The results of the equivalent model presented in this analysis are obtained with a good accuracy. Moreover, investigations carried out in this research relate to the honeycomb plate modal analysis under several aspects including the structural geometrical variation by studying the various influences of the dimension parameters on the modal frequency, the variation of core and skin material of the honeycomb. The various results obtained in this paper are promising and show that the geometry parameters and the type of material have an effect on the value of the honeycomb plate modal frequency.

The Relationship between Burnout, Negative Affectivity and Organizational Citizenship Behavior for Human Services Employees

The purpose of this study was to explore the relationship between Burnout, Negative Affectivity, and Organizational Citizenship Behavior (OCB) for social service workers at two agencies serving homeless populations. Thirty two subjects completed surveys. Significant correlations between major variables and subscales were found.

Automatic Segmentation of Dermoscopy Images Using Histogram Thresholding on Optimal Color Channels

Automatic segmentation of skin lesions is the first step towards development of a computer-aided diagnosis of melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most discriminative and effective color space for melanoma application. This paper proposes a novel automatic segmentation algorithm using color space analysis and clustering-based histogram thresholding, which is able to determine the optimal color channel for segmentation of skin lesions. To demonstrate the validity of the algorithm, it is tested on a set of 30 high resolution dermoscopy images and a comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm. The evaluation is carried out by applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. Through ROC analysis and ranking the metrics, it is shown that the best results are obtained with the X and XoYoR color channels which results in an accuracy of approximately 97%. The proposed method is also compared with two state-ofthe- art skin lesion segmentation methods, which demonstrates the effectiveness and superiority of the proposed segmentation method.

A Novel Machining Signal Filtering Technique: Z-notch Filter

A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.

A Computationally Efficient Design for Prototype Filters of an M-Channel Cosine Modulated Filter Bank

The paper discusses a computationally efficient method for the design of prototype filters required for the implementation of an M-band cosine modulated filter bank. The prototype filter is formulated as an optimum interpolated FIR filter. The optimum interpolation factor requiring minimum number of multipliers is used. The model filter as well as the image suppressor will be designed using the Kaiser window. The method will seek to optimize a single parameter namely cutoff frequency to minimize the distortion in the overlapping passband.

Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment

This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.

Challenges to Technological Advancement in Economically Weak Countries: An Assessment of the Nigerian Educational Situation

Nigeria is considered as one of the many countries in sub-Saharan Africa with a weak economy and gross deficiencies in technology and engineering. Available data from international monitoring and regulatory organizations show that technology is pivotal to determining the economic strengths of nations all over the world. Education is critical to technology acquisition, development, dissemination and adaptation. Thus, this paper seeks to critically assess and discuss issues and challenges facing technological advancement in Nigeria, particularly in the education sector, and also proffers solutions to resuscitate the Nigerian education system towards achieving national technological and economic sustainability such that Nigeria can compete favourably with other technologicallydriven economies of the world in the not-too-distant future.

FIR Filter Design via Linear Complementarity Problem, Messy Genetic Algorithm, and Ising Messy Genetic Algorithm

In this paper the design of maximally flat linear phase finite impulse response (FIR) filters is considered. The problem is handled with totally two different approaches. The first one is completely deterministic numerical approach where the problem is formulated as a Linear Complementarity Problem (LCP). The other one is based on a combination of Markov Random Fields (MRF's) approach with messy genetic algorithm (MGA). Markov Random Fields (MRFs) are a class of probabilistic models that have been applied for many years to the analysis of visual patterns or textures. Our objective is to establish MRFs as an interesting approach to modeling messy genetic algorithms. We establish a theoretical result that every genetic algorithm problem can be characterized in terms of a MRF model. This allows us to construct an explicit probabilistic model of the MGA fitness function and introduce the Ising MGA. Experimentations done with Ising MGA are less costly than those done with standard MGA since much less computations are involved. The least computations of all is for the LCP. Results of the LCP, random search, random seeded search, MGA, and Ising MGA are discussed.

Possible Futures for Doctoral Research Training in Design

In this paper, we argue that Design research is basic to countries- national productivity and competition agendas at the same time that vagaries of research training presents as one of the barriers faced by Design Higher Degree by Research students in engaging those agendas. We argue that, given industry requirements for research-trained recruits, students have the right to expect that research training will provide the foundations of a successful career on an academic or research pathway or a professional pathway, but that universities have yet to address problems in their provision of research training for Design doctoral students. We suggest that to facilitate this, rigorous research conducted on the provision of Doctoral programs in Design would serve to inform future activities in Design research in productive ways.

Analytical Model Prediction: Micro-Cutting Tool Forces with the Effect of Friction on Machining Titanium Alloy (Ti-6Al-4V)

In this paper, a methodology of a model based on predicting the tool forces oblique machining are introduced by adopting the orthogonal technique. The applied analytical calculation is mostly based on Devries model and some parts of the methodology are employed from Amareggo-Brown model. Model validation is performed by comparing experimental data with the prediction results on machining titanium alloy (Ti-6Al-4V) based on micro-cutting tool perspective. Good agreements with the experiments are observed. A detailed friction form that affected the tool forces also been examined with reasonable results obtained.

An On-chip LDO Voltage Regulator with Improved Current Buffer Compensation

A fully on-chip low drop-out (LDO) voltage regulator with 100pF output load capacitor is presented. A novel frequency compensation scheme using current buffer is adopted to realize single dominant pole within the unit gain frequency of the regulation loop, the phase margin (PM) is at least 50 degree under the full range of the load current, and the power supply rejection (PSR) character is improved compared with conventional Miller compensation. Besides, the differentiator provides a high speed path during the load current transient. Implemented in 0.18μm CMOS technology, the LDO voltage regulator provides 100mA load current with a stable 1.8V output voltage consuming 80μA quiescent current.

Genetic Algorithm Parameters Optimization for Bi-Criteria Multiprocessor Task Scheduling Using Design of Experiments

Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.

Decomposition of Graphs into Induced Paths and Cycles

A decomposition of a graph G is a collection ψ of subgraphs H1,H2, . . . , Hr of G such that every edge of G belongs to exactly one Hi. If each Hi is either an induced path or an induced cycle in G, then ψ is called an induced path decomposition of G. The minimum cardinality of an induced path decomposition of G is called the induced path decomposition number of G and is denoted by πi(G). In this paper we initiate a study of this parameter.

An Experimental and Numerical Investigation on Gas Hydrate Plug Flow in the Inclined Pipes and Bends

Gas hydrates can agglomerate and block multiphase oil and gas pipelines when water is present at hydrate forming conditions. Using "Cold Flow Technology", the aim is to condition gas hydrates so that they can be transported as a slurry mixture without a risk of agglomeration. During the pipeline shut down however, hydrate particles may settle in bends and build hydrate plugs. An experimental setup has been designed and constructed to study the flow of such plugs at start up operations. Experiments have been performed using model fluid and model hydrate particles. The propagations of initial plugs in a bend were recorded with impedance probes along the pipe. The experimental results show a dispersion of the plug front. A peak in pressure drop was also recorded when the plugs were passing the bend. The evolutions of the plugs have been simulated by numerical integration of the incompressible mass balance equations, with an imposed mixture velocity. The slip between particles and carrier fluid has been calculated using a drag relation together with a particle-fluid force balance.

A Novel Hybrid Mobile Agent Based Distributed Intrusion Detection System

The first generation of Mobile Agents based Intrusion Detection System just had two components namely data collection and single centralized analyzer. The disadvantage of this type of intrusion detection is if connection to the analyzer fails, the entire system will become useless. In this work, we propose novel hybrid model for Mobile Agent based Distributed Intrusion Detection System to overcome the current problem. The proposed model has new features such as robustness, capability of detecting intrusion against the IDS itself and capability of updating itself to detect new pattern of intrusions. In addition, our proposed model is also capable of tackling some of the weaknesses of centralized Intrusion Detection System models.

Development of a RAM Simulation Model for Acid Gas Removal System

A reliability, availability and maintainability (RAM) model has been built for acid gas removal plant for system analysis that will play an important role in any process modifications, if required, for achieving its optimum performance. Due to the complexity of the plant, the model was based on a Reliability Block Diagram (RBD) with a Monte Carlo simulation engine. The model has been validated against actual plant data as well as local expert opinions, resulting in an acceptable simulation model. The results from the model showed that the operation and maintenance can be further improved, resulting in reduction of the annual production loss.

Effect of Salt Solution and Plasticity Index on undrain Shear Strength of Clays

Compacted clay liners (CCLs) are the main materials used in waste disposal landfills due to their low permeability. In this study, the effect on the shear resistant of clays with inorganic salt solutions as permeate fluid was experimentally investigated. For this purpose, NaCl inorganic salt solution at concentrations of 2, 5, 10% and deionized water were used. Laboratory direct shear and Vane shear tests were conducted on three compacted clays with low, medium and high plasticity. Results indicated that the solutions type and its concentration affect the shear properties of the mixture. In the light of this study, the influence magnitude of these inorganic salts in varies concentrations in different clays were determined and more suitable compacted clay with the compare of plasticity were found.

Analytical Study of Component Based Software Engineering

This paper is a survey of current component-based software technologies and the description of promotion and inhibition factors in CBSE. The features that software components inherit are also discussed. Quality Assurance issues in componentbased software are also catered to. The feat research on the quality model of component based system starts with the study of what the components are, CBSE, its development life cycle and the pro & cons of CBSE. Various attributes are studied and compared keeping in view the study of various existing models for general systems and CBS. When illustrating the quality of a software component an apt set of quality attributes for the description of the system (or components) should be selected. Finally, the research issues that can be extended are tabularized.