Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences

In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.

Estimation of Time -Varying Linear Regression with Unknown Time -Volatility via Continuous Generalization of the Akaike Information Criterion

The problem of estimating time-varying regression is inevitably concerned with the necessity to choose the appropriate level of model volatility - ranging from the full stationarity of instant regression models to their absolute independence of each other. In the stationary case the number of regression coefficients to be estimated equals that of regressors, whereas the absence of any smoothness assumptions augments the dimension of the unknown vector by the factor of the time-series length. The Akaike Information Criterion is a commonly adopted means of adjusting a model to the given data set within a succession of nested parametric model classes, but its crucial restriction is that the classes are rigidly defined by the growing integer-valued dimension of the unknown vector. To make the Kullback information maximization principle underlying the classical AIC applicable to the problem of time-varying regression estimation, we extend it onto a wider class of data models in which the dimension of the parameter is fixed, but the freedom of its values is softly constrained by a family of continuously nested a priori probability distributions.

A Novel Machining Signal Filtering Technique: Z-notch Filter

A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.

Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment

This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.

Challenges to Technological Advancement in Economically Weak Countries: An Assessment of the Nigerian Educational Situation

Nigeria is considered as one of the many countries in sub-Saharan Africa with a weak economy and gross deficiencies in technology and engineering. Available data from international monitoring and regulatory organizations show that technology is pivotal to determining the economic strengths of nations all over the world. Education is critical to technology acquisition, development, dissemination and adaptation. Thus, this paper seeks to critically assess and discuss issues and challenges facing technological advancement in Nigeria, particularly in the education sector, and also proffers solutions to resuscitate the Nigerian education system towards achieving national technological and economic sustainability such that Nigeria can compete favourably with other technologicallydriven economies of the world in the not-too-distant future.

A Sub-Pixel Image Registration Technique with Applications to Defect Detection

This paper presents a useful sub-pixel image registration method using line segments and a sub-pixel edge detector. In this approach, straight line segments are first extracted from gray images at the pixel level before applying the sub-pixel edge detector. Next, all sub-pixel line edges are mapped onto the orientation-distance parameter space to solve for line correspondence between images. Finally, the registration parameters with sub-pixel accuracy are analytically solved via two linear least-square problems. The present approach can be applied to various fields where fast registration with sub-pixel accuracy is required. To illustrate, the present approach is applied to the inspection of printed circuits on a flat panel. Numerical example shows that the present approach is effective and accurate when target images contain a sufficient number of line segments, which is true in many industrial problems.

Analytical Model Prediction: Micro-Cutting Tool Forces with the Effect of Friction on Machining Titanium Alloy (Ti-6Al-4V)

In this paper, a methodology of a model based on predicting the tool forces oblique machining are introduced by adopting the orthogonal technique. The applied analytical calculation is mostly based on Devries model and some parts of the methodology are employed from Amareggo-Brown model. Model validation is performed by comparing experimental data with the prediction results on machining titanium alloy (Ti-6Al-4V) based on micro-cutting tool perspective. Good agreements with the experiments are observed. A detailed friction form that affected the tool forces also been examined with reasonable results obtained.

Customer Knowledge and Service Development, the Web 2.0 Role in Co-production

The paper is concerned with relationships between SSME and ICTs and focuses on the role of Web 2.0 tools in the service development process. The research presented aims at exploring how collaborative technologies can support and improve service processes, highlighting customer centrality and value coproduction. The core idea of the paper is the centrality of user participation and the collaborative technologies as enabling factors; Wikipedia is analyzed as an example. The result of such analysis is the identification and description of a pattern characterising specific services in which users collaborate by means of web tools with value co-producers during the service process. The pattern of collaborative co-production concerning several categories of services including knowledge based services is then discussed.

An On-chip LDO Voltage Regulator with Improved Current Buffer Compensation

A fully on-chip low drop-out (LDO) voltage regulator with 100pF output load capacitor is presented. A novel frequency compensation scheme using current buffer is adopted to realize single dominant pole within the unit gain frequency of the regulation loop, the phase margin (PM) is at least 50 degree under the full range of the load current, and the power supply rejection (PSR) character is improved compared with conventional Miller compensation. Besides, the differentiator provides a high speed path during the load current transient. Implemented in 0.18μm CMOS technology, the LDO voltage regulator provides 100mA load current with a stable 1.8V output voltage consuming 80μA quiescent current.

Genetic Algorithm Parameters Optimization for Bi-Criteria Multiprocessor Task Scheduling Using Design of Experiments

Multiprocessor task scheduling is a NP-hard problem and Genetic Algorithm (GA) has been revealed as an excellent technique for finding an optimal solution. In the past, several methods have been considered for the solution of this problem based on GAs. But, all these methods consider single criteria and in the present work, minimization of the bi-criteria multiprocessor task scheduling problem has been considered which includes weighted sum of makespan & total completion time. Efficiency and effectiveness of genetic algorithm can be achieved by optimization of its different parameters such as crossover, mutation, crossover probability, selection function etc. The effects of GA parameters on minimization of bi-criteria fitness function and subsequent setting of parameters have been accomplished by central composite design (CCD) approach of response surface methodology (RSM) of Design of Experiments. The experiments have been performed with different levels of GA parameters and analysis of variance has been performed for significant parameters for minimisation of makespan and total completion time simultaneously.

An Experimental and Numerical Investigation on Gas Hydrate Plug Flow in the Inclined Pipes and Bends

Gas hydrates can agglomerate and block multiphase oil and gas pipelines when water is present at hydrate forming conditions. Using "Cold Flow Technology", the aim is to condition gas hydrates so that they can be transported as a slurry mixture without a risk of agglomeration. During the pipeline shut down however, hydrate particles may settle in bends and build hydrate plugs. An experimental setup has been designed and constructed to study the flow of such plugs at start up operations. Experiments have been performed using model fluid and model hydrate particles. The propagations of initial plugs in a bend were recorded with impedance probes along the pipe. The experimental results show a dispersion of the plug front. A peak in pressure drop was also recorded when the plugs were passing the bend. The evolutions of the plugs have been simulated by numerical integration of the incompressible mass balance equations, with an imposed mixture velocity. The slip between particles and carrier fluid has been calculated using a drag relation together with a particle-fluid force balance.

Analytical Study of Component Based Software Engineering

This paper is a survey of current component-based software technologies and the description of promotion and inhibition factors in CBSE. The features that software components inherit are also discussed. Quality Assurance issues in componentbased software are also catered to. The feat research on the quality model of component based system starts with the study of what the components are, CBSE, its development life cycle and the pro & cons of CBSE. Various attributes are studied and compared keeping in view the study of various existing models for general systems and CBS. When illustrating the quality of a software component an apt set of quality attributes for the description of the system (or components) should be selected. Finally, the research issues that can be extended are tabularized.

Enhancing Performance of Bluetooth Piconets Using Priority Scheduling and Exponential Back-Off Mechanism

Bluetooth is a personal wireless communication technology and is being applied in many scenarios. It is an emerging standard for short range, low cost, low power wireless access technology. Current existing MAC (Medium Access Control) scheduling schemes only provide best-effort service for all masterslave connections. It is very challenging to provide QoS (Quality of Service) support for different connections due to the feature of Master Driven TDD (Time Division Duplex). However, there is no solution available to support both delay and bandwidth guarantees required by real time applications. This paper addresses the issue of how to enhance QoS support in a Bluetooth piconet. The Bluetooth specification proposes a Round Robin scheduler as possible solution for scheduling the transmissions in a Bluetooth Piconet. We propose an algorithm which will reduce the bandwidth waste and enhance the efficiency of network. We define token counters to estimate traffic of real-time slaves. To increase bandwidth utilization, a back-off mechanism is then presented for best-effort slaves to decrease the frequency of polling idle slaves. Simulation results demonstrate that our scheme achieves better performance over the Round Robin scheduling.

Using Fractional Factorial Designs for Variable Importance in Random Forest Models

Random Forests are a powerful classification technique, consisting of a collection of decision trees. One useful feature of Random Forests is the ability to determine the importance of each variable in predicting the outcome. This is done by permuting each variable and computing the change in prediction accuracy before and after the permutation. This variable importance calculation is similar to a one-factor-at a time experiment and therefore is inefficient. In this paper, we use a regular fractional factorial design to determine which variables to permute. Based on the results of the trials in the experiment, we calculate the individual importance of the variables, with improved precision over the standard method. The method is illustrated with a study of student attrition at Monash University.

Goal-Based Request Cloud Resource Broker in Medical Application

In this paper, cloud resource broker using goalbased request in medical application is proposed. To handle recent huge production of digital images and data in medical informatics application, the cloud resource broker could be used by medical practitioner for proper process in discovering and selecting correct information and application. This paper summarizes several reviewed articles to relate medical informatics application with current broker technology and presents a research work in applying goal-based request in cloud resource broker to optimize the use of resources in cloud environment. The objective of proposing a new kind of resource broker is to enhance the current resource scheduling, discovery, and selection procedures. We believed that it could help to maximize resources allocation in medical informatics application.

Parallel Image Compression and Analysis with Wavelets

This paper presents image compression with wavelet based method. The wavelet transformation divides image to low- and high pass filtered parts. The traditional JPEG compression technique requires lower computation power with feasible losses, when only compression is needed. However, there is obvious need for wavelet based methods in certain circumstances. The methods are intended to the applications in which the image analyzing is done parallel with compression. Furthermore, high frequency bands can be used to detect changes or edges. Wavelets enable hierarchical analysis for low pass filtered sub-images. The first analysis can be done for a small image, and only if any interesting is found, the whole image is processed or reconstructed.

Approximate Range-Sum Queries over Data Cubes Using Cosine Transform

In this research, we propose to use the discrete cosine transform to approximate the cumulative distributions of data cube cells- values. The cosine transform is known to have a good energy compaction property and thus can approximate data distribution functions easily with small number of coefficients. The derived estimator is accurate and easy to update. We perform experiments to compare its performance with a well-known technique - the (Haar) wavelet. The experimental results show that the cosine transform performs much better than the wavelet in estimation accuracy, speed, space efficiency, and update easiness.

From e-Government to e-Democracy Challenges and Opportunities for Development in Montenegro

Internet today has a huge impact on all aspects of life, and also in the area of the broader context of democracy, politics and politicians. If democracy is freedom of choice, there are a number of conditions that can ensure in practice the freedom to be achieved and realized. These preconditions must be achieved regardless of the manner of voting. The key contribution of ICT to achieve freedom of choice is that technology enables the correlation of the citizens and elected representatives on the better way than it was possible without the Internet. In this sense, we can say that the Internet and ICT are changing significantly, and potentially improving the environment in which democratic processes are taking place. This paper aims to describe trends in use of ICT in democratic processes, and analyzes the challenges for implementation of e-Democracy in Montenegro

Fractal - Wavelet Based Techniques for Improving the Artificial Neural Network Models

Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for preprocessing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based preprocessing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.

Nonlinear Effects in Stiffness Modeling of Robotic Manipulators

The paper focuses on the enhanced stiffness modeling of robotic manipulators by taking into account influence of the external force/torque acting upon the end point. It implements the virtual joint technique that describes the compliance of manipulator elements by a set of localized six-dimensional springs separated by rigid links and perfect joints. In contrast to the conventional formulation, which is valid for the unloaded mode and small displacements, the proposed approach implicitly assumes that the loading leads to the non-negligible changes of the manipulator posture and corresponding amendment of the Jacobian. The developed numerical technique allows computing the static equilibrium and relevant force/torque reaction of the manipulator for any given displacement of the end-effector. This enables designer detecting essentially nonlinear effects in elastic behavior of manipulator, similar to the buckling of beam elements. It is also proposed the linearization procedure that is based on the inversion of the dedicated matrix composed of the stiffness parameters of the virtual springs and the Jacobians/Hessians of the active and passive joints. The developed technique is illustrated by an application example that deals with the stiffness analysis of a parallel manipulator of the Orthoglide family