Lithofacies Classification from Well Log Data Using Neural Networks, Interval Neutrosophic Sets and Quantification of Uncertainty

This paper proposes a novel approach to the question of lithofacies classification based on an assessment of the uncertainty in the classification results. The proposed approach has multiple neural networks (NN), and interval neutrosophic sets (INS) are used to classify the input well log data into outputs of multiple classes of lithofacies. A pair of n-class neural networks are used to predict n-degree of truth memberships and n-degree of false memberships. Indeterminacy memberships or uncertainties in the predictions are estimated using a multidimensional interpolation method. These three memberships form the INS used to support the confidence in results of multiclass classification. Based on the experimental data, our approach improves the classification performance as compared to an existing technique applied only to the truth membership. In addition, our approach has the capability to provide a measure of uncertainty in the problem of multiclass classification.

Performance Evaluation of Complex Electrical Bio-impedance from V/I Four-electrode Measurements

The passive electrical properties of a tissue depends on the intrinsic constituents and its structure, therefore by measuring the complex electrical impedance of the tissue it might be possible to obtain indicators of the tissue state or physiological activity [1]. Complete bio-impedance information relative to physiology and pathology of a human body and functional states of the body tissue or organs can be extracted by using a technique containing a fourelectrode measurement setup. This work presents the estimation measurement setup based on the four-electrode technique. First, the complex impedance is estimated by three different estimation techniques: Fourier, Sine Correlation and Digital De-convolution and then estimation errors for the magnitude, phase, reactance and resistance are calculated and analyzed for different levels of disturbances in the observations. The absolute values of relative errors are plotted and the graphical performance of each technique is compared.

Influence of Ammonium Concentration on the Performance of an Inorganic Biofilter Treating Methane

Among the technologies available to reduce methane emitted from the pig industry, biofiltration seems to be an effective and inexpensive solution. In methane (CH4) biofiltration, nitrogen is an important macronutrient for the microorganisms growth. The objective of this research project was to study the effect of ammonium (NH4 +) on the performance, the biomass production and the nitrogen conversion of a biofilter treating methane. For NH4 + concentrations ranging from 0.05 to 0.5 gN-NH4 +/L, the CH4 removal efficiency and the dioxide carbon production rate decreased linearly from 68 to 11.8 % and from 7.1 to 0.5 g/(m3-h), respectively. The dry biomass content varied from 4.1 to 5.8 kg/(m3 filter bed). For the same range of concentrations, the ammonium conversion decreased while the specific nitrate production rate increased. The specific nitrate production rate presented negative values indicating denitrification in the biofilter.

Constructing a Suitable Model of Distance Training for Community Leader in the Upper Northeastern Region

The objective of this research intends to create a suitable model of distance training for community leaders in the upper northeastern region of Thailand. The implementation of the research process is divided into four steps: The first step is to analyze relevant documents. The second step deals with an interview in depth with experts. The third step is concerned with constructing a model. And the fourth step takes aim at model validation by expert assessments. The findings reveal the two important components for constructing an appropriate model of distance training for community leaders in the upper northeastern region. The first component consists of the context of technology management, e.g., principle, policy and goals. The second component can be viewed in two ways. Firstly, there are elements comprising input, process, output and feedback. Secondly, the sub-components include steps and process in training. The result of expert assessments informs that the researcher-s constructed model is consistent and suitable and overall the most appropriate.

A General Segmentation Scheme for Contouring Kidney Region in Ultrasound Kidney Images using Improved Higher Order Spline Interpolation

A higher order spline interpolated contour obtained with up-sampling of homogenously distributed coordinates for segmentation of kidney region in different classes of ultrasound kidney images has been developed and presented in this paper. The performance of the proposed method is measured and compared with modified snake model contour, Markov random field contour and expert outlined contour. The validation of the method is made in correspondence with expert outlined contour using maximum coordinate distance, Hausdorff distance and mean radial distance metrics. The results obtained reveal that proposed scheme provides optimum contour that agrees well with expert outlined contour. Moreover this technique helps to preserve the pixels-of-interest which in specific defines the functional characteristic of kidney. This explores various possibilities in implementing computer-aided diagnosis system exclusively for US kidney images.

Optimum Design of an Absorption Heat Pump Integrated with a Kraft Industry using Genetic Algorithm

In this study the integration of an absorption heat pump (AHP) with the concentration section of an industrial pulp and paper process is investigated using pinch technology. The optimum design of the proposed water-lithium bromide AHP is then achieved by minimizing the total annual cost. A comprehensive optimization is carried out by relaxation of all stream pressure drops as well as heat exchanger areas involving in AHP structure. It is shown that by applying genetic algorithm optimizer, the total annual cost of the proposed AHP is decreased by 18% compared to one resulted from simulation.

Downtrend Algorithm and Hedging Strategy in Futures Market

The paper investigates downtrend algorithm and trading strategy based on chart pattern recognition and technical analysis in futures market. The proposed chart formation is a pattern with the lowest low in the middle and one higher low on each side. The contribution of this paper lies in the reinforcement of statements about the profitability of momentum trend trading strategies. Practical benefit of the research is a trading algorithm in falling markets and back-test analysis in futures markets. When based on daily data, the algorithm has generated positive results, especially when the market had downtrend period. Downtrend algorithm can be applied as a hedge strategy against possible sudden market crashes. The proposed strategy can be interesting for futures traders, hedge funds or scientific researchers performing technical or algorithmic market analysis based on momentum trend trading.

Tax Innovation, Administration and Revenue Generation in Nigeria: Case of Cross River State

Taxation as a potent fiscal policy instrument through which infrastructures and social services that drive the development process of any society has been ineffective in Nigeria. The adoption of appropriate measures is, however, a requirement for the generation of adequate tax revenue. This study set out to investigates efficiency and effectiveness in the administration of tax in Nigeria, using Cross River State as a case-study. The methodology to achieve this objective is a qualitative technique using structured questionnaires to survey the three senatorial districts in the state; the central limit theory is adopted as our analytical technique. Result showed a significant degree of inefficiency in the administration of taxes. It is recommended that periodic review and update of tax policy will bring innovation and effectiveness in the administration of taxes. Also proper appropriation of tax revenue will drive development in needed infrastructural and social services.

Multi Band Frequency Synthesizer Based on ISPD PLL with Adapted LC Tuned VCO

The 4G front-end transceiver needs a high performance which can be obtained mainly with an optimal architecture and a multi-band Local Oscillator. In this study, we proposed and presented a new architecture of multi-band frequency synthesizer based on an Inverse Sine Phase Detector Phase Locked Loop (ISPD PLL) without any filters and any controlled gain block and associated with adapted multi band LC tuned VCO using a several numeric controlled capacitive branches but not binary weighted. The proposed architecture, based on 0.35μm CMOS process technology, supporting Multi-band GSM/DCS/DECT/ UMTS/WiMax application and gives a good performances: a phase noise @1MHz -127dBc and a Factor Of Merit (FOM) @ 1MHz - 186dB and a wide band frequency range (from 0.83GHz to 3.5GHz), that make the proposed architecture amenable for monolithic integration and 4G multi-band application.

One-Class Support Vector Machines for Protein-Protein Interactions Prediction

Predicting protein-protein interactions represent a key step in understanding proteins functions. This is due to the fact that proteins usually work in context of other proteins and rarely function alone. Machine learning techniques have been applied to predict protein-protein interactions. However, most of these techniques address this problem as a binary classification problem. Although it is easy to get a dataset of interacting proteins as positive examples, there are no experimentally confirmed non-interacting proteins to be considered as negative examples. Therefore, in this paper we solve this problem as a one-class classification problem using one-class support vector machines (SVM). Using only positive examples (interacting protein pairs) in training phase, the one-class SVM achieves accuracy of about 80%. These results imply that protein-protein interaction can be predicted using one-class classifier with comparable accuracy to the binary classifiers that use artificially constructed negative examples.

Information Retrieval: Improving Question Answering Systems by Query Reformulation and Answer Validation

Question answering (QA) aims at retrieving precise information from a large collection of documents. Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems to reformulate questions. Moreover answer processing module is an emerging topic in QA systems, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic relations and co-occurrence keywords. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing which both affect on the evaluation of the system operations. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. The objective of an Answer Validation task is thus to judge the correctness of an answer returned by a QA system, according to the text snippet given to support it. For validating answers we apply candidate answer filtering, candidate answer ranking and also it has a final validation section by user voting. Also this paper described new architecture of question and answer processing modules with modeling, implementing and evaluating the system. The system differs from most question answering systems in its answer validation model. This module makes it more suitable to find exact answer. Results show that, from total 50 asked questions, evaluation of the model, show 92% improving the decision of the system.

Analysis of Chatter in Ball End Milling by Wavelet Transform

The chatter is one of the major limitations of the productivity in the ball end milling process. It affects the surface roughness, the dimensional accuracy and the tool life. The aim of this research is to propose the new system to detect the chatter during the ball end milling process by using the wavelet transform. The proposed method is implemented on the 5-axis CNC machining center and the new three parameters are introduced from three dynamic cutting forces, which are calculated by taking the ratio of the average variances of dynamic cutting forces to the absolute variances of themselves. It had been proved that the chatter can be easier to detect during the in-process cutting by using the new parameters which are proposed in this research. The experimentally obtained results showed that the wavelet transform can provide the reliable results to detect the chatter under various cutting conditions.

Paradigm and Paradox: Knowledge Management and Business Ethics

Knowledge management (KM) is generally considered to be a positive process in an organisation, facilitating opportunities to achieve competitive advantage via better quality information handling, compilation of expert know-how and rapid response to fluctuations in the business environment. The KM paradigm as portrayed in the literature informs the processes that can increase intangible assets so that corporate knowledge is preserved. However, in some instances, knowledge management exists in a universe of dynamic tension among the conflicting needs to respect privacy and intellectual property (IP), to guard against data theft, to protect national security and to stay within the laws. While the Knowledge Management literature focuses on the bright side of the paradigm, there is also a different side in which knowledge is distorted, suppressed or misappropriated due to personal or organisational motives (the paradox). This paper describes the ethical paradoxes that occur within the taxonomy and deontology of knowledge management and suggests that recognising both the promises and pitfalls of KM requires wisdom.

Study of Compaction in Hot-Mix Asphalt Using Computer Simulations

During the process of compaction in Hot-Mix Asphalt (HMA) mixtures, the distance between aggregate particles decreases as they come together and eliminate air-voids. By measuring the inter-particle distances in a cut-section of a HMA sample the degree of compaction can be estimated. For this, a calibration curve is generated by computer simulation technique when the gradation and asphalt content of the HMA mixture are known. A two-dimensional cross section of HMA specimen was simulated using the mixture design information (gradation, asphalt content and air-void content). Nearest neighbor distance methods such as Delaunay triangulation were used to study the changes in inter-particle distance and area distribution during the process of compaction in HMA. Such computer simulations would enable making several hundreds of repetitions in a short period of time without the necessity to compact and analyze laboratory specimens in order to obtain good statistics on the parameters defined. The distributions for the statistical parameters based on computer simulations showed similar trends as those of laboratory specimens.

Performance Analysis of an Adaptive Threshold Hybrid Double-Dwell System with Antenna Diversity for Acquisition in DS-CDMA Systems

In this paper, we consider the analysis of the acquisition process for a hybrid double-dwell system with antenna diversity for DS-CDMA (direct sequence-code division multiple access) using an adaptive threshold. Acquisition systems with a fixed threshold value are unable to adapt to fast varying mobile communications environments and may result in a high false alarm rate, and/or low detection probability. Therefore, we propose an adaptively varying threshold scheme through the use of a cellaveraging constant false alarm rate (CA-CFAR) algorithm, which is well known in the field of radar detection. We derive exact expressions for the probabilities of detection and false alarm in Rayleigh fading channels. The mean acquisition time of the system under consideration is also derived. The performance of the system is analyzed and compared to that of a hybrid single dwell system.

Researches on Simulation and Validation of Airborne Enhanced Ground Proximity Warning System

In this paper, enhanced ground proximity warning simulation and validation system is designed and implemented. First, based on square grid and sub-grid structure, the global digital terrain database is designed and constructed. Terrain data searching is implemented through querying the latitude and longitude bands and separated zones of global terrain database with the current aircraft position. A combination of dynamic scheduling and hierarchical scheduling is adopted to schedule the terrain data, and the terrain data can be read and delete dynamically in the memory. Secondly, according to the scope, distance, approach speed information etc. to the dangerous terrain in front, and using security profiles calculating method, collision threat detection is executed in real-time, and provides caution and warning alarm. According to this scheme, the implementation of the enhanced ground proximity warning simulation system is realized. Simulations are carried out to verify a good real-time in terrain display and alarm trigger, and the results show simulation system is realized correctly, reasonably and stable.

Performance Evaluation of ROI Extraction Models from Stationary Images

In this paper three basic approaches and different methods under each of them for extracting region of interest (ROI) from stationary images are explored. The results obtained for each of the proposed methods are shown, and it is demonstrated where each method outperforms the other. Two main problems in ROI extraction: the channel selection problem and the saliency reversal problem are discussed and how best these two are addressed by various methods is also seen. The basic approaches are 1) Saliency based approach 2) Wavelet based approach 3) Clustering based approach. The saliency approach performs well on images containing objects of high saturation and brightness. The wavelet based approach performs well on natural scene images that contain regions of distinct textures. The mean shift clustering approach partitions the image into regions according to the density distribution of pixel intensities. The experimental results of various methodologies show that each technique performs at different acceptable levels for various types of images.

Active Control for Reduction of Noise Passing through Enclosure and Optimization of Microphone Position

In this study, noise characteristics of structure were analyzed in an effort to reduce noise passing through an opening of an enclosure surrounding the structure that generates noise. Enclosures are essential measure to protect noise propagation from operating machinery. Access openings of the enclosures are important path of noise leakage. First, noise characteristics of structure were analyzed and feed-forward noise control was performed using simulation in order to reduce noise passing through the opening of enclosure, which surrounds a structure generating noise. We then implemented a feed-forward controller to actively control the acoustic power through the opening. Finally, we conducted optimization of placement of the reference sensors for several cases of the number of sensors. Good control performances were achieved using the minimum number of microphones arranged an optimal placement.

The Effect of Entrepreneurship on Foreign Direct Investment

Entrepreneurship has become an important and extensively researched concept in business studies. Research on foreign direct investment (FDI) has become widespread due to the growth of FDI and its importance in globalization. Most entrepreneurship studies examined the importance and influence of entrepreneurial orientation in a micro-level context. On the other hand, studies and research concerning FDI used statistical techniques to analyze the effect, determinants, and motives of FDI on a macroeconomic level, ignoring empirical studies on other noneconomic determinants. In order to bridge the gap between the theory and empirical evidence on FDI and the theory and research on entrepreneurship, this study examines the impact of entrepreneurship on inward foreign direct investment. The relationship between entrepreneurship and foreign direct investment is investigated through regression analysis of pooled time-series and cross-sectional data. The results suggest that entrepreneurship has a significant effect on FDI.

A Discriminatory Rewarding Mechanism for Sybil Detection with Applications to Tor

This paper presents an economic game for sybil detection in a distributed computing environment. Cost parameters reflecting impacts of different sybil attacks are introduced in the sybil detection game. The optimal strategies for this game in which both sybil and non-sybil identities are expected to participate are devised. A cost sharing economic mechanism called Discriminatory Rewarding Mechanism for Sybil Detection is proposed based on this game. A detective accepts a security deposit from each active agent, negotiates with the agents and offers rewards to the sybils if the latter disclose their identity. The basic objective of the detective is to determine the optimum reward amount for each sybil which will encourage the maximum possible number of sybils to reveal themselves. Maintaining privacy is an important issue for the mechanism since the participants involved in the negotiation are generally reluctant to share their private information. The mechanism has been applied to Tor by introducing a reputation scoring function.