Multilevel Activation Functions For True Color Image Segmentation Using a Self Supervised Parallel Self Organizing Neural Network (PSONN) Architecture: A Comparative Study

The paper describes a self supervised parallel self organizing neural network (PSONN) architecture for true color image segmentation. The proposed architecture is a parallel extension of the standard single self organizing neural network architecture (SONN) and comprises an input (source) layer of image information, three single self organizing neural network architectures for segmentation of the different primary color components in a color image scene and one final output (sink) layer for fusion of the segmented color component images. Responses to the different shades of color components are induced in each of the three single network architectures (meant for component level processing) by applying a multilevel version of the characteristic activation function, which maps the input color information into different shades of color components, thereby yielding a processed component color image segmented on the basis of the different shades of component colors. The number of target classes in the segmented image corresponds to the number of levels in the multilevel activation function. Since the multilevel version of the activation function exhibits several subnormal responses to the input color image scene information, the system errors of the three component network architectures are computed from some subnormal linear index of fuzziness of the component color image scenes at the individual level. Several multilevel activation functions are employed for segmentation of the input color image scene using the proposed network architecture. Results of the application of the multilevel activation functions to the PSONN architecture are reported on three real life true color images. The results are substantiated empirically with the correlation coefficients between the segmented images and the original images.

Molecular Docking Studies of Mycobacterium tuberculosis RNA Polymerase β Subunit (rpoB) Receptor

Tuberculosis (TB) is a bacterial infectious disease caused by the obligate human pathogen, Mycobacterium tuberculosis. Multidrug-resistant tuberculosis (MDR-TB) is a global reality that threatens tuberculosis control. Resistance to antibiotic Rifampicin, occurs in 95% of cases through nucleotide substitutions in an 81-bp core region of the rpoB i.e; beta subunit of DNA dependant RNA polymerase. In this paper, we studied the Rifampicin-rpoB receptor interactions In silico. First, homology modeling was performed to obtain the three dimensional structure of Mycobacterium rpoB. Sixty analogs of Rifampicin were prepared using Marvin sketch software. Both original Rifampicin and the analogs were docked with rpoB and energy values were obtained. Out of sixty analogs, 43 analogs had lesser energy values than conventional Rifampicin and hence are predicted to have greater binding affinity to rpoB. Thus, this study offers a route for the development of Rifampicin analogs against multi drug resistant Mycobacterium rpoB.

Determinants of Enterprise Risk Management Adoption: An Empirical Analysis of Malaysian Public Listed Firms

Purpose:This paper aims to gain insights to the influential factors of ERM adoptions by public listed firms in Malaysia. Findings:The two factors of financial leverage and auditor type were found to be significant influential factors for ERM adoption. In other words the findings indicated that firms with higher financial leverage and with a Big Four auditor are more likely to have a form of ERM framework in place. Originality/Value:Since there are relatively few studies conducted in this area and specially in developing economies like Malaysia, this study will broaden the scope of literature by providing novel empirical evidence.

The U.S. and Central Asia: Religion, Politics, Ideology

Numerous facts evidence the increasing religiosity of the population and the intensification of religious movements in various countries in the last decade of the 20th century. The number of international religious institutions and foundations; religious movements; parties and sects operating worldwide is increasing as well. Some ethnic and inter-state conflicts are obviously of a religious origin. All of this make a number of analysts to conclude that the religious factor is becoming an important part of international life, including the formation and activities of terrorist organizations. Most of all is said and written about Islam, the second, after Christianity, world religions professed according to various estimates by 1.5 bln. individuals in 127 countries.

Remarks on Energy Based Control of a Nonlinear, Underactuated, MIMO and Unstable Benchmark

In the last decade, energy based control theory has undergone a significant breakthrough in dealing with underactated mechanical systems with two successful and similar tools, controlled Lagrangians and controlled Hamiltanians (IDA-PBC). However, because of the complexity of these tools, successful case studies are lacking, in particular, MIMO cases. The seminal theoretical paper of controlled Lagrangians proposed by Bloch and his colleagues presented a benchmark example–a 4 d.o.f underactuated pendulum on a cart but a detailed and completed design is neglected. To compensate this ignorance, the note revisit their design idea by addressing explicit control functions for a similar device motivated by a vector thrust body hovering in the air. To the best of our knowledge, this system is the first MIMO, underactuated example that is stabilized by using energy based tools at the courtesy of the original design idea. Some observations are given based on computer simulation.

On Identity Disclosure Risk Measurement for Shared Microdata

Probability-based identity disclosure risk measurement may give the same overall risk for different anonymization strategy of the same dataset. Some entities in the anonymous dataset may have higher identification risks than the others. Individuals are more concerned about higher risks than the average and are more interested to know if they have a possibility of being under higher risk. A notation of overall risk in the above measurement method doesn-t indicate whether some of the involved entities have higher identity disclosure risk than the others. In this paper, we have introduced an identity disclosure risk measurement method that not only implies overall risk, but also indicates whether some of the members have higher risk than the others. The proposed method quantifies the overall risk based on the individual risk values, the percentage of the records that have a risk value higher than the average and how larger the higher risk values are compared to the average. We have analyzed the disclosure risks for different disclosure control techniques applied to original microdata and present the results.

Program Camouflage: A Systematic Instruction Hiding Method for Protecting Secrets

This paper proposes an easy-to-use instruction hiding method to protect software from malicious reverse engineering attacks. Given a source program (original) to be protected, the proposed method (1) takes its modified version (fake) as an input, (2) differences in assembly code instructions between original and fake are analyzed, and, (3) self-modification routines are introduced so that fake instructions become correct (i.e., original instructions) before they are executed and that they go back to fake ones after they are executed. The proposed method can add a certain amount of security to a program since the fake instructions in the resultant program confuse attackers and it requires significant effort to discover and remove all the fake instructions and self-modification routines. Also, this method is easy to use (with little effort) because all a user (who uses the proposed method) has to do is to prepare a fake source code by modifying the original source code.

A Two-Channel Secure Communication Using Fractional Chaotic Systems

In this paper, a two-channel secure communication using fractional chaotic systems is presented. Conditions for chaos synchronization have been investigated theoretically by using Laplace transform. To illustrate the effectiveness of the proposed scheme, a numerical example is presented. The keys, key space, key selection rules and sensitivity to keys are discussed in detail. Results show that the original plaintexts have been well masked in the ciphertexts yet recovered faithfully and efficiently by the present schemes.

XML Data Management in Compressed Relational Database

XML is an important standard of data exchange and representation. As a mature database system, using relational database to support XML data may bring some advantages. But storing XML in relational database has obvious redundancy that wastes disk space, bandwidth and disk I/O when querying XML data. For the efficiency of storage and query XML, it is necessary to use compressed XML data in relational database. In this paper, a compressed relational database technology supporting XML data is presented. Original relational storage structure is adaptive to XPath query process. The compression method keeps this feature. Besides traditional relational database techniques, additional query process technologies on compressed relations and for special structure for XML are presented. In this paper, technologies for XQuery process in compressed relational database are presented..

A Tool for Audio Quality Evaluation Under Hostile Environment

In this paper is to evaluate audio and speech quality with the help of Digital Audio Watermarking Technique under the different types of attacks (signal impairments) like Gaussian Noise, Compression Error and Jittering Effect. Further attacks are considered as Hostile Environment. Audio and Speech Quality Evaluation is an important research topic. The traditional way for speech quality evaluation is using subjective tests. They are reliable, but very expensive, time consuming, and cannot be used in certain applications such as online monitoring. Objective models, based on human perception, were developed to predict the results of subjective tests. The existing objective methods require either the original speech or complicated computation model, which makes some applications of quality evaluation impossible.

Coding of DWT Coefficients using Run-length Coding and Huffman Coding for the Purpose of Color Image Compression

In present paper we proposed a simple and effective method to compress an image. Here we found success in size reduction of an image without much compromising with it-s quality. Here we used Haar Wavelet Transform to transform our original image and after quantization and thresholding of DWT coefficients Run length coding and Huffman coding schemes have been used to encode the image. DWT is base for quite populate JPEG 2000 technique.

Vapor Bubble Dynamics in Upward Subcooled Flow Boiling During Void Evolution

Bubble generation was observed using a high-speed camera in subcooled flow boiling at low void fraction. Constant heat flux was applied on one side of an upward rectangular channel to make heated test channel. Water as a working fluid from high subcooling to near saturation temperature was injected step by step to investigate bubble behavior during void development. Experiments were performed in two different pressures condition close to 2bar and 4bar. It was observed that in high subcooling when boiling was commenced, bubble after nucleation departed its origin and slid beside heated surface. In an observation window mean release frequency of bubble fb,mean, nucleation site Ns and mean bubble volume Vb,mean in each step of experiments were measured to investigate wall vaporization rate. It was found that in proximity of PNVG vaporization rate was increased significantly in compare with condensation rate which remained in low value.

Efficient Design Optimization of Multi-State Flow Network for Multiple Commodities

The network of delivering commodities has been an important design problem in our daily lives and many transportation applications. The delivery performance is evaluated based on the system reliability of delivering commodities from a source node to a sink node in the network. The system reliability is thus maximized to find the optimal routing. However, the design problem is not simple because (1) each path segment has randomly distributed attributes; (2) there are multiple commodities that consume various path capacities; (3) the optimal routing must successfully complete the delivery process within the allowable time constraints. In this paper, we want to focus on the design optimization of the Multi-State Flow Network (MSFN) for multiple commodities. We propose an efficient approach to evaluate the system reliability in the MSFN with respect to randomly distributed path attributes and find the optimal routing subject to the allowable time constraints. The delivery rates, also known as delivery currents, of the path segments are evaluated and the minimal-current arcs are eliminated to reduce the complexity of the MSFN. Accordingly, the correct optimal routing is found and the worst-case reliability is evaluated. It has been shown that the reliability of the optimal routing is at least higher than worst-case measure. Two benchmark examples are utilized to demonstrate the proposed method. The comparisons between the original and the reduced networks show that the proposed method is very efficient.

Knowledge Management Model for Research Projects Masters Program

This paper presents the adaptation of the knowledge management model and intellectual capital measurement NOVA to the needs of work or research project must be developed when conducting a program of graduate-level master. Brackets are added in each of the blocks which is represented in the original model NOVA and which allows to represent those involved in each of these.

SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space

Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.

A Worst Case Estimation of the Inspection Rate by a Berthing Policy in a Container Terminal

After the terrorist attack on September 11, 2001 in U.S., the container security issue got high attention, especially by U.S. government, which deployed a lot of measures to promote or improve security systems. U.S. government not only enhances its national security system, but allies with other countries against the potential terrorist attacks in the future. For example CSI (Container Security Initiative), it encourages foreign ports outside U.S. to become CSI ports as a part of U.S. anti-terrorism network. Although promotion of the security could partly reach the goal of anti-terrorism, that will influence the efficiency of container supply chain, which is the main concern when implementing the inspection measurements. This paper proposes a quick estimation methodology for an inspection service rate by a berth allocation heuristic such that the inspection activities will not affect the original container supply chain. Theoretical and simulation results show this approach is effective.

Mobility Analysis of the Population of Rabat-Salé-Zemmour-Zaer

In this paper, we present the 2006 survey study origin destination and price that we carried out during 2006 fall in the area in the Moroccan region of Rabat-Salé-Zemmour-Zaer. The survey concerns the people-s characteristics, their displacements behavior and the price that they will be able to pay for a tramway ticket. The main objective is to study a set of relative features to the households and to their displacement's habits and to their choices among public and privet transport modes. A comparison between this survey results and that of the 1996's is made. A pricing scheme is also given according to the tram capacity. (The Rabat-Salé tramway is under construction right now and it will be operational beginning 2010).

An Investigation on the Variation of Software Development Productivity

The productivity of software development is one of the major concerns for project managers. Given the increasing complexity of the software being developed and the concomitant rise in the typical project size, the productivity has not consistently improved. By analyzing the latest release of ISBSG data repository with 4106 projects ever developed, we report on the factors found to significantly influence productivity, and present an original model for the estimation of productivity during project design. We further illustrate that software development productivity has experienced irregular variations between the years 1995 and 2005. Considering the factors significant to productivity, we found its variations are primarily caused by the variations of average team size for the development and the unbalanced use of the less productive development language 3GL.

Energy Conscious Builder Design Pattern with C# and Intermediate Language

Design Patterns have gained more and more acceptances since their emerging in software development world last decade and become another de facto standard of essential knowledge for Object-Oriented Programming developers nowadays. Their target usage, from the beginning, was for regular computers, so, minimizing power consumption had never been a concern. However, in this decade, demands of more complicated software for running on mobile devices has grown rapidly as the much higher performance portable gadgets have been supplied to the market continuously. To get along with time to market that is business reason, the section of software development for power conscious, battery, devices has shifted itself from using specific low-level languages to higher level ones. Currently, complicated software running on mobile devices are often developed by high level languages those support OOP concepts. These cause the trend of embracing Design Patterns to mobile world. However, using Design Patterns directly in software development for power conscious systems is not recommended because they were not originally designed for such environment. This paper demonstrates the adapted Design Pattern for power limitation system. Because there are numerous original design patterns, it is not possible to mention the whole at once. So, this paper focuses only in creating Energy Conscious version of existing regular "Builder Pattern" to be appropriated for developing low power consumption software.

Is the Expansion of High-Tech Leaders Possible Within the New EU Members? A Case Study of Ammono S.A. and the High-Tech Financing System in Poland

Innovations, especially technological, are considered key-drivers for sustainable economic growth and competitiveness in the globalised world. As such they should also play an important role in the process of economical convergence inside the EU. Unfortunately, the problem of insufficient innovation performance concerns around half of the EU countries. Poland shows that a lack of a consistent high-tech financing system constitutes a serious obstacle for the development of innovative firms. In this article we will evaluate these questions referring to the example of Ammono S.A., a Polish company established to develop and commercialise an original technology for the production of bulk GaN crystals. We will focus on its efforts to accumulate the financial resources necessary at different stages of its development. The purpose of this article is to suggest possible ways to improve the national innovative system, which would make it more competent in generating high-tech leaders.