Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data

The k-nearest neighbors (knn) is a simple but effective method of classification. In this paper we present an extended version of this technique for chemical compounds used in High Throughput Screening, where the distances of the nearest neighbors can be taken into account. Our algorithm uses kernel weight functions as guidance for the process of defining activity in screening data. Proposed kernel weight function aims to combine properties of graphical structure and molecule descriptors of screening compounds. We apply the modified knn method on several experimental data from biological screens. The experimental results confirm the effectiveness of the proposed method.

Deposition Rate and Energy Enhancements of TiN Thin-Film in a Magnetized Sheet Plasma Source

Titanium nitride (TiN) has been synthesized using the sheet plasma negative ion source (SPNIS). The parameters used for its effective synthesis has been determined from previous experiments and studies. In this study, further enhancement of the deposition rate of TiN synthesis and advancement of the SPNIS operation is presented. This is primarily achieved by the addition of Sm-Co permanent magnets and a modification of the configuration in the TiN deposition process. The magnetic enhancement is aimed at optimizing the sputtering rate and the sputtering yield of the process. The Sm-Co permanent magnets are placed below the Ti target for better sputtering by argon. The Ti target is biased from –250V to – 350V and is sputtered by Ar plasma produced at discharge current of 2.5–4A and discharge potential of 60–90V. Steel substrates of dimensions 20x20x0.5mm3 were prepared with N2:Ar volumetric ratios of 1:3, 1:5 and 1:10. Ocular inspection of samples exhibit bright gold color associated with TiN. XRD characterization confirmed the effective TiN synthesis as all samples exhibit the (200) and (311) peaks of TiN and the non-stoichiometric Ti2N (220) facet. Cross-sectional SEM results showed increase in the TiN deposition rate of up to 0.35μm/min. This doubles what was previously obtained [1]. Scanning electron micrograph results give a comparative morphological picture of the samples. Vickers hardness results gave the largest hardness value of 21.094GPa.

Multi-Objective Optimization of Gas Turbine Power Cycle

Because of importance of energy, optimization of power generation systems is necessary. Gas turbine cycles are suitable manner for fast power generation, but their efficiency is partly low. In order to achieving higher efficiencies, some propositions are preferred such as recovery of heat from exhaust gases in a regenerator, utilization of intercooler in a multistage compressor, steam injection to combustion chamber and etc. However thermodynamic optimization of gas turbine cycle, even with above components, is necessary. In this article multi-objective genetic algorithms are employed for Pareto approach optimization of Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective optimization a number of conflicting objective functions are to be optimized simultaneously. The important objective functions that have been considered for optimization are entropy generation of RIGT cycle (Ns) derives using Exergy Analysis and Gouy-Stodola theorem, thermal efficiency and the net output power of RIGT Cycle. These objectives are usually conflicting with each other. The design variables consist of thermodynamic parameters such as compressor pressure ratio (Rp), excess air in combustion (EA), turbine inlet temperature (TIT) and inlet air temperature (T0). At the first stage single objective optimization has been investigated and the method of Non-dominated Sorting Genetic Algorithm (NSGA-II) has been used for multi-objective optimization. Optimization procedures are performed for two and three objective functions and the results are compared for RIGT Cycle. In order to investigate the optimal thermodynamic behavior of two objectives, different set, each including two objectives of output parameters, are considered individually. For each set Pareto front are depicted. The sets of selected decision variables based on this Pareto front, will cause the best possible combination of corresponding objective functions. There is no superiority for the points on the Pareto front figure, but they are superior to any other point. In the case of three objective optimization the results are given in tables.

Determinants of Brand Equity: Offering a Model to Chocolate Industry

This study examined the underlying dimensions of brand equity in the chocolate industry. For this purpose, researchers developed a model to identify which factors are influential in building brand equity. The second purpose was to assess brand loyalty and brand images mediating effect between brand attitude, brand personality, brand association with brand equity. The study employed structural equation modeling to investigate the causal relationships between the dimensions of brand equity and brand equity itself. It specifically measured the way in which consumers’ perceptions of the dimensions of brand equity affected the overall brand equity evaluations. Data were collected from a sample of consumers of chocolate industry in Iran. The results of this empirical study indicate that brand loyalty and brand image are important components of brand equity in this industry. Moreover, the role of brand loyalty and brand image as mediating factors in the intention of brand equity are supported. The principal contribution of the present research is that it provides empirical evidence of the multidimensionality of consumer based brand equity, supporting Aaker´s and Keller´s conceptualization of brand equity. The present research also enriched brand equity building by incorporating the brand personality and brand image, as recommended by previous researchers. Moreover, creating the brand equity index in chocolate industry of Iran particularly is novel.

Household Demand for Solid Waste Disposal Options in Malaysia

This paper estimates the economic values of household preference for enhanced solid waste disposal services in Malaysia. The contingent valuation (CV) method estimates an average additional monthly willingness-to-pay (WTP) in solid waste management charges of Ôé¼0.77 to 0.80 for improved waste disposal services quality. The finding of a slightly higher WTP from the generic CV question than that of label-specific, further reveals a higher WTP for sanitary landfill, at Ôé¼0.90, than incineration, at Ôé¼0.63. This suggests that sanitary landfill is a more preferred alternative. The logistic regression estimation procedure reveals that household-s concern of where their rubbish is disposed, age, ownership of house, household income and format of CV question are significant factors in influencing WTP.

Restriction Specificity of Some Soybean Genotypes to Bradyrhizobium japonicum Serogrous

Competitive relationships among Bradyrhizobium japonicum USDA serogroup 123, 122 and 138 were screened versus the standard commercial soybean variety Williams and two introductions P1 377578 "671" in a field trial. Displacement of strain 123 by an effective strain should improved N2 fixation. Root nodules were collected and strain occupancy percentage was determined using strain specific fluorescent antibodies technique. As anticipated the strain USDA 123 dominated 92% of nodules due to the high affinity between the host and the symbiont. This dominance was consistent and not changed materially either by inoculation practice or by introducing new strainan. The interrelationship between the genotype Williams and serogroup 122 & 138 was found very weak although the cell density of the strain in the rhizosphere area was equal. On the other hand, the nodule occupancy of genotypes 671 and 166 with rhizobia serogroup 123 was almost diminished to zero. . The data further exhibited that the genotypes P1 671 and P1 166 have high affinity to colonize with strains 122 and 138 whereas Williams was highly promiscuous to strain 123.

Enhancing Retrieval Effectiveness of Malay Documents by Exploiting Implicit Semantic Relationship between Words

Phrases has a long history in information retrieval, particularly in commercial systems. Implicit semantic relationship between words in a form of BaseNP have shown significant improvement in term of precision in many IR studies. Our research focuses on linguistic phrases which is language dependent. Our results show that using BaseNP can improve performance although above 62% of words formation in Malay Language based on derivational affixes and suffixes.

Managers' Empowerment in High School by Knowledge Management

The purpose of the present study is the investigation of the relationship between knowledge management and enabling managers based on achieving proper function. This research is descriptive and investigative. The sample includes all male and female high school managers of first and second regions of Urmia including 98 school and accordingly 98 managers. The instrument applied was a questionnaire. To sum up, there is a statistically significant relationship between knowledge management and empowering managers. In the end, several suggestions are provided.

On the Coupled Electromechanical Behavior of Artificial Materials with Chiral-Shell Elements

In the present work we investigate both the elastic and electric properties of a chiral material. We consider a composite structure made from a polymer matrix and anisotropic inclusions of GaAs taking into account piezoelectric and dielectric properties of the composite material. The principal task of the work is the estimation of the functional properties of the composite material.

Aircraft Gas Turbine Engines Technical Condition Identification System

In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.

Validation of Reverse Engineered Web Application Models

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

A Survey of Business Component Identification Methods and Related Techniques

With deep development of software reuse, componentrelated technologies have been widely applied in the development of large-scale complex applications. Component identification (CI) is one of the primary research problems in software reuse, by analyzing domain business models to get a set of business components with high reuse value and good reuse performance to support effective reuse. Based on the concept and classification of CI, its technical stack is briefly discussed from four views, i.e., form of input business models, identification goals, identification strategies, and identification process. Then various CI methods presented in literatures are classified into four types, i.e., domain analysis based methods, cohesion-coupling based clustering methods, CRUD matrix based methods, and other methods, with the comparisons between these methods for their advantages and disadvantages. Additionally, some insufficiencies of study on CI are discussed, and the causes are explained subsequently. Finally, it is concluded with some significantly promising tendency about research on this problem.

Existence and Exponential Stability of Almost Periodic Solution for Recurrent Neural Networks on Time Scales

In this paper, a class of recurrent neural networks (RNNs) with variable delays are studied on almost periodic time scales, some sufficient conditions are established for the existence and global exponential stability of the almost periodic solution. These results have important leading significance in designs and applications of RNNs. Finally, two examples and numerical simulations are presented to illustrate the feasibility and effectiveness of the results.

Development of a Pipeline Monitoring System by Bio-mimetic Robots

To explore pipelines is one of various bio-mimetic robot applications. The robot may work in common buildings such as between ceilings and ducts, in addition to complicated and massive pipeline systems of large industrial plants. The bio-mimetic robot finds any troubled area or malfunction and then reports its data. Importantly, it can not only prepare for but also react to any abnormal routes in the pipeline. The pipeline monitoring tasks require special types of mobile robots. For an effective movement along a pipeline, the movement of the robot will be similar to that of insects or crawling animals. During its movement along the pipelines, a pipeline monitoring robot has an important task of finding the shapes of the approaching path on the pipes. In this paper we propose an effective solution to the pipeline pattern recognition, based on the fuzzy classification rules for the measured IR distance data.

New Features for Specific JPEG Steganalysis

We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.

Optimization of Inverse Kinematics of a 3R Robotic Manipulator using Genetic Algorithms

In this paper the direct kinematic model of a multiple applications three degrees of freedom industrial manipulator, was developed using the homogeneous transformation matrices and the Denavit - Hartenberg parameters, likewise the inverse kinematic model was developed using the same method, verifying that in the workload border the inverse kinematic presents considerable errors, therefore a genetic algorithm was implemented to optimize the model improving greatly the efficiency of the model.

Egyptian Electronic Government: The University Enrolment Case Study

E-government projects have potential for greater efficiency and effectiveness of government operations. For this reason, many developing countries governments have invested heavily in this agenda and an increasing number of e-government projects are being implemented. However, there is a lack of clear case material, which describes the potentialities and consequence experienced by organizations trying to manage with this change. The Ministry of State for Administrative Development (MSAD) is the organization responsible for the e-Government program in Egypt since early 2004. This paper presents a case study of the process of admission to public universities and institutions in Egypt which is led by MSAD. Underlining the key benefits resulting from the initiative, explaining the strategies and the development steps used to implement it, and highlighting the main obstacles encountered and how they were overcome will help repeat the experience in other useful e-government projects.

Redundancy in Steel Frames with Masonry Infill Walls

Structural redundancy is an interesting point in seismic design of structures. Initially, the structural redundancy is described as indeterminate degree of a system. Although many definitions are presented for redundancy in structures, recently the definition of structural redundancy has been related to the configuration of structural system and the number of lateral load transferring directions in the structure. The steel frames with infill walls are general systems in the constructing of usual residential buildings in some countries. It is obviously declared that the performance of structures will be affected by adding masonry infill walls. In order to investigate the effect of infill walls on the redundancy of the steel frame which constructed with masonry walls, the components of redundancy including redundancy variation index, redundancy strength index and redundancy response modification factor were extracted for the frames with masonry infills. Several steel frames with typical storey number and various numbers of bays were designed and considered. The redundancy of frames with and without infill walls was evaluated by proposed method. The results showed the presence of infill causes increase of redundancy.

On Method of Fundamental Solution for Nondestructive Testing

Nondestructive testing in engineering is an inverse Cauchy problem for Laplace equation. In this paper the problem of nondestructive testing is expressed by a Laplace-s equation with third-kind boundary conditions. In order to find unknown values on the boundary, the method of fundamental solution is introduced and realized. Because of the ill-posedness of studied problems, the TSVD regularization technique in combination with L-curve criteria and Generalized Cross Validation criteria is employed. Numerical results are shown that the TSVD method combined with L-curve criteria is more efficient than the TSVD method combined with GCV criteria. The abstract goes here.

Power Efficient OFDM Signals with Reduced Symbol's Aperiodic Autocorrelation

Three new algorithms based on minimization of autocorrelation of transmitted symbols and the SLM approach which are computationally less demanding have been proposed. In the first algorithm, autocorrelation of complex data sequence is minimized to a value of 1 that results in reduction of PAPR. Second algorithm generates multiple random sequences from the sequence generated in the first algorithm with same value of autocorrelation i.e. 1. Out of these, the sequence with minimum PAPR is transmitted. Third algorithm is an extension of the second algorithm and requires minimum side information to be transmitted. Multiple sequences are generated by modifying a fixed number of complex numbers in an OFDM data sequence using only one factor. The multiple sequences represent the same data sequence and the one giving minimum PAPR is transmitted. Simulation results for a 256 subcarrier OFDM system show that significant reduction in PAPR is achieved using the proposed algorithms.