Investigation of the Electronic Properties of Au/methyl-red/Ag Surface type Schottky Diode by Current-Voltage Method

In this paper, fabrication and study of electronic properties of Au/methyl-red/Ag surface type Schottky diode by current-voltage (I-V) method has been reported. The I-V characteristics of the Schottky diode showed the good rectifying behavior. The values of ideality factor n and barrier height b of Au/methyl-red/Ag Schottky diode were calculated from the semi-log I-V characteristics and by using the Cheung functions. From semi-log current-voltage characteristics the values of n and b were found 1.93 and 0.254 eV, respectively, while by using Cheung functions their values were calculated 1.89 and 0.26 eV, respectively. The effect of series resistance was also analyzed by Cheung functions. The series resistance RS values were determined from dV/d(lnI)–I and H(I)–I graphs and were found to be 1.1 k and 1.3 k, respectively.

Development Trend in Investigation of Residual Stresses in WC-Co Coating by HVOF Thermal Spraying

In this paper, the techniques for estimating the residual stress in high velocity oxy fuel thermal spray coatings have been discussed and compared. The development trend and the last investigation have been studied. It is seemed that the there is not effective study on the effect of the peening action in HVOF analytically and numerically.

Lowering Error Floors by Concatenation of Low-Density Parity-Check and Array Code

Low-density parity-check (LDPC) codes have been shown to deliver capacity approaching performance; however, problematic graphical structures (e.g. trapping sets) in the Tanner graph of some LDPC codes can cause high error floors in bit-error-ratio (BER) performance under conventional sum-product algorithm (SPA). This paper presents a serial concatenation scheme to avoid the trapping sets and to lower the error floors of LDPC code. The outer code in the proposed concatenation is the LDPC, and the inner code is a high rate array code. This approach applies an interactive hybrid process between the BCJR decoding for the array code and the SPA for the LDPC code together with bit-pinning and bit-flipping techniques. Margulis code of size (2640, 1320) has been used for the simulation and it has been shown that the proposed concatenation and decoding scheme can considerably improve the error floor performance with minimal rate loss.

Data Envelopment Analysis under Uncertainty and Risk

Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.

A Novel Framework for Abnormal Behaviour Identification and Detection for Wireless Sensor Networks

Despite extensive study on wireless sensor network security, defending internal attacks and finding abnormal behaviour of the sensor are still difficult and unsolved task. The conventional cryptographic technique does not give the robust security or detection process to save the network from internal attacker that cause by abnormal behavior. The insider attacker or abnormally behaved sensor identificationand location detection framework using false massage detection and Time difference of Arrival (TDoA) is presented in this paper. It has been shown that the new framework can efficiently identify and detect the insider attacker location so that the attacker can be reprogrammed or subside from the network to save from internal attack.

Context for Simplicity: A Basis for Context-aware Systems Based on the 3GPP Generic User Profile

The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.

Correction of Frequent English Writing Errors by Using Coded Indirect Corrective Feedback and Error Treatment

The purposes of this study are 1) to study the frequent English writing errors of students registering the course: Reading and Writing English for Academic Purposes II, and 2) to find out the results of writing error correction by using coded indirect corrective feedback and writing error treatments. Samples include 28 2nd year English Major students, Faculty of Education, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tool for data collection includes 4 writing tests of short texts. The research findings disclose that frequent English writing errors found in this course comprise 7 types of grammatical errors, namely Fragment sentence, Subject-verb agreement, Wrong form of verb tense, Singular or plural noun endings, Run-ons sentence, Wrong form of verb pattern and Lack of parallel structure. Moreover, it is found that the results of writing error correction by using coded indirect corrective feedback and error treatment reveal the overall reduction of the frequent English writing errors and the increase of students’ achievement in the writing of short texts with the significance at .05.

Harmful Effect of Ambient Ozone on Growth and Productivity of Two Legume Crops Visia Faba, and Pisum sativum in Riyadh City, K.S.A.

Ozone (O3) is considered as one of the most phytotoxic pollutants with deleterious effects on living and non living components of Ecosystems. It reduces growth and yield of many crops as well as alters the physiology and crop quality. The present study described series of experiments to investigate the effects of ambient O3 at different locations with different ambient levels of O3 depending on proximity to pollutant source and ranged between 17 ppb/h in control experiment to 112 ppb/h in industrial area respectively. The ambient levels in other three locations (King Saud University botanical garden, King Fahd Rd, and Almanakh Garden) were 61,61,77 ppb/h respectively. Tow legume crops species (vicia vaba L ; and Pisum sativum) differ in their phenology and sensitivity were used. The results showed a significant negative effect to ozone on morphology, number of injured leaves, growth and productivity with a difference in the degree of response depending on the plant type. Visia Faba showed sensitivity to ozone to number and leaf area and the degree of injury leaves 3, pisum sativum show higher sensitivity for the gas for degree of injury 1,The relative growth rate and seed weight, it turns out there is no significant difference between the two plants in plant height and number of seeds.

Knowledge Sharing: A Survey, Assessment and Directions for Future Research: Individual Behavior Perspective

One of the most important areas of knowledge management studies is knowledge sharing. Measured in terms of number of scientific articles and organization-s applications, knowledge sharing stands as an example of success in the field. This paper reviews the related papers in the context of the underlying individual behavioral variables to providea direction framework for future research and writing.

Synthesis of Peptide Amides using Sol-Gel Immobilized Alcalase in Batch and Continuous Reaction System

Two commercial proteases from Bacillus licheniformis (Alcalase 2.4 L FG and Alcalase 2.5 L, Type DX) were screened for the production of Z-Ala-Phe-NH2 in batch reaction. Alcalase 2.4 L FG was the most efficient enzyme for the C-terminal amidation of Z-Ala-Phe-OMe using ammonium carbamate as ammonium source. Immobilization of protease has been achieved by the sol-gel method, using dimethyldimethoxysilane (DMDMOS) and tetramethoxysilane (TMOS) as precursors (unpublished results). In batch production, about 95% of Z-Ala-Phe-NH2 was obtained at 30°C after 24 hours of incubation. Reproducibility of different batches of commercial Alcalase 2.4 L FG preparations was also investigated by evaluating the amidation activity and the entrapment yields in the case of immobilization. A packed-bed reactor (0.68 cm ID, 15.0 cm long) was operated successfully for the continuous synthesis of peptide amides. The immobilized enzyme retained the initial activity over 10 cycles of repeated use in continuous reactor at ambient temperature. At 0.75 mL/min flow rate of the substrate mixture, the total conversion of Z-Ala-Phe-OMe was achieved after 5 hours of substrate recycling. The product contained about 90% peptide amide and 10% hydrolysis byproduct.

A Hybrid Radial-Based Neuro-GA Multiobjective Design of Laminated Composite Plates under Moisture and Thermal Actions

In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.

Study on Extraction of Niobium Oxide from Columbite–Tantalite Concentrate

The principal objective of this study is to be able to extract niobium oxide from columbite-tantalite concentrate of Thayet Kon Area in Nay Phi Taw. It is recovered from columbite-tantalite concentrate which contains 19.29 % Nb2O5.The recovery of niobium oxide from columbite-tantalite concentrate can be divided into three main sections, namely, digestion of the concentrate, recovery from the leached solution and precipitation and calcinations. The concentrate was digested with hydrofluoric acid and sulfuric acid. Of the various parameters that effect acidity and time were studied. In the recovery section solvent extraction process using methyl isobutyl ketone was investigated. Ammonium hydroxide was used as a precipitating agent and the precipitate was later calcined. The percentage of niobium oxide is 74%.

Acidity of different Jordanian Clays characterized by TPD-NH3 and MBOH Conversion

The acidity of different raw Jordanian clays containing zeolite, bentonite, red and white kaolinite and diatomite was characterized by means of temperature programmed desorption (TPD) of ammonia, conversion of 2-methyl-3-butyn-2-ol (MBOH), FTIR and BET-measurements. FTIR spectra proved presence of silanol and bridged hydroxyls on the clay surface. The number of acidic sites was calculated from experimental TPD-profiles. We observed the decrease of surface acidity correlates with the decrease of Si/Al ratio except for diatomite. On the TPD-plot for zeolite two maxima were registered due to different strength of surface acidic sites. Values of MBOH conversion, product yields and selectivity were calculated for the catalysis on Jordanian clays. We obtained that all clay samples are able to convert MBOH into a major product which is 3-methyl-3-buten-1-yne (MBYNE) catalyzed by acid surface sites with the selectivity close to 70%. There was found a correlation between MBOH conversion and acidity of clays determined by TPD-NH3, i.e. the higher the acidity the higher the conversion of MBOH. However, diatomite provided the lowest conversion of MBOH as result of poor polarization of silanol groups. Comparison of surface areas and conversions revealed the highest density of active sites for red kaolinite and the lowest for zeolite and diatomite.

On Pattern-Based Programming towards the Discovery of Frequent Patterns

The problem of frequent pattern discovery is defined as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a database. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages. Such paradigm is inefficient when set of patterns is large and the frequent pattern is long. We suggest a high-level declarative style of programming apply to the problem of frequent pattern discovery. We consider two languages: Haskell and Prolog. Our intuitive idea is that the problem of finding frequent patterns should be efficiently and concisely implemented via a declarative paradigm since pattern matching is a fundamental feature supported by most functional languages and Prolog. Our frequent pattern mining implementation using the Haskell and Prolog languages confirms our hypothesis about conciseness of the program. The comparative performance studies on line-of-code, speed and memory usage of declarative versus imperative programming have been reported in the paper.

Programmable Logic Controller for Cassava Centrifugal Machine

Chaiyaphum Starch Co. Ltd. is one of many starch manufacturers that has introduced machinery to aid in manufacturing. Even though machinery has replaced many elements and is now a significant part in manufacturing processes, problems that must be solved with respect to current process flow to increase efficiency still exist. The paper-s aim is to increase productivity while maintaining desired quality of starch, by redesigning the flipping machine-s mechanical control system which has grossly low functional lifetime. Such problems stem from the mechanical control system-s bearings, as fluids and humidity can access into said bearing directly, in tandem with vibrations from the machine-s function itself. The wheel which is used to sense starch thickness occasionally falls from its shaft, due to high speed rotation during operation, while the shaft may bend from impact when processing dried bread. Redesigning its mechanical control system has increased its efficiency, allowing quality thickness measurement while increasing functional lifetime an additional 62 days.

Probability Distribution of Rainfall Depth at Hourly Time-Scale

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.

Numerical Investigation on Damage Evolution of Piles inside Liquefied Soil Foundation - Dynamic-Loading Experiments -

The large and small-scale shaking table tests, which was conducted for investigating damage evolution of piles inside liquefied soil, are numerically simulated and experimental verified by the3D nonlinear finite element analysis. Damage evolution of elasto-plastic circular steel piles and reinforced concrete (RC) one with cracking and yield of reinforcement are focused on, and the failure patterns and residual damages are captured by the proposed constitutive models. The superstructure excitation behind quay wall is reproduced as well.

A Framework for Ranking Quality of Information on Weblog

The vast amount of information on the World Wide Web is created and published by many different types of providers. Unlike books and journals, most of this information is not subject to editing or peer review by experts. This lack of quality control and the explosion of web sites make the task of finding quality information on the web especially critical. Meanwhile new facilities for producing web pages such as Blogs make this issue more significant because Blogs have simple content management tools enabling nonexperts to build easily updatable web diaries or online journals. On the other hand despite a decade of active research in information quality (IQ) there is no framework for measuring information quality on the Blogs yet. This paper presents a novel experimental framework for ranking quality of information on the Weblog. The results of data analysis revealed seven IQ dimensions for the Weblog. For each dimension, variables and related coefficients were calculated so that presented framework is able to assess IQ of Weblogs automatically.

Simulating Gradient Contour and Mesh of a Scalar Field

This research paper is based upon the simulation of gradient of mathematical functions and scalar fields using MATLAB. Scalar fields, their gradient, contours and mesh/surfaces are simulated using different related MATLAB tools and commands for convenient presentation and understanding. Different mathematical functions and scalar fields are examined here by taking their gradient, visualizing results in 3D with different color shadings and using other necessary relevant commands. In this way the outputs of required functions help us to analyze and understand in a better way as compared to just theoretical study of gradient.

Applications of Stable Distributions in Time Series Analysis, Computer Sciences and Financial Markets

In this paper, first we introduce the stable distribution, stable process and theirs characteristics. The a -stable distribution family has received great interest in the last decade due to its success in modeling data, which are too impulsive to be accommodated by the Gaussian distribution. In the second part, we propose major applications of alpha stable distribution in telecommunication, computer science such as network delays and signal processing and financial markets. At the end, we focus on using stable distribution to estimate measure of risk in stock markets and show simulated data with statistical softwares.