Acidity of different Jordanian Clays characterized by TPD-NH3 and MBOH Conversion

The acidity of different raw Jordanian clays containing zeolite, bentonite, red and white kaolinite and diatomite was characterized by means of temperature programmed desorption (TPD) of ammonia, conversion of 2-methyl-3-butyn-2-ol (MBOH), FTIR and BET-measurements. FTIR spectra proved presence of silanol and bridged hydroxyls on the clay surface. The number of acidic sites was calculated from experimental TPD-profiles. We observed the decrease of surface acidity correlates with the decrease of Si/Al ratio except for diatomite. On the TPD-plot for zeolite two maxima were registered due to different strength of surface acidic sites. Values of MBOH conversion, product yields and selectivity were calculated for the catalysis on Jordanian clays. We obtained that all clay samples are able to convert MBOH into a major product which is 3-methyl-3-buten-1-yne (MBYNE) catalyzed by acid surface sites with the selectivity close to 70%. There was found a correlation between MBOH conversion and acidity of clays determined by TPD-NH3, i.e. the higher the acidity the higher the conversion of MBOH. However, diatomite provided the lowest conversion of MBOH as result of poor polarization of silanol groups. Comparison of surface areas and conversions revealed the highest density of active sites for red kaolinite and the lowest for zeolite and diatomite.

On Pattern-Based Programming towards the Discovery of Frequent Patterns

The problem of frequent pattern discovery is defined as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a database. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages. Such paradigm is inefficient when set of patterns is large and the frequent pattern is long. We suggest a high-level declarative style of programming apply to the problem of frequent pattern discovery. We consider two languages: Haskell and Prolog. Our intuitive idea is that the problem of finding frequent patterns should be efficiently and concisely implemented via a declarative paradigm since pattern matching is a fundamental feature supported by most functional languages and Prolog. Our frequent pattern mining implementation using the Haskell and Prolog languages confirms our hypothesis about conciseness of the program. The comparative performance studies on line-of-code, speed and memory usage of declarative versus imperative programming have been reported in the paper.

Protocol and Method for Preventing Attacks from the Web

Nowadays, computer worms, viruses and Trojan horse become popular, and they are collectively called malware. Those malware just spoiled computers by deleting or rewriting important files a decade ago. However, recent malware seems to be born to earn money. Some of malware work for collecting personal information so that malicious people can find secret information such as password for online banking, evidence for a scandal or contact address which relates with the target. Moreover, relation between money and malware becomes more complex. Many kinds of malware bear bots to get springboards. Meanwhile, for ordinary internet users, countermeasures against malware come up against a blank wall. Pattern matching becomes too much waste of computer resources, since matching tools have to deal with a lot of patterns derived from subspecies. Virus making tools can automatically bear subspecies of malware. Moreover, metamorphic and polymorphic malware are no longer special. Recently there appears malware checking sites that check contents in place of users' PC. However, there appears a new type of malicious sites that avoids check by malware checking sites. In this paper, existing protocols and methods related with the web are reconsidered in terms of protection from current attacks, and new protocol and method are indicated for the purpose of security of the web.

Static/kinetic Friction Behaviour of a Clutch Facing Material: Effects of Temperature and Pressure

The feasibility of applying a simple and cost effective sliding friction testing apparatus to study the friction behaviour of a clutch facing material, effected by the variation of temperature and contact pressure, was investigated. It was found that the method used in this work was able to give a convenient and cost effective measurement of friction coefficients and their transitions of a clutch facing material. The obtained results will be useful for the development process of new facing materials.

Independent Spanning Trees on Systems-on-chip Hypercubes Routing

Independent spanning trees (ISTs) provide a number of advantages in data broadcasting. One can cite the use in fault tolerance network protocols for distributed computing and bandwidth. However, the problem of constructing multiple ISTs is considered hard for arbitrary graphs. In this paper we present an efficient algorithm to construct ISTs on hypercubes that requires minimum resources to be performed.

Programmable Logic Controller for Cassava Centrifugal Machine

Chaiyaphum Starch Co. Ltd. is one of many starch manufacturers that has introduced machinery to aid in manufacturing. Even though machinery has replaced many elements and is now a significant part in manufacturing processes, problems that must be solved with respect to current process flow to increase efficiency still exist. The paper-s aim is to increase productivity while maintaining desired quality of starch, by redesigning the flipping machine-s mechanical control system which has grossly low functional lifetime. Such problems stem from the mechanical control system-s bearings, as fluids and humidity can access into said bearing directly, in tandem with vibrations from the machine-s function itself. The wheel which is used to sense starch thickness occasionally falls from its shaft, due to high speed rotation during operation, while the shaft may bend from impact when processing dried bread. Redesigning its mechanical control system has increased its efficiency, allowing quality thickness measurement while increasing functional lifetime an additional 62 days.

Analysis of Air Quality in the Outdoor Environment of the City of Messina by an Application of the Pollution Index Method

In this paper is reported an analysis about the outdoor air pollution of the urban centre of the city of Messina. The variations of the most critical pollutants concentrations (PM10, O3, CO, C6H6) and their trends respect of climatic parameters and vehicular traffic have been studied. Linear regressions have been effectuated for representing the relations among the pollutants; the differences between pollutants concentrations on weekend/weekday were also analyzed. In order to evaluate air pollution and its effects on human health, a method for calculating a pollution index was implemented and applied in the urban centre of the city. This index is based on the weighted mean of the most detrimental air pollutants concentrations respect of their limit values for protection of human health. The analyzed data of the polluting substances were collected by the Assessorship of the Environment of the Regional Province of Messina in the year 2004. A statistical analysis of the air quality index trends is also reported.

Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory

Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).

Design of Synchronous Torque Couplers

This paper presents the design, analysis and development of permanent magnet (PM) torque couplers. These couplers employ rare-earth magnets. Based on finite element analysis and earlier analytical works both concentric and face-type synchronous type couplers have been designed and fabricated. The experimental performance has good correlation with finite element calculations.

Continuity of Defuzzification and Its Application to Fuzzy Control

The mathematical framework for studying of a fuzzy approximate reasoning is presented in this paper. Two important defuzzification methods (Area defuzzification and Height defuzzification) besides the center of gravity method which is the best well known defuzzification method are described. The continuity of the defuzzification methods and its application to a fuzzy feedback control are discussed.

Porous Ni and Ni-Co Electrodeposits for Alkaline Water Electrolysis – Energy Saving

Hydrogen is considered to be the most promising candidate as a future energy carrier. One of the most used technologies for the electrolytic hydrogen production is alkaline water electrolysis. However, due to the high energy requirements, the cost of hydrogen produced in such a way is high. In continuous search to improve this process using advanced electrocatalytic materials for the hydrogen evolution reaction (HER), Ni type Raney and macro-porous Ni-Co electrodes were prepared on AISI 304 stainless steel substrates by electrodeposition. The developed electrodes were characterized by SEM and confocal laser scanning microscopy. HER on these electrodes was evaluated in 30 wt.% KOH solution by means of hydrogen discharge curves and galvanostatic tests. Results show that the developed electrodes present a most efficient behaviour for HER when comparing with the smooth Ni cathode. It has been reported a reduction in the energy consumption of the electrolysis cell of about 25% by using the developed coatings as cathodes.

Parallelization of Ensemble Kalman Filter (EnKF) for Oil Reservoirs with Time-lapse Seismic Data

In this paper we describe the design and implementation of a parallel algorithm for data assimilation with ensemble Kalman filter (EnKF) for oil reservoir history matching problem. The use of large number of observations from time-lapse seismic leads to a large turnaround time for the analysis step, in addition to the time consuming simulations of the realizations. For efficient parallelization it is important to consider parallel computation at the analysis step. Our experiments show that parallelization of the analysis step in addition to the forecast step has good scalability, exploiting the same set of resources with some additional efforts.

A Taguchi Approach to Investigate Impact of Factors for Reusability of Software Components

Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.

Sophorolipids Production by Candida Bombicola using Synthetic Dairy Wastewater

Sophorolipids (SLs) production by the yeast Candida bombicola was studied in batch shake flasks using synthetic dairy wastewaters (SDWW) with or without any added external carbon and nitrogen sources. A maximum SLs production of 38.76 g/l was observed with the SDWW supplemented with low cost substrate of sugarcane molasses at 50 g/l and soybean oil at 50 g/l. When the SDWW was supplemented with more costly glucose, yeast extract, urea and soybean oil, the production, however, got lowered to only 29.49 g/l, but with a maximum biomass production of 17.38 g/l together with a complete utilization of the carbon sources.

Mathematical Approach for Large Deformation Analysis of the Stiffened Coupled Shear Walls

Shear walls are used in most of the tall buildings for carrying the lateral load. When openings for doors or windows are necessary to be existed in the shear walls, a special type of the shear walls is used called "coupled shear walls" which in some cases is stiffened by specific beams and so, called "stiffened coupled shear walls". In this paper, a mathematical method for geometrically nonlinear analysis of the stiffened coupled shear walls has been presented. Then, a suitable formulation for determining the critical load of the stiffened coupled shear walls under gravity force has been proposed. The governing differential equations for equilibrium and deformation of the stiffened coupled shear walls have been obtained by setting up the equilibrium equations and the moment-curvature relationships for each wall. Because of the complexity of the differential equation, the energy method has been adopted for approximate solution of the equations.

An Approach of Quantum Steganography through Special SSCE Code

Encrypted messages sending frequently draws the attention of third parties, perhaps causing attempts to break and reveal the original messages. Steganography is introduced to hide the existence of the communication by concealing a secret message in an appropriate carrier like text, image, audio or video. Quantum steganography where the sender (Alice) embeds her steganographic information into the cover and sends it to the receiver (Bob) over a communication channel. Alice and Bob share an algorithm and hide quantum information in the cover. An eavesdropper (Eve) without access to the algorithm can-t find out the existence of the quantum message. In this paper, a text quantum steganography technique based on the use of indefinite articles (a) or (an) in conjunction with the nonspecific or non-particular nouns in English language and quantum gate truth table have been proposed. The authors also introduced a new code representation technique (SSCE - Secret Steganography Code for Embedding) at both ends in order to achieve high level of security. Before the embedding operation each character of the secret message has been converted to SSCE Value and then embeds to cover text. Finally stego text is formed and transmits to the receiver side. At the receiver side different reverse operation has been carried out to get back the original information.

Quality Evaluation of Compressed MRI Medical Images for Telemedicine Applications

Medical image modalities such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound (US), X-ray are adapted to diagnose disease. These modalities provide flexible means of reviewing anatomical cross-sections and physiological state in different parts of the human body. The raw medical images have a huge file size and need large storage requirements. So it should be such a way to reduce the size of those image files to be valid for telemedicine applications. Thus the image compression is a key factor to reduce the bit rate for transmission or storage while maintaining an acceptable reproduction quality, but it is natural to rise the question of how much an image can be compressed and still preserve sufficient information for a given clinical application. Many techniques for achieving data compression have been introduced. In this study, three different MRI modalities which are Brain, Spine and Knee have been compressed and reconstructed using wavelet transform. Subjective and objective evaluation has been done to investigate the clinical information quality of the compressed images. For the objective evaluation, the results show that the PSNR which indicates the quality of the reconstructed image is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and 26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For the subjective evaluation test, the results show that the compression ratio of 40:1 was acceptable for brain image, whereas for spine and knee images 50:1 was acceptable.

An Efficient Biometric Cryptosystem using Autocorrelators

Cryptography provides the secure manner of information transmission over the insecure channel. It authenticates messages based on the key but not on the user. It requires a lengthy key to encrypt and decrypt the sending and receiving the messages, respectively. But these keys can be guessed or cracked. Moreover, Maintaining and sharing lengthy, random keys in enciphering and deciphering process is the critical problem in the cryptography system. A new approach is described for generating a crypto key, which is acquired from a person-s iris pattern. In the biometric field, template created by the biometric algorithm can only be authenticated with the same person. Among the biometric templates, iris features can efficiently be distinguished with individuals and produces less false positives in the larger population. This type of iris code distribution provides merely less intra-class variability that aids the cryptosystem to confidently decrypt messages with an exact matching of iris pattern. In this proposed approach, the iris features are extracted using multi resolution wavelets. It produces 135-bit iris codes from each subject and is used for encrypting/decrypting the messages. The autocorrelators are used to recall original messages from the partially corrupted data produced by the decryption process. It intends to resolve the repudiation and key management problems. Results were analyzed in both conventional iris cryptography system (CIC) and non-repudiation iris cryptography system (NRIC). It shows that this new approach provides considerably high authentication in enciphering and deciphering processes.

Linear Pocket Profile based Threshold Voltage Model for sub-100 nm n-MOSFET

This paper presents a threshold voltage model of pocket implanted sub-100 nm n-MOSFETs incorporating the drain and substrate bias effects using two linear pocket profiles. Two linear equations are used to simulate the pocket profiles along the channel at the surface from the source and drain edges towards the center of the n-MOSFET. Then the effective doping concentration is derived and is used in the threshold voltage equation that is obtained by solving the Poisson-s equation in the depletion region at the surface. Simulated threshold voltages for various gate lengths fit well with the experimental data already published in the literature. The simulated result is compared with the two other pocket profiles used to derive the threshold voltage models of n-MOSFETs. The comparison shows that the linear model has a simple compact form that can be utilized to study and characterize the pocket implanted advanced ULSI devices.

Probability Distribution of Rainfall Depth at Hourly Time-Scale

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.