Cloud Computing: Changing Cogitation about Computing

Cloud Computing is a new technology that helps us to use the Cloud for compliance our computation needs. Cloud refers to a scalable network of computers that work together like Internet. An important element in Cloud Computing is that we shift processing, managing, storing and implementing our data from, locality into the Cloud; So it helps us to improve the efficiency. Because of it is new technology, it has both advantages and disadvantages that are scrutinized in this article. Then some vanguards of this technology are studied. Afterwards we find out that Cloud Computing will have important roles in our tomorrow life!

Extraction of Symbolic Rules from Artificial Neural Networks

Although backpropagation ANNs generally predict better than decision trees do for pattern classification problems, they are often regarded as black boxes, i.e., their predictions cannot be explained as those of decision trees. In many applications, it is desirable to extract knowledge from trained ANNs for the users to gain a better understanding of how the networks solve the problems. A new rule extraction algorithm, called rule extraction from artificial neural networks (REANN) is proposed and implemented to extract symbolic rules from ANNs. A standard three-layer feedforward ANN is the basis of the algorithm. A four-phase training algorithm is proposed for backpropagation learning. Explicitness of the extracted rules is supported by comparing them to the symbolic rules generated by other methods. Extracted rules are comparable with other methods in terms of number of rules, average number of conditions for a rule, and predictive accuracy. Extensive experimental studies on several benchmarks classification problems, such as breast cancer, iris, diabetes, and season classification problems, demonstrate the effectiveness of the proposed approach with good generalization ability.

Study of Features for Hand-printed Recognition

The feature extraction method(s) used to recognize hand-printed characters play an important role in ICR applications. In order to achieve high recognition rate for a recognition system, the choice of a feature that suits for the given script is certainly an important task. Even if a new feature required to be designed for a given script, it is essential to know the recognition ability of the existing features for that script. Devanagari script is being used in various Indian languages besides Hindi the mother tongue of majority of Indians. This research examines a variety of feature extraction approaches, which have been used in various ICR/OCR applications, in context to Devanagari hand-printed script. The study is conducted theoretically and experimentally on more that 10 feature extraction methods. The various feature extraction methods have been evaluated on Devanagari hand-printed database comprising more than 25000 characters belonging to 43 alphabets. The recognition ability of the features have been evaluated using three classifiers i.e. k-NN, MLP and SVM.

The Roles of Natural and Anthropogenic Factors of Ecological State in the Lake Peipsi

In this paper we discuss the problems of the long-term management policy of Lake Peipsi and the roles of natural and anthropogenic factors in the ecological state of the lake. The reduction of the pollution during the last 15 years could not give significant changes of the chemical composition of the water, what implicates the essential role that natural factors have on the ecological state of lake. One of the most important factors having impact on the hydrochemical cycles and ecological state is the hydrological regime which is clearly expressed in L. Peipsi. The absence on clear interrelations of climate cycles and nutrients suggest that complex abiotic and biotic interactions, which take place in the lake ecosystem, plays a significant role in the matter circulation mechanism within lake.

Removal of Hydrogen Sulphide from Air by Means of Fibrous Ion Exchangers

The removal of hydrogen sulphide is required for reasons of health, odour problems, safety and corrosivity problems. The means of removing hydrogen sulphide mainly depend on its concentration and kind of medium to be purified. The paper deals with a method of hydrogen sulphide removal from the air by its catalytic oxidation to elemental sulphur with the use of Fe-EDTA complex. The possibility of obtaining fibrous filtering materials able to remove small concentrations of H2S from the air were described. The base of these materials is fibrous ion exchanger with Fe(III)- EDTA complex immobilized on their functional groups. The complex of trivalent iron converts hydrogen sulphide to elemental sulphur. Bivalent iron formed in the reaction is oxidized by the atmospheric oxygen, so complex of trivalent iron is continuously regenerated and the overall process can be accounted as pseudocatalytic. In the present paper properties of several fibrous catalysts based on ion exchangers with different chemical nature (weak acid,weak base and strong base) were described. It was shown that the main parameters affecting the process of catalytic oxidation are:concentration of hydrogen sulphide in the air, relative humidity of the purified air, the process time and the content of Fe-EDTA complex in the fibres. The data presented show that the filtering layers with anion exchange package are much more active in the catalytic processes of hydrogen sulphide removal than cation exchanger and inert materials. In the addition to the nature of the fibres relative air humidity is a critical factor determining efficiency of the material in the air purification from H2S. It was proved that the most promising carrier of the Fe-EDTA catalyst for hydrogen sulphide oxidation are Fiban A-6 and Fiban AK-22 fibres.

Multi-Agents Coordination Model in Inter- Organizational Workflow: Applying in Egovernment

Inter-organizational Workflow (IOW) is commonly used to support the collaboration between heterogeneous and distributed business processes of different autonomous organizations in order to achieve a common goal. E-government is considered as an application field of IOW. The coordination of the different organizations is the fundamental problem in IOW and remains the major cause of failure in e-government projects. In this paper, we introduce a new coordination model for IOW that improves the collaboration between government administrations and that respects IOW requirements applied to e-government. For this purpose, we adopt a Multi-Agent approach, which deals more easily with interorganizational digital government characteristics: distribution, heterogeneity and autonomy. Our model integrates also different technologies to deal with the semantic and technologic interoperability. Moreover, it conserves the existing systems of government administrations by offering a distributed coordination based on interfaces communication. This is especially applied in developing countries, where administrations are not necessary equipped with workflow systems. The use of our coordination techniques allows an easier migration for an e-government solution and with a lower cost. To illustrate the applicability of the proposed model, we present a case study of an identity card creation in Tunisia.

Multi-criteria Optimization of Square Beam using Linear Weighted Average Model

Increasing energy absorption is a significant parameter in vehicle design. Absorbing more energy results in decreasing occupant damage. Limitation of the deflection in a side impact results in decreased energy absorption (SEA) and increased peak load (PL). Hence a high crash force jeopardizes passenger safety and vehicle integrity. The aims of this paper are to determine suitable dimensions and material of a square beam subjected to side impact, in order to maximize SEA and minimize PL. To achieve this novel goal, the geometric parameters of a square beam are optimized using the response surface method (RSM).multi-objective optimization is performed, and the optimum design for different response features is obtained.

Impact of Government Spending on Private Consumption and on the Economy: Case of Thailand

The recent global financial problem urges government to play role in stimulating the economy due to the fact that private sector has little ability to purchase during the recession. A concerned question is whether the increased government spending crowds out private consumption and whether it helps stimulate the economy. If the government spending policy is effective; the private consumption is expected to increase and can compensate the recent extra government expense. In this study, the government spending is categorized into government consumption spending and government capital spending. The study firstly examines consumer consumption along the line with the demand function in microeconomic theory. Three categories of private consumption are used in the study. Those are food consumption, non food consumption, and services consumption. The dynamic Almost Ideal Demand System of the three categories of the private consumption is estimated using the Vector Error Correction Mechanism model. The estimated model indicates the substituting effects (negative impacts) of the government consumption spending on budget shares of private non food consumption and of the government capital spending on budget share of private food consumption, respectively. Nevertheless the result does not necessarily indicate whether the negative effects of changes in the budget shares of the non food and the food consumption means fallen total private consumption. Microeconomic consumer demand analysis clearly indicates changes in component structure of aggregate expenditure in the economy as a result of the government spending policy. The macroeconomic concept of aggregate demand comprising consumption, investment, government spending (the government consumption spending and the government capital spending), export, and import are used to estimate for their relationship using the Vector Error Correction Mechanism model. The macroeconomic study found no effect of the government capital spending on either the private consumption or the growth of GDP while the government consumption spending has negative effect on the growth of GDP. Therefore no crowding out effect of the government spending is found on the private consumption but it is ineffective and even inefficient expenditure as found reducing growth of the GDP in the context of Thailand.

Interactive Model Based On an Extended CPN

The UML modeling of complex distributed systems often is a great challenge due to the large amount of parallel real-time operating components. In this paper the problems of verification of such systems are discussed. ECPN, an Extended Colored Petri Net is defined to formally describe state transitions of components and interactions among components. The relationship between sequence diagrams and Free Choice Petri Nets is investigated. Free Choice Petri Net theory helps verifying the liveness of sequence diagrams. By converting sequence diagrams to ECPNs and then comparing behaviors of sequence diagram ECPNs and statecharts, the consistency among models is analyzed. Finally, a verification process for an example model is demonstrated.

A Consistency Protocol Multi-Layer for Replicas Management in Large Scale Systems

Large scale systems such as computational Grid is a distributed computing infrastructure that can provide globally available network resources. The evolution of information processing systems in Data Grid is characterized by a strong decentralization of data in several fields whose objective is to ensure the availability and the reliability of the data in the reason to provide a fault tolerance and scalability, which cannot be possible only with the use of the techniques of replication. Unfortunately the use of these techniques has a height cost, because it is necessary to maintain consistency between the distributed data. Nevertheless, to agree to live with certain imperfections can improve the performance of the system by improving competition. In this paper, we propose a multi-layer protocol combining the pessimistic and optimistic approaches conceived for the data consistency maintenance in large scale systems. Our approach is based on a hierarchical representation model with tree layers, whose objective is with double vocation, because it initially makes it possible to reduce response times compared to completely pessimistic approach and it the second time to improve the quality of service compared to an optimistic approach.

Integrating Technology into Mathematics Education: A Case Study from Primary Mathematics Students Teachers

The purpose of the study is to determine the primary mathematics student teachers- views related to use instructional technology tools in course of the learning process and to reveal how the sample presentations towards different mathematical concepts affect their views. This is a qualitative study involving twelve mathematics students from a public university. The data gathered from two semi-structural interviews. The first one was realized in the beginning of the study. After that the representations prepared by the researchers were showed to the participants. These representations contain animations, Geometer-s Sketchpad activities, video-clips, spreadsheets, and power-point presentations. The last interview was realized at the end of these representations. The data from the interviews and content analyses were transcribed and read and reread to explore the major themes. Findings revealed that the views of the students changed in this process and they believed that the instructional technology tools should be used in their classroom.

The Corporate Integration of Highly Skilled Professionals - A Social Capital Perspective

Not with standing the importance of foreign highly skilled professionals for host economies, there is a paucity of research studies investigating the role of the corporate social context during the integration process. This research aims to address this paucity by exploring the role of social capital in the integration of foreign health professionals. It does so by using a qualitative research approach. In this pilot study the hospital sector forms this study-s sample and interviews were conducted with HR managers, foreign health professionals and external HR consultants. It was found that most of the participating hospitals had not established specific HR practices and had only partly linked the development of organisational social capital with a successful integration process. This research contributes, for example, to the HR literature on the integration of self-initiated expatriates by analysing the role of HRM in generating organisational social capital needed for a successful integration process.

Use of Caffeine and Human Pharmaceutical Compounds to Identify Sewage Contamination

Fecal coliform bacteria are widely used as indicators of sewage contamination in surface water. However, there are some disadvantages in these microbial techniques including time consuming (18-48h) and inability in discriminating between human and animal fecal material sources. Therefore, it is necessary to seek a more specific indicator of human sanitary waste. In this study, the feasibility was investigated to apply caffeine and human pharmaceutical compounds to identify the human-source contamination. The correlation between caffeine and fecal coliform was also explored. Surface water samples were collected from upstream, middle-stream and downstream points respectively, along Rochor Canal, as well as 8 locations of Marina Bay. Results indicate that caffeine is a suitable chemical tracer in Singapore because of its easy detection (in the range of 0.30-2.0 ng/mL), compared with other chemicals monitored. Relative low concentrations of human pharmaceutical compounds (< 0.07 ng/mL) in Rochor Canal and Marina Bay water samples make them hard to be detected and difficult to be chemical tracer. However, their existence can help to validate sewage contamination. In addition, it was discovered the high correlation exists between caffeine concentration and fecal coliform density in the Rochor Canal water samples, demonstrating that caffeine is highly related to the human-source contamination.

Ensembling Classifiers – An Application toImage Data Classification from Cherenkov Telescope Experiment

Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques with classifiers such as random forests, neural networks and support vector machines. The data sets are from MAGIC, a Cherenkov telescope experiment. The task is to classify gamma signals from overwhelmingly hadron and muon signals representing a rare class classification problem. We compare the individual classifiers with their ensemble counterparts and discuss the results. WEKA a wonderful tool for machine learning has been used for making the experiments.

Application of Smooth Ergodic Hidden Markov Model in Text to Speech Systems

In developing a text-to-speech system, it is well known that the accuracy of information extracted from a text is crucial to produce high quality synthesized speech. In this paper, a new scheme for converting text into its equivalent phonetic spelling is introduced and developed. This method is applicable to many applications in text to speech converting systems and has many advantages over other methods. The proposed method can also complement the other methods with a purpose of improving their performance. The proposed method is a probabilistic model and is based on Smooth Ergodic Hidden Markov Model. This model can be considered as an extension to HMM. The proposed method is applied to Persian language and its accuracy in converting text to speech phonetics is evaluated using simulations.

Perturbations of the EM-field Meters Reading Caused by Flat Roof Security Wall

The wide increase and diffusion on telecommunication technologies have caused a huge spread of electromagnetic sources in most European Countries. Since the public is continuously being exposed to electromagnetic radiation the possible health effects have become the focus of population concerns. As a result, electromagnetic field monitoring stations which control field strength in commercial frequency bands are being placed on the flat roof of many buildings. However there is no guidance on where to place them. This paper presents an analysis of frequency, polarization and angles of incidence of a plane wave which impinges on a flat roof security wall and its dependence on electromagnetic field strength meters placement.

Proteolytic Dedradation of Anchovy (Spolephorus spp.) Proteins by Halophilic Proteinase from Halobacillus sp. SR5-3

The halophilic proteinase showed a maximal activity at 50°C and pH 9~10, in 20% NaCl and was highly stabilized by NaCl. It was able to hydrolyse natural actomyosin (NAM), collagen and anchovy protein. For NAM hydrolysis, the myosin heavy chain was completely digested by halophilic proteinase as evidenced by the lowest band intensity remaining, but partially hydrolysed actin. The SR5-3 proteinase was also capable hydrolyzing two major components of collagen, β- and α-compounds, effectively. The degree of hydrolysis (DH) of the halophilic proteinase and commercial proteinases (Novozyme, Neutrase, chymotrypsin and Flavourzyme) on the anchovy protein, were compared, and it was found that the proteinase showed a greater degree of hydrolysis towards anchovy protein than that from commercial proteinases. DH of halophilic proteinase was sharply enhanced according to the increase in the concentration of enzyme from 0.035 U to 0.105 U. The results warranting that the acceleration of the production of fish sauce with higher quality, may be achieved by adding of the halophilic proteinase from this bacterium.

Polymerisation Shrinkage of Light−Cured Hydroxyapatite (HA)−Reinforced Dental Composites

The dental composites are preferably used as filling materials due to their esthetic appearances. Nevertheless one of the major problems, during the application of the dental composites, is shape change named as “polymerisation shrinkage" affecting clinical success of the dental restoration while photo-polymerisation. Polymerisation shrinkage of composites arises basically from the formation of a polymer due to the monomer transformation which composes of an organic matrix phase. It was sought, throughout this study, to detect and evaluate the structural polymerisation shrinkage of prepared dental composites in order to optimize the effects of various fillers included in hydroxyapatite (HA)-reinforced dental composites and hence to find a means to modify the properties of these dental composites prepared with defined parameters. As a result, the shrinkage values of the experimental dental composites were decreased by increasing the filler content of composites and the composition of different fillers used had effect on the shrinkage of the prepared composite systems.

Software Process Improvement: A Organizational Change that Need to be Managed and Motivated

As seen in literature, about 70% of the improvement initiatives fail, and a significant number do not even get started. This paper analyses the problem of failing initiatives on Software Process Improvement (SPI), and proposes good practices supported by motivational tools that can help minimizing failures. It elaborates on the hypothesis that human factors are poorly addressed by deployers, especially because implementation guides usually emphasize only technical factors. This research was conducted with SPI deployers and analyses 32 SPI initiatives. The results indicate that although human factors are not commonly highlighted in guidelines, the successful initiatives usually address human factors implicitly. This research shows that practices based on human factors indeed perform a crucial role on successful implantations of SPI, proposes change management as a theoretical framework to introduce those practices in the SPI context and suggests some motivational tools based on SPI deployers experience to support it.

Coverage and Connectivity Problem in Sensor Networks

In over deployed sensor networks, one approach to Conserve energy is to keep only a small subset of sensors active at Any instant. For the coverage problems, the monitoring area in a set of points that require sensing, called demand points, and consider that the node coverage area is a circle of range R, where R is the sensing range, If the Distance between a demand point and a sensor node is less than R, the node is able to cover this point. We consider a wireless sensor network consisting of a set of sensors deployed randomly. A point in the monitored area is covered if it is within the sensing range of a sensor. In some applications, when the network is sufficiently dense, area coverage can be approximated by guaranteeing point coverage. In this case, all the points of wireless devices could be used to represent the whole area, and the working sensors are supposed to cover all the sensors. We also introduce Hybrid Algorithm and challenges related to coverage in sensor networks.