Describing Learning Features of Reusable Resources: A Proposal

One of the main advantages of the LO paradigm is to allow the availability of good quality, shareable learning material through the Web. The effectiveness of the retrieval process requires a formal description of the resources (metadata) that closely fits the user-s search criteria; in spite of the huge international efforts in this field, educational metadata schemata often fail to fulfil this requirement. This work aims to improve the situation, by the definition of a metadata model capturing specific didactic features of shareable learning resources. It classifies LOs into “teacher-oriented" and “student-oriented" categories, in order to describe the role a LO is to play when it is integrated into the educational process. This article describes the model and a first experimental validation process that has been carried out in a controlled environment.

Video Data Mining based on Information Fusion for Tamper Detection

In this paper, we propose novel algorithmic models based on information fusion and feature transformation in crossmodal subspace for different types of residue features extracted from several intra-frame and inter-frame pixel sub-blocks in video sequences for detecting digital video tampering or forgery. An evaluation of proposed residue features – the noise residue features and the quantization features, their transformation in cross-modal subspace, and their multimodal fusion, for emulated copy-move tamper scenario shows a significant improvement in tamper detection accuracy as compared to single mode features without transformation in cross-modal subspace.

Effects of the Second Entrant in GSM Telecommunication Market in MENA Region

For the first incumbent operator it is very important to understand how to react when the second operator comes to the market. In this paper which is prepared for preliminary study of GSM market in Iran, we have studied five MENA markets according to the similarity point of view. This paper aims at analyzing the impact of second entrants in selected markets on certain marketing key performance indicators (KPI) such as: Market shares (by operator), prepaid share, minutes of use (MoU), Price and average revenue per user (ARPU) (for total market each).

Forecasting Fraudulent Financial Statements using Data Mining

This paper explores the effectiveness of machine learning techniques in detecting firms that issue fraudulent financial statements (FFS) and deals with the identification of factors associated to FFS. To this end, a number of experiments have been conducted using representative learning algorithms, which were trained using a data set of 164 fraud and non-fraud Greek firms in the recent period 2001-2002. The decision of which particular method to choose is a complicated problem. A good alternative to choosing only one method is to create a hybrid forecasting system incorporating a number of possible solution methods as components (an ensemble of classifiers). For this purpose, we have implemented a hybrid decision support system that combines the representative algorithms using a stacking variant methodology and achieves better performance than any examined simple and ensemble method. To sum up, this study indicates that the investigation of financial information can be used in the identification of FFS and underline the importance of financial ratios.

A New Edit Distance Method for Finding Similarity in Dna Sequence

The P-Bigram method is a string comparison methods base on an internal two characters-based similarity measure. The edit distance between two strings is the minimal number of elementary editing operations required to transform one string into the other. The elementary editing operations include deletion, insertion, substitution two characters. In this paper, we address the P-Bigram method to sole the similarity problem in DNA sequence. This method provided an efficient algorithm that locates all minimum operation in a string. We have been implemented algorithm and found that our program calculated that smaller distance than one string. We develop PBigram edit distance and show that edit distance or the similarity and implementation using dynamic programming. The performance of the proposed approach is evaluated using number edit and percentage similarity measures.

Time Comparative Simulator for Distributed Process Scheduling Algorithms

In any distributed systems, process scheduling plays a vital role in determining the efficiency of the system. Process scheduling algorithms are used to ensure that the components of the system would be able to maximize its utilization and able to complete all the processes assigned in a specified period of time. This paper focuses on the development of comparative simulator for distributed process scheduling algorithms. The objectives of the works that have been carried out include the development of the comparative simulator, as well as to implement a comparative study between three distributed process scheduling algorithms; senderinitiated, receiver-initiated and hybrid sender-receiver-initiated algorithms. The comparative study was done based on the Average Waiting Time (AWT) and Average Turnaround Time (ATT) of the processes involved. The simulation results show that the performance of the algorithms depends on the number of nodes in the system.

Mounting Time Reduction using Content-Based Block Management for NAND Flash File System

The flash memory has many advantages such as low power consumption, strong shock resistance, fast I/O and non-volatility. And it is increasingly used in the mobile storage device. The YAFFS, one of the NAND flash file system, is widely used in the embedded device. However, the existing YAFFS takes long time to mount the file system because it scans whole spare areas in all pages of NAND flash memory. In order to solve this problem, we propose a new content-based flash file system using a mounting time reduction technique. The proposed method only scans partial spare areas of some special pages by using content-based block management. The experimental results show that the proposed method reduces the average mounting time by 87.2% comparing with JFFS2 and 69.9% comparing with YAFFS.

An Integrative Bayesian Approach to Supporting the Prediction of Protein-Protein Interactions: A Case Study in Human Heart Failure

Recent years have seen a growing trend towards the integration of multiple information sources to support large-scale prediction of protein-protein interaction (PPI) networks in model organisms. Despite advances in computational approaches, the combination of multiple “omic" datasets representing the same type of data, e.g. different gene expression datasets, has not been rigorously studied. Furthermore, there is a need to further investigate the inference capability of powerful approaches, such as fullyconnected Bayesian networks, in the context of the prediction of PPI networks. This paper addresses these limitations by proposing a Bayesian approach to integrate multiple datasets, some of which encode the same type of “omic" data to support the identification of PPI networks. The case study reported involved the combination of three gene expression datasets relevant to human heart failure (HF). In comparison with two traditional methods, Naive Bayesian and maximum likelihood ratio approaches, the proposed technique can accurately identify known PPI and can be applied to infer potentially novel interactions.

Optimization Based Obstacle Avoidance

Based on a non-linear single track model which describes the dynamics of vehicle, an optimal path planning strategy is developed. Real time optimization is used to generate reference control values to allow leading the vehicle alongside a calculated lane which is optimal for different objectives such as energy consumption, run time, safety or comfort characteristics. Strict mathematic formulation of the autonomous driving allows taking decision on undefined situation such as lane change or obstacle avoidance. Based on position of the vehicle, lane situation and obstacle position, the optimization problem is reformulated in real-time to avoid the obstacle and any car crash.

Signature Identification Scheme Based on Iterated Function Systems

Since 1984 many schemes have been proposed for digital signature protocol, among them those that based on discrete log and factorizations. However a new identification scheme based on iterated function (IFS) systems are proposed and proved to be more efficient. In this study the proposed identification scheme is transformed into a digital signature scheme by using a one way hash function. It is a generalization of the GQ signature schemes. The attractor of the IFS is used to obtain public key from a private one, and in the encryption and decryption of a hash function. Our aim is to provide techniques and tools which may be useful towards developing cryptographic protocols. Comparisons between the proposed scheme and fractal digital signature scheme based on RSA setting, as well as, with the conventional Guillou-Quisquater signature, and RSA signature schemes is performed to prove that, the proposed scheme is efficient and with high performance.

Industrial Development, Environment And Occupational Problems: The Case Of Iran

There are three distinct stages in the evolution of economic thought, namely: 1. in the first stage, the major concern was to accelerate economic growth with increased availability of material goods, especially in developing economies with very low living standards, because poverty eradication meant faster economic growth. 2. in the second stage, economists made distinction between growth and development. Development was seen as going beyond economic growth, and bringing certain changes in the structure of the economy with more equitable distribution of the benefits of growth, with the growth coming automatic and sustained. 3. the third stage is now reached. Our concern is now with “sustainable development", that is, development not only for the present but also of the future. Thus the focus changed from “sustained growth" to “sustained development". Sustained development brings to the fore the long term relationship between the ecology and economic development. Since the creation of UNEP in 1972 it has worked for development without destruction for environmentally sound and sustained development. It was realised that the environment cannot be viewed in a vaccum, it is not separate from development, nor is it competing. It suggested for the integration of the environment with development whereby ecological factors enter development planning, socio-economic policies, cost-benefit analysis, trade, technology transfer, waste management, educational and other specific areas. Industrialisation has contributed to the growth of economy of several countries. It has improved the standards of living of its people and provided benefits to the society. It has also created in the process great environmental problems like climate change, forest destruction and denudation, soil erosion and desertification etc. On the other hand, industry has provided jobs and improved the prospects of wealth for the industrialists. The working class communities had to simply put up with the high levels of pollution in order to keep up their jobs and also to save their income. There are many roots of the environmental problem. They may be political, economic, cultural and technological conditions of the modern society. The experts concede that industrial growth lies somewhere close to the heart of the matter. Therefore, the objective of this paper is not to document all roots of an environmental crisis but rather to discuss the effects of industrial growth and development. We have come to the conclusion that although public intervention is often unnecessary to ensure that perfectly competitive markets will function in society-s best interests, such intervention is necessary when firms or consumers pollute.

Investigation Corn and Soybean Intercropping Advantages in Competition with Redroot Pigweed and Jimsonweed

The spatial variation in plant species associated with intercropping is intended to reduce resource competition between species and increase yield potential. A field experiment was carried out on corn (Zea mays L.) and soybean (Glycine max L.) intercropping in a replacement series experiment with weed contamination consist of: weed free, infestation of redroot pigweed, infestation of jimsonweed and simultaneous infestation of redroot pigweed and jimsonweed in Karaj, Iran during 2007 growing season. The experimental design was a randomized complete block in factorial experiment with replicated thrice. Significant (P≤0.05) differences were observed in yield in intercropping. Corn yield was higher in intercropping, but soybean yield was significantly reduced by corn when intercropped. However, total productivity and land use efficiency were high under the intercropping system even in contamination of either species of weeds. Aggressivity of corn relative to soybean revealed the greater competitive ability of corn than soybean. Land equivalent ratio (LER) more than 1 in all treatments attributed to intercropping advantages and was highest in 50: 50 (corn/soybean) in weed free. These findings suggest that intercropping corn and soybean increase total productivity per unit area and improve land use efficiency. Considering the experimental findings, corn-soybean intercropping (50:50) may be recommended for yield advantage, more efficient utilization of resources, and weed suppression as a biological control.

Dependence of Virtual Subjects Reflection from the Features of Coping Behavior of Students

In the globalization process, when the struggle for minds and values of the people is taking place, the impact of the virtual space can cause unexpected effects and consequences in the process of adjustment of young people in this world. Their special significance is defined by unconscious influence on the underlying process of meaning and therefore the values preached by them are much more effective and affect both the personal characteristics and the peculiarities of adjustment process. Related to this the challenge is to identify factors influencing the reflection characteristics of virtual subjects and measures their impact on the personal characteristics of the students.

Mathematical Determination of Tall Square Building Height under Peak Wind Loads

The present study concentrates on solving the along wind oscillation problem of a tall square building from first principles and across wind oscillation problem of the same from empirical relations obtained by experiments. The criterion for human comfort at the worst condition at the top floor of the building is being considered and a limiting value of height of a building for a given cross section is predicted. Numerical integrations are carried out as and when required. The results show severeness of across wind oscillations in comparison to along wind oscillation. The comfort criterion is combined with across wind oscillation results to determine the maximum allowable height of a building for a given square cross-section.

The Development of Decision Support System for Waste Management; a Review

Most Decision Support Systems (DSS) for waste management (WM) constructed are not widely marketed and lack practical applications. This is due to the number of variables and complexity of the mathematical models which include the assumptions and constraints required in decision making. The approach made by many researchers in DSS modelling is to isolate a few key factors that have a significant influence to the DSS. This segmented approach does not provide a thorough understanding of the complex relationships of the many elements involved. The various elements in constructing the DSS must be integrated and optimized in order to produce a viable model that is marketable and has practical application. The DSS model used in assisting decision makers should be integrated with GIS, able to give robust prediction despite the inherent uncertainties of waste generation and the plethora of waste characteristics, and gives optimal allocation of waste stream for recycling, incineration, landfill and composting.

Estimation of the Minimum Floor Length Downstream Regulators under Different Flow Scenarios

The correct design of the regulators structure requires complete prediction of the ultimate dimensions of the scour hole profile formed downstream the solid apron. The study of scour downstream regulator is studied either on solid aprons by means of velocity distribution or on movable bed by studying the topography of the scour hole formed in the downstream. In this paper, a new technique was developed to study the scour hole downstream regulators on movable beds. The study was divided into two categories; the first is to find out the sum of the lengths of rigid apron behind the gates in addition to the length of scour hole formed downstream, while the second is to find the minimum length of rigid apron behind the gates to prevent erosion downstream it. The study covers free and submerged hydraulic jump conditions in both symmetrical and asymmetrical under-gated regulations. From the comparison between the studied categories, we found that the minimum length of rigid apron to prevent scour (Ls) is greater than the sum of the lengths of rigid apron and that of scour hole formed behind it (L+Xs). On the other hand, the scour hole dimensions in case of submerged hydraulic jump is always greater than free one, also the scour hole dimensions in asymmetrical operation is greater than symmetrical one.

Morpho-Phonological Modelling in Natural Language Processing

In this paper we propose a computational model for the representation and processing of morpho-phonological phenomena in a natural language, like Modern Greek. We aim at a unified treatment of inflection, compounding, and word-internal phonological changes, in a model that is used for both analysis and generation. After discussing certain difficulties cuase by well-known finitestate approaches, such as Koskenniemi-s two-level model [7] when applied to a computational treatment of compounding, we argue that a morphology-based model provides a more adequate account of word-internal phenomena. Contrary to the finite state approaches that cannot handle hierarchical word constituency in a satisfactory way, we propose a unification-based word grammar, as the nucleus of our strategy, which takes into consideration word representations that are based on affixation and [stem stem] or [stem word] compounds. In our formalism, feature-passing operations are formulated with the use of the unification device, and phonological rules modeling the correspondence between lexical and surface forms apply at morpheme boundaries. In the paper, examples from Modern Greek illustrate our approach. Morpheme structures, stress, and morphologically conditioned phoneme changes are analyzed and generated in a principled way.

Dynamic Slope Scaling Procedure for Stochastic Integer Programming Problem

Mathematical programming has been applied to various problems. For many actual problems, the assumption that the parameters involved are deterministic known data is often unjustified. In such cases, these data contain uncertainty and are thus represented as random variables, since they represent information about the future. Decision-making under uncertainty involves potential risk. Stochastic programming is a commonly used method for optimization under uncertainty. A stochastic programming problem with recourse is referred to as a two-stage stochastic problem. In this study, we consider a stochastic programming problem with simple integer recourse in which the value of the recourse variable is restricted to a multiple of a nonnegative integer. The algorithm of a dynamic slope scaling procedure for solving this problem is developed by using a property of the expected recourse function. Numerical experiments demonstrate that the proposed algorithm is quite efficient. The stochastic programming model defined in this paper is quite useful for a variety of design and operational problems.

Study on Position Polarity Compensation for Permanent Magnet Synchronous Motor Based on High Frequency Signal Injection

The application of a high frequency signal injection method as speed and position observer in PMSM drives has been a research focus. At present, the precision of this method is nearly good as that of ten-bit encoder. But there are some questions for estimating position polarity. Based on high frequency signal injection, this paper presents a method to compensate position polarity for permanent magnet synchronous motor (PMSM). Experiments were performed to test the effectiveness of the proposed algorithm and results present the good performance.

Thermo-mechanical Behavior of Pressure Tube of Indian PHWR at 20 bar Pressure

In a nuclear reactor Loss of Coolant accident (LOCA) considers wide range of postulated damage or rupture of pipe in the heat transport piping system. In the case of LOCA with/without failure of emergency core cooling system in a Pressurised Heavy water Reactor, the Pressure Tube (PT) temperature could rise significantly due to fuel heat up and gross mismatch of the heat generation and heat removal in the affected channel. The extent and nature of deformation is important from reactor safety point of view. Experimental set-ups have been designed and fabricated to simulate ballooning (radial deformation) of PT for 220 MWe IPHWRs. Experiments have been conducted by covering the CT by ceramic fibers and then by submerging CT in water of voided PTs. In both the experiments, it is observed that ballooning initiates at a temperature around 665´┐¢C and complete contact between PT and Caldaria Tube (CT) occurs at around 700´┐¢C approximately. The strain rate is found to be 0.116% per second. The structural integrity of PT is retained (no breach) for all the experiments. The PT heatup is found to be arrested after the contact between PT and CT, thus establishing moderator acting as an efficient heat sink for IPHWRs.