Estimated Production Potential Types of Wind Turbines Connected to the Network Using Random Numbers Simulation

Nowadays, power systems, energy generation by wind has been very important. Noting that the production of electrical energy by wind turbines on site to several factors (such as wind speed and profile site for the turbines, especially off the wind input speed, wind rated speed and wind output speed disconnect) is dependent. On the other hand, several different types of turbines in the market there. Therefore, selecting a turbine that its capacity could also answer the need for electric consumers the efficiency is high something is important and necessary. In this context, calculating the amount of wind power to help optimize overall network, system operation, in determining the parameters of wind power is very important. In this article, to help calculate the amount of wind power plant, connected to the national network in the region Manjil wind, selecting the best type of turbine and power delivery profile appropriate to the network using Monte Carlo method has been. In this paper, wind speed data from the wind site in Manjil, as minute and during the year has been. Necessary simulations based on Random Numbers Simulation method and repeat, using the software MATLAB and Excel has been done.

Image Similarity: A Genetic Algorithm Based Approach

The paper proposes an approach using genetic algorithm for computing the region based image similarity. The image is denoted using a set of segmented regions reflecting color and texture properties of an image. An image is associated with a family of image features corresponding to the regions. The resemblance of two images is then defined as the overall similarity between two families of image features, and quantified by a similarity measure, which integrates properties of all the regions in the images. A genetic algorithm is applied to decide the most plausible matching. The performance of the proposed method is illustrated using examples from an image database of general-purpose images, and is shown to produce good results.

Monitoring Patents Using the Statistical Process Control

The statistical process control (SPC) is one of the most powerful tools developed to assist ineffective control of quality, involves collecting, organizing and interpreting data during production. This article aims to show how the use of CEP industries can control and continuously improve product quality through monitoring of production that can detect deviations of parameters representing the process by reducing the amount of off-specification products and thus the costs of production. This study aimed to conduct a technological forecasting in order to characterize the research being done related to the CEP. The survey was conducted in the databases Spacenet, WIPO and the National Institute of Industrial Property (INPI). Among the largest are the United States depositors and deposits via PCT, the classification section that was presented in greater abundance to F.

The Effects of Perceived Organizational Support, Abusive Supervision, and Exchange Ideology on Employees- Task Performance

Employee-s task performance has been recognized as a core contributor to overall organizational effectiveness. Hence, verifying the determinants of task performance is one of the most important research issues. This study tests the influence of perceived organizational support, abusive supervision, and exchange ideology on employee-s task performance. We examined our hypotheses by collecting self-reported data from 413 Korean employees in different organizations. Our all hypotheses gained support from the results. Implications for research and directions for future research are discussed.

Modeling the Country Selection Decision in Retail Internationalization

This paper aims to develop a model that assists the international retailer in selecting the country that maximizes the degree of fit between the retailer-s goals and the country characteristics in his initial internationalization move. A two-stage multi criteria decision model is designed integrating the Analytic Hierarchy Process (AHP) and Goal Programming. Ethical, cultural, geographic and economic proximity are identified as the relevant constructs of the internationalization decision. The constructs are further structured into sub-factors within analytic hierarchy. The model helps the retailer to integrate, rank and weigh a number of hard and soft factors and prioritize the countries accordingly. The model has been implemented on a Turkish luxury goods retailer who was planning to internationalize. Actual entry of the specific retailer in the selected country is a support for the model. Implementation on a single retailer limits the generalizability of the results; however, the emphasis of the paper is on construct identification and model development. The paper enriches the existing literature by proposing a hybrid multi objective decision model which introduces new soft dimensions i.e. perceived distance, ethical proximity, humane orientation to the decision process and facilitates effective decision making.

eTax Filing and Service Quality: The Case of the Revenue Online Service

This paper describes an ongoing study into the quality of service provided by the Irish Revenue Commisioners- online tax filing and collection system. The Irish Revenue On-Line Service (ROS) site has won several awards. In this study, a version of the widely use SERVQUAL measuring instrument, adapted for use with online services, has been modified for the specific case of ROS. In this paper, the theory behind this instrument is set out, the particular problems of evaluating revenue collecting online are examined and the rationale for this approach is explained.

Turkish Emerging Adults' Identity Statuses with Respect to Marital and Parental Statuses and SES

Emerging adulthood, between the ages of 18 and 25, as a new developmental stage extending from adolescence to young adulthood. According to Arnett [2004], there are experiments related to identity in three basic fields which are love, work and view of the world in emerging adulthood. When the literature related to identity is examined, it is seen that identity has been studied more with adolescent, and studies were concentrated on the relationship of identity with many demographic variables neglecting important variables such as marital status, parental status and SES. Thus, the main aim of this study is to determine whether identity statuses differenciate with marital status, parental status and SES. A total of 700 emerging adults participated in this study, and the mean age was 22,45 years [SD = 3.76]. The sample was made up of 347 female and 353 male. All participants in the study were students from colleges. Student responses to the Extended Version of the Objective Measure of Ego Identity Status [EOM-EIS-2] used to classify students into one of the four identity statuses. SPSS 15.00 program wasa used to analyse data. Percentage, frequency and X2 analysis were used in the analysis of data. When the findings of the study is viewed as a whole, the most frequently observed identity status in the group is found to be moratorium. Also, identity statuses differenciate with marital status, parental status and SES. Findings were discussed in the context of emerging adulthood.

Towards a Systematic, Cost-Effective Approach for ERP Selection

Existing experiences indicate that one of the most prominent reasons that some ERP implementations fail is related to selecting an improper ERP package. Among those important factors resulting in inappropriate ERP selections, one is to ignore preliminary activities that should be done before the evaluation of ERP packages. Another factor yielding these unsuitable selections is that usually organizations employ prolonged and costly selection processes in such extent that sometimes the process would never be finalized or sometimes the evaluation team might perform many key final activities in an incomplete or inaccurate way due to exhaustion, lack of interest or out-of-date data. In this paper, a systematic approach that recommends some activities to be done before and after the main selection phase is introduced for choosing an ERP package. On the other hand, the proposed approach has utilized some ideas that accelerates the selection process at the same time that reduces the probability of an erroneous final selection.

Research on the Layout of Ground Control Points in Plain area 1:10000 DLG Production Using POS Technique

POS (also been called DGPS/IMU) technique can obtain the Exterior Orientation Elements of aerial photo, so the triangulation and DLG production using POS can save large numbers of ground control points (GCP), and this will improve the produce efficiency of DLG and reduce the cost of collecting GCP. This paper mainly research on POS technique in production of 1:10 000 scale DLG on GCP distribution. We designed 23 kinds of ground control points distribution schemes, using integrated sensor direction method to do the triangulation experiments, based on the results of triangulation, we produce a map with the scale of 1:10 000 and test its accuracy. This paper put forward appropriate GCP distributing schemes by experiments and research above, and made preparations for the application of POS technique on photogrammetry 4D data production.

Traffic Flow Prediction using Adaboost Algorithm with Random Forests as a Weak Learner

Traffic Management and Information Systems, which rely on a system of sensors, aim to describe in real-time traffic in urban areas using a set of parameters and estimating them. Though the state of the art focuses on data analysis, little is done in the sense of prediction. In this paper, we describe a machine learning system for traffic flow management and control for a prediction of traffic flow problem. This new algorithm is obtained by combining Random Forests algorithm into Adaboost algorithm as a weak learner. We show that our algorithm performs relatively well on real data, and enables, according to the Traffic Flow Evaluation model, to estimate and predict whether there is congestion or not at a given time on road intersections.

Energy and Distance Based Clustering: An Energy Efficient Clustering Method for Wireless Sensor Networks

In this paper, we propose an energy efficient cluster based communication protocol for wireless sensor network. Our protocol considers both the residual energy of sensor nodes and the distance of each node from the BS when selecting cluster-head. This protocol can successfully prolong the network-s lifetime by 1) reducing the total energy dissipation on the network and 2) evenly distributing energy consumption over all sensor nodes. In this protocol, the nodes with more energy and less distance from the BS are probable to be selected as cluster-head. Simulation results with MATLAB show that proposed protocol could increase the lifetime of network more than 94% for first node die (FND), and more than 6% for the half of the nodes alive (HNA) factor as compared with conventional protocols.

A New Extended Group Mutual Exclusion Algorithm with Low Message Complexity in Distributed Systems

The group mutual exclusion (GME) problem is an interesting generalization of the mutual exclusion problem. In the group mutual exclusion, multiple processes can enter a critical section simultaneously if they belong to the same group. In the extended group mutual exclusion, each process is a member of multiple groups at the same time. As a result, after the process by selecting a group enter critical section, other processes can select the same group with its belonging group and can enter critical section at the moment, so that it avoids their unnecessary blocking. This paper presents a quorum-based distributed algorithm for the extended group mutual exclusion problem. The message complexity of our algorithm is O(4Q ) in the best case and O(5Q) in the worst case, where Q is a quorum size.

A Complexity-Based Approach in Image Compression using Neural Networks

In this paper we present an adaptive method for image compression that is based on complexity level of the image. The basic compressor/de-compressor structure of this method is a multilayer perceptron artificial neural network. In adaptive approach different Back-Propagation artificial neural networks are used as compressor and de-compressor and this is done by dividing the image into blocks, computing the complexity of each block and then selecting one network for each block according to its complexity value. Three complexity measure methods, called Entropy, Activity and Pattern-based are used to determine the level of complexity in image blocks and their ability in complexity estimation are evaluated and compared. In training and evaluation, each image block is assigned to a network based on its complexity value. Best-SNR is another alternative in selecting compressor network for image blocks in evolution phase which chooses one of the trained networks such that results best SNR in compressing the input image block. In our evaluations, best results are obtained when overlapping the blocks is allowed and choosing the networks in compressor is based on the Best-SNR. In this case, the results demonstrate superiority of this method comparing with previous similar works and JPEG standard coding.

A Tool for Creation Artificial Symbiotic Associations of Wheat

This paper reports optimization of characteristics of bioballistic transformation of spring soft wheat (Triticum aestivum L. cultivar Raduga) and getting of transgenic plants, carrying pea lectin gene. This gene will let to create new associative wheat symbiosis with nodule bacteria of field pea, which has growth encouraging, fungistatic and other useful characteristics.

Neural Networks Learning Improvement using the K-Means Clustering Algorithm to Detect Network Intrusions

In the present work, we propose a new technique to enhance the learning capabilities and reduce the computation intensity of a competitive learning multi-layered neural network using the K-means clustering algorithm. The proposed model use multi-layered network architecture with a back propagation learning mechanism. The K-means algorithm is first applied to the training dataset to reduce the amount of samples to be presented to the neural network, by automatically selecting an optimal set of samples. The obtained results demonstrate that the proposed technique performs exceptionally in terms of both accuracy and computation time when applied to the KDD99 dataset compared to a standard learning schema that use the full dataset.

A Scenario Oriented Supplier Selection by Considering a Multi Tier Supplier Network

One of the main processes of supply chain management is supplier selection process which its accurate implementation can dramatically increase company competitiveness. In presented article model developed based on the features of second tiers suppliers and four scenarios are predicted in order to help the decision maker (DM) in making up his/her mind. In addition two tiers of suppliers have been considered as a chain of suppliers. Then the proposed approach is solved by a method combined of concepts of fuzzy set theory (FST) and linear programming (LP) which has been nourished by real data extracted from an engineering design and supplying parts company. At the end results reveal the high importance of considering second tier suppliers features as criteria for selecting the best supplier.

Distributor Plate Design and a System for Collection of Granules in a Device with a Vortex Fluidized Bed

A newly designed gas-distributor for granulation of powdery materials in equilibrated fluidized bed and a system for collecting the granules prepared are suggested. The aim of these designs is to solve the problems arising by the granulation of powdery materials in fluidized bed devices. The gasdistributor and the collection system proved to be reliable at operation; they reduce the size of still zones, effectively disperse the binding solution in the bed and ensure the collection of granules of given diameter

Dataset Analysis Using Membership-Deviation Graph

Classification is one of the primary themes in computational biology. The accuracy of classification strongly depends on quality of a dataset, and we need some method to evaluate this quality. In this paper, we propose a new graphical analysis method using 'Membership-Deviation Graph (MDG)' for analyzing quality of a dataset. MDG represents degree of membership and deviations for instances of a class in the dataset. The result of MDG analysis is used for understanding specific feature and for selecting best feature for classification.

Feedstock Effects on Selecting the Appropriate Coil Configuration for Cracking Furnaces

In the present research, steam cracking of two types of feedstocks i.e., naphtha and ethane is simulated for Pyrocrack1-1 and 2/2 coil configurations considering two key parameters of coil outlet temperature (COT) and coil capacity using a radical based kinetic model. The computer model is confirmed using the industrial data obtained from Amirkabir Petrochemical Complex. The results are in good agreement with performance data for naphtha cracking in a wide range of severity (0.4-0.7), and for ethane cracking on various conversions (50-70). It was found that Pyrocrack2-2 coil type is an appropriate choice for steam cracking of ethane at reasonable ethylene yield while resulting in much lower tube wall temperature while Pyrocrack1-1 coil type is a proper selection for liquid feedstocks i.e. naphtha. It can be used for cracking of liquid feedstocks at optimal ethylene yield whereas not exceeding the allowable maximum tube temperature.

An Optimization Model of CMMI-Based Software Project Risk Response Planning

Risk response planning is of importance for software project risk management (SPRM). In CMMI, risk management was in the third capability maturity level, which provides a framework for software project risk identification, assessment, risk planning, risk control. However, the CMMI-based SPRM currently lacks quantitative supporting tools, especially during the process of implementing software project risk planning. In this paper, an economic optimization model for selecting risk reduction actions in the phase of software project risk response planning is presented. Furthermore, an example taken from a Chinese software industry is illustrated to verify the application of this method. The research provides a risk decision method for project risk managers that can be used in the implementation of CMMI-based SPRM.