A Study on a Discrete Event Simulation Model for Availability Analysis of Weapon Systems

This paper discusses a discrete event simulation model for the availability analysis of weapon systems. This model incorporates missions, operational tasks and system reliability structures to analyze the availability of a weapon system. The proposed simulation model consists of 5 modules: Simulation Engine, Maintenance Organizations, System, its Mission Profile and RBD which are based on missions and operational tasks. Simulation Engine executes three kinds of discrete events in chronological order. The events are mission events generated by Mission Profile, failure events generated by System, and maintenance events executed by Maintenance Organization. Finally, this paper shows the case study of a system's availability analysis and mission reliability using the simulation model.

Increasing the Heterogeneity and Competition of Early Stage Financing: An Analysis of the Role of Crowdfunding in Entrepreneurial Ventures

The financial crisis has decreased the opportunities of small businesses to acquire financing through conventional financial actors, such as commercial banks. This credit constraint is partly the reason for the emergence of new alternatives of financing, in addition to the spreading opportunities for communication and secure financial transfer through Internet. One of the most interesting venues for finance is termed “crowdfunding". As the term suggests crowdfunding is an appeal to prospective customers and investors to form a crowd that will finance projects that otherwise would find it hard to generate support through the most common financial actors. Crowdfunding is in this paper divided into different models; the threshold model, the microfinance model, the micro loan model and the equity model. All these models add to the financial possibilities of emerging entrepreneurs.

SWARM: A Meta-Scheduler to Minimize Job Queuing Times on Computational Grids

Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.

Hiding Data in Images Using PCP

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

Use of Novel Algorithms MAJE4 and MACJER-320 for Achieving Confidentiality and Message Authentication in SSL and TLS

Extensive use of the Internet coupled with the marvelous growth in e-commerce and m-commerce has created a huge demand for information security. The Secure Socket Layer (SSL) protocol is the most widely used security protocol in the Internet which meets this demand. It provides protection against eaves droppings, tampering and forgery. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL. But recent attacks against RC4 and HMAC have raised questions in the confidence on these algorithms. Hence two novel cryptographic algorithms MAJE4 and MACJER-320 have been proposed as substitutes for them. The focus of this work is to demonstrate the performance of these new algorithms and suggest them as dependable alternatives to satisfy the need of security services in SSL. The performance evaluation has been done by using practical implementation method.

A Novel Nucleus-Based Classifier for Discrimination of Osteoclasts and Mesenchymal Precursor Cells in Mouse Bone Marrow Cultures

Bone remodeling occurs by the balanced action of bone resorbing osteoclasts (OC) and bone-building osteoblasts. Increased bone resorption by excessive OC activity contributes to malignant and non-malignant diseases including osteoporosis. To study OC differentiation and function, OC formed in in vitro cultures are currently counted manually, a tedious procedure which is prone to inter-observer differences. Aiming for an automated OC-quantification system, classification of OC and precursor cells was done on fluorescence microscope images based on the distinct appearance of fluorescent nuclei. Following ellipse fitting to nuclei, a combination of eight features enabled clustering of OC and precursor cell nuclei. After evaluating different machine-learning techniques, LOGREG achieved 74% correctly classified OC and precursor cell nuclei, outperforming human experts (best expert: 55%). In combination with the automated detection of total cell areas, this system allows to measure various cell parameters and most importantly to quantify proteins involved in osteoclastogenesis.

Bioremediation of MEG, DEG, and TEG: Potential of Burhead Plant and Soil Microorganisms

The aim of this work was to investigate the potential of soil microorganisms and the burhead plant, as well as the combination of soil microorganisms and plants to remediate monoethylene glycol (MEG), diethylene glycol (DEG), and triethylene glycol (TEG) in synthetic wastewater. The result showed that a system containing both burhead plant and soil microorganisms had the highest efficiency in EGs removal. Around 100% of MEG and DEG and 85% of TEG were removed within 15 days of the experiments. However, the burhead plant had higher removal efficiency than soil microorganisms for MEG and DEG but the same for TEG in the study systems. The removal rate of EGs in the study system related to the molecular weight of the compounds and MEG, the smallest glycol, was removed faster than DEG and TEG by both the burhead plant and soil microorganisms in the study system.

Efficient System for Speech Recognition using General Regression Neural Network

In this paper we present an efficient system for independent speaker speech recognition based on neural network approach. The proposed architecture comprises two phases: a preprocessing phase which consists in segmental normalization and features extraction and a classification phase which uses neural networks based on nonparametric density estimation namely the general regression neural network (GRNN). The relative performances of the proposed model are compared to the similar recognition systems based on the Multilayer Perceptron (MLP), the Recurrent Neural Network (RNN) and the well known Discrete Hidden Markov Model (HMM-VQ) that we have achieved also. Experimental results obtained with Arabic digits have shown that the use of nonparametric density estimation with an appropriate smoothing factor (spread) improves the generalization power of the neural network. The word error rate (WER) is reduced significantly over the baseline HMM method. GRNN computation is a successful alternative to the other neural network and DHMM.

Urban Floods and Importance of Them in Cities Security Planning (Case Study: Dominant Watershed on Zavvareh City)

Development of cities and villages, agricultural farms and industrial regions in abutment and/or in the course of streams and rivers or in prone flood lands has been caused more notations in hydrology problems and city planning topics. In order to protection of cities against of flood damages, embankment construction is a desired and scientific method. The cities that located in arid zones may damage by floods periodically. Zavvareh city in Ardestan township(Isfahan province) with 7704 people located in Ardestan plain that has been damaged by floods that have flowed from dominant mountainous watersheds in past years with regard to return period. In this study, according to flowed floods toward Zavvareh city, was attempt to plan suitable hydraulic structures such as canals, bridges and collectors in order to collection, conduction and depletion of city surface runoff.

Genetic Variation of Durum Wheat Landraces and Cultivars Using Morphological and Protein Markers

Knowledge of patterns of genetic diversity enhances the efficiency of germplasm conservation and improvement. In this study 96 Iranian landraces of Triticum turgidum originating from different geographical areas of Iran, along with 18 durum cultivars from ten countries were evaluated for variation in morphological and high molecular weight glutenin subunit (HMW-GS) composition. The first two principal components clearly separated the Iranian landraces from cultivars. Three alleles were present at the Glu-A1 locus and 11 alleles at Glu-B1. In both cultivars and landraces of durum wheat, the null allele (Glu-A1c) was observed more frequently than the Glu-A1a and Glu-A1b alleles. Two alleles, namely Glu-B1a (subunit 7) and Glu-B1e (subunit 20) represented the more frequent alleles at Glu-B1 locus. The results showed that the evaluated Iranian landraces formed an interesting source of favourable glutenin subunits that might be very desirable in breeding activities for improving pasta-making quality.

Evaluating per-user Fairness of Goal-Oriented Parallel Computer Job Scheduling Policies

Fair share objective has been included into the goaloriented parallel computer job scheduling policy recently. However, the previous work only presented the overall scheduling performance. Thus, the per-user performance of the policy is still lacking. In this work, the details of per-user fair share performance under the Tradeoff-fs(Tx:avgX) policy will be further evaluated. A basic fair share priority backfill policy namely RelShare(1d) is also studied. The performance of all policies is collected using an event-driven simulator with three real job traces as input. The experimental results show that the high demand users are usually benefited under most policies because their jobs are large or they have a lot of jobs. In the large job case, one job executed may result in over-share during that period. In the other case, the jobs may be backfilled for performances. However, the users with a mixture of jobs may suffer because if the smaller jobs are executing the priority of the remaining jobs from the same user will be lower. Further analysis does not show any significant impact of users with a lot of jobs or users with a large runtime approximation error.