Classification and Analysis of Risks in Software Engineering

Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.

Application of Neural Network in User Authentication for Smart Home System

Security has been an important issue and concern in the smart home systems. Smart home networks consist of a wide range of wired or wireless devices, there is possibility that illegal access to some restricted data or devices may happen. Password-based authentication is widely used to identify authorize users, because this method is cheap, easy and quite accurate. In this paper, a neural network is trained to store the passwords instead of using verification table. This method is useful in solving security problems that happened in some authentication system. The conventional way to train the network using Backpropagation (BPN) requires a long training time. Hence, a faster training algorithm, Resilient Backpropagation (RPROP) is embedded to the MLPs Neural Network to accelerate the training process. For the Data Part, 200 sets of UserID and Passwords were created and encoded into binary as the input. The simulation had been carried out to evaluate the performance for different number of hidden neurons and combination of transfer functions. Mean Square Error (MSE), training time and number of epochs are used to determine the network performance. From the results obtained, using Tansig and Purelin in hidden and output layer and 250 hidden neurons gave the better performance. As a result, a password-based user authentication system for smart home by using neural network had been developed successfully.

Growth and Mineral Content of Mokara chark kuan Pink Orchid as Affected by Allelopathic Lantana camara Weed

Growth and mineral nutrient elemental content were studied in Mokara chark kuan pink terrestrial orchid and wild Lantana camara weed agroecosystem. The treated subplots were encircled with L. camara plants and sprayed weekly with L. camara 10% leaf aqueous extract. Allelopathic interactions were possible through extensive invading root of L. camara plants into the treated orchid subplots and weekly L. camara leaf aqueous extract sprayings. Orchid growth was not significantly different in between the control and treated plots, but chlorosis and yellowish patches of leaves were observed in control orchid leaves. Nitrogen content in L. camara leaf was significantly higher than in orchid leaf, the order of importance of mineral nutrient contents in L. camara leaf was K>Mg>Na>N. In treated orchid leaf, the order of importance was N>K>Mg>Na. Orchid leaf N content from the treated plot was higher than control, but Mg and Na contents were almost similar.

A Signal Driven Adaptive Resolution Short-Time Fourier Transform

The frequency contents of the non-stationary signals vary with time. For proper characterization of such signals, a smart time-frequency representation is necessary. Classically, the STFT (short-time Fourier transform) is employed for this purpose. Its limitation is the fixed timefrequency resolution. To overcome this drawback an enhanced STFT version is devised. It is based on the signal driven sampling scheme, which is named as the cross-level sampling. It can adapt the sampling frequency and the window function (length plus shape) by following the input signal local variations. This adaptation results into the proposed technique appealing features, which are the adaptive time-frequency resolution and the computational efficiency.

Specialization-based parallel Processing without Memo-trees

The purpose of this paper is to propose a framework for constructing correct parallel processing programs based on Equivalent Transformation Framework (ETF). ETF regards computation as In the framework, a problem-s domain knowledge and a query are described in definite clauses, and computation is regarded as transformation of the definite clauses. Its meaning is defined by a model of the set of definite clauses, and the transformation rules generated must preserve meaning. We have proposed a parallel processing method based on “specialization", a part of operation in the transformations, which resembles substitution in logic programming. The method requires “Memo-tree", a history of specialization to maintain correctness. In this paper we proposes the new method for the specialization-base parallel processing without Memo-tree.

Effect of VA-Mycorrhiza on Growth and Yield of Sunflower (Helianthus annuus L.) at Different Phosphorus Levels

The effect of seed inoculation by VA- mycorrhiza and different levels of phosphorus fertilizer on growth and yield of sunflower (Azargol cultivar) was studied in experiment farm of Islamic Azad University, Karaj Branch during 2008 growing season. The experiment treatments were arranged in factorial based on a complete randomized block design with three replications. Four phosphorus fertilizer levels of 25%, 50% 75% and 100% P recommended with two levels of Mycorrhiza: with and without Mycorrhiza (control) were assigned in a factorial combination. Results showed that head diameter, number of seeds in head, seed yield and oil yield were significantly higher in inoculated plants than in non-inoculated plants. Head diameter, number of seeds in head, 1000 seeds weight, biological yield, seed yield and oil yield increased with increasing P level above 75% P recommended in non-inoculated plants, whereas no significant difference was observed between 75% and 100% P recommended. The positive effect of mycorrhizal inoculation decreased with increasing P levels due to decreased percent root colonization at higher P levels. According to the results of this experiment, application of mycorrhiza in present of 50% P recommended had an appropriate performance and could increase seed yield and oil production to an acceptable level, so it could be considered as a suitable substitute for chemical phosphorus fertilizer in organic agricultural systems.

Portfolio Management: A Fuzzy Set Based Approach to Monitoring Size to Maximize Return and Minimize Risk

Fuzzy logic can be used when knowledge is incomplete or when ambiguity of data exists. The purpose of this paper is to propose a proactive fuzzy set- based model for reacting to the risk inherent in investment activities relative to a complete view of portfolio management. Fuzzy rules are given where, depending on the antecedents, the portfolio size may be slightly or significantly decreased or increased. The decision maker considers acceptable bounds on the proportion of acceptable risk and return. The Fuzzy Controller model allows learning to be achieved as 1) the firing strength of each rule is measured, 2) fuzzy output allows rules to be updated, and 3) new actions are recommended as the system continues to loop. An extension is given to the fuzzy controller that evaluates potential financial loss before adjusting the portfolio. An application is presented that illustrates the algorithm and extension developed in the paper.

Microarrays Denoising via Smoothing of Coefficients in Wavelet Domain

We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.

Defect Cause Modeling with Decision Tree and Regression Analysis

The main aim of this study is to identify the most influential variables that cause defects on the items produced by a casting company located in Turkey. To this end, one of the items produced by the company with high defective percentage rates is selected. Two approaches-the regression analysis and decision treesare used to model the relationship between process parameters and defect types. Although logistic regression models failed, decision tree model gives meaningful results. Based on these results, it can be claimed that the decision tree approach is a promising technique for determining the most important process variables.

Impact of Implementing VPN to Secure Wireless LAN

Many corporations are seriously concerned about security of networks and therefore, their network supervisors are still reluctant to install WLANs. In this regards, the IEEE802.11i standard was developed to address the security problems, even though the mistrust of the wireless LAN technology is still existing. The thought was that the best security solutions could be found in open standards based technologies that can be delivered by Virtual Private Networking (VPN) being used for long time without addressing any security holes for the past few years. This work, addresses this issue and presents a simulated wireless LAN of IEEE802.11g protocol, and analyzes impact of integrating Virtual Private Network technology to secure the flow of traffic between the client and the server within the LAN, using OPNET WLAN utility. Two Wireless LAN scenarios have been introduced and simulated. These are based on normal extension to a wired network and VPN over extension to a wired network. The results of the two scenarios are compared and indicate the impact of improving performance, measured by response time and load, of Virtual Private Network over wireless LAN.

Object-Oriented Cognitive-Spatial Complexity Measures

Software maintenance and mainly software comprehension pose the largest costs in the software lifecycle. In order to assess the cost of software comprehension, various complexity measures have been proposed in the literature. This paper proposes new cognitive-spatial complexity measures, which combine the impact of spatial as well as architectural aspect of the software to compute the software complexity. The spatial aspect of the software complexity is taken into account using the lexical distances (in number of lines of code) between different program elements and the architectural aspect of the software complexity is taken into consideration using the cognitive weights of control structures present in control flow of the program. The proposed measures are evaluated using standard axiomatic frameworks and then, the proposed measures are compared with the corresponding existing cognitive complexity measures as well as the spatial complexity measures for object-oriented software. This study establishes that the proposed measures are better indicators of the cognitive effort required for software comprehension than the other existing complexity measures for object-oriented software.

Optimization of New 25A-size Metal Gasket Design Based on Contact Width Considering Forming and Contact Stress Effect

At the previous study of new metal gasket, contact width and contact stress were important design parameter for optimizing metal gasket performance. However, the range of contact stress had not been investigated thoroughly. In this study, we conducted a gasket design optimization based on an elastic and plastic contact stress analysis considering forming effect using FEM. The gasket model was simulated by using two simulation stages which is forming and tightening simulation. The optimum design based on an elastic and plastic contact stress was founded. Final evaluation was determined by helium leak quantity to check leakage performance of both type of gaskets. The helium leak test shows that a gasket based on the plastic contact stress design better than based on elastic stress design.

Diffusion Analysis of a Scalable Feistel Network

A generalization of the concepts of Feistel Networks (FN), known as Extended Feistel Network (EFN) is examined. EFN splits the input blocks into n > 2 sub-blocks. Like conventional FN, EFN consists of a series of rounds whereby at least one sub-block is subjected to an F function. The function plays a key role in the diffusion process due to its completeness property. It is also important to note that in EFN the F-function is the most computationally expensive operation in a round. The aim of this paper is to determine a suitable type of EFN for a scalable cipher. This is done by analyzing the threshold number of rounds for different types of EFN to achieve the completeness property as well as the number of F-function required in the network. The work focuses on EFN-Type I, Type II and Type III only. In the analysis it is found that EFN-Type II and Type III diffuses at the same rate and both are faster than Type-I EFN. Since EFN-Type-II uses less F functions as compared to EFN-Type III, therefore Type II is the most suitable EFN for use in a scalable cipher.

Mean-Square Performance of Adaptive Filter Algorithms in Nonstationary Environments

Employing a recently introduced unified adaptive filter theory, we show how the performance of a large number of important adaptive filter algorithms can be predicted within a general framework in nonstationary environment. This approach is based on energy conservation arguments and does not need to assume a Gaussian or white distribution for the regressors. This general performance analysis can be used to evaluate the mean square performance of the Least Mean Square (LMS) algorithm, its normalized version (NLMS), the family of Affine Projection Algorithms (APA), the Recursive Least Squares (RLS), the Data-Reusing LMS (DR-LMS), its normalized version (NDR-LMS), the Block Least Mean Squares (BLMS), the Block Normalized LMS (BNLMS), the Transform Domain Adaptive Filters (TDAF) and the Subband Adaptive Filters (SAF) in nonstationary environment. Also, we establish the general expressions for the steady-state excess mean square in this environment for all these adaptive algorithms. Finally, we demonstrate through simulations that these results are useful in predicting the adaptive filter performance.

A Novel Feedback-Based Integrated FiWi Networks Architecture by Centralized Interlink-ONU Communication

Integrated fiber-wireless (FiWi) access networks are a viable solution that can deliver the high profile quadruple play services. Passive optical networks (PON) networks integrated with wireless access networks provide ubiquitous characteristics for high bandwidth applications. Operation of PON improves by employing a variety of multiplexing techniques. One of it is time division/wavelength division multiplexed (TDM/WDM) architecture that improves the performance of optical-wireless access networks. This paper proposes a novel feedback-based TDM/WDM-PON architecture and introduces a model of integrated PON-FiWi networks. Feedback-based link architecture is an efficient solution to improves the performance of optical-line-terminal (OLT) and interlink optical-network-units (ONUs) communication. Furthermore, the feedback-based WDM/TDM-PON architecture is compared with existing architectures in terms of capacity of network throughput.

Static Single Point Positioning Using The Extended Kalman Filter

Global Positioning System (GPS) technology is widely used today in the areas of geodesy and topography as well as in aeronautics mainly for military purposes. Due to the military usage of GPS, full access and use of this technology is being denied to the civilian user who must then work with a less accurate version. In this paper we focus on the estimation of the receiver coordinates ( X, Y, Z ) and its clock bias ( δtr ) of a fixed point based on pseudorange measurements of a single GPS receiver. Utilizing the instantaneous coordinates of just 4 satellites and their clock offsets, by taking into account the atmospheric delays, we are able to derive a set of pseudorange equations. The estimation of the four unknowns ( X, Y, Z , δtr ) is achieved by introducing an extended Kalman filter that processes, off-line, all the data collected from the receiver. Higher performance of position accuracy is attained by appropriate tuning of the filter noise parameters and by including other forms of biases.

A New Approach for Counting Passersby Utilizing Space-Time Images

Understanding the number of people and the flow of the persons is useful for efficient promotion of the institution managements and company-s sales improvements. This paper introduces an automated method for counting passerby using virtualvertical measurement lines. The process of recognizing a passerby is carried out using an image sequence obtained from the USB camera. Space-time image is representing the human regions which are treated using the segmentation process. To handle the problem of mismatching, different color space are used to perform the template matching which chose automatically the best matching to determine passerby direction and speed. A relation between passerby speed and the human-pixel area is used to distinguish one or two passersby. In the experiment, the camera is fixed at the entrance door of the hall in a side viewing position. Finally, experimental results verify the effectiveness of the presented method by correctly detecting and successfully counting them in order to direction with accuracy of 97%.

Extend Three-wave Method for the (3+1)-Dimensional Soliton Equation

In this paper, we study (3+1)-dimensional Soliton equation. We employ the Hirota-s bilinear method to obtain the bilinear form of (3+1)-dimensional Soliton equation. Then by the idea of extended three-wave method, some exact soliton solutions including breather type solutions are presented.

Organization as System, Psychic Dynamism as Equilibration: A Conceptualization

Organizations are supposed to be systems and consequently require defining the notion of equilibrium within. However, organizations comprise people and unavoidably entail their irrational aspects. Then, the question is what is the organizational equilibrium and equilibrating mechanisms considering these aspects. Hence, some arguments are provided here to conceptualize human unconsciousness, irrationalities and consequent uncertainties within organizations in the form of a system of psychic dynamism. The assumption is this dynamism maintains the psychic balance of the organization through a psychodynamic point of view. The resultant conceptualization expected to promote the understanding of such aspects in different organizational settings by hypothesizing organizational equilibration from this perspective. As a result, the main expectation is, if it is known that how the organization equilibrates in this sense, we can explain and deal with such irrationalities and unconsciousness by rational and, of course conscious, planning and accomplishing.

A Method for 3D Mesh Adaptation in FEA

The use of the mechanical simulation (in particular the finite element analysis) requires the management of assumptions in order to analyse a real complex system. In finite element analysis (FEA), two modeling steps require assumptions to be able to carry out the computations and to obtain some results: the building of the physical model and the building of the simulation model. The simplification assumptions made on the analysed system in these two steps can generate two kinds of errors: the physical modeling errors (mathematical model, domain simplifications, materials properties, boundary conditions and loads) and the mesh discretization errors. This paper proposes a mesh adaptive method based on the use of an h-adaptive scheme in combination with an error estimator in order to choose the mesh of the simulation model. This method allows us to choose the mesh of the simulation model in order to control the cost and the quality of the finite element analysis.