Geometry Calibration Factors of Modified Arcan Fracture Test for Welded Joint

In this study the mixed mode fracture mechanics parameters were investigated for high tensile steel butt welded joint based on modified Arcan test and finite element analysis was used to evaluate the effect of crack length on fracture criterion. The nondimensional stress intensity factors, strain energy release rates and Jintegral energy on crack tip were obtained for various in-plane loading combinations on Arcan specimen starting from pure mode-I to pure mode-II loading conditions. The specimen and apparatus were modeled by finite element method and analyzed under various loading angles (between 0 to 90 degrees with 15 degree interval) to simulate the pure mode-I, II and mixed mode fracture. Since the analytical results are independent from elasticity modules for isotropic materials, therefore the results in elastic fields can be used for Arcan specimens. The main objective of this study was to evaluate the geometric calibration factors for modified Arcan test specimen in order to obtain fracture toughness under mixed mode loading conditions.

Cost Based Warranty Optimisation Using Genetic Algorithm

Warranty is a powerful marketing tool for the manufacturer and a good protection for both the manufacturer and the customer. However, warranty always involves additional costs to the manufacturer, which depend on product reliability characteristics and warranty parameters. This paper presents an approach to optimisation of warranty parameters for known product failure distribution to reduce the warranty costs to the manufacturer while retaining the promotional function of the warranty. Combination free replacement and pro-rata warranty policy is chosen as a model and the length of free replacement period and pro-rata policy period are varied, as well as the coefficients that define the pro-rata cost function. Multiparametric warranty optimisation is done by using genetic algorithm. Obtained results are guideline for the manufacturer to choose the warranty policy that minimises the costs and maximises the profit.

Interest of the Sequences Pseudo Noises Codes of Different Lengths for the Reduction from the Interference between Users of CDMA Network

The third generation (3G) of cellular system adopted the spread spectrum as solution for the transmission of the data in the physical layer. Contrary to systems IS-95 or CDMAOne (systems with spread spectrum of the preceding generation), the new standard, called Universal Mobil Telecommunications System (UMTS), uses long codes in the down link. The system is conceived for the vocal communication and the transmission of the data. In particular, the down link is very important, because of the asymmetrical request of the data, i.e., more remote loading towards the mobiles than towards the basic station. Moreover, the UMTS uses for the down link an orthogonal spreading out with a variable factor of spreading out (OVSF for Orthogonal Variable Spreading Factor). This characteristic makes it possible to increase the flow of data of one or more users by reducing their factor of spreading out without changing the factor of spreading out of other users. In the current standard of the UMTS, two techniques to increase the performances of the down link were proposed, the diversity of sending antenna and the codes space-time. These two techniques fight only fainding. The receiver proposed for the mobil station is the RAKE, but one can imagine a receiver more sophisticated, able to reduce the interference between users and the impact of the coloured noise and interferences to narrow band. In this context, where the users have long codes synchronized with variable factor of spreading out and ignorance by the mobile of the other active codes/users, the use of the sequences of code pseudo-noises different lengths is presented in the form of one of the most appropriate solutions.

Detection of Ultrasonic Images in the Presence of a Random Number of Scatterers: A Statistical Learning Approach

Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.

Application of Artificial Neural Network for Predicting Maintainability Using Object-Oriented Metrics

Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.

A Critical Survey of Reusability Aspects for Component-Based Systems

The last decade has shown that object-oriented concept by itself is not that powerful to cope with the rapidly changing requirements of ongoing applications. Component-based systems achieve flexibility by clearly separating the stable parts of systems (i.e. the components) from the specification of their composition. In order to realize the reuse of components effectively in CBSD, it is required to measure the reusability of components. However, due to the black-box nature of components where the source code of these components are not available, it is difficult to use conventional metrics in Component-based Development as these metrics require analysis of source codes. In this paper, we survey few existing component-based reusability metrics. These metrics give a border view of component-s understandability, adaptability, and portability. It also describes the analysis, in terms of quality factors related to reusability, contained in an approach that aids significantly in assessing existing components for reusability.

Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences

In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.

Estimation of Time -Varying Linear Regression with Unknown Time -Volatility via Continuous Generalization of the Akaike Information Criterion

The problem of estimating time-varying regression is inevitably concerned with the necessity to choose the appropriate level of model volatility - ranging from the full stationarity of instant regression models to their absolute independence of each other. In the stationary case the number of regression coefficients to be estimated equals that of regressors, whereas the absence of any smoothness assumptions augments the dimension of the unknown vector by the factor of the time-series length. The Akaike Information Criterion is a commonly adopted means of adjusting a model to the given data set within a succession of nested parametric model classes, but its crucial restriction is that the classes are rigidly defined by the growing integer-valued dimension of the unknown vector. To make the Kullback information maximization principle underlying the classical AIC applicable to the problem of time-varying regression estimation, we extend it onto a wider class of data models in which the dimension of the parameter is fixed, but the freedom of its values is softly constrained by a family of continuously nested a priori probability distributions.

MDA of Hexagonal Honeycomb Plates used for Space Applications

The purpose of this paper is to perform a multidisciplinary design and analysis (MDA) of honeycomb panels used in the satellites structural design. All the analysis is based on clamped-free boundary conditions. In the present work, detailed finite element models for honeycomb panels are developed and analysed. Experimental tests were carried out on a honeycomb specimen of which the goal is to compare the previous modal analysis made by the finite element method as well as the existing equivalent approaches. The obtained results show a good agreement between the finite element analysis, equivalent and tests results; the difference in the first two frequencies is less than 4% and less than 10% for the third frequency. The results of the equivalent model presented in this analysis are obtained with a good accuracy. Moreover, investigations carried out in this research relate to the honeycomb plate modal analysis under several aspects including the structural geometrical variation by studying the various influences of the dimension parameters on the modal frequency, the variation of core and skin material of the honeycomb. The various results obtained in this paper are promising and show that the geometry parameters and the type of material have an effect on the value of the honeycomb plate modal frequency.

Automatic Segmentation of Dermoscopy Images Using Histogram Thresholding on Optimal Color Channels

Automatic segmentation of skin lesions is the first step towards development of a computer-aided diagnosis of melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most discriminative and effective color space for melanoma application. This paper proposes a novel automatic segmentation algorithm using color space analysis and clustering-based histogram thresholding, which is able to determine the optimal color channel for segmentation of skin lesions. To demonstrate the validity of the algorithm, it is tested on a set of 30 high resolution dermoscopy images and a comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm. The evaluation is carried out by applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. Through ROC analysis and ranking the metrics, it is shown that the best results are obtained with the X and XoYoR color channels which results in an accuracy of approximately 97%. The proposed method is also compared with two state-ofthe- art skin lesion segmentation methods, which demonstrates the effectiveness and superiority of the proposed segmentation method.

Molecular Dynamics of Fatty Acid Interacting with Carbon Nanotube as Selective Device

In this paper we study a system composed by carbon nanotube (CNT) and bundle of carbon nanotube (BuCNT) interacting with a specific fatty acid as molecular probe. Full system is represented by open nanotube (or nanotubes) and the linoleic acid (LA) relaxing due the interaction with CNT and BuCNT. The LA has in his form an asymmetric shape with COOH termination provoking a close BuCNT interaction mainly by van der Waals force field. The simulations were performed by classical molecular dynamics with standard parameterizations. Our results show that these BuCNT and CNT are dynamically stable and it shows a preferential interaction position with LA resulting in three features: (i) when the LA is interacting with CNT and BuCNT (including both termination, CH2 or COOH), the LA is repelled; (ii) when the LA terminated with CH2 is closer to open extremity of BuCNT, the LA is also repelled by the interaction between them; and (iii) when the LA terminated with COOH is closer to open extremity of BuCNT, the LA is encapsulated by the BuCNT. These simulations are part of a more extensive work on searching efficient selective molecular devices and could be useful to reach this goal.

Modeling of PZ in Haunch Connections Systems

Modeling of Panel Zone (PZ) seismic behavior, because of its role in overall ductility and lateral stiffness of steel moment frames, has been considered a challenge for years. There are some studies regarding the effects of different doubler plates thicknesses and geometric properties of PZ on its seismic behavior. However, there is not much investigation on the effects of number of provided continuity plates in case of presence of one triangular haunch, two triangular haunches and rectangular haunch (T shape haunches) for exterior columns. In this research first detailed finite element models of 12tested connection of SAC joint venture were created and analyzed then obtained cyclic behavior backbone curves of these models besides other FE models for similar tests were used for neural network training. Then seismic behavior of these data is categorized according to continuity plate-s arrangements and differences in type of haunches. PZ with one-sided haunches have little plastic rotation. As the number of continuity plates increases due to presence of two triangular haunches (four continuity plate), there will be no plastic rotation, in other words PZ behaves in its elastic range. In the case of rectangular haunch, PZ show more plastic rotation in comparison with one-sided triangular haunch and especially double-sided triangular haunches. Moreover, the models that will be presented in case of triangular one-sided and double- sided haunches and rectangular haunches as a result of this study seem to have a proper estimation of PZ seismic behavior.

A New Biometric Human Identification Based On Fusion Fingerprints and Finger Veins Using monoLBP Descriptor

Single biometric modality recognition is not able to meet the high performance supplies in most cases with its application become more and more broadly. Multimodal biometrics identification represents an emerging trend recently. This paper investigates a novel algorithm based on fusion of both fingerprint and fingervein biometrics. For both biometric recognition, we employ the Monogenic Local Binary Pattern (MonoLBP). This operator integrate the orginal LBP (Local Binary Pattern ) with both other rotation invariant measures: local phase and local surface type. Experimental results confirm that a weighted sum based proposed fusion achieves excellent identification performances opposite unimodal biometric systems. The AUC of proposed approach based on combining the two modalities has very close to unity (0.93).

Sensing Pressure for Authentication System Using Keystroke Dynamics

In this paper, an authentication system using keystroke dynamics is presented. We introduced pressure sensing for the improvement of the accuracy of measurement and durability against intrusion using key-logger, and so on, however additional instrument is needed. As the result, it has been found that the pressure sensing is also effective for estimation of real moment of keystroke.

Topology Influence on TCP Congestion Control Performance in Multi-hop Ad Hoc Wireless

Wireless ad hoc nodes are freely and dynamically self-organize in communicating with others. Each node can act as host or router. However it actually depends on the capability of nodes in terms of its current power level, signal strength, number of hops, routing protocol, interference and others. In this research, a study was conducted to observe the effect of hops count over different network topologies that contribute to TCP Congestion Control performance degradation. To achieve this objective, a simulation using NS-2 with different topologies have been evaluated. The comparative analysis has been discussed based on standard observation metrics: throughput, delay and packet loss ratio. As a result, there is a relationship between types of topology and hops counts towards the performance of ad hoc network. In future, the extension study will be carried out to investigate the effect of different error rate and background traffic over same topologies.

Fast Cosine Transform to Increase Speed-up and Efficiency of Karhunen-Loève Transform for Lossy Image Compression

In this work, we present a comparison between two techniques of image compression. In the first case, the image is divided in blocks which are collected according to zig-zag scan. In the second one, we apply the Fast Cosine Transform to the image, and then the transformed image is divided in blocks which are collected according to zig-zag scan too. Later, in both cases, the Karhunen-Loève transform is applied to mentioned blocks. On the other hand, we present three new metrics based on eigenvalues for a better comparative evaluation of the techniques. Simulations show that the combined version is the best, with minor Mean Absolute Error (MAE) and Mean Squared Error (MSE), higher Peak Signal to Noise Ratio (PSNR) and better image quality. Finally, new technique was far superior to JPEG and JPEG2000.

Dynamic Coupling Metrics for Service – Oriented Software

Service-oriented systems have become popular and presented many advantages in develop and maintain process. The coupling is the most important attribute of services when they are integrated into a system. In this paper, we propose a suite of metrics to evaluate service-s quality according to its ability of coupling. We use the coupling metrics to measure the maintainability, reliability, testability, and reusability of services. Our proposed metrics are operated in run-time which bring more exact results.

A Survey on Metric of Software Cognitive Complexity for OO design

In modern era, the biggest challenge facing the software industry is the upcoming of new technologies. So, the software engineers are gearing up themselves to meet and manage change in large software system. Also they find it difficult to deal with software cognitive complexities. In the last few years many metrics were proposed to measure the cognitive complexity of software. This paper aims at a comprehensive survey of the metric of software cognitive complexity. Some classic and efficient software cognitive complexity metrics, such as Class Complexity (CC), Weighted Class Complexity (WCC), Extended Weighted Class Complexity (EWCC), Class Complexity due to Inheritance (CCI) and Average Complexity of a program due to Inheritance (ACI), are discussed and analyzed. The comparison and the relationship of these metrics of software complexity are also presented.

On the Exact Solution of Non-Uniform Torsion for Beams with Axial Symmetric Cross-Section

In the traditional theory of non-uniform torsion the axial displacement field is expressed as the product of the unit twist angle and the warping function. The first one, variable along the beam axis, is obtained by a global congruence condition; the second one, instead, defined over the cross-section, is determined by solving a Neumann problem associated to the Laplace equation, as well as for the uniform torsion problem. So, as in the classical theory the warping function doesn-t punctually satisfy the first indefinite equilibrium equation, the principal aim of this work is to develop a new theory for non-uniform torsion of beams with axial symmetric cross-section, fully restrained on both ends and loaded by a constant torque, that permits to punctually satisfy the previous equation, by means of a trigonometric expansion of the axial displacement and unit twist angle functions. Furthermore, as the classical theory is generally applied with good results to the global and local analysis of ship structures, two beams having the first one an open profile, the second one a closed section, have been analyzed, in order to compare the two theories.

Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems

In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experiments