Effect of Geometrical Parameters on Natural Frequencies of FGM Cylindrical shell with Holes Under Various Boundary Conditions

In the recent years, functionally gradient materials (FGMs) have gained considerable attention in the high temperature environment applications. In this paper, free vibration of thin functionally graded cylindrical shell with hole composed of stainless steel and zirconia is studied. The mechanical properties vary smoothly and continuously from one surface to the other according to a volume fraction power-law distribution. The Influence of shell geometrical parameters, variations of volume fractions and boundary conditions on natural frequency is considered. The equations of motion are based on strains-displacement relations from Love-s shell theory and Rayleigh method. The results have been obtained for natural frequencies of cylindrical shell with holes for different shape, number and location in this paper.

Confidence Intervals for Double Exponential Distribution: A Simulation Approach

The double exponential model (DEM), or Laplace distribution, is used in various disciplines. However, there are issues related to the construction of confidence intervals (CI), when using the distribution.In this paper, the properties of DEM are considered with intention of constructing CI based on simulated data. The analysis of pivotal equations for the models here in comparisons with pivotal equations for normal distribution are performed, and the results obtained from simulation data are presented.

Development of Maximum Entropy Method for Prediction of Droplet-size Distribution in Primary Breakup Region of Spray

Droplet size distributions in the cold spray of a fuel are important in observed combustion behavior. Specification of droplet size and velocity distributions in the immediate downstream of injectors is also essential as boundary conditions for advanced computational fluid dynamics (CFD) and two-phase spray transport calculations. This paper describes the development of a new model to be incorporated into maximum entropy principle (MEP) formalism for prediction of droplet size distribution in droplet formation region. The MEP approach can predict the most likely droplet size and velocity distributions under a set of constraints expressing the available information related to the distribution. In this article, by considering the mechanisms of turbulence generation inside the nozzle and wave growth on jet surface, it is attempted to provide a logical framework coupling the flow inside the nozzle to the resulting atomization process. The purpose of this paper is to describe the formulation of this new model and to incorporate it into the maximum entropy principle (MEP) by coupling sub-models together using source terms of momentum and energy. Comparison between the model prediction and experimental data for a gas turbine swirling nozzle and an annular spray indicate good agreement between model and experiment.

Systems with Queueing and their Simulation

In the queueing theory, it is assumed that customer arrivals correspond to a Poisson process and service time has the exponential distribution. Using these assumptions, the behaviour of the queueing system can be described by means of Markov chains and it is possible to derive the characteristics of the system. In the paper, these theoretical approaches are presented on several types of systems and it is also shown how to compute the characteristics in a situation when these assumptions are not satisfied

Nonparametric Control Chart Using Density Weighted Support Vector Data Description

In manufacturing industries, development of measurement leads to increase the number of monitoring variables and eventually the importance of multivariate control comes to the fore. Statistical process control (SPC) is one of the most widely used as multivariate control chart. Nevertheless, SPC is restricted to apply in processes because its assumption of data as following specific distribution. Unfortunately, process data are composed by the mixture of several processes and it is hard to estimate as one certain distribution. To alternative conventional SPC, therefore, nonparametric control chart come into the picture because of the strength of nonparametric control chart, the absence of parameter estimation. SVDD based control chart is one of the nonparametric control charts having the advantage of flexible control boundary. However,basic concept of SVDD has been an oversight to the important of data characteristic, density distribution. Therefore, we proposed DW-SVDD (Density Weighted SVDD) to cover up the weakness of conventional SVDD. DW-SVDD makes a new attempt to consider dense of data as introducing the notion of density Weight. We extend as control chart using new proposed SVDD and a simulation study of various distributional data is conducted to demonstrate the improvement of performance.

Normalized Cumulative Spectral Distribution in Music

As the remedy used music becomes active and meditation effect through the music is verified, people take a growing interest about psychological balance or remedy given by music. From traditional studies, it is verified that the music of which spectral envelop varies approximately as 1/f (f is frequency) down to a frequency of low frequency bandwidth gives psychological balance. In this paper, we researched signal properties of music which gives psychological balance. In order to find this, we derived the property from voice. Music composed by voice shows large value in NCSD. We confirmed the degree of deference between music by curvature of normalized cumulative spectral distribution. In the music that gives psychological balance, the curvature shows high value, otherwise, the curvature shows low value.

Analysis of Linked in Series Servers with Blocking, Priority Feedback Service and Threshold Policy

The use of buffer thresholds, blocking and adequate service strategies are well-known techniques for computer networks traffic congestion control. This motivates the study of series queues with blocking, feedback (service under Head of Line (HoL) priority discipline) and finite capacity buffers with thresholds. In this paper, the external traffic is modelled using the Poisson process and the service times have been modelled using the exponential distribution. We consider a three-station network with two finite buffers, for which a set of thresholds (tm1 and tm2) is defined. This computer network behaves as follows. A task, which finishes its service at station B, gets sent back to station A for re-processing with probability o. When the number of tasks in the second buffer exceeds a threshold tm2 and the number of task in the first buffer is less than tm1, the fed back task is served under HoL priority discipline. In opposite case, for fed backed tasks, “no two priority services in succession" procedure (preventing a possible overflow in the first buffer) is applied. Using an open Markovian queuing schema with blocking, priority feedback service and thresholds, a closed form cost-effective analytical solution is obtained. The model of servers linked in series is very accurate. It is derived directly from a twodimensional state graph and a set of steady-state equations, followed by calculations of main measures of effectiveness. Consequently, efficient expressions of the low computational cost are determined. Based on numerical experiments and collected results we conclude that the proposed model with blocking, feedback and thresholds can provide accurate performance estimates of linked in series networks.

Estimation of Bayesian Sample Size for Binomial Proportions Using Areas P-tolerance with Lowest Posterior Loss

This paper uses p-tolerance with the lowest posterior loss, quadratic loss function, average length criteria, average coverage criteria, and worst outcome criterion for computing of sample size to estimate proportion in Binomial probability function with Beta prior distribution. The proposed methodology is examined, and its effectiveness is shown.

Fuzzy Numbers and MCDM Methods for Portfolio Optimization

A new deployment of the multiple criteria decision making (MCDM) techniques: the Simple Additive Weighting (SAW), and the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for portfolio allocation, is demonstrated in this paper. Rather than exclusive reference to mean and variance as in the traditional mean-variance method, the criteria used in this demonstration are the first four moments of the portfolio distribution. Each asset is evaluated based on its marginal impacts to portfolio higher moments that are characterized by trapezoidal fuzzy numbers. Then centroid-based defuzzification is applied to convert fuzzy numbers to the crisp numbers by which SAW and TOPSIS can be deployed. Experimental results suggest the similar efficiency of these MCDM approaches to selecting dominant assets for an optimal portfolio under higher moments. The proposed approaches allow investors flexibly adjust their risk preferences regarding higher moments via different schemes adapting to various (from conservative to risky) kinds of investors. The other significant advantage is that, compared to the mean-variance analysis, the portfolio weights obtained by SAW and TOPSIS are consistently well-diversified.

Volume Fraction Law for Stainless Steel on Inner Surface and Nickel on Outer Surface For FGM Cylindrical Shell

Vibration of thin cylindrical shells made of a functionally gradient material composed of stainless steel and nickel is presented. The effects of the FGM configuration are studied by studying the frequencies of FG cylindrical shells. In this case FG cylindrical shell has Nickel on its outer surface and stainless steel on its inner surface. The study is carried out based on third order shear deformation shell theory. The objective is to study the natural frequencies, the influence of constituent volume fractions and the effects of configurations of the constituent materials on the frequencies. The properties are graded in the thickness direction according to the volume fraction power-law distribution. Results are presented on the frequency characteristics, the influence of the constituent various volume fractions on the frequencies.

Software Reliability Prediction Model Analysis

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Reconstitute Information about Discontinued Water Quality Variables in the Nile Delta Monitoring Network Using Two Record Extension Techniques

The world economic crises and budget constraints have caused authorities, especially those in developing countries, to rationalize water quality monitoring activities. Rationalization consists of reducing the number of monitoring sites, the number of samples, and/or the number of water quality variables measured. The reduction in water quality variables is usually based on correlation. If two variables exhibit high correlation, it is an indication that some of the information produced may be redundant. Consequently, one variable can be discontinued, and the other continues to be measured. Later, the ordinary least squares (OLS) regression technique is employed to reconstitute information about discontinued variable by using the continuously measured one as an explanatory variable. In this paper, two record extension techniques are employed to reconstitute information about discontinued water quality variables, the OLS and the Line of Organic Correlation (LOC). An empirical experiment is conducted using water quality records from the Nile Delta water quality monitoring network in Egypt. The record extension techniques are compared for their ability to predict different statistical parameters of the discontinued variables. Results show that the OLS is better at estimating individual water quality records. However, results indicate an underestimation of the variance in the extended records. The LOC technique is superior in preserving characteristics of the entire distribution and avoids underestimation of the variance. It is concluded from this study that the OLS can be used for the substitution of missing values, while LOC is preferable for inferring statements about the probability distribution.

The Influence of Surface Roughness of Drawbead on Non-Symmetry Deep Drawing Cold Rolled Steel Sheet

This study was aimed to explain the influence of surface roughness of the drawbead on non-symmetry deep drawing cold rolled steel sheet to improve the drawability of cold rolled steel sheet. The variables used in this study included semi-circle drawbead with 3 levels of surface roughness which are 6.127 mm Ra, 0.963 mm Ra and 0.152 mm Ra and cold rolled steel sheet according to 3 grades of the JIS standards which are SPCC, SPCE and SPCD with the thickness of 1.0 mm and the blankholder force which is 50% of the drawing force and the depth of 50 mm. According to the test results, when there was the increase in the surface roughness of drawbead, there would be the increase in deep drawing force, especially the SPCC cold rolled steel sheet. This is similar to the increase in the equivalent strain and the wall thickness distribution when the surface roughness of the drawbead increased. It could be concluded that the surface roughness of drawbead has an influence on deep drawing cold rolled steel sheet, especially the drawing force, the equivalent strain and the wall thickness distribution.

The Analysis of Photoconductive Semiconductor Switch Operation in the Frequency of 10 GHz

A device analysis of the photoconductive semiconductor switch is carried out to investigate distribution of electric field and carrier concentrations as well as the current density distribution. The operation of this device was then investigated as a switch operating in X band. It is shown that despite the presence of symmetry geometry, switch current density of the on-state steady state mode is distributed asymmetrically throughout the device.

Research on Weakly Hard Real-Time Constraints and Their Boolean Combination to Support Adaptive QoS

Advances in computing applications in recent years have prompted the demand for more flexible scheduling models for QoS demand. Moreover, in practical applications, partly violated temporal constraints can be tolerated if the violation meets certain distribution. So we need extend the traditional Liu and Lanland model to adapt to these circumstances. There are two extensions, which are the (m, k)-firm model and Window-Constrained model. This paper researches on weakly hard real-time constraints and their combination to support QoS. The fact that a practical application can tolerate some violations of temporal constraint under certain distribution is employed to support adaptive QoS on the open real-time system. The experiment results show these approaches are effective compared to traditional scheduling algorithms.

Nickel on Inner Surface and Stainless Steel on Outer Surface for Functionally Graded Cylindrical Shell

Study is on the vibration of thin cylindrical shells made of a functionally gradient material (FGM) composed of stainless steel and nickel is presented. The effects of the FGM configuration are studied by studying the frequencies of FG cylindrical shells. In this case FG cylindrical shell has Nickel on its inner surface and stainless steel on its outer surface. The study is carried out based on third order shear deformation shell theory. The objective is to study the natural frequencies, the influence of constituent volume fractions and the effects of configurations of the constituent materials on the frequencies. The properties are graded in the thickness direction according to the volume fraction power-law distribution. Results are presented on the frequency characteristics, the influence of the constituent various volume fractions on the frequencies.

Landslide and Debris Flow Characteristics during Extreme Rainfall in Taiwan

As the global climate changes, the threat from landslides and debris flows increases. Learning how a watershed initiates landslides under abnormal rainfall conditions and predicting landslide magnitude and frequency distribution is thus important. Landslides show a power-law distribution in the frequency-area distribution. The distribution curve shows an exponent gradient 1.0 in the Sandpile model test. Will the landslide frequency-area statistics show a distribution similar to the Sandpile model under extreme rainfall conditions? The purpose of the study is to identify the extreme rainfall-induced landslide frequency-area distribution in the Laonong River Basin in southern Taiwan. Results of the analysis show that a lower gradient of landslide frequency-area distribution could be attributed to the transportation and deposition of debris flow areas that are included in the landslide area.

Circular Patch Microstrip Array Antenna for KU-band

This paper present a circular patch microstrip array antenna operate in KU-band (10.9GHz – 17.25GHz). The proposed circular patch array antenna will be in light weight, flexible, slim and compact unit compare with current antenna used in KU-band. The paper also presents the detail steps of designing the circular patch microstrip array antenna. An Advance Design System (ADS) software is used to compute the gain, power, radiation pattern, and S11 of the antenna. The proposed Circular patch microstrip array antenna basically is a phased array consisting of 'n' elements (circular patch antennas) arranged in a rectangular grid. The size of each element is determined by the operating frequency. The incident wave from satellite arrives at the plane of the antenna with equal phase across the surface of the array. Each 'n' element receives a small amount of power in phase with the others. There are feed network connects each element to the microstrip lines with an equal length, thus the signals reaching the circular patches are all combined in phase and the voltages add up. The significant difference of the circular patch array antenna is not come in the phase across the surface but in the magnitude distribution.

Key Exchange Protocol over Insecure Channel

Key management represents a major and the most sensitive part of cryptographic systems. It includes key generation, key distribution, key storage, and key deletion. It is also considered the hardest part of cryptography. Designing secure cryptographic algorithms is hard, and keeping the keys secret is much harder. Cryptanalysts usually attack both symmetric and public key cryptosystems through their key management. We introduce a protocol to exchange cipher keys over insecure communication channel. This protocol is based on public key cryptosystem, especially elliptic curve cryptosystem. Meanwhile, it tests the cipher keys and selects only the good keys and rejects the weak one.

A New Approach to Image Segmentation via Fuzzification of Rènyi Entropy of Generalized Distributions

In this paper, we propose a novel approach for image segmentation via fuzzification of Rènyi Entropy of Generalized Distributions (REGD). The fuzzy REGD is used to precisely measure the structural information of image and to locate the optimal threshold desired by segmentation. The proposed approach draws upon the postulation that the optimal threshold concurs with maximum information content of the distribution. The contributions in the paper are as follow: Initially, the fuzzy REGD as a measure of the spatial structure of image is introduced. Then, we propose an efficient entropic segmentation approach using fuzzy REGD. However the proposed approach belongs to entropic segmentation approaches (i.e. these approaches are commonly applied to grayscale images), it is adapted to be viable for segmenting color images. Lastly, diverse experiments on real images that show the superior performance of the proposed method are carried out.