The Splitting Upwind Schemes for Spectral Action Balance Equation

The spectral action balance equation is an equation that used to simulate short-crested wind-generated waves in shallow water areas such as coastal regions and inland waters. This equation consists of two spatial dimensions, wave direction, and wave frequency which can be solved by finite difference method. When this equation with dominating convection term are discretized using central differences, stability problems occur when the grid spacing is chosen too coarse. In this paper, we introduce the splitting upwind schemes for avoiding stability problems and prove that it is consistent to the upwind scheme with same accuracy. The splitting upwind schemes was adopted to split the wave spectral action balance equation into four onedimensional problems, which for each small problem obtains the independently tridiagonal linear systems. For each smaller system can be solved by direct or iterative methods at the same time which is very fast when performed by a multi-processor computer.

Towards Cloud Computing Anatomy

Cloud Computing has recently emerged as a compelling paradigm for managing and delivering services over the internet. The rise of Cloud Computing is rapidly changing the landscape of information technology, and ultimately turning the longheld promise of utility computing into a reality. As the development of Cloud Computing paradigm is speedily progressing, concepts, and terminologies are becoming imprecise and ambiguous, as well as different technologies are interfering. Thus, it becomes crucial to clarify the key concepts and definitions. In this paper, we present the anatomy of Cloud Computing, covering its essential concepts, prominent characteristics, its affects, architectural design and key technologies. We differentiate various service and deployment models. Also, significant challenges and risks need are tackled in order to guarantee the long-term success of Cloud Computing. The aim of this paper is to provide a better understanding of the anatomy of Cloud Computing and pave the way for further research in this area.

Evaluation of Clustering Based on Preprocessing in Gene Expression Data

Microarrays have become the effective, broadly used tools in biological and medical research to address a wide range of problems, including classification of disease subtypes and tumors. Many statistical methods are available for analyzing and systematizing these complex data into meaningful information, and one of the main goals in analyzing gene expression data is the detection of samples or genes with similar expression patterns. In this paper, we express and compare the performance of several clustering methods based on data preprocessing including strategies of normalization or noise clearness. We also evaluate each of these clustering methods with validation measures for both simulated data and real gene expression data. Consequently, clustering methods which are common used in microarray data analysis are affected by normalization and degree of noise and clearness for datasets.

Numerical Simulation of Tidal Currents in Persian Gulf

In this paper, a two-dimensional (2D) numerical model for the tidal currents simulation in Persian Gulf is presented. The model is based on the depth averaged equations of shallow water which consider hydrostatic pressure distribution. The continuity equation and two momentum equations including the effects of bed friction, the Coriolis effects and wind stress have been solved. To integrate the 2D equations, the Alternative Direction Implicit (ADI) technique has been used. The base of equations discritization was finite volume method applied on rectangular mesh. To evaluate the model validation, a dam break case study including analytical solution is selected and the comparison is done. After that, the capability of the model in simulation of tidal current in a real field is represented by modeling the current behavior in Persian Gulf. The tidal fluctuations in Hormuz Strait have caused the tidal currents in the area of study. Therefore, the water surface oscillations data at Hengam Island on Hormoz Strait are used as the model input data. The check point of the model is measured water surface elevations at Assaluye port. The comparison between the results and the acceptable agreement of them showed the model ability for modeling marine hydrodynamic.

Performance of Heterogeneous Autoregressive Models of Realized Volatility: Evidence from U.S. Stock Market

This paper deals with heterogeneous autoregressive models of realized volatility (HAR-RV models) on high-frequency data of stock indices in the USA. Its aim is to capture the behavior of three groups of market participants trading on a daily, weekly and monthly basis and assess their role in predicting the daily realized volatility. The benefits of this work lies mainly in the application of heterogeneous autoregressive models of realized volatility on stock indices in the USA with a special aim to analyze an impact of the global financial crisis on applied models forecasting performance. We use three data sets, the first one from the period before the global financial crisis occurred in the years 2006-2007, the second one from the period when the global financial crisis fully hit the U.S. financial market in 2008-2009 years, and the last period was defined over 2010-2011 years. The model output indicates that estimated realized volatility in the market is very much determined by daily traders and in some cases excludes the impact of those market participants who trade on monthly basis.

Feasibility Analysis Studies on New National R&D Programs in Korea

As a part of evaluation system for R&D program, the Korean government has applied feasibility analysis since 2008. Various professionals put forth a great effort in order to catch up the high degree of freedom of R&D programs, and make contributions to evolving the feasibility analysis. We analyze diverse R&D programs from various viewpoints, such as technology, policy, and Economics, integrate the separate analysis, and finally arrive at a definite result; whether a program is feasible or unfeasible. This paper describes the concept and method of the feasibility analysis as a decision making tool. The analysis unit and content of each criterion, which are key elements in a comprehensive decision making structure, are examined

Diagnosis of Ovarian Cancer with Proteomic Patterns in Serum using Independent Component Analysis and Neural Networks

We propose a method for discrimination and classification of ovarian with benign, malignant and normal tissue using independent component analysis and neural networks. The method was tested for a proteomic patters set from A database, and radial basis functions neural networks. The best performance was obtained with probabilistic neural networks, resulting I 99% success rate, with 98% of specificity e 100% of sensitivity.

Effective Relay Communication for Scalable Video Transmission

In this paper, we propose an effective relay communication for layered video transmission as an alternative to make the most of limited resources in a wireless communication network where loss often occurs. Relaying brings stable multimedia services to end clients, compared to multiple description coding (MDC). Also, retransmission of only parity data about one or more video layer using channel coder to the end client of the relay device is paramount to the robustness of the loss situation. Using these methods in resource-constrained environments, such as real-time user created content (UCC) with layered video transmission, can provide high-quality services even in a poor communication environment. Minimal services are also possible. The mathematical analysis shows that the proposed method reduced the probability of GOP loss rate compared to MDC and raptor code without relay. The GOP loss rate is about zero, while MDC and raptor code without relay have a GOP loss rate of 36% and 70% in case of 10% frame loss rate.

Dynamical Analysis of Circadian Gene Expression

Microarrays technique allows the simultaneous measurements of the expression levels of thousands of mRNAs. By mining this data one can identify the dynamics of the gene expression time series. By recourse of principal component analysis, we uncover the circadian rhythmic patterns underlying the gene expression profiles from Cyanobacterium Synechocystis. We applied PCA to reduce the dimensionality of the data set. Examination of the components also provides insight into the underlying factors measured in the experiments. Our results suggest that all rhythmic content of data can be reduced to three main components.

Algorithmic Method for Efficient Cruise Program

One of the mayor problems of programming a cruise circuit is to decide which destinations to include and which don-t. Thus a decision problem emerges, that might be solved using a linear and goal programming approach. The problem becomes more complex if several boats in the fleet must be programmed in a limited schedule, trying their capacity matches best a seasonal demand and also attempting to minimize the operation costs. Moreover, the programmer of the company should consider the time of the passenger as a limited asset, and would like to maximize its usage. The aim of this work is to design a method in which, using linear and goal programming techniques, a model to design circuits for the cruise company decision maker can achieve an optimal solution within the fleet schedule.

Patterned Growth of ZnO Nanowire Arrays on Zinc Foil by Thermal Oxidation

A simple approach is demonstrated for growing large scale, nearly vertically aligned ZnO nanowire arrays by thermal oxidation method. To reveal effect of temperature on growth and physical properties of the ZnO nanowires, gold coated zinc substrates were annealed at 300 °C and 400 °C for 4 hours duration in air. Xray diffraction patterns of annealed samples indicated a set of well defined diffraction peaks, indexed to the wurtzite hexagonal phase of ZnO. The scanning electron microscopy studies show formation of ZnO nanowires having length of several microns and average of diameter less than 500 nm. It is found that the areal density of wires is relatively higher, when the annealing is carried out at higher temperature i.e. at 400°C. From the field emission studies, the values of the turn-on and threshold field, required to draw emission current density of 10 μA/cm2 and 100 μA/cm2 are observed to be 1.2 V/μm and 1.7 V/μm for the samples annealed at 300 °C and 2.9 V/μm and 3.7 V/μm for that annealed at 400 °C, respectively. The field emission current stability, investigated over duration of more than 2 hours at the preset value of 1 μA, is found to be fairly good in both cases. The simplicity of the synthesis route coupled with the promising field emission properties offer unprecedented advantage for the use of ZnO field emitters for high current density applications.

Inferences on Compound Rayleigh Parameters with Progressively Type-II Censored Samples

This paper considers inference under progressive type II censoring with a compound Rayleigh failure time distribution. The maximum likelihood (ML), and Bayes methods are used for estimating the unknown parameters as well as some lifetime parameters, namely reliability and hazard functions. We obtained Bayes estimators using the conjugate priors for two shape and scale parameters. When the two parameters are unknown, the closed-form expressions of the Bayes estimators cannot be obtained. We use Lindley.s approximation to compute the Bayes estimates. Another Bayes estimator has been obtained based on continuous-discrete joint prior for the unknown parameters. An example with the real data is discussed to illustrate the proposed method. Finally, we made comparisons between these estimators and the maximum likelihood estimators using a Monte Carlo simulation study.

A Numerical Strategy to Design Maneuverable Micro-Biomedical Swimming Robots Based on Biomimetic Flagellar Propulsion

Medical applications are among the most impactful areas of microrobotics. The ultimate goal of medical microrobots is to reach currently inaccessible areas of the human body and carry out a host of complex operations such as minimally invasive surgery (MIS), highly localized drug delivery, and screening for diseases at their very early stages. Miniature, safe and efficient propulsion systems hold the key to maturing this technology but they pose significant challenges. A new type of propulsion developed recently, uses multi-flagella architecture inspired by the motility mechanism of prokaryotic microorganisms. There is a lack of efficient methods for designing this type of propulsion system. The goal of this paper is to overcome the lack and this way, a numerical strategy is proposed to design multi-flagella propulsion systems. The strategy is based on the implementation of the regularized stokeslet and rotlet theory, RFT theory and new approach of “local corrected velocity". The effects of shape parameters and angular velocities of each flagellum on overall flow field and on the robot net forces and moments are considered. Then a multi-layer perceptron artificial neural network is designed and employed to adjust the angular velocities of the motors for propulsion control. The proposed method applied successfully on a sample configuration and useful demonstrative results is obtained.

Autobiographical Memory and Flexible Remembering: Gender Differences

In this study, we examined gender differences in: (1) a flexible remembering task, that asked for episodic memory decisions at an item-specific versus category-based level, and (2) the retrieval specificity of autobiographical memory during free recall. Differences favouring women were found on both measures. Furthermore, a significant association was observed, across gender groups, between level of specificity in the autobiographical memory interview and sensitivity to gist on the flexible remembering task. These results suggest that similar cognitive processes may partially contribute to both the ability for specific autobiographical recall and the capacity for inhibition of gist-information on the flexible remembering task.

Efficiency of Post-Tensioning Method for Seismic Retrofitting of Pre-Cast Cylindrical Concrete Reservoirs

Cylindrical concrete reservoirs are appropriate choice for storing liquids as water, oil and etc. By using of the pre-cast concrete reservoirs instead of the in-situ constructed reservoirs, the speed and precision of the construction would considerably increase. In this construction method, wall and roof panels would make in factory with high quality materials and precise controlling. Then, pre-cast wall and roof panels would carry out to the construction site for assembling. This method has a few faults such as: the existing weeks in connection of wall panels together and wall panels to foundation. Therefore, these have to be resisted under applied loads such as seismic load. One of the innovative methods which was successfully applied for seismic retrofitting of numerous pre-cast cylindrical water reservoirs in New Zealand, using of the high tensile cables around the reservoirs and post-tensioning them. In this paper, analytical modeling of wall and roof panels and post-tensioned cables are carried out with finite element method and the effect of height to diameter ratio, post-tensioning force value, liquid level in reservoir, installing position of tendons on seismic response of reservoirs are investigated.

On Identity Disclosure Risk Measurement for Shared Microdata

Probability-based identity disclosure risk measurement may give the same overall risk for different anonymization strategy of the same dataset. Some entities in the anonymous dataset may have higher identification risks than the others. Individuals are more concerned about higher risks than the average and are more interested to know if they have a possibility of being under higher risk. A notation of overall risk in the above measurement method doesn-t indicate whether some of the involved entities have higher identity disclosure risk than the others. In this paper, we have introduced an identity disclosure risk measurement method that not only implies overall risk, but also indicates whether some of the members have higher risk than the others. The proposed method quantifies the overall risk based on the individual risk values, the percentage of the records that have a risk value higher than the average and how larger the higher risk values are compared to the average. We have analyzed the disclosure risks for different disclosure control techniques applied to original microdata and present the results.

The Impact of HIV/AIDS on Micro-enterprise Development in Kenya: A Study of Obunga Slum in Kisumu

The performances of small and medium enterprises have stagnated in the last two decades. This has mainly been due to the emergence of HIV / Aids. The disease has had a detrimental effect on the general economy of the country leading to morbidity and mortality of the Kenyan workforce in their primary age. The present study sought to establish the economic impact of HIV / Aids on the micro-enterprise development in Obunga slum – Kisumu, in terms of production loss, increasing labor related cost and to establish possible strategies to address the impact of HIV / Aids on microenterprises. The study was necessitated by the observation that most micro-enterprises in the slum are facing severe economic and social crisis due to the impact of HIV / Aids, they get depleted and close down within a short time due to death of skilled and experience workforce. The study was carried out between June 2008 and June 2009 in Obunga slum. Data was subjected to computer aided statistical analysis that included descriptive statistic, chi-squared and ANOVA techniques. Chi-squared analysis on the micro-enterprise owners opinion on the impact of HIV / Aids on depletion of microenterprise compared to other diseases indicated high levels of the negative effects of the disease at significance levels of P

The Experimental Study of the Effect of Flow Pattern Geometry on Performance of Micro Proton Exchange Membrane Fuel Cell

In this research, the flow pattern influence on performance of a micro PEMFC was investigated experimentally. The investigation focused on the impacts of bend angels and rib/channel dimensions of serpentine flow channel pattern on the performance and investigated how they improve the performance. The fuel cell employed for these experiments was a micro single PEMFC with a membrane of 1.44 cm2 Nafion NRE-212. The results show that 60° and 120° bend angles can provide the better performances at 20 and 40 sccm inlet flow rates comparing to that the conventional design. Additionally, wider channel with narrower rib spacing gives better performance. These results may be applied to develop universal heuristics for the design of flow pattern of micro PEMFC.

A new Heuristic Algorithm for the Dynamic Facility Layout Problem with Budget Constraint

In this research, we have developed a new efficient heuristic algorithm for the dynamic facility layout problem with budget constraint (DFLPB). This heuristic algorithm combines two mathematical programming methods such as discrete event simulation and linear integer programming (IP) to obtain a near optimum solution. In the proposed algorithm, the non-linear model of the DFLP has been changed to a pure integer programming (PIP) model. Then, the optimal solution of the PIP model has been used in a simulation model that has been designed in a similar manner as the DFLP for determining the probability of assigning a facility to a location. After a sufficient number of runs, the simulation model obtains near optimum solutions. Finally, to verify the performance of the algorithm, several test problems have been solved. The results show that the proposed algorithm is more efficient in terms of speed and accuracy than other heuristic algorithms presented in previous works found in the literature.

Program Camouflage: A Systematic Instruction Hiding Method for Protecting Secrets

This paper proposes an easy-to-use instruction hiding method to protect software from malicious reverse engineering attacks. Given a source program (original) to be protected, the proposed method (1) takes its modified version (fake) as an input, (2) differences in assembly code instructions between original and fake are analyzed, and, (3) self-modification routines are introduced so that fake instructions become correct (i.e., original instructions) before they are executed and that they go back to fake ones after they are executed. The proposed method can add a certain amount of security to a program since the fake instructions in the resultant program confuse attackers and it requires significant effort to discover and remove all the fake instructions and self-modification routines. Also, this method is easy to use (with little effort) because all a user (who uses the proposed method) has to do is to prepare a fake source code by modifying the original source code.