Power Saving System in Green Data Center

Power consumption is rapidly increased in data centers because the number of data center is increased and more the scale of data center become larger. Therefore, it is one of key research items to reduce power consumption in data center. The peak power of a typical server is around 250 watts. When a server is idle, it continues to use around 60% of the power consumed when in use, though vendors are putting effort into reducing this “idle" power load. Servers tend to work at only around a 5% to 20% utilization rate, partly because of response time concerns. An average of 10% of servers in their data centers was unused. In those reason, we propose dynamic power management system to reduce power consumption in green data center. Experiment result shows that about 55% power consumption is reduced at idle time.

From Micro to Nanosystems: An Exploratory Study of Influences on Innovation Teams

What influences microsystems (MEMS) and nanosystems (NEMS) innovation teams apart from technology complexity? Based on in-depth interviews with innovators, this research explores the key influences on innovation teams in the early phases of MEMS/NEMS. Projects are rare and may last from 5 to 10 years or more from idea to concept. As fundamental technology development in MEMS/NEMS is highly complex and interdisciplinary by involving expertise from different basic and engineering disciplines, R&D is rather a 'testing of ideas' with many uncertainties than a clearly structured process. The purpose of this study is to explore the innovation teams- environment and give specific insights for future management practices. The findings are grouped into three major areas: people, know-how and experience, and market. The results highlight the importance and differences of innovation teams- composition, transdisciplinary knowledge, project evaluation and management compared to the counterparts from new product development teams.

On-line Identification of Continuous-time Hammerstein Systems via RBF Networks and Immune Algorithm

This paper deals with an on-line identification method of continuous-time Hammerstein systems by using the radial basis function (RBF) networks and immune algorithm (IA). An unknown nonlinear static part to be estimated is approximately represented by the RBF network. The IA is efficiently combined with the recursive least-squares (RLS) method. The objective function for the identification is regarded as the antigen. The candidates of the RBF parameters such as the centers and widths are coded into binary bit strings as the antibodies and searched by the IA. On the other hand, the candidates of both the weighting parameters of the RBF network and the system parameters of the linear dynamic part are updated by the RLS method. Simulation results are shown to illustrate the proposed method.

Increased Solubility, Dissolution and Physicochemical Studies of Curcumin- Polyvinylpyrrolidone K-30 Solid Dispersions

Solid dispersions (SD) of curcuminpolyvinylpyrrolidone in the ratio of 1:2, 1:4, 1:5, 1:6, and 1:8 were prepared in an attempt to increase the solubility and dissolution. Solubility, dissolution, powder X-ray diffraction (XRD), differential scanning calorimetry (DSC) and Fourier transform infrared spectroscopy (FTIR) of solid dispersions, physical mixtures (PM) and curcumin were evaluated. Both solubility and dissolution of curcumin solid dispersions were significantly greater than those observed for physical mixtures and intact curcumin. The powder X-ray diffractograms indicated that the amorphous curcumin was obtained from all solid dispersions. It was found that the optimum weight ratio for curcumin:PVP K-30 is 1:6. The 1:6 solid dispersion still in the amorphous from after storage at ambient temperature for 2 years and the dissolution profile did not significantly different from freshly prepared.

A Model Driven Based Method for Scheduling Analysis and HW/SW Partitioning

Unified Modeling Language (UML) extensions for real time embedded systems (RTES) co-design, are taking a growing interest by a great number of industrial and research communities. The extension mechanism is provided by UML profiles for RTES. It aims at improving an easily-understood method of system design for non-experts. On the other hand, one of the key items of the co- design methods is the Hardware/Software partitioning and scheduling tasks. Indeed, it is mandatory to define where and when tasks are implemented and run. Unfortunately the main goals of co-design are not included in the usual practice of UML profiles. So, there exists a need for mapping used models to an execution platform for both schedulability test and HW/SW partitioning. In the present work, test schedulability and design space exploration are performed at an early stage. The proposed approach adopts Model Driven Engineering MDE. It starts from UML specification annotated with the recent profile for the Modeling and Analysis of Real Time Embedded systems MARTE. Following refinement strategy, transformation rules allow to find a feasible schedule that satisfies timing constraints and to define where tasks will be implemented. The overall approach is experimented for the design of a football player robot application.

Time and Frequency Domain Analysis of Heart Rate Variability and their Correlations in Diabetes Mellitus

Diabetes mellitus (DM) is frequently characterized by autonomic nervous dysfunction. Analysis of heart rate variability (HRV) has become a popular noninvasive tool for assessing the activities of autonomic nervous system (ANS). In this paper, changes in ANS activity are quantified by means of frequency and time domain analysis of R-R interval variability. Electrocardiograms (ECG) of 16 patients suffering from DM and of 16 healthy volunteers were recorded. Frequency domain analysis of extracted normal to normal interval (NN interval) data indicates significant difference in very low frequency (VLF) power, low frequency (LF) power and high frequency (HF) power, between the DM patients and control group. Time domain measures, standard deviation of NN interval (SDNN), root mean square of successive NN interval differences (RMSSD), successive NN intervals differing more than 50 ms (NN50 Count), percentage value of NN50 count (pNN50), HRV triangular index and triangular interpolation of NN intervals (TINN) also show significant difference between the DM patients and control group.

In Vitro Study of Coded Transmission in Synthetic Aperture Ultrasound Imaging Systems

In the paper the study of synthetic transmit aperture method applying the Golay coded transmission for medical ultrasound imaging is presented. Longer coded excitation allows to increase the total energy of the transmitted signal without increasing the peak pressure. Moreover signal-to-noise ratio and penetration depth are improved while maintaining high ultrasound image resolution. In the work the 128-element linear transducer array with 0.3 mm inter-element spacing excited by one cycle and the 8 and 16- bit Golay coded sequences at nominal frequency 4 MHz was used. To generate a spherical wave covering the full image region a single element transmission aperture was used and all the elements received the echo signals. The comparison of 2D ultrasound images of the tissue mimicking phantom and in vitro measurements of the beef liver is presented to illustrate the benefits of the coded transmission. The results were obtained using the synthetic aperture algorithm with transmit and receive signals correction based on a single element directivity function.

Traffic Behaviour of VoIP in a Simulated Access Network

Insufficient Quality of Service (QoS) of Voice over Internet Protocol (VoIP) is a growing concern that has lead the need for research and study. In this paper we investigate the performance of VoIP and the impact of resource limitations on the performance of Access Networks. The impact of VoIP performance in Access Networks is particularly important in regions where Internet resources are limited and the cost of improving these resources is prohibitive. It is clear that perceived VoIP performance, as measured by mean opinion score [2] in experiments, where subjects are asked to rate communication quality, is determined by end-to-end delay on the communication path, delay variation, packet loss, echo, the coding algorithm in use and noise. These performance indicators can be measured and the affect in the Access Network can be estimated. This paper investigates the congestion in the Access Network to the overall performance of VoIP services with the presence of other substantial uses of internet and ways in which Access Networks can be designed to improve VoIP performance. Methods for analyzing the impact of the Access Network on VoIP performance will be surveyed and reviewed. This paper also considers some approaches for improving performance of VoIP by carrying out experiments using Network Simulator version 2 (NS2) software with a view to gaining a better understanding of the design of Access Networks.

New Efficient Iterative Optimization Algorithm to Design the Two Channel QMF Bank

This paper proposes an efficient method for the design of two channel quadrature mirror filter (QMF) bank. To achieve minimum value of reconstruction error near to perfect reconstruction, a linear optimization process has been proposed. Prototype low pass filter has been designed using Kaiser window function. The modified algorithm has been developed to optimize the reconstruction error using linear objective function through iteration method. The result obtained, show that the performance of the proposed algorithm is better than that of the already exists methods.

Adoption of iPads Paving the Way to Changes in the Knowledge Practices within a School of Vocational Teacher Education

The possibilities of mobile technology generate new demands for vocational teacher trainers to transform their approach to work and to incorporate its usage into their ordinary educational practice. This paper presents findings of a focus discussion group (FDG) session on the usage of iPads within a school of vocational teacher education (SoVTE). It aims to clarify how the teacher trainers are using iPads and what has changed in their work during the usage of iPads. The analytical framework bases on content analysis and expansive learning cycle. It was not only found what kind of a role iPads played in their daily practices but it brought also into attention how a cultural change regarding the usage of social media and mobile technology was desperately needed in the whole work community. Thus, the FGD was abducted for developing the knowledge practices of the community of the SoVTE.

Protein Secondary Structure Prediction Using Parallelized Rule Induction from Coverings

Protein 3D structure prediction has always been an important research area in bioinformatics. In particular, the prediction of secondary structure has been a well-studied research topic. Despite the recent breakthrough of combining multiple sequence alignment information and artificial intelligence algorithms to predict protein secondary structure, the Q3 accuracy of various computational prediction algorithms rarely has exceeded 75%. In a previous paper [1], this research team presented a rule-based method called RT-RICO (Relaxed Threshold Rule Induction from Coverings) to predict protein secondary structure. The average Q3 accuracy on the sample datasets using RT-RICO was 80.3%, an improvement over comparable computational methods. Although this demonstrated that RT-RICO might be a promising approach for predicting secondary structure, the algorithm-s computational complexity and program running time limited its use. Herein a parallelized implementation of a slightly modified RT-RICO approach is presented. This new version of the algorithm facilitated the testing of a much larger dataset of 396 protein domains [2]. Parallelized RTRICO achieved a Q3 score of 74.6%, which is higher than the consensus prediction accuracy of 72.9% that was achieved for the same test dataset by a combination of four secondary structure prediction methods [2].

The Importance of Enterprise Support for Tourism Workers- Successful Use of a Cash Transaction System: An Information Systems Continuance Approach

In this paper we investigate how wide-ranging organizational support and the more specific form of support, namely management support, may influence on tourism workers satisfaction with a cash transaction system. The IS continuance theory, proposed by Bhattacherjee in 2001, is utilized as a theoretical framework. This implies that both perceived usefulness and ease of use is included in the research model, in addition to organizational and management support. The sample consists of 500 workers from 10 cruise and tourist ferries in Scandinavia that use a cash transaction system to perform their work tasks. Using structural equation modelling, results indicate that organizational support and ease of use perceptions is critical for the users- level of satisfaction with the cash transaction system.The findings have implications for business managers and IS practitioners that want to increase the quality of IT-based business processes within the tourism industry.

Analysis of Classifications of Unsolicited Bulk Emails

In recent times, the problem of Unsolicited Bulk Email (UBE) or commonly known as Spam Email, has increased at a tremendous growth rate. We present an analysis of survey based on classifications of UBE in various research works. There are many research instances for classification between spam and non-spam emails but very few research instances are available for classification of spam emails, per se. This paper does not intend to assert some UBE classification to be better than the others nor does it propose any new classification but it bemoans the lack of harmony on number and definition of categories proposed by different researchers. The paper also elaborates on factors like intent of spammer, content of UBE and ambiguity in different categories as proposed in related research works of classifications of UBE.

A Bayesian Hierarchical 13COBT to Correct Estimates Associated with a Delayed Gastric Emptying

The use of a Bayesian Hierarchical Model (BHM) to interpret breath measurements obtained during a 13C Octanoic Breath Test (13COBT) is demonstrated. The statistical analysis was implemented using WinBUGS, a commercially available computer package for Bayesian inference. A hierarchical setting was adopted where poorly defined parameters associated with a delayed Gastric Emptying (GE) were able to "borrow" strength from global distributions. This is proved to be a sufficient tool to correct model's failures and data inconsistencies apparent in conventional analyses employing a Non-linear least squares technique (NLS). Direct comparison of two parameters describing gastric emptying ng ( tlag -lag phase, t1/ 2 -half emptying time) revealed a strong correlation between the two methods. Despite our large dataset ( n = 164 ), Bayesian modeling was fast and provided a successful fitting for all subjects. On the contrary, NLS failed to return acceptable estimates in cases where GE was delayed.

1-D Modeling of Hydrate Decomposition in Porous Media

This paper describes a one-dimensional numerical model for natural gas production from the dissociation of methane hydrate in hydrate-capped gas reservoir under depressurization and thermal stimulation. Some of the hydrate reservoirs discovered are overlying a free-gas layer, known as hydrate-capped gas reservoirs. These reservoirs are thought to be easiest and probably the first type of hydrate reservoirs to be produced. The mathematical equations that can be described this type of reservoir include mass balance, heat balance and kinetics of hydrate decomposition. These non-linear partial differential equations are solved using finite-difference fully implicit scheme. In the model, the effect of convection and conduction heat transfer, variation change of formation porosity, the effect of using different equations of state such as PR and ER and steam or hot water injection are considered. In addition distributions of pressure, temperature, saturation of gas, hydrate and water in the reservoir are evaluated. It is shown that the gas production rate is a sensitive function of well pressure.

Modeling and Stability Analysis of Delayed Game Network

This paper aims to establish a delayed dynamical relationship between payoffs of players in a zero-sum game. By introducing Markovian chain and time delay in the network model, a delayed game network model with sector bounds and slope bounds restriction nonlinear function is first proposed. As a result, a direct dynamical relationship between payoffs of players in a zero-sum game can be illustrated through a delayed singular system. Combined with Finsler-s Lemma and Lyapunov stable theory, a sufficient condition guaranteeing the unique existence and stability of zero-sum game-s Nash equilibrium is derived. One numerical example is presented to illustrate the validity of the main result.

Improvement Plant Layout Using Systematic Layout Planning (SLP) for Increased Productivity

The objective of this research is to study plant layout of iron manufacturing based on the systematic layout planning pattern theory (SLP) for increased productivity. In this case study, amount of equipments and tools in iron production are studied. The detailed study of the plant layout such as operation process chart, flow of material and activity relationship chart has been investigated. The new plant layout has been designed and compared with the present plant layout. The SLP method showed that new plant layout significantly decrease the distance of material flow from billet cutting process until keeping in ware house.

Creating the Color Panoramic View using Medley of Grayscale and Color Partial Images

Panoramic view generation has always offered novel and distinct challenges in the field of image processing. Panoramic view generation is nothing but construction of bigger view mosaic image from set of partial images of the desired view. The paper presents a solution to one of the problems of image seascape formation where some of the partial images are color and others are grayscale. The simplest solution could be to convert all image parts into grayscale images and fusing them to get grayscale image panorama. But in the multihued world, obtaining the colored seascape will always be preferred. This could be achieved by picking colors from the color parts and squirting them in grayscale parts of the seascape. So firstly the grayscale image parts should be colored with help of color image parts and then these parts should be fused to construct the seascape image. The problem of coloring grayscale images has no exact solution. In the proposed technique of panoramic view generation, the job of transferring color traits from reference color image to grayscale image is done by palette based method. In this technique, the color palette is prepared using pixel windows of some degrees taken from color image parts. Then the grayscale image part is divided into pixel windows with same degrees. For every window of grayscale image part the palette is searched and equivalent color values are found, which could be used to color grayscale window. For palette preparation we have used RGB color space and Kekre-s LUV color space. Kekre-s LUV color space gives better quality of coloring. The searching time through color palette is improved over the exhaustive search using Kekre-s fast search technique. After coloring the grayscale image pieces the next job is fusion of all these pieces to obtain panoramic view. For similarity estimation between partial images correlation coefficient is used.

Population Trend of Canola Aphid, Lipaphis Erysimi (Kalt.) (Homoptera: Aphididae) and its Associated Natural Enemies in Different Brassica Lines along with the Effect of Gamma Radiation on Their Population

Studies regarding the determination of population trend of Lipaphis erysimi (kalt.) and its associated natural enemies in different Brassica lines along with the effect of gamma radiation on their population were conducted at Agricultural Research Farm, Malakandher, Khyber Pakhtunkhwa Agricultural University Peshawar during spring 2006. Three different Brassica lines F6B3, F6B6 and F6B7 were used, which were replicated four times in Randomized Complete Block Design. The data revealed that aphid infestation invariably stated in all three varieties during last week of February 2006 (1st observation). The peak population of 4.39 aphids leaf-1 was s recorded during 2nd week of March and lowest population of 1.02 aphids leaf-1 was recorded during 5th week of March. The species of lady bird beetle (Coccinella septempunctata) and Syrphid fly (Syrphus balteatus) first appeared on 24th February with a mean number of 0.40 lady bird beetle leaf-1 and 0.87 Syrphid fly leaf-1, respectively. At the time when aphid population started to increase the peak population of C. septempunctata (0.70 lady bird beetle leaf- 1) and S. balteatus (1.04 syrphid fly leaf-1) was recorded on the 2nd week of March. Chrysoperla carnea appeared in the 1st week of March and their peak population was recorded during the 3rd week of March with mean population of 1.46 C. carnea leaf-1. Among all the Brassica lines, F6B7 showed comparatively more resistance as compared to F6B3 F6B6. F6B3 showed least resistance against L. erysimi, which was found to be the most susceptible cultivar. F6B7 was also found superior in terms of natural enemies. Maximum number of all natural enemies was recorded on this variety followed by F6B6. Lowest number of natural enemies was recorded in F6B3. No significant effect was recorded for the effect of gamma radiation on the population of aphids, natural enemies and on the varieties.

Meta Random Forests

Leo Breimans Random Forests (RF) is a recent development in tree based classifiers and quickly proven to be one of the most important algorithms in the machine learning literature. It has shown robust and improved results of classifications on standard data sets. Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques to the random forests. We experiment the working of the ensembles of random forests on the standard data sets available in UCI data sets. We compare the original random forest algorithm with their ensemble counterparts and discuss the results.