A Cell-Based Multiphase Interleaving Buck Converter with Bypass Capacitors

Today-s Voltage Regulator Modules (VRMs) face increasing design challenges as the number of transistors in microprocessors increases per Moore-s Law. These challenges have recently become even more demanding as microprocessors operate at sub voltage range at significantly high current. This paper presents a new multiphase topology with cell configuration for improved performance in low voltage and high current applications. A lab scale hardware prototype of the new topology was design and constructed. Laboratory tests were performed on the proposed converter and compared with a commercially available VRM. Results from the proposed topology exhibit improved performance compared to the commercially available counterpart.

Effects of Global Warming on Climate Change in Udon Thani Province in the Period in 60 Surrounding Years (A.D.1951-2010)

This research were investigated, determined, and analyzed of the climate characteristically change in the provincial Udon Thani in the period of 60 surrounding years from 1951 to 2010 A.D. that it-s transferred to effects of climatologically data for determining global warming. Statistically significant were not found for the 60 years- data (R2

Model-free Prediction based on Tracking Theory and Newton Form of Polynomial

The majority of existing predictors for time series are model-dependent and therefore require some prior knowledge for the identification of complex systems, usually involving system identification, extensive training, or online adaptation in the case of time-varying systems. Additionally, since a time series is usually generated by complex processes such as the stock market or other chaotic systems, identification, modeling or the online updating of parameters can be problematic. In this paper a model-free predictor (MFP) for a time series produced by an unknown nonlinear system or process is derived using tracking theory. An identical derivation of the MFP using the property of the Newton form of the interpolating polynomial is also presented. The MFP is able to accurately predict future values of a time series, is stable, has few tuning parameters and is desirable for engineering applications due to its simplicity, fast prediction speed and extremely low computational load. The performance of the proposed MFP is demonstrated using the prediction of the Dow Jones Industrial Average stock index.

Finite Element Modeling of two-dimensional Nanoscale Structures with Surface Effects

Nanomaterials have attracted considerable attention during the last two decades, due to their unusual electrical, mechanical and other physical properties as compared with their bulky counterparts. The mechanical properties of nanostructured materials show strong size dependency, which has been explained within the framework of continuum mechanics by including the effects of surface stress. The size-dependent deformations of two-dimensional nanosized structures with surface effects are investigated in the paper by the finite element method. Truss element is used to evaluate the contribution of surface stress to the total potential energy and the Gurtin and Murdoch surface stress model is implemented with ANSYS through its user programmable features. The proposed approach is used to investigate size-dependent stress concentration around a nanosized circular hole and the size-dependent effective moduli of nanoporous materials. Numerical results are compared with available analytical results to validate the proposed modeling approach.

Towards a Suitable and Systematic Approach for Component Based Software Development

Software crisis refers to the situation in which the developers are not able to complete the projects within time and budget constraints and moreover these overscheduled and over budget projects are of low quality as well. Several methodologies have been adopted form time to time to overcome this situation and now in the focus is component based software engineering. In this approach, emphasis is on reuse of already existing software artifacts. But the results can not be achieved just by preaching the principles; they need to be practiced as well. This paper highlights some of the very basic elements of this approach, which has to be in place to get the desired goals of high quality, low cost with shorter time-to-market software products.

High Securing Cover-File of Hidden Data Using Statistical Technique and AES Encryption Algorithm

Nowadays, the rapid development of multimedia and internet allows for wide distribution of digital media data. It becomes much easier to edit, modify and duplicate digital information Besides that, digital documents are also easy to copy and distribute, therefore it will be faced by many threatens. It-s a big security and privacy issue with the large flood of information and the development of the digital format, it become necessary to find appropriate protection because of the significance, accuracy and sensitivity of the information. Nowadays protection system classified with more specific as hiding information, encryption information, and combination between hiding and encryption to increase information security, the strength of the information hiding science is due to the non-existence of standard algorithms to be used in hiding secret messages. Also there is randomness in hiding methods such as combining several media (covers) with different methods to pass a secret message. In addition, there are no formal methods to be followed to discover the hidden data. For this reason, the task of this research becomes difficult. In this paper, a new system of information hiding is presented. The proposed system aim to hidden information (data file) in any execution file (EXE) and to detect the hidden file and we will see implementation of steganography system which embeds information in an execution file. (EXE) files have been investigated. The system tries to find a solution to the size of the cover file and making it undetectable by anti-virus software. The system includes two main functions; first is the hiding of the information in a Portable Executable File (EXE), through the execution of four process (specify the cover file, specify the information file, encryption of the information, and hiding the information) and the second function is the extraction of the hiding information through three process (specify the steno file, extract the information, and decryption of the information). The system has achieved the main goals, such as make the relation of the size of the cover file and the size of information independent and the result file does not make any conflict with anti-virus software.

Testing of Materials for Rapid Prototyping Fused Deposition Modelling Technology

Paper presents knowledge about types of test in area of materials properties of selected methods of rapid prototyping technologies. In today used rapid prototyping technologies for production of models and final parts are used materials in initial state as solid, liquid or powder material structure. In solid state are used various forms such as pellets, wire or laminates. Basic range materials include paper, nylon, wax, resins, metals and ceramics. In Fused Deposition Modeling (FDM) rapid prototyping technology are mainly used as basic materials ABS (Acrylonitrile Butadiene Styrene), polyamide, polycarbonate, polyethylene and polypropylene. For advanced FDM applications are used special materials as silicon nitrate, PZT (Piezoceramic Material - Lead Zirconate Titanate), aluminium oxide, hydroxypatite and stainless steel.

Salbutamol Sulphate-Ethylcellulose Tabletted Microcapsules: Pharmacokinetic Study using Convolution Approach

The aim of this article is to narrate the utility of novel simulation approach i.e. convolution method to predict blood concentration of drug utilizing dissolution data of salbutamol sulphate microparticulate formulations with different release patterns (1:1, 1:2 and 1:3, drug:polymer). Dissolution apparatus II USP 2007 and 900 ml double distilled water stirrd at 50 rpm was employed for dissolution analysis. From dissolution data, blood drug concentration was determined, and in return predicted blood drug concentration data was used to calculate the pharmacokinetic parameters i.e. Cmax, Tmax, and AUC. Convolution is a good biwaiver technique; however its better utility needs it application in the conditions where biorelevant dissolution media are used.

RUPSec: An Extension on RUP for Developing Secure Systems - Requirements Discipline

The world is moving rapidly toward the deployment of information and communication systems. Nowadays, computing systems with their fast growth are found everywhere and one of the main challenges for these systems is increasing attacks and security threats against them. Thus, capturing, analyzing and verifying security requirements becomes a very important activity in development process of computing systems, specially in developing systems such as banking, military and e-business systems. For developing every system, a process model which includes a process, methods and tools is chosen. The Rational Unified Process (RUP) is one of the most popular and complete process models which is used by developers in recent years. This process model should be extended to be used in developing secure software systems. In this paper, the Requirement Discipline of RUP is extended to improve RUP for developing secure software systems. These proposed extensions are adding and integrating a number of Activities, Roles, and Artifacts to RUP in order to capture, document and model threats and security requirements of system. These extensions introduce a group of clear and stepwise activities to developers. By following these activities, developers assure that security requirements are captured and modeled. These models are used in design, implementation and test activitie

Renewable Energy Supply Options in Kuwait

This paper compares planning results of the electricity and water generation inventory up to year 2030 in the State of Kuwait. Currently, the generation inventory consists of oil and gas fired technologies only. The planning study considers two main cases. The first case, Reference case, examines a generation inventory based on oil and gas fired generation technologies only. The second case examines the inclusion of renewables as part of the generation inventory under two scenarios. In the first scenario, Ref-RE, renewable build-out is based on optimum economic performance of overall generation system. Result shows that the optimum installed renewable capacity with electric energy generation of 11% . In the second scenario, Ref-RE20, the renewable capacity build-out is forced to provide 20% of electric energy by 2030. The respective energy systems costs of Reference, Ref-RE and Ref-RE20 case scenarios reach US dollar 24, 10 and 14 billion annually in 2030.

Retrospective Synthetic Focusing with Correlation Weighting for Very High Frame Rate Ultrasound

The need of high frame-rate imaging has been triggered by the new applications of ultrasound imaging to transient elastography and real-time 3D ultrasound. Using plane wave excitation (PWE) is one of the methods to achieve very high frame-rate imaging since an image can be formed with a single insonification. However, due to the lack of transmit focusing, the image quality with PWE is lower compared with those using conventional focused transmission. To solve this problem, we propose a filter-retrieved transmit focusing (FRF) technique combined with cross-correlation weighting (FRF+CC weighting) for high frame-rate imaging with PWE. A restrospective focusing filter is designed to simultaneously minimize the predefined sidelobe energy associated with single PWE and the filter energy related to the signal-to-noise-ratio (SNR). This filter attempts to maintain the mainlobe signals and to reduce the sidelobe ones, which gives similar mainlobe signals and different sidelobes between the original PWE and the FRF baseband data. Normalized cross-correlation coefficient at zero lag is calculated to quantify the degree of similarity at each imaging point and used as a weighting matrix to the FRF baseband data to further suppress sidelobes, thus improving the filter-retrieved focusing quality.

Development of an Organizational Knowledge Capabilities Assessment (OKCA) Method for Innovative Technology Enterprises

Knowledge capabilities are increasingly important for the innovative technology enterprises to enhance the business performance in terms of product competitiveness, innovation and sales. Recognition of the company capability by auditing allows them to further pursue advancement, strategic planning and hence gain competitive advantages. This paper attempts to develop an Organizations- Knowledge Capabilities Assessment (OKCA) method to assess the knowledge capabilities of technology companies. The OKCA is a questionnaire-based assessment tool which has been developed to uncover the impact of various knowledge capabilities on different organizational performance. The collected data is then analyzed to find out the crucial elements for different technological companies. Based on the results, innovative technology enterprises are able to recognize the direction for further improvement on business performance and future development plan. External environmental factors affecting organization performance can be found through the further analysis of some selected reference companies.

New Models of Financial Management Put into Effect in Dental Practices in Romania –Empirical Study

20 years of dentistry was a period of transition from communist to market economy but Romanian doctors have insufficient management knowledge. Recently, the need for modern management has increased due to technologies and superior materials appearance, as patient-s demands. Research goal is to increase efficiency by evaluating dental medical office cost categories in real pricing procedures. Empirical research is based on guided study that includes information about the association between categories of cost perception and therapeutic procedures commonly used in dental offices. Due to the obtained results to identify all the labours that make up a settled procedure costs were determined for each procedure. Financial evaluation software was created with the main functions: introducing and maintaining patient records, treatment and appointments made, procedures cost and monitoring office productivity. We believe that the study results can significantly improve the financial management of dental offices, increasing the effectiveness and quality of services.

Strongly Adequate Software Architecture

Components of a software system may be related in a wide variety of ways. These relationships need to be represented in software architecture in order develop quality software. In practice, software architecture is immensely challenging, strikingly multifaceted, extravagantly domain based, perpetually changing, rarely cost-effective, and deceptively ambiguous. This paper analyses relations among the major components of software systems and argues for using several broad categories for software architecture for assessment purposes: strongly adequate, weakly adequate and functionally adequate software architectures among other categories. These categories are intended for formative assessments of architectural designs.

Public User Assessment of Malaysia's E-Government Applications

The implementation of electronic government started since the initiation of Multimedia Super Corridor (MSC) by the Malaysia government. The introduction of ICT in the public sector especially e-Government initiatives opens up a new book in the government administration throughout the world. The aim or this paper is to discuss the implementation of e-government in Malaysia, covering the result of public user self assessment on Malaysia's electronic government applications. E-services, e-procurement, Generic Office Environment (GOE), Human Resources Management Information System (HRMIS), Project Monitoring System (PMS), Electronic Labor Exchange (ELX) and e-syariah(religion) were the seven flagship application assessed. The study adopted a crosssectional survey research approach and information system literature were used. The analysis was done for 35 responden in pilot test and there was evidence from public user's perspective to suggest that the e-government applications were generally successful.

Effects of Feeding Glycerol to Lactating Dairy Cows on Milk Production and Composition

A study was conducted to determine the effect of feeding glycerol on dairy cows performance. Twenty four Holstein Friesian crossbred (>87.5% Holstein Friesian) lactating dairy cows in early lactation; averaging 13+2.4 kg of milk, 64+45 days in milk, 55+16 months old and 325+26 kg live weight, were stratified for milk yield, days in milk, age, stage of lactation and body weight, and then randomly allocated to three treatment groups. All cows were fed approximate 8 kg of concentrate together with ad libitum corn silage and freely access to clean water. Nil or 150 and 300g of glycerol were supplemented to the cows according to treatment groups. All cows consumed similar concentrate, corn silage and total DM and NELP. There were no significant differences in DM intake, CP intake, NELP intake, milk and milk composition yields. All cows had similar fat, protein, lactose, solid not fat and total solid percentage. All cows gain similar live weight. The present study indicated that, supplementation of glycerol did not enhance milk yield, milk composition and live weight change.

Heat Transfer Modeling in Multi-Layer Cookware using Finite Element Method

The high temperature degree and uniform Temperature Distribution (TD) on surface of cookware which contact with food are effective factors for improving cookware application. Additionally, the ability of pan material in retaining the heat and nonreactivity with foods are other significant properties. It is difficult for single material to meet a wide variety of demands such as superior thermal and chemical properties. Multi-Layer Plate (MLP) makes more regular TD. In this study the main objectives are to find the best structure (single or multi-layer) and materials to provide maximum temperature degree and uniform TD up side surface of pan. And also heat retaining of used metals with goal of improving the thermal quality of pan to economize the energy. To achieve this aim were employed Finite Element Method (FEM) for analyzing transient thermal behavior of applied materials. The analysis has been extended for different metals, we achieved the best temperature profile and heat retaining in Copper/ Stainless Steel MLP.

Comparing and Combining the Axial with the Network Maps for Analyzing Urban Street Pattern

Rooted in the study of social functioning of space in architecture, Space Syntax (SS) and the more recent Network Pattern (NP) researches demonstrate the 'spatial structures' of city, i.e. the hierarchical patterns of streets, junctions and alley ends. Applying SS and NP models, planners can conceptualize the real city-s patterns. Although, both models yield the optimal path of the city their underpinning displays of the city-s spatial configuration differ. The Axial Map analyzes the topological non-distance-based connectivity structure, whereas, the Central-Node Map and the Shortcut-Path Map, in contrast, analyze the metrical distance-based structures. This research contrasts and combines them to understand various forms of city-s structures. It concludes that, while they reveal different spatial structures, Space Syntax and Network Pattern urban models support each the other. Combining together they simulate the global access and the locally compact structures namely the central nodes and the shortcuts for the city.

Usability and Affordances: Examinations of Object-Naming and Object-Task Performance in Haptic Interfaces

The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.

Genetic Programming Approach for Multi-Category Pattern Classification Appliedto Network Intrusions Detection

This paper describes a new approach of classification using genetic programming. The proposed technique consists of genetically coevolving a population of non-linear transformations on the input data to be classified, and map them to a new space with a reduced dimension, in order to get a maximum inter-classes discrimination. The classification of new samples is then performed on the transformed data, and so become much easier. Contrary to the existing GP-classification techniques, the proposed one use a dynamic repartition of the transformed data in separated intervals, the efficacy of a given intervals repartition is handled by the fitness criterion, with a maximum classes discrimination. Experiments were first performed using the Fisher-s Iris dataset, and then, the KDD-99 Cup dataset was used to study the intrusion detection and classification problem. Obtained results demonstrate that the proposed genetic approach outperform the existing GP-classification methods [1],[2] and [3], and give a very accepted results compared to other existing techniques proposed in [4],[5],[6],[7] and [8].