Energy Conscious Builder Design Pattern with C# and Intermediate Language

Design Patterns have gained more and more acceptances since their emerging in software development world last decade and become another de facto standard of essential knowledge for Object-Oriented Programming developers nowadays. Their target usage, from the beginning, was for regular computers, so, minimizing power consumption had never been a concern. However, in this decade, demands of more complicated software for running on mobile devices has grown rapidly as the much higher performance portable gadgets have been supplied to the market continuously. To get along with time to market that is business reason, the section of software development for power conscious, battery, devices has shifted itself from using specific low-level languages to higher level ones. Currently, complicated software running on mobile devices are often developed by high level languages those support OOP concepts. These cause the trend of embracing Design Patterns to mobile world. However, using Design Patterns directly in software development for power conscious systems is not recommended because they were not originally designed for such environment. This paper demonstrates the adapted Design Pattern for power limitation system. Because there are numerous original design patterns, it is not possible to mention the whole at once. So, this paper focuses only in creating Energy Conscious version of existing regular "Builder Pattern" to be appropriated for developing low power consumption software.

Evaluation of Tension Capacity of Pile (Case Study in Sandy Soil)

High building constructions are increasing in south beaches of the Caspian Sea because of tourist attractions and limitation of residential areas. According to saturated alluvial fields transfer of load from high structures to the soil by piles is inevitable. In spite of most of these piles are under compression forces, tension piles are used in special conditions. Few studies have been conducted because of the limited use of these piles. Tension capacity of openended pipe piles in full scale was tested in this study. The length of the bored piles was 420 up to 480 cm and all were in 120 cm diameter. The results of testing 7 piles were compared with the results of relations given by researches.

Design of Multi-disease Diagnosis Processor using Hypernetworks Technique

In this paper, we propose disease diagnosis hardware architecture by using Hypernetworks technique. It can be used to diagnose 3 different diseases (SPECT Heart, Leukemia, Prostate cancer). Generally, the disparate diseases require specified diagnosis hardware model for each disease. Using similarities of three diseases diagnosis processor, we design diagnosis processor that can diagnose three different diseases. Our proposed architecture that is combining three processors to one processor can reduce hardware size without decrease of the accuracy.

A Bayesian Network Reliability Modeling for FlexRay Systems

The increasing importance of FlexRay systems in automotive domain inspires unceasingly relative researches. One primary issue among researches is to verify the reliability of FlexRay systems either from protocol aspect or from system design aspect. However, research rarely discusses the effect of network topology on the system reliability. In this paper, we will illustrate how to model the reliability of FlexRay systems with various network topologies by a well-known probabilistic reasoning technology, Bayesian Network. In this illustration, we especially investigate the effectiveness of error containment built in star topology and fault-tolerant midpoint synchronization algorithm adopted in FlexRay communication protocol. Through a FlexRay steer-by-wire case study, the influence of different topologies on the failure probability of the FlexRay steerby- wire system is demonstrated. The notable value of this research is to show that the Bayesian Network inference is a powerful and feasible method for the reliability assessment of FlexRay systems.

A Case Study on Product Development Performance Measurement

In recent years, an increased competition and lower profit margins have necessitated a focus on improving the performance of the product development process, an area that traditionally have been excluded from detailed steering and evaluation. A systematic improvement requires a good understanding of the current performance, wherefore the interest for product development performance measurement has increased dramatically. This paper presents a case study that evaluates the performance of the product development performance measurement system used in a Swedish company that is a part of a global corporate group. The study is based on internal documentation and eighteen in-depth interviews with stakeholders involved in the product development process. The results from the case study includes a description of what metrics that are in use, how these are employed, and its affect on the quality of the performance measurement system. Especially, the importance of having a well-defined process proved to have a major impact on the quality of the performance measurement system in this particular case.

Spread Spectrum Code Estimationby Particle Swarm Algorithm

In the context of spectrum surveillance, a new method to recover the code of spread spectrum signal is presented, while the receiver has no knowledge of the transmitter-s spreading sequence. In our previous paper, we used Genetic algorithm (GA), to recover spreading code. Although genetic algorithms (GAs) are well known for their robustness in solving complex optimization problems, but nonetheless, by increasing the length of the code, we will often lead to an unacceptable slow convergence speed. To solve this problem we introduce Particle Swarm Optimization (PSO) into code estimation in spread spectrum communication system. In searching process for code estimation, the PSO algorithm has the merits of rapid convergence to the global optimum, without being trapped in local suboptimum, and good robustness to noise. In this paper we describe how to implement PSO as a component of a searching algorithm in code estimation. Swarm intelligence boasts a number of advantages due to the use of mobile agents. Some of them are: Scalability, Fault tolerance, Adaptation, Speed, Modularity, Autonomy, and Parallelism. These properties make swarm intelligence very attractive for spread spectrum code estimation. They also make swarm intelligence suitable for a variety of other kinds of channels. Our results compare between swarm-based algorithms and Genetic algorithms, and also show PSO algorithm performance in code estimation process.

Hydrodynamic Simulation of Fixed Bed GTL Reactor Using CFD

In this work, axisymetric CFD simulation of fixed bed GTL reactor has been conducted, using computational fluid dynamics (CFD). In fixed bed CFD modeling, when N (tube-to-particle diameter ratio) has a large value, it is common to consider the packed bed as a porous media. Synthesis gas (a mixture of predominantly carbon monoxide and hydrogen) was fed to the reactor. The reactor length was 20 cm, divided to three sections. The porous zone was in the middle section of the reactor. The model equations were solved employing finite volume method. The effects of particle diameter, bed voidage, fluid velocity and bed length on pressure drop have been investigated. Simulation results showed these parameters could have remarkable impacts on the reactor pressure drop.

Using the PGAS Programming Paradigm for Biological Sequence Alignment on a Chip Multi-Threading Architecture

The Partitioned Global Address Space (PGAS) programming paradigm offers ease-of-use in expressing parallelism through a global shared address space while emphasizing performance by providing locality awareness through the partitioning of this address space. Therefore, the interest in PGAS programming languages is growing and many new languages have emerged and are becoming ubiquitously available on nearly all modern parallel architectures. Recently, new parallel machines with multiple cores are designed for targeting high performance applications. Most of the efforts have gone into benchmarking but there are a few examples of real high performance applications running on multicore machines. In this paper, we present and evaluate a parallelization technique for implementing a local DNA sequence alignment algorithm using a PGAS based language, UPC (Unified Parallel C) on a chip multithreading architecture, the UltraSPARC T1.

Microstructure and Mechanical Behaviuor of Rotary Friction Welded Titanium Alloys

Ti-6Al-4V alloy has demonstrated a high strength to weight ratio as well as good properties at high temperature. The successful application of the alloy in some important areas depends on suitable joining techniques. Friction welding has many advantageous features to be chosen for joining Titanium alloys. The present work investigates the feasibility of producing similar metal joints of this Titanium alloy by rotary friction welding method. The joints are produced at three different speeds and the performances of the welded joints are evaluated by conducting microstructure studies, Vickers Hardness and tensile tests at the joints. It is found that the weld joints produced are sound and the ductile fractures in the tensile weld specimens occur at locations away from the welded joints. It is also found that a rotational speed of 1500 RPM can produce a very good weld, with other parameters kept constant.

Requirements Driven Multiple View Paradigm for Developing Security Architecture

This paper describes a paradigmatic approach to develop architecture of secure systems by describing the requirements from four different points of view: that of the owner, the administrator, the user, and the network. Deriving requirements and developing architecture implies the joint elicitation and describing the problem and the structure of the solution. The view points proposed in this paper are those we consider as requirements towards their contributions as major parties in the design, implementation, usage and maintenance of secure systems. The dramatic growth of the technology of Internet and the applications deployed in World Wide Web have lead to the situation where the security has become a very important concern in the development of secure systems. Many security approaches are currently being used in organizations. In spite of the widespread use of many different security solutions, the security remains a problem. It is argued that the approach that is described in this paper for the development of secure architecture is practical by all means. The models representing these multiple points of view are termed the requirements model (views of owner and administrator) and the operations model (views of user and network). In this paper, this multiple view paradigm is explained by first describing the specific requirements and or characteristics of secure systems (particularly in the domain of networks) and the secure architecture / system development methodology.

Entropy based Expeditive Methodology for Rating Curves Assessment

The river flow forecasting represents a crucial point to employ for improving a management policy addressed to the right use of water resources as well as for conjugating prevention and defense actions against environmental degradation. The difficulties occurring during the field activities encourage the development and implementation of operative computation and measuring methods addressed to time reduction for data acquisition and processing maintaining a good level of accuracy. Therefore, the aim of the present work is to test a new entropy based expeditive methodology for the evaluation of the rating curves on three gauged sections with different geometric and morphological characteristics. The methodology requires the choice of only three verticals along the measure section and the sampling of only the maximum velocity. The results underline how in most conditions the rating curves drawn can replace those built with classic methodologies, simplifying thus the procedures of data monitoring and calculation.

Why Traditional Technology Acceptance Models Won't Work for Future Information Technologies?

This paper illustrates why existing technology acceptance models are only of limited use for predicting and explaining the adoption of future information and communication technologies. It starts with a general overview over technology adoption processes, and presents several theories for the acceptance as well as adoption of traditional information technologies. This is followed by an overview over the recent developments in the area of information and communication technologies. Based on the arguments elaborated in these sections, it is shown why the factors used to predict adoption in existing systems, will not be sufficient for explaining the adoption of future information and communication technologies.

Large-Eddy Simulations of Subsonic Impinging Jets

We consider here the subsonic impinging jet representing the flow field of a vertical take-off aircraft or the initial stage of rocket launching. Implicit Large-Eddy Simulation (ILES) is used to calculate the time-dependent flow field and the radiate sound pressure associated with jet impinging. With proper boundary treatments and high-order numerical scheme, the near field sound pressure is successfully obtained. Results are presented for both a rectangular as well a circular jet.

MAS Simulations of Optical Antenna Structures

A semi-analytic boundary discretization method, the Method of Auxiliary Sources (MAS) is used to analyze Optical Antennas consisting of metallic parts. In addition to standard dipoletype antennas, consisting of two pieces of metal, a new structure consisting of a single metal piece with a tiny groove in the center is analyzed. It is demonstrated that difficult numerical problems are caused because optical antennas exhibit strong material dispersion, loss, and plasmon-polariton effects that require a very accurate numerical simulation. This structure takes advantage of the Channel Plasmon-Polariton (CPP) effect and exhibits a strong enhancement of the electric field in the groove. Also primitive 3D antenna model with spherical nano particles is analyzed.

The Application of Learning Systems to Support Decision for Stakeholder and Infrastructures Managers Based On Crowdsourcing

The actual grow of the infrastructure in develop country require sophisticate ways manage the operation and control the quality served. This research wants to concentrate in the operation of this infrastructure beyond the construction. The infrastructure-s operation involves an uncertain environment, where unexpected variables are present every day and everywhere. Decision makers need to make right decisions with right information/data analyzed most in real time. To adequately support their decisions and decrease any negative impact and collateral effect, they need to use computational tools called decision support systems (DSS), but now the main source of information came from common users thought an extensive crowdsourcing

A Linear Use Case Based Software Cost Estimation Model

Software development is moving towards agility with use cases and scenarios being used for requirements stories. Estimates of software costs are becoming even more important than before as effects of delays is much larger in successive short releases context of agile development. Thus, this paper reports on the development of new linear use case based software cost estimation model applicable in the very early stages of software development being based on simple metric. Evaluation showed that accuracy of estimates varies between 43% and 55% of actual effort of historical test projects. These results outperformed those of wellknown models when applied in the same context. Further work is being carried out to improve the performance of the proposed model when considering the effect of non-functional requirements.

Depth Controls of an Autonomous Underwater Vehicle by Neurocontrollers for Enhanced Situational Awareness

This paper focuses on a critical component of the situational awareness (SA), the neural control of autonomous constant depth flight of an autonomous underwater vehicle (AUV). Autonomous constant depth flight is a challenging but important task for AUVs to achieve high level of autonomy under adverse conditions. The fundamental requirement for constant depth flight is the knowledge of the depth, and a properly designed controller to govern the process. The AUV, named VORAM, is used as a model for the verification of the proposed hybrid control algorithm. Three neural network controllers, named NARMA-L2 controllers, are designed for fast and stable diving maneuvers of chosen AUV model. This hybrid control strategy for chosen AUV model has been verified by simulation of diving maneuvers using software package Simulink and demonstrated good performance for fast SA in real-time searchand- rescue operations.

Managing Handheld Devices in Ad-Hoc Collaborative Computing Environments

The noticeable advance in the area of computer technology has paved the way for the invention of powerful mobile devices. However, limited storage, short battery life, and relatively low computational power define the major problems of such devices. Due to the ever increasing computational requirements, such devices may fail to process needed tasks under certain constraints. One of the proposed solutions to this drawback is the introduction of Collaborative Computing, a new concept dealing with the distribution of computational tasks amongst several handhelds. This paper introduces the basics of Collaborative Computing, and proposes a new protocol that aims at managing and optimizing computing tasks in Ad-Hoc Collaborative Computing Environments.

A Weighted Least Square Algorithm for Low-Delay FIR Filters with Piecewise Variable Stopbands

Variable digital filters are useful for various signal processing and communication applications where the frequency characteristics, such as fractional delays and cutoff frequencies, can be varied. In this paper, we propose a design method of variable FIR digital filters with an approximate linear phase characteristic in the passband. The proposed variable FIR filters have some large attenuation in stopband and their large attenuation can be varied by spectrum parameters. In the proposed design method, a quasi-equiripple characteristic can be obtained by using an iterative weighted least square method. The usefulness of the proposed design method is verified through some examples.

Internet Bandwidth Network Quality Management: The Case Study of Telecom Organization of Thailand

This paper addresses a current problem that occurs among Thai internet service providers with regard to bandwidth network quality management. The IPSTAR department of Telecom Organization of Thailand public company (TOT); the largest internet service provider in Thailand, is the case study to analyze the problem that exists. The Internet bandwidth network quality management (iBWQM) framework is mainly applied to the problem that has been found. Bandwidth management policy (BMP) and quality of service (QoS) are two antecedents of iBWQM. This paper investigates internet user behavior, marketing demand and network operation views in order to determine bandwidth management policy (e.g. quota management, scheduling and malicious management). The congestion of bandwidth is also analyzed to enhance quality of service (QoS). Moreover, the iBWQM framework is able to improve the quality of service and increase bandwidth utilization, minimize complaint rate concerns to slow speed, and provide network planning guidelines through Thai Internet services providers.