A Beacon Based Priority Routing Scheme for Solar Power Plants in WSNs

Solar power plants(SPPs) have shown a lot of good outcomes in providing a various functions depending on industrial expectations by deploying ad-hoc networking with helps of light loaded and battery powered sensor nodes. In particular, it is strongly requested to develop an algorithm to deriver the sensing data from the end node of solar power plants to the sink node on time. In this paper, based on the above observation we have proposed an IEEE802.15.4 based self routing scheme for solar power plants. The proposed beacon based priority routing Algorithm (BPRA) scheme utilizes beacon periods in sending message with embedding the high priority data and thus provides high quality of service(QoS) in the given criteria. The performance measures are the packet Throughput, delivery, latency, total energy consumption. Simulation results under TinyOS Simulator(TOSSIM) have shown the proposed scheme outcome the conventional Ad hoc On-Demand Distance Vector(AODV) Routing in solar power plants.

Towards a Load Balancing Framework for an SMS–Based Service Invocation Environment

The drastic increase in the usage of SMS technology has led service providers to seek for a solution that enable users of mobile devices to access services through SMSs. This has resulted in the proposal of solutions towards SMS-based service invocation in service oriented environments. However, the dynamic nature of service-oriented environments coupled with sudden load peaks generated by service request, poses performance challenges to infrastructures for supporting SMS-based service invocation. To address this problem we adopt load balancing techniques. A load balancing model with adaptive load balancing and load monitoring mechanisms as its key constructs is proposed. The load balancing model then led to realization of Least Loaded Load Balancing Framework (LLLBF). Evaluation of LLLBF benchmarked with round robin (RR) scheme on the queuing approach showed LLLBF outperformed RR in terms of response time and throughput. However, LLLBF achieved better result in the cost of high processing power.

Value-Relevance of Accounting Information:Evidence from Iranian Emerging Stock Exchange

This study aims to investigate empirically the valuerelevance of accounting information to domestic investors in Tehran stock exchange from 1999 to 2006. During the present research impacts of two factors, including positive vs. negative earnings and the firm size are considered as well. The authors used earnings per share and annual change of earnings per share as the income statement indices, and book value of equity per share as the balance sheet index. Return and Price models through regression analysis are deployed in order to test the research hypothesis. Results depicted that accounting information is value-relevance to domestic investors in Tehran Stock Exchange according to both studied models. However, income statement information has more value-relevance than the balance sheet information. Furthermore, positive vs. negative earnings and firm size seems to have significant impact on valuerelevance of accounting information.

Implementation of TinyHash based on Hash Algorithm for Sensor Network

In recent years, it has been proposed security architecture for sensor network.[2][4]. One of these, TinySec by Chris Kalof, Naveen Sastry, David Wagner had proposed Link layer security architecture, considering some problems of sensor network. (i.e : energy, bandwidth, computation capability,etc). The TinySec employs CBC_mode of encryption and CBC-MAC for authentication based on SkipJack Block Cipher. Currently, This TinySec is incorporated in the TinyOS for sensor network security. This paper introduces TinyHash based on general hash algorithm. TinyHash is the module in order to replace parts of authentication and integrity in the TinySec. it implies that apply hash algorithm on TinySec architecture. For compatibility about TinySec, Components in TinyHash is constructed as similar structure of TinySec. And TinyHash implements the HMAC component for authentication and the Digest component for integrity of messages. Additionally, we define the some interfaces for service associated with hash algorithm.

Comparison of Detached Eddy Simulations with Turbulence Modeling

Flow field around hypersonic vehicles is very complex and difficult to simulate. The boundary layers are squeezed between shock layer and body surface. Resolution of boundary layer, shock wave and turbulent regions where the flow field has high values is difficult of capture. Detached eddy simulation (DES) is a modification of a RANS model in which the model switches to a subgrid scale formulation in regions fine enough for LES calculations. Regions near solid body boundaries and where the turbulent length scale is less than the maximum grid dimension are assigned the RANS mode of solution. As the turbulent length scale exceeds the grid dimension, the regions are solved using the LES mode. Therefore the grid resolution is not as demanding as pure LES, thereby considerably cutting down the cost of the computation. In this research study hypersonic flow is simulated at Mach 8 and different angle of attacks to resolve the proper boundary layers and discontinuities. The flow is also simulated in the long wake regions. Mesh is little different than RANS simulations and it is made dense near the boundary layers and in the wake regions to resolve it properly. Hypersonic blunt cone cylinder body with frustrum at angle 5o and 10 o are simulated and there aerodynamics study is performed to calculate aerodynamics characteristics of different geometries. The results and then compared with experimental as well as with some turbulence model (SA Model). The results achieved with DES simulation have very good resolution as well as have excellent agreement with experimental and available data. Unsteady simulations are performed for DES calculations by using duel time stepping method or implicit time stepping. The simulations are performed at Mach number 8 and angle of attack from 0o to 10o for all these cases. The results and resolutions for DES model found much better than SA turbulence model.

Fluid Flow Analysis and Design of a Flow Distributor in a Domestic Gas Boiler Using a Commercial CFD Software

The aim of the study was to investigate the possible use of commercial Computational Fluid Dynamics (CFD) software in the design process of a domestic gas boiler. Because of the limited computational resources some simplifications had to be made in order to contribute to the design in a reasonable timescale. The porous media model was used in order to simulate the influence of the pressure drop characteristic of particular elements of a heat transfer system on the water-flow distribution in the system. Further, a combination of CFD analyses and spread sheet calculations was used in order to solve the flow distribution problem.

Single Frame Supercompression of Still Images,Video, High Definition TV and Digital Cinema

Super-resolution is nowadays used for a high-resolution image produced from several low-resolution noisy frames. In this work, we consider the problem of high-quality interpolation of a single noise-free image. Such images may come from different sources, i.e., they may be frames of videos, individual pictures, etc. On the other hand, in the encoder we apply a downsampling via bidimen-sional interpolation of each frame, and in the decoder we apply a upsampling by which we restore the original size of the image. If the compression ratio is very high, then we use a convolutive mask that restores the edges, eliminating the blur. Finally, both, the encoder and the complete decoder are implemented on General-Purpose computation on Graphics Processing Units (GPGPU) cards. In fact, the mentioned mask is coded inside texture memory of a GPGPU.

Recent Accounting Standard Setting Changes for Consolidated Financial Statements

In the current context of globalization, a large number of companies sought to develop as a group in order to reach to other markets or meet the necessary criteria for listing on a stock exchange. The issue of consolidated financial statements prepared by a parent, an investor or a venture and the financial reporting standards guiding them therefore becomes even more important. The aim of our paper is to expose this issue in a consistent manner, first by summarizing the international accounting and financial reporting standards applicable before the 1st of January 2013 and considering the role of the crisis in shaping the standard setting process, and secondly by analyzing the newly issued/modified standards and main changes being brought

Technological Environment - International Marketing Strategy Relationship

International trade involves both large and small firms engaged in business overseas. Possible drivers that force companies to enter international markets include increasing competition at the domestic market, maturing domestic markets, and limited domestic market opportunities. Technology is an important driving factor in shaping international marketing strategy as well as in driving force towards a more global marketplace, especially technology in communication. It includes telephones, the internet, computer systems and e-mail. There are three main marketing strategy choices, namely standardization approach, adaptation approach and middleof- the-road approach that companies implement to overseas markets. The decision depends on situations and factors facing the companies in the international markets. In this paper, the contingency concept is considered that no single strategy can be effective in all contexts. The effect of strategy on performance depends on specific situational variables. Strategic fit is employed to investigate export marketing strategy adaptation under certain environmental conditions, which in turn can lead to superior performance.

Distributed Load Flow Analysis using Graph Theory

In today scenario, to meet enhanced demand imposed by domestic, commercial and industrial consumers, various operational & control activities of Radial Distribution Network (RDN) requires a focused attention. Irrespective of sub-domains research aspects of RDN like network reconfiguration, reactive power compensation and economic load scheduling etc, network performance parameters are usually estimated by an iterative process and is commonly known as load (power) flow algorithm. In this paper, a simple mechanism is presented to implement the load flow analysis (LFA) algorithm. The reported algorithm utilizes graph theory principles and is tested on a 69- bus RDN.

Exploiting Two Intelligent Models to Predict Water Level: A Field Study of Urmia Lake, Iran

Water level forecasting using records of past time series is of importance in water resources engineering and management. For example, water level affects groundwater tables in low-lying coastal areas, as well as hydrological regimes of some coastal rivers. Then, a reliable prediction of sea-level variations is required in coastal engineering and hydrologic studies. During the past two decades, the approaches based on the Genetic Programming (GP) and Artificial Neural Networks (ANN) were developed. In the present study, the GP is used to forecast daily water level variations for a set of time intervals using observed water levels. The measurements from a single tide gauge at Urmia Lake, Northwest Iran, were used to train and validate the GP approach for the period from January 1997 to July 2008. Statistics, the root mean square error and correlation coefficient, are used to verify model by comparing with a corresponding outputs from Artificial Neural Network model. The results show that both these artificial intelligence methodologies are satisfactory and can be considered as alternatives to the conventional harmonic analysis.

Multi-matrix Real-coded Genetic Algorithm for Minimising Total Costs in Logistics Chain Network

The importance of supply chain and logistics management has been widely recognised. Effective management of the supply chain can reduce costs and lead times and improve responsiveness to changing customer demands. This paper proposes a multi-matrix real-coded Generic Algorithm (MRGA) based optimisation tool that minimises total costs associated within supply chain logistics. According to finite capacity constraints of all parties within the chain, Genetic Algorithm (GA) often produces infeasible chromosomes during initialisation and evolution processes. In the proposed algorithm, chromosome initialisation procedure, crossover and mutation operations that always guarantee feasible solutions were embedded. The proposed algorithm was tested using three sizes of benchmarking dataset of logistic chain network, which are typical of those faced by most global manufacturing companies. A half fractional factorial design was carried out to investigate the influence of alternative crossover and mutation operators by varying GA parameters. The analysis of experimental results suggested that the quality of solutions obtained is sensitive to the ways in which the genetic parameters and operators are set.

Secure Data Aggregation Using Clusters in Sensor Networks

Wireless sensor network can be applied to both abominable and military environments. A primary goal in the design of wireless sensor networks is lifetime maximization, constrained by the energy capacity of batteries. One well-known method to reduce energy consumption in such networks is data aggregation. Providing efcient data aggregation while preserving data privacy is a challenging problem in wireless sensor networks research. In this paper, we present privacy-preserving data aggregation scheme for additive aggregation functions. The Cluster-based Private Data Aggregation (CPDA)leverages clustering protocol and algebraic properties of polynomials. It has the advantage of incurring less communication overhead. The goal of our work is to bridge the gap between collaborative data collection by wireless sensor networks and data privacy. We present simulation results of our schemes and compare their performance to a typical data aggregation scheme TAG, where no data privacy protection is provided. Results show the efficacy and efficiency of our schemes.

Avoiding Catastrophic Forgetting by a Dual-Network Memory Model Using a Chaotic Neural Network

In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models.

An Efficient Key Management Scheme for Secure SCADA Communication

A SCADA (Supervisory Control And Data Acquisition) system is an industrial control and monitoring system for national infrastructures. The SCADA systems were used in a closed environment without considering about security functionality in the past. As communication technology develops, they try to connect the SCADA systems to an open network. Therefore, the security of the SCADA systems has been an issue. The study of key management for SCADA system also has been performed. However, existing key management schemes for SCADA system such as SKE(Key establishment for SCADA systems) and SKMA(Key management scheme for SCADA systems) cannot support broadcasting communication. To solve this problem, an Advanced Key Management Architecture for Secure SCADA Communication has been proposed by Choi et al.. Choi et al.-s scheme also has a problem that it requires lots of computational cost for multicasting communication. In this paper, we propose an enhanced scheme which improving computational cost for multicasting communication with considering the number of keys to be stored in a low power communication device (RTU).

Virtual Environments...Vehicle for Pedagogical Advancement

Virtual environments are a hot topic in academia and more importantly in courses offered via distance education. Today-s gaming generation view virtual worlds as strong social and interactive mediums for communicating and socializing. And while institutions of higher education are challenged with increasing enrollment while balancing budget cuts, offering effective courses via distance education become a valid option. Educators can utilize virtual worlds to offer students an enhanced learning environment which has the power to alleviate feelings of isolation through the promotion of communication, interaction, collaboration, teamwork, feedback, engagement and constructivists learning activities. This paper focuses on the use of virtual environments to facilitate interaction in distance education courses so as to produce positive learning outcomes for students. Furthermore, the instructional strategies were reviewed and discussed for use in virtual worlds to enhance learning within a social context.

Small and Medium Enterprises (SMEs) Financing Practice and Accessing Bank Loan Issues -The Case of Libya

The purpose of this paper is to examine the financing practices of SMEs in Libya in two different phases of business life cycle: start-up and matured stages. Moreover, SMEs- accessing bank loan issues is also identified. The study was conducted by taking into account the aspect of demand. The findings are based on a sample of 76 SMEs in Libya through the adoption of questionnaires. The results have pinpointed several things- evidently, SMEs use informal financing sources which prefer personal savings; SME owners are willing to apply for bank loan, that the most pressing problem has been identified, not to apply bank loan is loan with interest (religion factor).

Use of Multiple Linear Regressions to Evaluate the Influence of O3 and PM10 on Biological Pollutants

Exposure to ambient air pollution has been linked to a number of health outcomes, starting from modest transient changes in the respiratory tract and impaired pulmonary function, continuing to restrict activity/reduce performance and to the increase emergency rooms visits, hospital admissions or mortality. The increase of allergenic symptoms has been associated with air contaminants such as ozone, particulate matter, fungal spores and pollen. Considering the potential relevance of crossed effects of nonbiological pollutants and airborne pollens and fungal spores on allergy worsening, the aim of this work was to evaluate the influence of non-biological pollutants (O3 and PM10) and meteorological parameters on the concentrations of pollen and fungal spores using multiple linear regressions. The data considered in this study were collected in Oporto which is the second largest Portuguese city, located in the North. Daily mean of O3, PM10, pollen and fungal spore concentrations, temperature, relative humidity, precipitation, wind velocity, pollen and fungal spore concentrations, for 2003, 2004 and 2005 were considered. Results showed that the 90th percentile of the adjusted coefficient of determination, P90 (R2aj), of the multiple regressions varied from 0.613 to 0.916 for pollen and from 0.275 to 0.512 for fungal spores. O3 and PM10 showed to have some influence on the biological pollutants. Among the meteorological parameters analysed, temperature was the one that most influenced the pollen and fungal spores airborne concentrations. Relative humidity also showed to have some influence on the fungal spore dispersion. Nevertheless, the models for each pollen and fungal spore were different depending on the analysed period, which means that the correlations identified as statistically significant can not be, even so, consistent enough.

SBTAR: An Enhancing Method for Automate Test Tools

Since Software testing becomes an important part of Software development in order to improve the quality of software, many automation tools are created to help testing functionality of software. There are a few issues about usability of these tools, one is that the result log which is generated from tools contains useless information that the tester cannot use result log to communicate efficiently, or the result log needs to use a specific application to open. This paper introduces a new method, SBTAR that improves usability of automated test tools in a part of a result log. The practice will use the capability of tools named as IBM Rational Robot to create a customized function, the function would generate new format of a result log which contains useful information faster and easier to understand than using the original result log which was generated from the tools. This result log also increases flexibility by Microsoft Word or WordPad to make them readable.

A Novel Adaptive E-Learning Model Based on Developed Learner's Styles

Adaptive e-learning today gives the student a central role in his own learning process. It allows learners to try things out, participate in courses like never before, and get more out of learning than before. In this paper, an adaptive e-learning model for logic design, simplification of Boolean functions and related fields is presented. Such model presents suitable courses for each student in a dynamic and adaptive manner using existing database and workflow technologies. The main objective of this research work is to provide an adaptive e-learning model based learners' personality using explicit and implicit feedback. To recognize the learner-s, we develop dimensions to decide each individual learning style in order to accommodate different abilities of the users and to develop vital skills. Thus, the proposed model becomes more powerful, user friendly and easy to use and interpret. Finally, it suggests a learning strategy and appropriate electronic media that match the learner-s preference.