Relationship between Transparency, Liquidity and Valuation

Recent evidences on liquidity and valuation of securities in the capital markets clearly show the importance of stock market liquidity and valuation of firms. In this paper, relationship between transparency, liquidity, and valuation is studied by using data obtained from 70 companies listed in Tehran Stock Exchange during2003-2012. In this study, discriminatory earnings management, as a sign of lack of transparency and Tobin's Q, was used as the criteria of valuation. The results indicate that there is a significant and reversed relationship between earnings management and liquidity. On the other hand, there is a relationship between liquidity and transparency.The results also indicate a significant relationship between transparency and valuation. Transparency has an indirect effect on firm valuation alone or through the liquidity channel. Although the effect of transparency on the value of a firm was reduced by adding the variable of liquidity, the cumulative effect of transparency and liquidity increased.

A Voltage Based Maximum Power Point Tracker for Low Power and Low Cost Photovoltaic Applications

This paper describes the design of a voltage based maximum power point tracker (MPPT) for photovoltaic (PV) applications. Of the various MPPT methods, the voltage based method is considered to be the simplest and cost effective. The major disadvantage of this method is that the PV array is disconnected from the load for the sampling of its open circuit voltage, which inevitably results in power loss. Another disadvantage, in case of rapid irradiance variation, is that if the duration between two successive samplings, called the sampling period, is too long there is a considerable loss. This is because the output voltage of the PV array follows the unchanged reference during one sampling period. Once a maximum power point (MPP) is tracked and a change in irradiation occurs between two successive samplings, then the new MPP is not tracked until the next sampling of the PV array voltage. This paper proposes an MPPT circuit in which the sampling interval of the PV array voltage, and the sampling period have been shortened. The sample and hold circuit has also been simplified. The proposed circuit does not utilize a microcontroller or a digital signal processor and is thus suitable for low cost and low power applications.

Genetic-Fuzzy Inverse Controller for a Robot Arm Suitable for On Line Applications

The robot is a repeated task plant. The control of such a plant under parameter variations and load disturbances is one of the important problems. The aim of this work is to design Geno-Fuzzy controller suitable for online applications to control single link rigid robot arm plant. The genetic-fuzzy online controller (indirect controller) has two genetic-fuzzy blocks, the first as controller, the second as identifier. The identification method is based on inverse identification technique. The proposed controller it tested in normal and load disturbance conditions.

Effects of Variations in Generator Inputs for Small Signal Stability Studies of a Three Machine Nine Bus Network

Small signal stability causes small perturbations in the generator that can cause instability in the power network. It is generally known that small signal stability are directly related to the generator and load properties. This paper examines the effects of generator input variations on power system oscillations for a small signal stability study. Eigenvaules and eigenvectors are used to examine the stability of the power system. The dynamic power system's mathematical model is constructed and thus calculated using load flow and small signal stability toolbox on MATLAB. The power system model is based on a 3-machine 9-bus system that was modified to suit this study. In this paper, Participation Factors are a means to gauge the effects of variation in generation with other parameters on the network are also incorporated.

Mixing Behaviors of Wet Granular Materials in Gas Fluidized Beds

The mixing behaviors of dry and wet granular materials in gas fluidized bed systems were investigated computationally using the combined Computational Fluid Dynamics and Discrete Element Method (CFD-DEM). Dry particles were observed to mix fairly rapidly during the fluidization process due to vigorous relative motions between particles induced by the flow of gas. In contrast, due to the presence of strong cohesive forces arising from capillary liquid bridges between wet particles, the mixing efficiencies of wet granular materials under similar operating conditions were observed to be reduced significantly.

Geostatistical Analysis and Mapping of Groundlevel Ozone in a Medium Sized Urban Area

Ground-level tropospheric ozone is one of the air pollutants of most concern. It is mainly produced by photochemical processes involving nitrogen oxides and volatile organic compounds in the lower parts of the atmosphere. Ozone levels become particularly high in regions close to high ozone precursor emissions and during summer, when stagnant meteorological conditions with high insolation and high temperatures are common. In this work, some results of a study about urban ozone distribution patterns in the city of Badajoz, which is the largest and most industrialized city in Extremadura region (southwest Spain) are shown. Fourteen sampling campaigns, at least one per month, were carried out to measure ambient air ozone concentrations, during periods that were selected according to favourable conditions to ozone production, using an automatic portable analyzer. Later, to evaluate the ozone distribution at the city, the measured ozone data were analyzed using geostatistical techniques. Thus, first, during the exploratory analysis of data, it was revealed that they were distributed normally, which is a desirable property for the subsequent stages of the geostatistical study. Secondly, during the structural analysis of data, theoretical spherical models provided the best fit for all monthly experimental variograms. The parameters of these variograms (sill, range and nugget) revealed that the maximum distance of spatial dependence is between 302-790 m and the variable, air ozone concentration, is not evenly distributed in reduced distances. Finally, predictive ozone maps were derived for all points of the experimental study area, by use of geostatistical algorithms (kriging). High prediction accuracy was obtained in all cases as cross-validation showed. Useful information for hazard assessment was also provided when probability maps, based on kriging interpolation and kriging standard deviation, were produced.

Finite Element Analysis of Full Ceramic Crowns with and without Zirconia Framework

Simulation of occlusal function during laboratory material-s testing becomes essential in predicting long-term performance before clinical usage. The aim of the study was to assess the influence of chamfer preparation depth on failure risk of heat pressed ceramic crowns with and without zirconia framework by means of finite element analysis. 3D models of maxillary central incisor, prepared for full ceramic crowns with different depths of the chamfer margin (between 0.8 and 1.2 mm) and 6-degree tapered walls together with the overlying crowns were generated using literature data (Fig. 1, 2). The crowns were designed with and without a zirconia framework with a thickness of 0.4 mm. For all preparations and crowns, stresses in the pressed ceramic crown, zirconia framework, pressed ceramic veneer, and dentin were evaluated separately. The highest stresses were registered in the dentin. The depth of the preparations had no significant influence on the stress values of the teeth and pressed ceramics for the studied cases, only for the zirconia framework. The zirconia framework decreases the stress values in the veneer.

An Augmented Beam-search Based Algorithm for the Strip Packing Problem

In this paper, the use of beam search and look-ahead strategies for solving the strip packing problem (SPP) is investigated. Given a strip of fixed width W, unlimited length L, and a set of n circular pieces of known radii, the objective is to determine the minimum length of the initial strip that packs all the pieces. An augmented algorithm which combines beam search and a look-ahead strategies is proposed. The look-ahead is used in order to evaluate the nodes at each level of the tree search. The best nodes are then retained for branching. The computational investigation showed that the proposed augmented algorithm is able to improve the best known solutions of the literature on most instances used.

A Green Chemical Technique for the Synthesis of Magnetic Nanoparticles by Magnetotactic Bacteria

Bacterial magnetic nanoparticles have great useful potential in biotechnological and biomedical applications. In this study, a liquid growth medium was modified for cultivation a fastidious magnetotactic bacterium that has been isolated from Anzali lagoon, Iran in our previous research. These modifications include change in vitamin, mineral, carbon sources and etcetera. In our experience, the serum bottles and designed air-tight laboratory bottles were used to create microaerobic conditions in order to development of a method for scale-up experiment. This information may serve as a guide to green chemistry based biological protocols for the synthesis of magnetic nanoparticles with control over the chemical composition, morphology and size.

Contingent Pay and Experience with its use by Organizations of the Czech Republic Operating in the Field of Environmental Protection

One part of the total employee-s reward is apart from basic wages or salary, employee-s benefits and intangible elements also so called contingent (variable) pay. Contingent pay is connected to performance, contribution, capcompetency or skills of individual employees, and to team-s or company-wide performance or to combination of few of the mentioned possibilities. Main aim of this article is to define, based on available information, contingent pay, describe reasons for its implementation and arguments for and against this type of remuneration, but also bring information not only about its extent and level of utilization by organizations of the Czech Republic operating in the field of environmental protection, but also mention their practical experience with this type of remuneration.

Finger Vein Recognition using PCA-based Methods

In this paper a novel algorithm is proposed to merit the accuracy of finger vein recognition. The performances of Principal Component Analysis (PCA), Kernel Principal Component Analysis (KPCA), and Kernel Entropy Component Analysis (KECA) in this algorithm are validated and compared with each other in order to determine which one is the most appropriate one in terms of finger vein recognition.

Kosovo- A Unique Experiment in Europe- in the International Context at the End of the Cold War?

The question of interethnic and interreligious conflicts in ex-Yugoslavia receives much attention within the framework of the international context created after 1991 because of the impact of these conflicts on the security and the stability of the region of Balkans and of Europe. This paper focuses on the rationales leading to the declaration of independence by Kosovo according to ethnic and religious criteria and analyzes why these same rationales were not applied in Bosnia and Herzegovina. The approach undertaken aims at comparatively examining the cases of Kosovo, and Bosnia and Herzegovina. At the same time, it aims at understanding the political decision making of the international community in the case of Kosovo. Specifically, was this a good political decision for the security and the stability of the region of Balkans, of Europe, or even for global security and stability? This research starts with an overview on the European security framework post 1991, paying particular attention to Kosovo and Bosnia and Herzegovina. It then presents the theoretical and methodological framework and compares the representative cases. Using the constructivism issue and the comparative methodology, it arrives at the results of the study. An important issue of the paper is the thesis that this event modifies the principles of international law and creates dangerous precedents for regional stability in the Balkans.

Using Structural Equation Modeling in Causal Relationship Design for Balanced-Scorecards' Strategic Map

Through 1980s, management accounting researchers described the increasing irrelevance of traditional control and performance measurement systems. The Balanced Scorecard (BSC) is a critical business tool for a lot of organizations. It is a performance measurement system which translates mission and strategy into objectives. Strategy map approach is a development variant of BSC in which some necessary causal relations must be established. To recognize these relations, experts usually use experience. It is also possible to utilize regression for the same purpose. Structural Equation Modeling (SEM), which is one of the most powerful methods of multivariate data analysis, obtains more appropriate results than traditional methods such as regression. In the present paper, we propose SEM for the first time to identify the relations between objectives in the strategy map, and a test to measure the importance of relations. In SEM, factor analysis and test of hypotheses are done in the same analysis. SEM is known to be better than other techniques at supporting analysis and reporting. Our approach provides a framework which permits the experts to design the strategy map by applying a comprehensive and scientific method together with their experience. Therefore this scheme is a more reliable method in comparison with the previously established methods.

Manufacturing Dispersions Based Simulation and Synthesis of Design Tolerances

The objective of this work which is based on the approach of simultaneous engineering is to contribute to the development of a CIM tool for the synthesis of functional design dimensions expressed by average values and tolerance intervals. In this paper, the dispersions method known as the Δl method which proved reliable in the simulation of manufacturing dimensions is used to develop a methodology for the automation of the simulation. This methodology is constructed around three procedures. The first procedure executes the verification of the functional requirements by automatically extracting the functional dimension chains in the mechanical sub-assembly. Then a second procedure performs an optimization of the dispersions on the basis of unknown variables. The third procedure uses the optimized values of the dispersions to compute the optimized average values and tolerances of the functional dimensions in the chains. A statistical and cost based approach is integrated in the methodology in order to take account of the capabilities of the manufacturing processes and to distribute optimal values among the individual components of the chains.

Packet Losses Interpretation in Mobile Internet

The mobile users with Laptops need to have an efficient access to i.e. their home personal data or to the Internet from any place in the world, regardless of their location or point of attachment, especially while roaming outside the home subnet. An efficient interpretation of packet losses problem that is encountered from this roaming is to the centric of all aspects in this work, to be over-highlighted. The main previous works, such as BER-systems, Amigos, and ns-2 implementation that are considered to be in conjunction with that problem under study are reviewed and discussed. Their drawbacks and limitations, of stopping only at monitoring, and not to provide an actual solution for eliminating or even restricting these losses, are mentioned. Besides that, the framework around which we built a Triple-R sequence as a costeffective solution to eliminate the packet losses and bridge the gap between subnets, an area that until now has been largely neglected, is presented. The results show that, in addition to the high bit error rate of wireless mobile networks, mainly the low efficiency of mobile-IP registration procedure is a direct cause of these packet losses. Furthermore, the output of packet losses interpretation resulted an illustrated triangle of the registration process. This triangle should be further researched and analyzed in our future work.

MIM: A Species Independent Approach for Classifying Coding and Non-Coding DNA Sequences in Bacterial and Archaeal Genomes

A number of competing methodologies have been developed to identify genes and classify DNA sequences into coding and non-coding sequences. This classification process is fundamental in gene finding and gene annotation tools and is one of the most challenging tasks in bioinformatics and computational biology. An information theory measure based on mutual information has shown good accuracy in classifying DNA sequences into coding and noncoding. In this paper we describe a species independent iterative approach that distinguishes coding from non-coding sequences using the mutual information measure (MIM). A set of sixty prokaryotes is used to extract universal training data. To facilitate comparisons with the published results of other researchers, a test set of 51 bacterial and archaeal genomes was used to evaluate MIM. These results demonstrate that MIM produces superior results while remaining species independent.

Recognition-based Segmentation in Persian Character Recognition

Optical character recognition of cursive scripts presents a number of challenging problems in both segmentation and recognition processes in different languages, including Persian. In order to overcome these problems, we use a newly developed Persian word segmentation method and a recognition-based segmentation technique to overcome its segmentation problems. This method is robust as well as flexible. It also increases the system-s tolerances to font variations. The implementation results of this method on a comprehensive database show a high degree of accuracy which meets the requirements for commercial use. Extended with a suitable pre and post-processing, the method offers a simple and fast framework to develop a full OCR system.

The Relationship between Personality Characteristics and Driving Behavior

The present study investigated the relationship between personality characteristics of drivers and the number and amount of fines they have in a year .This study was carried out on 120 male taxi drivers that worked at least seven hours in a day in Lamerd - a city in the south of IRAN. Subjects were chosen voluntarily among those available. Predictive variables were the NEO –five great personality factors (1. conscientiousness 2. Openness to Experience 3.Neuroticism4 .Extraversion 5.Agreeableness ) thecriterion variables were the number and amount of fines the drivers have had the last three years. the result of regression analysis showed that conscientiousness factor was able to negatively predict the number and amount of financial fines the drivers had during the last three years. The openness factor positively predicted the number of fines they had in last 3 years and the amount of financial fines during the last year. The extraversion factor both meaningfully and positively could predict only the amount of financial fines they had during the last year. Increasing age was associated with decreasing driving offenses as well as financial loss.The findings can be useful in recognizing the high-risk drivers and leading them to counseling centers .They can also be used to inform the drivers about their personality and it’s relation with their accident rate. Such criteria would be of great importance in employing drivers in different places such as companies, offices etc…

Dimensional Modeling of HIV Data Using Open Source

Selecting the data modeling technique for an information system is determined by the objective of the resultant data model. Dimensional modeling is the preferred modeling technique for data destined for data warehouses and data mining, presenting data models that ease analysis and queries which are in contrast with entity relationship modeling. The establishment of data warehouses as components of information system landscapes in many organizations has subsequently led to the development of dimensional modeling. This has been significantly more developed and reported for the commercial database management systems as compared to the open sources thereby making it less affordable for those in resource constrained settings. This paper presents dimensional modeling of HIV patient information using open source modeling tools. It aims to take advantage of the fact that the most affected regions by the HIV virus are also heavily resource constrained (sub-Saharan Africa) whereas having large quantities of HIV data. Two HIV data source systems were studied to identify appropriate dimensions and facts these were then modeled using two open source dimensional modeling tools. Use of open source would reduce the software costs for dimensional modeling and in turn make data warehousing and data mining more feasible even for those in resource constrained settings but with data available.

Energy and Exergy Analysis of Dual Purpose Solar Collector

Energy and exergy study of air-water combined solar collector which is called dual purpose solar collector (DPSC) is investigated. The method of ε - NTU is used. Analysis is performed for triangle channels. Parameters like the air flow rate and water inlet temperature are studied. Results are shown that DPSC has better energy and exergy efficiency than single collector. In addition, the triangle passage with water inlet temperature of 60O C has shown better exergy and energy efficiency.