Convergence of National Regulations with IFRS for SMEs: Empirical Evidences in the Case of Romania

The IFRS for Small and Medium-sized Entities (SMEs) was issued in July 2009 and currently regulators are considering various implementation strategies of this standard. Romania is a member of the European Union since 2007, thus accounting regulations were issued in order to ensure compliance with the European Accounting Directives. As the European Commission rejected recently the mandatory use of IFRS for SMEs, regulatory bodies from the Member States have to decide if the standard will affect or not the accounting practices of SMEs from their countries. Recently IASB invited stakeholders to discuss the revision of IFRS for SMEs. Empirical studies on the differences and similarities between national standards and IFRS for SMEs could inform decision makers on the actual level of convergence in different countries. The purpose of this paper is to provide empirical evidences on the convergence of the Romanian regulations with IFRS for SMEs analyzing the results in the context of the last revisions proposed to the EU Accounting Directives.

Immunity of Integrated Drive System, Effects of Radiated and Conducted Emission

In this paper the problems associated with immunity of embedded systems used in Motor-Drive systems are investigated and appropriate solutions are presented. Integration of VSD motor systems (Integral Motor) while partially reducing some of these effects, adds to immunity problem of their embedded systems. Fail safe operation of an Integral Motor in arduous industrial environments is considered. In this paper an integral motor with a unique design is proposed to overcome critical issues such as heat, vibration and electromagnetic interference which are damaging to sensitive electronics without requirement of any additional cooling system. Advantages of the proposed Integral motor are compactness of combo motor and drive system with no external cabling/wiring. This motor provides a perfect shielding for least amount of radiated emission. It has an inbuilt filter for EMC compliance and has been designed to provide lower EMC noise for immunity of the internal electronics as well as the other neighbouring systems.

A New Block-based NLMS Algorithm and Its Realization in Block Floating Point Format

we propose a new normalized LMS (NLMS) algorithm, which gives satisfactory performance in certain applications in comaprison with con-ventional NLMS recursion. This new algorithm can be treated as a block based simplification of NLMS algorithm with significantly reduced number of multi¬ply and accumulate as well as division operations. It is also shown that such a recursion can be easily implemented in block floating point (BFP) arithmetic, treating the implementational issues much efficiently. In particular, the core challenges of a BFP realization to such adaptive filters are mainly considered in this regard. A global upper bound on the step size control parameter of the new algorithm due to BFP implementation is also proposed to prevent overflow in filtering as well as weight updating operations jointly.

SIP Authentication Scheme using ECDH

SIP (Session Initiation Protocol), using HTML based call control messaging which is quite simple and efficient, is being replaced for VoIP networks recently. As for authentication and authorization purposes there are many approaches and considerations for securing SIP to eliminate forgery on the integrity of SIP messages. On the other hand Elliptic Curve Cryptography has significant advantages like smaller key sizes, faster computations on behalf of other Public Key Cryptography (PKC) systems that obtain data transmission more secure and efficient. In this work a new approach is proposed for secure SIP authentication by using a public key exchange mechanism using ECC. Total execution times and memory requirements of proposed scheme have been improved in comparison with non-elliptic approaches by adopting elliptic-based key exchange mechanism.

Compressive Strength and Interfacial Transition Zone Characteristic of Geopolymer Concrete with Different Cast In-Situ Curing Conditions

The compressive strength development through polymerization process of alkaline solution and fly ash blended with Microwave Incinerated Rice Husk Ash (MIRHA) is described in this paper. Three curing conditions, which are hot gunny curing, ambient curing, and external humidity curing are investigated to obtain the suitable curing condition for cast in situ provision. Fly ash was blended with MIRHA at 3%, 5%, and 7% to identify the effect of blended mixes to the compressive strength and microstructure properties of geopolymer concrete. Compressive strength results indicated an improvement in the strength development with external humidity curing concrete samples compared to hot gunny curing and ambient curing. Blended mixes also presented better performance than control mixes. Improvement of interfacial transition zone (ITZ) and micro structure in external humidity concrete samples were also identified compared to hot gunny and ambient curing.

Energy Map Construction using Adaptive Alpha Grey Prediction Model in WSNs

Wireless Sensor Networks can be used to monitor the physical phenomenon in such areas where human approach is nearly impossible. Hence the limited power supply is the major constraint of the WSNs due to the use of non-rechargeable batteries in sensor nodes. A lot of researches are going on to reduce the energy consumption of sensor nodes. Energy map can be used with clustering, data dissemination and routing techniques to reduce the power consumption of WSNs. Energy map can also be used to know which part of the network is going to fail in near future. In this paper, Energy map is constructed using the prediction based approach. Adaptive alpha GM(1,1) model is used as the prediction model. GM(1,1) is being used worldwide in many applications for predicting future values of time series using some past values due to its high computational efficiency and accuracy.

Lowering Error Floors by Concatenation of Low-Density Parity-Check and Array Code

Low-density parity-check (LDPC) codes have been shown to deliver capacity approaching performance; however, problematic graphical structures (e.g. trapping sets) in the Tanner graph of some LDPC codes can cause high error floors in bit-error-ratio (BER) performance under conventional sum-product algorithm (SPA). This paper presents a serial concatenation scheme to avoid the trapping sets and to lower the error floors of LDPC code. The outer code in the proposed concatenation is the LDPC, and the inner code is a high rate array code. This approach applies an interactive hybrid process between the BCJR decoding for the array code and the SPA for the LDPC code together with bit-pinning and bit-flipping techniques. Margulis code of size (2640, 1320) has been used for the simulation and it has been shown that the proposed concatenation and decoding scheme can considerably improve the error floor performance with minimal rate loss.

Simulation of Propagation of Cos-Gaussian Beam in Strongly Nonlocal Nonlinear Media Using Paraxial Group Transformation

In this paper, propagation of cos-Gaussian beam in strongly nonlocal nonlinear media has been stimulated by using paraxial group transformation. At first, cos-Gaussian beam, nonlocal nonlinear media, critical power, transfer matrix, and paraxial group transformation are introduced. Then, the propagation of the cos-Gaussian beam in strongly nonlocal nonlinear media is simulated. Results show that beam propagation has periodic structure during self-focusing effect in this case. However, this simple method can be used for investigation of propagation of kinds of beams in ABCD optical media.

Simulation of Non-Linear Behavior of Shear Wall under Seismic Loading

The seismic response of steel shear wall system considering nonlinearity effects using finite element method is investigated in this paper. The non-linear finite element analysis has potential as usable and reliable means for analyzing of civil structures with the availability of computer technology. In this research the large displacements and materially nonlinear behavior of shear wall is presented with developing of finite element code. A numerical model based on the finite element method for the seismic analysis of shear wall is presented with developing of finite element code in this research. To develop the finite element code, the standard Galerkin weighted residual formulation is used. Two-dimensional plane stress model and total Lagrangian formulation was carried out to present the shear wall response and the Newton-Raphson method is applied for the solution of nonlinear transient equations. The presented model in this paper can be developed for analysis of civil engineering structures with different material behavior and complicated geometry.

Information Fusion for Identity Verification

In this paper we propose a novel approach for ascertaining human identity based on fusion of profile face and gait biometric cues The identification approach based on feature learning in PCA-LDA subspace, and classification using multivariate Bayesian classifiers allows significant improvement in recognition accuracy for low resolution surveillance video scenarios. The experimental evaluation of the proposed identification scheme on a publicly available database [2] showed that the fusion of face and gait cues in joint PCA-LDA space turns out to be a powerful method for capturing the inherent multimodality in walking gait patterns, and at the same time discriminating the person identity..

Data Envelopment Analysis under Uncertainty and Risk

Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.

Use of Semantic Networks as Learning Material and Evaluation of the Approach by Students

This article first summarizes reasons why current approaches supporting Open Learning and Distance Education need to be complemented by tools permitting lecturers, researchers and students to cooperatively organize the semantic content of Learning related materials (courses, discussions, etc.) into a fine-grained shared semantic network. This first part of the article also quickly describes the approach adopted to permit such a collaborative work. Then, examples of such semantic networks are presented. Finally, an evaluation of the approach by students is provided and analyzed.

An Improved Switching Median filter for Uniformly Distributed Impulse Noise Removal

The performance of an image filtering system depends on its ability to detect the presence of noisy pixels in the image. Most of the impulse detection schemes assume the presence of salt and pepper noise in the images and do not work satisfactorily in case of uniformly distributed impulse noise. In this paper, a new algorithm is presented to improve the performance of switching median filter in detection of uniformly distributed impulse noise. The performance of the proposed scheme is demonstrated by the results obtained from computer simulations on various images.

Noise Depressed in a Micro Stepping Motor

An investigation of noise in a micro stepping motor is considered to study in this article. Because of the trend towards higher precision and more and more small 3C (including Computer, Communication and Consumer Electronics) products, the micro stepping motor is frequently used to drive the micro system or the other 3C products. Unfortunately, noise in a micro stepped motor is too large to accept by the customs. To depress the noise of a micro stepped motor, the dynamic characteristics in this system must be studied. In this article, a Visual Basic (VB) computer program speed controlled micro stepped motor in a digital camera is investigated. Karman KD2300-2S non-contract eddy current displacement sensor, probe microphone, and HP 35670A analyzer are employed to analyze the dynamic characteristics of vibration and noise in a motor. The vibration and noise measurement of different type of bearings and different treatment of coils are compared. The rotating components, bearings, coil, etc. of the motor play the important roles in producing vibration and noise. It is found that the noise will be depressed about 3~4 dB and 6~7 dB, when substitutes the copper bearing with plastic one and coats the motor coil with paraffin wax, respectively.

Correction of Frequent English Writing Errors by Using Coded Indirect Corrective Feedback and Error Treatment

The purposes of this study are 1) to study the frequent English writing errors of students registering the course: Reading and Writing English for Academic Purposes II, and 2) to find out the results of writing error correction by using coded indirect corrective feedback and writing error treatments. Samples include 28 2nd year English Major students, Faculty of Education, Suan Sunandha Rajabhat University. Tool for experimental study includes the lesson plan of the course; Reading and Writing English for Academic Purposes II, and tool for data collection includes 4 writing tests of short texts. The research findings disclose that frequent English writing errors found in this course comprise 7 types of grammatical errors, namely Fragment sentence, Subject-verb agreement, Wrong form of verb tense, Singular or plural noun endings, Run-ons sentence, Wrong form of verb pattern and Lack of parallel structure. Moreover, it is found that the results of writing error correction by using coded indirect corrective feedback and error treatment reveal the overall reduction of the frequent English writing errors and the increase of students’ achievement in the writing of short texts with the significance at .05.

Skew Detection Technique for Binary Document Images based on Hough Transform

Document image processing has become an increasingly important technology in the automation of office documentation tasks. During document scanning, skew is inevitably introduced into the incoming document image. Since the algorithm for layout analysis and character recognition are generally very sensitive to the page skew. Hence, skew detection and correction in document images are the critical steps before layout analysis. In this paper, a novel skew detection method is presented for binary document images. The method considered the some selected characters of the text which may be subjected to thinning and Hough transform to estimate skew angle accurately. Several experiments have been conducted on various types of documents such as documents containing English Documents, Journals, Text-Book, Different Languages and Document with different fonts, Documents with different resolutions, to reveal the robustness of the proposed method. The experimental results revealed that the proposed method is accurate compared to the results of well-known existing methods.

Generating Qualitative Causal Graph using Modeling Constructs of Qualitative Process Theory for Explaining Organic Chemistry Reactions

This paper discusses the causal explanation capability of QRIOM, a tool aimed at supporting learning of organic chemistry reactions. The development of the tool is based on the hybrid use of Qualitative Reasoning (QR) technique and Qualitative Process Theory (QPT) ontology. Our simulation combines symbolic, qualitative description of relations with quantity analysis to generate causal graphs. The pedagogy embedded in the simulator is to both simulate and explain organic reactions. Qualitative reasoning through a causal chain will be presented to explain the overall changes made on the substrate; from initial substrate until the production of final outputs. Several uses of the QPT modeling constructs in supporting behavioral and causal explanation during run-time will also be demonstrated. Explaining organic reactions through causal graph trace can help improve the reasoning ability of learners in that their conceptual understanding of the subject is nurtured.

Power and Delay Optimized Graph Representation for Combinational Logic Circuits

Structural representation and technology mapping of a Boolean function is an important problem in the design of nonregenerative digital logic circuits (also called combinational logic circuits). Library aware function manipulation offers a solution to this problem. Compact multi-level representation of binary networks, based on simple circuit structures, such as AND-Inverter Graphs (AIG) [1] [5], NAND Graphs, OR-Inverter Graphs (OIG), AND-OR Graphs (AOG), AND-OR-Inverter Graphs (AOIG), AND-XORInverter Graphs, Reduced Boolean Circuits [8] does exist in literature. In this work, we discuss a novel and efficient graph realization for combinational logic circuits, represented using a NAND-NOR-Inverter Graph (NNIG), which is composed of only two-input NAND (NAND2), NOR (NOR2) and inverter (INV) cells. The networks are constructed on the basis of irredundant disjunctive and conjunctive normal forms, after factoring, comprising terms with minimum support. Construction of a NNIG for a non-regenerative function in normal form would be straightforward, whereas for the complementary phase, it would be developed by considering a virtual instance of the function. However, the choice of best NNIG for a given function would be based upon literal count, cell count and DAG node count of the implementation at the technology independent stage. In case of a tie, the final decision would be made after extracting the physical design parameters. We have considered AIG representation for reduced disjunctive normal form and the best of OIG/AOG/AOIG for the minimized conjunctive normal forms. This is necessitated due to the nature of certain functions, such as Achilles- heel functions. NNIGs are found to exhibit 3.97% lesser node count compared to AIGs and OIG/AOG/AOIGs; consume 23.74% and 10.79% lesser library cells than AIGs and OIG/AOG/AOIGs for the various samples considered. We compare the power efficiency and delay improvement achieved by optimal NNIGs over minimal AIGs and OIG/AOG/AOIGs for various case studies. In comparison with functionally equivalent, irredundant and compact AIGs, NNIGs report mean savings in power and delay of 43.71% and 25.85% respectively, after technology mapping with a 0.35 micron TSMC CMOS process. For a comparison with OIG/AOG/AOIGs, NNIGs demonstrate average savings in power and delay by 47.51% and 24.83%. With respect to device count needed for implementation with static CMOS logic style, NNIGs utilize 37.85% and 33.95% lesser transistors than their AIG and OIG/AOG/AOIG counterparts.

Strengthening the HCI Approaches in the Software Development Process

User-Centered Design (UCD), Usability Engineering (UE) and Participatory Design (PD) are the common Human- Computer Interaction (HCI) approaches that are practiced in the software development process, focusing towards issues and matters concerning user involvement. It overlooks the organizational perspective of HCI integration within the software development organization. The Management Information Systems (MIS) perspective of HCI takes a managerial and organizational context to view the effectiveness of integrating HCI in the software development process. The Human-Centered Design (HCD) which encompasses all of the human aspects including aesthetic and ergonomic, is claimed as to provide a better approach in strengthening the HCI approaches to strengthen the software development process. In determining the effectiveness of HCD in the software development process, this paper presents the findings of a content analysis of HCI approaches by viewing those approaches as a technology which integrates user requirements, ranging from the top management to other stake holder in the software development process. The findings obtained show that HCD approach is a technology that emphasizes on human, tools and knowledge in strengthening the HCI approaches to strengthen the software development process in the quest to produce a sustainable, usable and useful software product.