Data Transmission Reliability in Short Message Integrated Distributed Monitoring Systems

Short message integrated distributed monitoring systems (SM-DMS) are growing rapidly in wireless communication applications in various areas, such as electromagnetic field (EMF) management, wastewater monitoring, and air pollution supervision, etc. However, delay in short messages often makes the data embedded in SM-DMS transmit unreliably. Moreover, there are few regulations dealing with this problem in SMS transmission protocols. In this study, based on the analysis of the command and data requirements in the SM-DMS, we developed a processing model for the control center to solve the delay problem in data transmission. Three components of the model: the data transmission protocol, the receiving buffer pool method, and the timer mechanism were described in detail. Discussions on adjusting the threshold parameter in the timer mechanism were presented for the adaptive performance during the runtime of the SM-DMS. This model optimized the data transmission reliability in SM-DMS, and provided a supplement to the data transmission reliability protocols at the application level.

Agent-Based Offline Electronic Voting

Many electronic voting systems, classified mainly as homomorphic cryptography based, mix-net based and blind signature based, appear after the eighties when zero knowledge proofs were introduced. The common ground for all these three systems is that none of them works without real time cryptologic calculations that should be held on a server. As far as known, the agent-based approach has not been used in a secure electronic voting system. In this study, an agent-based electronic voting schema, which does not contain real time calculations on the server side, is proposed. Conventional cryptologic methods are used in the proposed schema and some of the requirements of an electronic voting system are constructed within the schema. The schema seems quite secure if the used cryptologic methods and agents are secure. In this paper, proposed schema will be explained and compared with already known electronic voting systems.

A Taxonomy of Internal Attacks in Wireless Sensor Network

Developments in communication technologies especially in wireless have enabled the progress of low-cost and lowpower wireless sensor networks (WSNs). The features of such WSN are holding minimal energy, weak computational capabilities, wireless communication and an open-medium nature where sensors are deployed. WSN is underpinned by application driven such as military applications, the health sector, etc. Due to the intrinsic nature of the network and application scenario, WSNs are vulnerable to many attacks externally and internally. In this paper we have focused on the types of internal attacks of WSNs based on OSI model and discussed some security requirements, characterizers and challenges of WSNs, by which to contribute to the WSN-s security research.

Performance Evaluation of Qos Parameters in Cognitive Radio Using Genetic Algorithm

The efficient use of available licensed spectrum is becoming more and more critical with increasing demand and usage of the radio spectrum. This paper shows how the use of spectrum as well as dynamic spectrum management can be effectively managed and spectrum allocation schemes in the wireless communication systems be implemented and used, in future. This paper would be an attempt towards better utilization of the spectrum. This research will focus on the decision-making process mainly, with an assumption that the radio environment has already been sensed and the QoS requirements for the application have been specified either by the sensed radio environment or by the secondary user itself. We identify and study the characteristic parameters of Cognitive Radio and use Genetic Algorithm for spectrum allocation. Performance evaluation is done using MATLAB toolboxes.

Effect of Flaying Capacitors on Improving the 4 Level Three-Cell Inverter

With the rapid advanced of technology, the industrial processes become increasingly demanding, from the point of view, power quality and controllability. The advent of multi levels inverters responds partially to these requirements. But actually, the new generation of multi-cells inverters permits to reach more performances, since, it offers more voltage levels. The disadvantage in the increase of voltage levels by the number of cells in cascades is on account of series igbts synchronisation loss, from where, a limitation of cells in cascade to 4. Regarding to these constraints, a new topology is proposed in this paper, which increases the voltage levels of the three-cell inverter from 4 to 8; with the same number of igbts, and using less stored energy in the flaying capacitors. The details of operation and modelling of this new inverter structure are also presented, then tested thanks to a three phase induction motor. KeywordsFlaying capacitors, Multi-cells inverter, pwm, switchers, modelling.

Optimal Algorithm for Constructing the Delaunay Triangulation in Ed

In this paper we propose a new approach to constructing the Delaunay Triangulation and the optimum algorithm for the case of multidimensional spaces (d ≥ 2). Analysing the modern state, it is possible to draw a conclusion, that the ideas for the existing effective algorithms developed for the case of d ≥ 2 are not simple to generalize on a multidimensional case, without the loss of efficiency. We offer for the solving this problem an effective algorithm that satisfies all the given requirements. But theoretical complexity of the problem it is impossible to improve as the Worst - Case Optimality for algorithms of solving such a problem is proved.

Evaluation of University Technology Malaysia on Campus Transport Access Management

Access Management is the proactive management of vehicular access points to land parcels adjacent to all manner of roadways. Good access management promotes safe and efficient use of the transportation network. This study attempts to utilize archived data from the University Technology of Malaysia on-campus area to assess the accuracy with which access management display some benefits. Results show that usage of access management reduces delay and fewer crashes. Clustered development can improve walking, cycling and transit travel, reduce parking requirements and improve emergency responses. Effective Access Management planning can also reduce total roadway facility costs by reducing the number of driveways and intersections. At the end after presenting recommendations some of the travel impact, and benefits that can be derived if these suggestions are implemented have been summarized with the related comments.

Multifunctional Barcode Inventory System for Retailing. Are You Ready for It?

This paper explains the development of Multifunctional Barcode Inventory Management System (MBIMS) to manage inventory and stock ordering. Today, most of the retailing market is still manually record their stocks and its effectiveness is quite low. By providing MBIMS, it will bring effectiveness to retailing market in inventory management. MBIMS will not only save time in recording input, output and refilling the inventory stock, but also in calculating remaining stock and provide auto-ordering function. This system is developed through System Development Life Cycle (SDLC) and the flow and structure of the system is fully built based on requirements of a retailing market. Furthermore, this system has been developed from methodical research and study where each part of the system is vigilantly designed. Thus, MBIMS will offer a good solution to the retailing market in achieving effectiveness and efficiency in inventory management.

The Association between the Firm Characteristics and Corporate Mandatory Disclosure the Case of Greece

The main thrust of this paper is to assess the level of disclosure in the annual reports of non-financial Greek firms and to empirically investigate the hypothesized impact of several firm characteristics on the extent of mandatory disclosure. A disclosure checklist consisting of 100 mandatory items was developed to assess the level of disclosure in the 2009 annual reports of 43 Greek companies listed at the Athens stock exchange. The association between the level of disclosure and some firm characteristics was examined using multiple linear regression analysis. The study reveals that Greek companies on general have responded adequately to the mandatory disclosure requirements of the regulatory bodies. The findings also indicate that firm size was significant positively associated with the level of disclosure. The remaining variables such as age, profitability, liquidity, and board composition were found to be insignificant in explaining the variation of mandatory disclosures. The outcome of this study is undoubtedly of great concern to the investment community at large to assist in evaluating the extent of mandatory disclosure by Greek firms and explaining the variation of disclosure in light of firm-specific characteristics.

Advanced Information Extraction with n-gram based LSI

Number of documents being created increases at an increasing pace while most of them being in already known topics and little of them introducing new concepts. This fact has started a new era in information retrieval discipline where the requirements have their own specialties. That is digging into topics and concepts and finding out subtopics or relations between topics. Up to now IR researches were interested in retrieving documents about a general topic or clustering documents under generic subjects. However these conventional approaches can-t go deep into content of documents which makes it difficult for people to reach to right documents they were searching. So we need new ways of mining document sets where the critic point is to know much about the contents of the documents. As a solution we are proposing to enhance LSI, one of the proven IR techniques by supporting its vector space with n-gram forms of words. Positive results we have obtained are shown in two different application area of IR domain; querying a document database, clustering documents in the document database.

Evaluating the Effectiveness of Memory Overcommit Techniques on KVM-based Hosting Platform

Determining how many virtual machines a Linux host could run can be a challenge. One of tough missions is to find the balance among performance, density and usability. Now KVM hypervisor has become the most popular open source full virtualization solution. It supports several ways of running guests with more memory than host really has. Due to large differences between minimum and maximum guest memory requirements, this paper presents initial results on same-page merging, ballooning and live migration techniques that aims at optimum memory usage on KVM-based cloud platform. Given the design of initial experiments, the results data is worth reference for system administrators. The results from these experiments concluded that each method offers different reliability tradeoff.

Software Architecture Recovery

The advent of modern technology shadows its impetus repercussions on successful Legacy systems making them obsolete with time. These systems have evolved the large organizations in major problems in terms of new business requirements, response time, financial depreciation and maintenance. Major difficulty is due to constant system evolution and incomplete, inconsistent and obsolete documents which a legacy system tends to have. The myriad dimensions of these systems can only be explored by incorporating reverse engineering, in this context, is the best method to extract useful artifacts and by exploring these artifacts for reengineering existing legacy systems to meet new requirements of organizations. A case study is conducted on six different type of software systems having source code in different programming languages using the architectural recovery framework.

A Specification-Based Approach for Retrieval of Reusable Business Component for Software Reuse

Software reuse can be considered as the most realistic and promising way to improve software engineering productivity and quality. Automated assistance for software reuse involves the representation, classification, retrieval and adaptation of components. The representation and retrieval of components are important to software reuse in Component-Based on Software Development (CBSD). However, current industrial component models mainly focus on the implement techniques and ignore the semantic information about component, so it is difficult to retrieve the components that satisfy user-s requirements. This paper presents a method of business component retrieval based on specification matching to solve the software reuse of enterprise information system. First, a business component model oriented reuse is proposed. In our model, the business data type is represented as sign data type based on XML, which can express the variable business data type that can describe the variety of business operations. Based on this model, we propose specification match relationships in two levels: business operation level and business component level. In business operation level, we use input business data types, output business data types and the taxonomy of business operations evaluate the similarity between business operations. In the business component level, we propose five specification matches between business components. To retrieval reusable business components, we propose the measure of similarity degrees to calculate the similarities between business components. Finally, a business component retrieval command like SQL is proposed to help user to retrieve approximate business components from component repository.

Concurrent Testing of ADC for Embedded System

Compaction testing methods allow at-speed detecting of errors while possessing low cost of implementation. Owing to this distinctive feature, compaction methods have been widely used for built-in testing, as well as external testing. In the latter case, the bandwidth requirements to the automated test equipment employed are relaxed which reduces the overall cost of testing. Concurrent compaction testing methods use operational signals to detect misbehavior of the device under test and do not require input test stimuli. These methods have been employed for digital systems only. In the present work, we extend the use of compaction methods for concurrent testing of analog-to-digital converters. We estimate tolerance bounds for the result of compaction and evaluate the aliasing rate.

Neuro-Hybrid Models for Automotive System Identification

In automotive systems almost all steps concerning the calibration of several control systems, e.g., low idle governor or boost pressure governor, are made with the vehicle because the timeto- production and cost requirements on the projects do not allow for the vehicle analysis necessary to build reliable models. Here is presented a procedure using parametric and NN (neural network) models that enables the generation of vehicle system models based on normal ECU engine control unit) vehicle measurements. These models are locally valid and permit pre and follow-up calibrations so that, only the final calibrations have to be done with the vehicle.

Lessons from Applying XP Methodology to Business Requirements Engineering in Developing Countries Context

Most standard software development methodologies are often not applied to software projects in many developing countries of the world. The approach generally practice is close to what eXtreme Programming (XP) is likely promoting, just keep coding and testing as the requirement evolves. XP is an agile software process development methodology that has inherent capability for improving efficiency of Business Software Development (BSD). XP can facilitate Business-to-Development (B2D) relationship due to its customer-oriented advocate. From practitioner point of view, we applied XP to BSD and result shows that customer involvement has positive impact on productivity, but can as well frustrate the success of the project. In an effort to promote software engineering practice in developing countries of Africa, we present the experiment performed, lessons learned, problems encountered and solution adopted in applying XP methodology to BSD.

Real Time Compensation of Machining Errors for Machine Tools NC based on Systematic Dispersion

Manufacturing tolerancing is intended to determine the intermediate geometrical and dimensional states of the part during its manufacturing process. These manufacturing dimensions also serve to satisfy not only the functional requirements given in the definition drawing, but also the manufacturing constraints, for example geometrical defects of the machine, vibration and the wear of the cutting tool. In this paper, an experimental study on the influence of the wear of the cutting tool (systematic dispersions) is explored. This study was carried out on three stages .The first stage allows machining without elimination of dispersions (random, systematic) so the tolerances of manufacture according to total dispersions. In the second stage, the results of the first stage are filtered in such way to obtain the tolerances according to random dispersions. Finally, from the two previous stages, the systematic dispersions are generated. The objective of this study is to model by the least squares method the error of manufacture based on systematic dispersion. Finally, an approach of optimization of the manufacturing tolerances was developed for machining on a CNC machine tool

Throughput Analysis over Power Line Communication Channel in an Electric Noisy Scenario

Powerline Communications –PLC– as an alternative method for broadband networking, has the advantage of transmitting over channels already used for electrical distribution or even transmission. But these channels have been not designed with usual wired channels requirements for broadband applications such as stable impedance or known attenuation, and the network have to reject noises caused by electrical appliances that share the same channel. Noise control standards are difficult to complain or simply do not exist on Latin-American environments. This paper analyzes PLC throughput for home connectivity by probing noisy channel scenarios in a PLC network and the statistical results are shown.

Development of a Microsensor to Minimize Post Cataract Surgery Complications

This paper presents design and characterization of a microaccelerometer designated for integration into cataract surgical probe to detect hardness of different eye tissues during cataract surgery. Soft posterior lens capsule of eye can be easily damaged in comparison with hard opaque lens since the surgeon can not see directly behind cutting needle during the surgery. Presence of microsensor helps the surgeon to avoid rupturing posterior lens capsule which if occurs leads to severe complications such as glaucoma, infection, or even blindness. The microsensor having overall dimensions of 480 μm x 395 μm is able to deliver significant capacitance variations during encountered vibration situations which makes it capable to distinguish between different types of tissue. Integration of electronic components on chip ensures high level of reliability and noise immunity while minimizes space and power requirements. Physical characteristics and results on performance testing, proves integration of microsensor as an effective tool to aid the surgeon during this procedure.

Intellectual Capital Research through Corporate Social Responsibility: (Re) Constructing the Agenda

The business strategy of any company wanting to be competitive on the market should be designed around the concept of intangibles, with an increasingly decisive role in knowledge transfer of the biggest corporations. Advancing the research in these areas, this study integrates the two approaches, emphasizing the relationships between the components of intellectual capital and corporate social responsibility. The three dimensions of intellectual capital in terms of sustainability requirements are debated. The paper introduces the concept of sustainable intellectual capital and debates it within an assessment model designed on the base of key performance indicators. The results refer to the assessment of possible ways for including the information on intellectual capital and corporate responsibility within the corporate strategy. The conclusions enhance the need for companies to be ready to support the integration of this type of information the knowledge transfer process, in order to develop competitive advantage on the market.