Data Transmission Reliability in Short Message Integrated Distributed Monitoring Systems

Short message integrated distributed monitoring systems (SM-DMS) are growing rapidly in wireless communication applications in various areas, such as electromagnetic field (EMF) management, wastewater monitoring, and air pollution supervision, etc. However, delay in short messages often makes the data embedded in SM-DMS transmit unreliably. Moreover, there are few regulations dealing with this problem in SMS transmission protocols. In this study, based on the analysis of the command and data requirements in the SM-DMS, we developed a processing model for the control center to solve the delay problem in data transmission. Three components of the model: the data transmission protocol, the receiving buffer pool method, and the timer mechanism were described in detail. Discussions on adjusting the threshold parameter in the timer mechanism were presented for the adaptive performance during the runtime of the SM-DMS. This model optimized the data transmission reliability in SM-DMS, and provided a supplement to the data transmission reliability protocols at the application level.

Reliability of Digital FSO Links in Europe

The paper deals with an analysis of visibility records collected from 210 European airports to obtain a realistic estimation of the availability of Free Space Optical (FSO) data links. Commercially available optical links usually operate in the 850nm waveband. Thus the influence of the atmosphere on the optical beam and on the visible light is similar. Long-term visibility records represent an invaluable source of data for the estimation of the quality of service of FSO links. The model used characterizes both the statistical properties of fade depths and the statistical properties of individual fade durations. Results are presented for Italy, France, and Germany.

Optimization of Parametric Studies Using Strategies of Sampling Techniques

To improve the efficiency of parametric studies or tests planning the method is proposed, that takes into account all input parameters, but only a few simulation runs are performed to assess the relative importance of each input parameter. For K input parameters with N input values the total number of possible combinations of input values equals NK. To limit the number of runs, only some (totally N) of possible combinations are taken into account. The sampling procedure Updated Latin Hypercube Sampling is used to choose the optimal combinations. To measure the relative importance of each input parameter, the Spearman rank correlation coefficient is proposed. The sensitivity and the influence of all parameters are analyzed within one procedure and the key parameters with the largest influence are immediately identified.

Control-flow Complexity Measurement of Processes and Weyuker's Properties

Process measurement is the task of empirically and objectively assigning numbers to the properties of business processes in such a way as to describe them. Desirable attributes to study and measure include complexity, cost, maintainability, and reliability. In our work we will focus on investigating process complexity. We define process complexity as the degree to which a business process is difficult to analyze, understand or explain. One way to analyze a process- complexity is to use a process control-flow complexity measure. In this paper, an attempt has been made to evaluate the control-flow complexity measure in terms of Weyuker-s properties. Weyuker-s properties must be satisfied by any complexity measure to qualify as a good and comprehensive one.

Java Based Automatic Curriculum Generator for Children with Trisomy 21

Early Intervention Program (EIP) is required to improve the overall development of children with Trisomy 21 (Down syndrome). In order to help trainer and parent in the implementation of EIP, a support system has been developed. The support system is able to screen data automatically, store and analyze data, generate individual EIP (curriculum) with optimal training duration and to generate training automatically. The system consists of hardware and software where the software has been implemented using Java language and Linux Fedora. The software has been tested to ensure the functionality and reliability. The prototype has been also tested in Down syndrome centers. Test result shows that the system is reliable to be used for generation of an individual curriculum which includes the training program to improve the motor, cognitive, and combination abilities of Down syndrome children under 6 years.

Design and Control of PEM Fuel Cell Diffused Aeration System using Artificial Intelligence Techniques

Fuel cells have become one of the major areas of research in the academia and the industry. The goal of most fish farmers is to maximize production and profits while holding labor and management efforts to the minimum. Risk of fish kills, disease outbreaks, poor water quality in most pond culture operations, aeration offers the most immediate and practical solution to water quality problems encountered at higher stocking and feeding rates. Many units of aeration system are electrical units so using a continuous, high reliability, affordable, and environmentally friendly power sources is necessary. Aeration of water by using PEM fuel cell power is not only a new application of the renewable energy, but also, it provides an affordable method to promote biodiversity in stagnant ponds and lakes. This paper presents a new design and control of PEM fuel cell powered a diffused air aeration system for a shrimp farm in Mersa Matruh in Egypt. Also Artificial intelligence (AI) techniques control is used to control the fuel cell output power by control input gases flow rate. Moreover the mathematical modeling and simulation of PEM fuel cell is introduced. A comparison study is applied between the performance of fuzzy logic control (FLC) and neural network control (NNC). The results show the effectiveness of NNC over FLC.

Investigation and Calculation of Seismic Reliability of Structures

Recently, analysis and designing of the structures based on the Reliability theory have been the center of attention. Reason of this attention is the existence of the natural and random structural parameters such as the material specification, external loads, geometric dimensions etc. By means of the Reliability theory, uncertainties resulted from the statistical nature of the structural parameters can be changed into the mathematical equations and the safety and operational considerations can be considered in the designing process. According to this theory, it is possible to study the destruction probability of not only a specific element but also the entire system. Therefore, after being assured of safety of every element, their reciprocal effects on the safety of the entire system can be investigated.

Evaluating the Effectiveness of Memory Overcommit Techniques on KVM-based Hosting Platform

Determining how many virtual machines a Linux host could run can be a challenge. One of tough missions is to find the balance among performance, density and usability. Now KVM hypervisor has become the most popular open source full virtualization solution. It supports several ways of running guests with more memory than host really has. Due to large differences between minimum and maximum guest memory requirements, this paper presents initial results on same-page merging, ballooning and live migration techniques that aims at optimum memory usage on KVM-based cloud platform. Given the design of initial experiments, the results data is worth reference for system administrators. The results from these experiments concluded that each method offers different reliability tradeoff.

The Using of Rasch-Model in Validating the Arabic Version of Multiple Intelligence Development Assessment Scale (MIDAS)

This article addresses the procedures to validate the Arabic version of Multiple Intelligence Development Assessment Scale (MIDAS). The content validity was examined based on the experts- judgments on the MIDAS-s items in the Arabic version. The content of eleven items in the Arabic version of MIDAS was modified to match the Arabic context. Then a translation from original English version of MIDAS into Arabic language was performed. The reliability of the Arabic MIDAS was calculated based on test and retest method and found to be 0.85 for the overall MIDAS and for the different subscales ranging between 0.78 - 0.87. The examination of construct validity for the overall Arabic MIDAS and its subscales was established by using Winsteps program version 6 based on Rasch model in order to fit the items into the Arabic context. The findings indicated that, the eight subscales in Arabic version of MIDAS scale have a unidimensionality, and the total number of kept items in the overall scale is 108 items.

Trust Based Energy Aware Reliable Reactive Protocol in Mobile Ad Hoc Networks

Trust and Energy consumption is the most challenging issue in routing protocol design for Mobile ad hoc networks (MANETs), since mobile nodes are battery powered and nodes behaviour are unpredictable. Furthermore replacing and recharging batteries and making nodes co-operative is often impossible in critical environments like military applications. In this paper, we propose a trust based energy aware routing model in MANET. During route discovery, node with more trust and maximum energy capacity is selected as a router based on a parameter called 'Reliability'. Route request from the source is accepted by a node only if its reliability is high. Otherwise, the route request is discarded. This approach forms a reliable route from source to destination thus increasing network life time, improving energy utilization and decreasing number of packet loss during transmission.

Prediction of the Characteristics of Transformer Oil under Different Operation Conditions

Power systems and transformer are intrinsic apparatus, therefore its reliability and safe operation is important to determine their operation conditions, and the industry uses quality control tests in the insulation design of oil filled transformers. Hence the service period effect on AC dielectric strength is significant. The effect of aging on transformer oil physical, chemical and electrical properties was studied using the international testing methods for the evaluation of transformer oil quality. The study was carried out on six transformers operate in the field and for monitoring periods over twenty years. The properties which are strongly time dependent were specified and those which have a great impact on the transformer oil acidity, breakdown voltage and dissolved gas analysis were defined. Several tests on the transformers oil were studied to know the time of purifying or changing it, moreover prediction of the characteristics of it under different operation conditions.

ISCS (Information Security Check Service) for the Safety and Reliability of Communications

Recent widespread use of information and communication technology has greatly changed information security risks that businesses and institutions encounter. Along with this situation, in order to ensure security and have confidence in electronic trading, it has become important for organizations to take competent information security measures to provide international confidence that sensitive information is secure. Against this backdrop, the approach to information security checking has come to an important issue, which is believed to be common to all countries. The purpose of this paper is to introduce the new system of information security checking program in Korea and to propose synthetic information security countermeasures under domestic circumstances in order to protect physical equipment, security management and technology, and the operation of security check for securing services on ISP(Internet Service Provider), IDC(Internet Data Center), and e-commerce(shopping malls, etc.)

Memory Leak Detection in Distributed System

Due to memory leaks, often-valuable system memory gets wasted and denied for other processes thereby affecting the computational performance. If an application-s memory usage exceeds virtual memory size, it can leads to system crash. Current memory leak detection techniques for clusters are reactive and display the memory leak information after the execution of the process (they detect memory leak only after it occur). This paper presents a Dynamic Memory Monitoring Agent (DMMA) technique. DMMA framework is a dynamic memory leak detection, that detects the memory leak while application is in execution phase, when memory leak in any process in the cluster is identified by DMMA it gives information to the end users to enable them to take corrective actions and also DMMA submit the affected process to healthy node in the system. Thus provides reliable service to the user. DMMA maintains information about memory consumption of executing processes and based on this information and critical states, DMMA can improve reliability and efficaciousness of cluster computing.

A Short Form of the Taiwan Health Literacy Scale (THLS) for Chinese-Speaking Adults

The Taiwan Health Literacy Scale (THLS) was developed to cope with the need of measuring heath literacy of Chinese-speaking adults in Taiwan. Although the scale was proven having good reliability and validity, it was not popularly adopted by the practitioners due to the length, and the time required completing. Based on the THLS, this research further invited healthcare professionals to review the original scale for a possible shorten work. Under the logic of THLS, the research adopted an analytic hierarchy process technique to consolidate the healthcare experts- assessments to shorten the original scale. There are fifteen items out of the original 66 items were identified having higher loadings. Confirmed by the experts and passed a pilot test with 40 undergraduate students, a short form of THLS is then introduced. This research then used 839 samples from the major cities of the Hua-lien county in the eastern part of Taiwan to test the reliability and validity of this new scale. The reliability of the scale is high and acceptable. The current scale is also highly correlated with the original, of which provide evidence for the validity of the scale.

Development of a Microsensor to Minimize Post Cataract Surgery Complications

This paper presents design and characterization of a microaccelerometer designated for integration into cataract surgical probe to detect hardness of different eye tissues during cataract surgery. Soft posterior lens capsule of eye can be easily damaged in comparison with hard opaque lens since the surgeon can not see directly behind cutting needle during the surgery. Presence of microsensor helps the surgeon to avoid rupturing posterior lens capsule which if occurs leads to severe complications such as glaucoma, infection, or even blindness. The microsensor having overall dimensions of 480 μm x 395 μm is able to deliver significant capacitance variations during encountered vibration situations which makes it capable to distinguish between different types of tissue. Integration of electronic components on chip ensures high level of reliability and noise immunity while minimizes space and power requirements. Physical characteristics and results on performance testing, proves integration of microsensor as an effective tool to aid the surgeon during this procedure.

Mechanical and Chemical Reliability Assessment of Silica Optical Fibres

The current study has investigated the ageing phenomena of silica optical fibres in relation to water activity which might be accelerated when exposed to a supplementary energy, such as microwaves. A controlled stress by winding fibres onto accurate diameter mandrel was applied. Taking into account that normally a decrease in fibre strength is induced in time by chemical action of water, the effects of cumulative reagents such as: water, applied stress and supplementary energy (microwave) in some cases acted in the opposite manner. The microwave effect as a structural relaxation catalyst appears unexpected, even if the overall gain in fibre strength is not high, but the stress corrosion factor revealed significant increase in certain simulation conditions.

Design of FIR Filter for Water Level Detection

This paper proposes a new design of spatial FIR filter to automatically detect water level from a video signal of various river surroundings. A new approach in this report applies "addition" of frames and a "horizontal" edge detector to distinguish water region and land region. Variance of each line of a filtered video frame is used as a feature value. The water level is recognized as a boundary line between the land region and the water region. Edge detection filter essentially demarcates between two distinctly different regions. However, the conventional filters are not automatically adaptive to detect water level in various lighting conditions of river scenery. An optimized filter is purposed so that the system becomes robust to changes of lighting condition. More reliability of the proposed system with the optimized filter is confirmed by accuracy of water level detection.

Optimization of Quantization in Higher Order Modulations for LDPC-Coded Systems

In this paper, we evaluate the choice of suitable quantization characteristics for both the decoder messages and the received samples in Low Density Parity Check (LDPC) coded systems using M-QAM (Quadrature Amplitude Modulation) schemes. The analysis involves the demapper block that provides initial likelihood values for the decoder, by relating its quantization strategy of the decoder. A mapping strategy refers to the grouping of bits within a codeword, where each m-bit group is used to select a 2m-ary signal in accordance with the signal labels. Further we evaluate the system with mapping strategies like Consecutive-Bit (CB) and Bit-Reliability (BR). A new demapper version, based on approximate expressions, is also presented to yield a low complexity hardware implementation.

A Software Tool Design for Cerebral Infarction of MR Images

The brain MR imaging-based clinical research and analysis system were specifically built and the development for a large-scale data was targeted. We used the general clinical data available for building large-scale data. Registration period for the selection of the lesion ROI and the region growing algorithm was used and the Mesh-warp algorithm for matching was implemented. The accuracy of the matching errors was modified individually. Also, the large ROI research data can accumulate by our developed compression method. In this way, the correctly decision criteria to the research result was suggested. The experimental groups were age, sex, MR type, patient ID and smoking which can easily be queries. The result data was visualized of the overlapped images by a color table. Its data was calculated by the statistical package. The evaluation for the utilization of this system in the chronic ischemic damage in the area has done from patients with the acute cerebral infarction. This is the cause of neurologic disability index location in the center portion of the lateral ventricle facing. The corona radiate was found in the position. Finally, the system reliability was measured both inter-user and intra-user registering correlation.

An Approach in the Improvement of the Reliability of Impedance Relay

The distance protection mainly the impedance relay which is considered as the main protection for transmission lines can be subjected to impedance measurement error which is, mainly, due to the fault resistance and to the power fluctuation. Thus, the impedance relay may not operate for a short circuit at the far end of the protected line (case of the under reach) or operates for a fault beyond its protected zone (case of overreach). In this paper, an approach to fault detection by a distance protection, which distinguishes between the faulty conditions and the effect of overload operation mode, has been developed. This approach is based on the symmetrical components; mainly the negative sequence, and it is taking into account both the effect of fault resistance and the overload situation which both have an effect upon the reliability of the protection in terms of dependability for the former and security for the latter.