A Pattern Language for Software Debugging

In spite of all advancement in software testing, debugging remains a labor-intensive, manual, time consuming, and error prone process. A candidate solution to enhance debugging process is to fuse it with testing process. To achieve this integration, a possible solution may be categorizing common software tests and errors followed by the effort on fixing the errors through general solutions for each test/error pair. Our approach to address this issue is based on Christopher Alexander-s pattern and pattern language concepts. The patterns in this language are grouped into three major sections and connect the three concepts of test, error, and debug. These patterns and their hierarchical relationship shape a pattern language that introduces a solution to solve software errors in a known testing context. Finally, we will introduce our developed framework ADE as a sample implementation to support a pattern of proposed language, which aims to automate the whole process of evolving software design via evolutionary methods.

Simulation of an Auto-Tuning Bicycle Suspension Fork with Quick Releasing Valves

Bicycle configuration is not as large as those of motorcycles or automobiles, while it indeed composes a complicated dynamic system. People-s requirements on comfortability, controllability and safety grow higher as the research and development technologies improve. The shock absorber affects the vehicle suspension performances enormously. The absorber takes the vibration energy and releases it at a suitable time, keeping the wheel under a proper contact condition with road surface, maintaining the vehicle chassis stability. Suspension design for mountain bicycles is more difficult than that of city bikes since it encounters dynamic variations on road and loading conditions. Riders need a stiff damper as they exert to tread on the pedals when climbing, while a soft damper when they descend downhill. Various switchable shock absorbers are proposed in markets, however riders have to manually switch them among soft, hard and lock positions. This study proposes a novel design of the bicycle shock absorber, which provides automatic smooth tuning of the damping coefficient, from a predetermined lower bound to theoretically unlimited. An automatic quick releasing valve is involved in this design so that it can release the peak pressure when the suspension fork runs into a square-wave type obstacle and prevent the chassis from damage, avoiding the rider skeleton from injury. This design achieves the automatic tuning process by innovative plunger valve and fluidic passage arrangements without any electronic devices. Theoretical modelling of the damper and spring are established in this study. Design parameters of the valves and fluidic passages are determined. Relations between design parameters and shock absorber performances are discussed in this paper. The analytical results give directions to the shock absorber manufacture.

Low Cost Real-Time Communication Braille Hand-Glove for Visually Impaired Using Slot Sensors and Vibration Motors

Visually impaired people find it extremely difficult to acquire basic and vital information necessary for their living. Therefore, they are at a very high risk of being socially excluded as a result of poor access to information. In recent years, several attempts have been made in improving the communication methods for visually impaired people which involve tactile sensation such as finger Braille, manual alphabets and the print on palm method and several other electronic devices. But, there are some problems which arise in such methods such as lack of privacy and lack of compatibility to computer environment. This paper describes a low cost Braille hand glove for blind people using slot sensors and vibration motors with the help of which they can read and write emails, text messages and read e-books. This glove allows the person to type characters based on different Braille combination using six slot sensors. The vibration in six different positions of the glove which matches to the Braille code allows them to read characters.

Framework for Delivery Reliability in European Machinery and Equipment Industry

Today-s manufacturing companies are facing multiple and dynamic customer-supplier-relationships embedded in nonhierarchical production networks. This complex environment leads to problems with delivery reliability and wasteful turbulences throughout the entire network. This paper describes an operational model based on a theoretical framework which improves delivery reliability of each individual customer-supplier-relationship within non-hierarchical production networks of the European machinery and equipment industry. By developing a non-centralized coordination mechanism based on determining the value of delivery reliability and derivation of an incentive system for suppliers the number of in time deliveries can be increased and thus the turbulences in the production network smoothened. Comparable to an electronic stock exchange the coordination mechanism will transform the manual and nontransparent process of determining penalties for delivery delays into an automated and transparent market mechanism creating delivery reliability.

XML Schema Automatic Matching Solution

Schema matching plays a key role in many different applications, such as schema integration, data integration, data warehousing, data transformation, E-commerce, peer-to-peer data management, ontology matching and integration, semantic Web, semantic query processing, etc. Manual matching is expensive and error-prone, so it is therefore important to develop techniques to automate the schema matching process. In this paper, we present a solution for XML schema automated matching problem which produces semantic mappings between corresponding schema elements of given source and target schemas. This solution contributed in solving more comprehensively and efficiently XML schema automated matching problem. Our solution based on combining linguistic similarity, data type compatibility and structural similarity of XML schema elements. After describing our solution, we present experimental results that demonstrate the effectiveness of this approach.

Automatic Segmentation of Dermoscopy Images Using Histogram Thresholding on Optimal Color Channels

Automatic segmentation of skin lesions is the first step towards development of a computer-aided diagnosis of melanoma. Although numerous segmentation methods have been developed, few studies have focused on determining the most discriminative and effective color space for melanoma application. This paper proposes a novel automatic segmentation algorithm using color space analysis and clustering-based histogram thresholding, which is able to determine the optimal color channel for segmentation of skin lesions. To demonstrate the validity of the algorithm, it is tested on a set of 30 high resolution dermoscopy images and a comprehensive evaluation of the results is provided, where borders manually drawn by four dermatologists, are compared to automated borders detected by the proposed algorithm. The evaluation is carried out by applying three previously used metrics of accuracy, sensitivity, and specificity and a new metric of similarity. Through ROC analysis and ranking the metrics, it is shown that the best results are obtained with the X and XoYoR color channels which results in an accuracy of approximately 97%. The proposed method is also compared with two state-ofthe- art skin lesion segmentation methods, which demonstrates the effectiveness and superiority of the proposed segmentation method.

Named Entity Recognition using Support Vector Machine: A Language Independent Approach

Named Entity Recognition (NER) aims to classify each word of a document into predefined target named entity classes and is now-a-days considered to be fundamental for many Natural Language Processing (NLP) tasks such as information retrieval, machine translation, information extraction, question answering systems and others. This paper reports about the development of a NER system for Bengali and Hindi using Support Vector Machine (SVM). Though this state of the art machine learning technique has been widely applied to NER in several well-studied languages, the use of this technique to Indian languages (ILs) is very new. The system makes use of the different contextual information of the words along with the variety of features that are helpful in predicting the four different named (NE) classes, such as Person name, Location name, Organization name and Miscellaneous name. We have used the annotated corpora of 122,467 tokens of Bengali and 502,974 tokens of Hindi tagged with the twelve different NE classes 1, defined as part of the IJCNLP-08 NER Shared Task for South and South East Asian Languages (SSEAL) 2. In addition, we have manually annotated 150K wordforms of the Bengali news corpus, developed from the web-archive of a leading Bengali newspaper. We have also developed an unsupervised algorithm in order to generate the lexical context patterns from a part of the unlabeled Bengali news corpus. Lexical patterns have been used as the features of SVM in order to improve the system performance. The NER system has been tested with the gold standard test sets of 35K, and 60K tokens for Bengali, and Hindi, respectively. Evaluation results have demonstrated the recall, precision, and f-score values of 88.61%, 80.12%, and 84.15%, respectively, for Bengali and 80.23%, 74.34%, and 77.17%, respectively, for Hindi. Results show the improvement in the f-score by 5.13% with the use of context patterns. Statistical analysis, ANOVA is also performed to compare the performance of the proposed NER system with that of the existing HMM based system for both the languages.

En-Face Optical Coherence Tomography and Fluorescence in Evaluation of Orthodontic Interfaces

Bonding has become a routine procedure in several dental specialties – from prosthodontics to conservative dentistry and even orthodontics. In many of these fields it is important to be able to investigate the bonded interfaces to assess their quality. All currently employed investigative methods are invasive, meaning that samples are destroyed in the testing procedure and cannot be used again. We have investigated the interface between human enamel and bonded ceramic brackets non-invasively, introducing a combination of new investigative methods – optical coherence tomography (OCT), fluorescence OCT and confocal microscopy (CM). Brackets were conventionally bonded on conditioned buccal surfaces of teeth. The bonding was assessed using these methods. Three dimensional reconstructions of the detected material defects were developed using manual and semi-automatic segmentation. The results clearly prove that OCT, fluorescence OCT and CM are useful in orthodontic bonding investigations.

Array Data Transformation for Source Code Obfuscation

Obfuscation is a low cost software protection methodology to avoid reverse engineering and re engineering of applications. Source code obfuscation aims in obscuring the source code to hide the functionality of the codes. This paper proposes an Array data transformation in order to obfuscate the source code which uses arrays. The applications using the proposed data structures force the programmer to obscure the logic manually. It makes the developed obscured codes hard to reverse engineer and also protects the functionality of the codes.

Detection of Action Potentials in the Presence of Noise Using Phase-Space Techniques

Emerging Bio-engineering fields such as Brain Computer Interfaces, neuroprothesis devices and modeling and simulation of neural networks have led to increased research activity in algorithms for the detection, isolation and classification of Action Potentials (AP) from noisy data trains. Current techniques in the field of 'unsupervised no-prior knowledge' biosignal processing include energy operators, wavelet detection and adaptive thresholding. These tend to bias towards larger AP waveforms, AP may be missed due to deviations in spike shape and frequency and correlated noise spectrums can cause false detection. Also, such algorithms tend to suffer from large computational expense. A new signal detection technique based upon the ideas of phasespace diagrams and trajectories is proposed based upon the use of a delayed copy of the AP to highlight discontinuities relative to background noise. This idea has been used to create algorithms that are computationally inexpensive and address the above problems. Distinct AP have been picked out and manually classified from real physiological data recorded from a cockroach. To facilitate testing of the new technique, an Auto Regressive Moving Average (ARMA) noise model has been constructed bases upon background noise of the recordings. Along with the AP classification means this model enables generation of realistic neuronal data sets at arbitrary signal to noise ratio (SNR).

Skin Detection using Histogram depend on the Mean Shift Algorithm

In this paper, we were introduces a skin detection method using a histogram approximation based on the mean shift algorithm. The proposed method applies the mean shift procedure to a histogram of a skin map of the input image, generated by comparison with standard skin colors in the CbCr color space, and divides the background from the skin region by selecting the maximum value according to brightness level. The proposed method detects the skin region using the mean shift procedure to determine a maximum value that becomes the dividing point, rather than using a manually selected threshold value, as in existing techniques. Even when skin color is contaminated by illumination, the procedure can accurately segment the skin region and the background region. The proposed method may be useful in detecting facial regions as a pretreatment for face recognition in various types of illumination.

Internal Accounting Controls

Internal controls of accounting are an essential business function for a growth-oriented organization, and include the elements of risk assessment, information communications and even employees' roles and responsibilities. Internal controls of accounting systems are designed to protect a company from fraud, abuse and inaccurate data recording and help organizations keep track of essential financial activities. Internal controls of accounting provide a streamlined solution for organizing all accounting procedures and ensuring that the accounting cycle is completed consistently and successfully. Implementing a formal Accounting Procedures Manual for the organization allows the financial department to facilitate several processes and maintain rigorous standards. Internal controls also allow organizations to keep detailed records, manage and organize important financial transactions and set a high standard for the organization's financial management structure and protocols. A well-implemented system also reduces the risk of accounting errors and abuse. A well-implemented controls system allows a company's financial managers to regulate and streamline all functions of the accounting department. Internal controls of accounting can be set up for every area to track deposits, monitor check handling, keep track of creditor accounts, and even assess budgets and financial statements on an ongoing basis. Setting up an effective accounting system to monitor accounting reports, analyze records and protect sensitive financial information also can help a company set clear goals and make accurate projections. Creating efficient accounting processes allows an organization to set specific policies and protocols on accounting procedures, and reach its financial objectives on a regular basis. Internal accounting controls can help keep track of such areas as cash-receipt recording, payroll management, appropriate recording of grants and gifts, cash disbursements by authorized personnel, and the recording of assets. These systems also can take into account any government regulations and requirements for financial reporting.

Semi-Automatic Approach for Semantic Annotation

The third phase of web means semantic web requires many web pages which are annotated with metadata. Thus, a crucial question is where to acquire these metadata. In this paper we propose our approach, a semi-automatic method to annotate the texts of documents and web pages and employs with a quite comprehensive knowledge base to categorize instances with regard to ontology. The approach is evaluated against the manual annotations and one of the most popular annotation tools which works the same as our tool. The approach is implemented in .net framework and uses the WordNet for knowledge base, an annotation tool for the Semantic Web.

Coordinated Q–V Controller for Multi-machine Steam Power Plant: Design and Validation

This paper discusses coordinated reactive power - voltage (Q-V) control in a multi machine steam power plant. The drawbacks of manual Q-V control are briefly listed, and the design requirements for coordinated Q-V controller are specified. Theoretical background and mathematical model of the new controller are presented next followed by validation of developed Matlab/Simulink model through comparison with recorded responses in real steam power plant and description of practical realisation of the controller. Finally, the performance of commissioned controller is illustrated on several examples of coordinated Q-V control in real steam power plant and compared with manual control.

The Method of Evaluation Artery Diameter from Ultrasound Video

The cardiovascular system has become the most important subject of clinical research, particularly measurement of arterial blood flow. Therefore correct determination of arterial diameter is crucial. We propose a novel, semi-automatic method for artery lumen detection. The method is based on Gaussian probability function. Usability of our proposed method was assessed by analyzing ultrasound B-mode CFA video sequences acquired from eleven healthy volunteers. The correlation coefficient between the manual and semi-automatic measurement of arterial diameter was 0.996. Our proposed method for detecting artery boundary is novel and accurate enough for the measurement of artery diameter.

Scale-Space Volume Descriptors for Automatic 3D Facial Feature Extraction

An automatic method for the extraction of feature points for face based applications is proposed. The system is based upon volumetric feature descriptors, which in this paper has been extended to incorporate scale space. The method is robust to noise and has the ability to extract local and holistic features simultaneously from faces stored in a database. Extracted features are stable over a range of faces, with results indicating that in terms of intra-ID variability, the technique has the ability to outperform manual landmarking.

High Level Synthesis of Kahn Process Networks(KPN) for Streaming Applications

Streaming Applications usually run in parallel or in series that incrementally transform a stream of input data. It poses a design challenge to break such an application into distinguishable blocks and then to map them into independent hardware processing elements. For this, there is required a generic controller that automatically maps such a stream of data into independent processing elements without any dependencies and manual considerations. In this paper, Kahn Process Networks (KPN) for such streaming applications is designed and developed that will be mapped on MPSoC. This is designed in such a way that there is a generic Cbased compiler that will take the mapping specifications as an input from the user and then it will automate these design constraints and automatically generate the synthesized RTL optimized code for specified application.

Statistical Models of Network Traffic

Model-based approaches have been applied successfully to a wide range of tasks such as specification, simulation, testing, and diagnosis. But one bottleneck often prevents the introduction of these ideas: Manual modeling is a non-trivial, time-consuming task. Automatically deriving models by observing and analyzing running systems is one possible way to amend this bottleneck. To derive a model automatically, some a-priori knowledge about the model structure–i.e. about the system–must exist. Such a model formalism would be used as follows: (i) By observing the network traffic, a model of the long-term system behavior could be generated automatically, (ii) Test vectors can be generated from the model, (iii) While the system is running, the model could be used to diagnose non-normal system behavior. The main contribution of this paper is the introduction of a model formalism called 'probabilistic regression automaton' suitable for the tasks mentioned above.

A Fuzzy Implementation for Optimization of Storage Locations in an Industrial AS/RS

Warehousing is commonly used in factories for the storage of products until delivery of orders. As the amount of products stored increases it becomes tedious to be carried out manually. In recent years, the manual storing has converted into fully or partially computer controlled systems, also known as Automated Storage and Retrieval Systems (AS/RS). This paper discusses an ASRS system, which was designed such that the best storage location for the products is determined by utilizing a fuzzy control system. The design maintains the records of the products to be/already in store and the storage/retrieval times along with the availability status of the storage locations. This paper discusses on the maintenance of the above mentioned records and the utilization of the concept of fuzzy logic in order to determine the optimum storage location for the products. The paper will further discuss on the dynamic splitting and merging of the storage locations depending on the product sizes.

Degeneracy of MIS under the Conditions of Instability: A Mathematical Formulation

It has been always observed that the effectiveness of MIS as a support tool for management decisions degenerate after time of implementation, despite the substantial investments being made. This is true for organizations at the initial stages of MIS implementations, manual or computerized. A survey of a sample of middle to top managers in business and government institutions was made. A large ratio indicates that the MIS has lost its impact on the day-to-day operations, and even the response lag time expands sometimes indefinitely. The data indicates an infant mortality phenomenon of the bathtub model. Reasons may be monotonous nature of MIS delivery, irrelevance, irreverence, timeliness, and lack of adequate detail. All those reasons collaborate to create a degree of degeneracy. We investigate and model as a bathtub model the phenomenon of MIS degeneracy that inflicts the MIS systems and renders it ineffective. A degeneracy index is developed to identify the status of the MIS system and possible remedies to prevent the onset of total collapse of the system to the point of being useless.