A Novel Antenna Design for Telemedicine Applications

To develop a reliable and cost effective communication platform for the telemedicine applications, novel antenna design has been presented using bacterial foraging optimization (BFO) technique. The proposed antenna geometry is achieved by etching a modified Koch curve fractal shape at the edges and a square shape slot at the center of the radiating element of a patch antenna. It has been found that the new antenna has achieved 43.79% size reduction and better resonating characteristic than the original patch. Representative results for both simulations and numerical validations are reported in order to assess the effectiveness of the developed methodology.

Study of the Antimicrobial Activity of Aminoreductone against Pathogenic Bacteria in Comparison with Other Antibiotics

Antimicrobial activities of aminoreductone (AR), a product formed in the initial stage of Maillard reaction, were screened against pathogenic bacteria. A significant growth inhibition of AR against all 7 isolates (Staphylococcus aureus ATCC® 25923™, Salmonella typhimurium ATCC® 14028™, Bacillus cereus ATCC® 13061™, Bacillus subtilis ATCC® 11774™, Escherichia coli ATCC® 25922™, Enterococcus faecalis ATCC® 29212™, Listeria innocua ATCC® 33090™) were observed by the standard disc diffusion methods. The inhibition zone for each isolate by AR (2.5 mg) ranged from 15±0mm to 28.3±0.4mm in diameter. The minimum inhibitory concentration (MIC) of AR ranging from 20mM to 26mM was proven in the 7 isolates tested. AR also showed the similar effect of growth inhibition in comparison with antibiotics frequently used for the treatment of infections bacteria, such as amikacin, ciprofloxacin, meropennem and levofloxacin. The results indicated that foods containing AR are valuable sources of bioactive compounds towards pathogenic bacteria.

On the Joint Optimization of Performance and Power Consumption in Data Centers

We model the process of a data center as a multi- objective problem of mapping independent tasks onto a set of data center machines that simultaneously minimizes the energy consump¬tion and response time (makespan) subject to the constraints of deadlines and architectural requirements. A simple technique based on multi-objective goal programming is proposed that guarantees Pareto optimal solution with excellence in convergence process. The proposed technique also is compared with other traditional approach. The simulation results show that the proposed technique achieves superior performance compared to the min-min heuristics, and com¬petitive performance relative to the optimal solution implemented in UNDO for small-scale problems.

Runtime Monitoring Using Policy Based Approach to Control Information Flow for Mobile Apps

Mobile applications are verified to check the correctness or evaluated to check the performance with respect to specific security properties such as Availability, Integrity and Confidentiality. Where they are made available to the end users of the mobile application is achievable only to a limited degree using software engineering static verification techniques. The more sensitive the information, such as credit card data, personal medical information or personal emails being processed by mobile application, the more important it is to ensure the confidentiality of this information. Monitoring untrusted mobile application during execution in an environment where sensitive information is present is difficult and unnerving. The paper addresses the issue of monitoring and controlling the flow of confidential information during untrusted mobile application execution. The approach concentrates on providing a dynamic and usable information security solution by interacting with the mobile users during the runtime of mobile application in response to information flow events.

Evaluating Performance of an Anomaly Detection Module with Artificial Neural Network Implementation

Anomaly detection techniques have been focused on two main components: data extraction and selection and the second one is the analysis performed over the obtained data. The goal of this paper is to analyze the influence that each of these components has over the system performance by evaluating detection over network scenarios with different setups. The independent variables are as follows: the number of system inputs, the way the inputs are codified and the complexity of the analysis techniques. For the analysis, some approaches of artificial neural networks are implemented with different number of layers. The obtained results show the influence that each of these variables has in the system performance.

Conceptual Synthesis of Multi-Source Renewable Energy Based Microgrid

Microgrids are increasingly being considered to provide electricity for the expanding energy demand in the grid distribution network and grid isolated areas. However, the technical challenges associated with the operation and controls are immense. Management of dynamic power balances, power flow, and network voltage profiles imposes unique challenges in the context of microgrids. Stability of the microgrid during both grid-connected and islanded mode is considered as the major challenge during its operation. Traditional control methods have been employed are based on the assumption of linear loads. For instance the concept of PQ, voltage and frequency control through decoupled PQ are some of very useful when considering linear loads, but they fall short when considering nonlinear loads. The deficiency of traditional control methods of microgrid suggests that more research in the control of microgrids should be done. This research aims at introducing the dq technique concept into decoupled PQ for dynamic load demand control in inverter interfaced DG system operating as isolated LV microgrid. Decoupled PQ in exact mathematical formulation in dq frame is expected to accommodate all variations of the line parameters (resistance and inductance) and to relinquish forced relationship between the DG variables such as power, voltage and frequency in LV microgrids and allow for individual parameter control (frequency and line voltages). This concept is expected to address and achieve accurate control, improve microgrid stability and power quality at all load conditions.

iCCS: Development of a Mobile Web-Based Student Integrated Information System Using Hill Climbing Algorithm

This paper describes a conducive and structured information exchange environment for the students of the College of Computer Studies in Manuel S. Enverga University Foundation in. The system was developed to help the students to check their academic result, manage profile, make self-enlistment and assist the students to manage their academic status that can be viewed also in mobile phones. Developing class schedules in a traditional way is a long process that involves making many numbers of choices. With Hill Climbing Algorithm, however, the process of class scheduling, particularly with regards to courses to be taken by the student aligned with the curriculum, can perform these processes and end up with an optimum solution. The proponent used Rapid Application Development (RAD) for the system development method. The proponent also used the PHP as the programming language and MySQL as the database.

Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Selection of Appropriate Classification Technique for Lithological Mapping of Gali Jagir Area, Pakistan

Satellite images interpretation and analysis assist geologists by providing valuable information about geology and minerals of an area to be surveyed. A test site in Fatejang of district Attock has been studied using Landsat ETM+ and ASTER satellite images for lithological mapping. Five different supervised image classification techniques namely maximum likelihood, parallelepiped, minimum distance to mean, mahalanobis distance and spectral angle mapper have been performed upon both satellite data images to find out the suitable classification technique for lithological mapping in the study area. Results of these five image classification techniques were compared with the geological map produced by Geological Survey of Pakistan. Result of maximum likelihood classification technique applied on ASTER satellite image has highest correlation of 0.66 with the geological map. Field observations and XRD spectra of field samples also verified the results. A lithological map was then prepared based on the maximum likelihood classification of ASTER satellite image.

A Local Invariant Generalized Hough Transform Method for Integrated Circuit Visual Positioning

In this study, an local invariant generalized Houghtransform (LI-GHT) method is proposed for integrated circuit (IC) visual positioning. The original generalized Hough transform (GHT) is robust to external noise; however, it is not suitable for visual positioning of IC chips due to the four-dimensionality (4D) of parameter space which leads to the substantial storage requirement and high computational complexity. The proposed LI-GHT method can reduce the dimensionality of parameter space to 2D thanks to the rotational invariance of local invariant geometric feature and it can estimate the accuracy position and rotation angle of IC chips in real-time under noise and blur influence. The experiment results show that the proposed LI-GHT can estimate position and rotation angle of IC chips with high accuracy and fast speed. The proposed LI-GHT algorithm was implemented in IC visual positioning system of radio frequency identification (RFID) packaging equipment.

Computer Aided Diagnosis of Polycystic Kidney Disease Using ANN

Many inherited diseases and non-hereditary disorders are common in the development of renal cystic diseases. Polycystic kidney disease (PKD) is a disorder developed within the kidneys in which grouping of cysts filled with water like fluid. PKD is responsible for 5-10% of end-stage renal failure treated by dialysis or transplantation. New experimental models, application of molecular biology techniques have provided new insights into the pathogenesis of PKD. Researchers are showing keen interest for developing an automated system by applying computer aided techniques for the diagnosis of diseases. In this paper a multilayered feed forward neural network with one hidden layer is constructed, trained and tested by applying back propagation learning rule for the diagnosis of PKD based on physical symptoms and test results of urinalysis collected from the individual patients. The data collected from 50 patients are used to train and test the network. Among these samples, 75% of the data used for training and remaining 25% of the data are used for testing purpose. Further, this trained network is used to implement for new samples. The output results in normality and abnormality of the patient.

Representing Data without Lost Compression Properties in Time Series: A Review

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Usability Guidelines for Arab E-government Websites

The website developer and designer should follow usability guidelines to provide a user-friendly interface. Many guidelines and heuristics have been developed by previous studies to help both the developer and designer in this task, but E-government websites are special cases that require specialized guidelines. This paper introduces a set of 18 guidelines for evaluating the usability of e-government websites in general and Arabic e-government websites specifically, along with a check list of how to apply them. The validity and effectiveness of these guidelines were evaluated against a variety of user characteristics. The results indicated that the proposed set of guidelines can be used to identify qualitative similarities and differences with user testing and that the new set is best suited for evaluating general and e-governmental usability.

Parallel Text Processing: Alignment of Indonesian to Javanese Language

Parallel text alignment is proposed as a way of aligning bahasa Indonesia to words in Javanese. Since the one-to-one word translator does not have the facility to translate pragmatic aspects of Javanese, the parallel text alignment model described uses a phrase pair combination. The algorithm aligns the parallel text automatically from the beginning to the end of each sentence. Even though the results of the phrase pair combination outperform the previous algorithm, it is still inefficient. Recording all possible combinations consume more space in the database and time consuming. The original algorithm is modified by applying the edit distance coefficient to improve the data-storage efficiency. As a result, the data-storage consumption is 90% reduced as well as its learning period (42s).

A Java Based Discrete Event Simulation Library

This paper describes important features of JAPROSIM, a free and open source simulation library implemented in Java programming language. It provides a framework for building discrete event simulation models. The process interaction world view adopted by JAPROSIM is discussed. We present the architecture and major components of the simulation library. A pedagogical example is given in order to illustrate how to use JAPROSIM for building discrete event simulation models. Further motivations are discussed and suggestions for improving our work are given.

Design and Implementation of an Image Based System to Enhance the Security of ATM

In this paper, an image-receiving system was designed and implemented through optimization of object detection algorithms using Haar features. This optimized algorithm served as face and eye detection separately. Then, cascading them led to a clear image of the user. Utilization of this feature brought about higher security by preventing fraud. This attribute results from the fact that services will be given to the user on condition that a clear image of his face has already been captured which would exclude the inappropriate person. In order to expedite processing and eliminating unnecessary ones, the input image was compressed, a motion detection function was included in the program, and detection window size was confined.

A New Hybrid K-Mean-Quick Reduct Algorithm for Gene Selection

Feature selection is a process to select features which are more informative. It is one of the important steps in knowledge discovery. The problem is that all genes are not important in gene expression data. Some of the genes may be redundant, and others may be irrelevant and noisy. Here a novel approach is proposed Hybrid K-Mean-Quick Reduct (KMQR) algorithm for gene selection from gene expression data. In this study, the entire dataset is divided into clusters by applying K-Means algorithm. Each cluster contains similar genes. The high class discriminated genes has been selected based on their degree of dependence by applying Quick Reduct algorithm to all the clusters. Average Correlation Value (ACV) is calculated for the high class discriminated genes. The clusters which have the ACV value as 1 is determined as significant clusters, whose classification accuracy will be equal or high when comparing to the accuracy of the entire dataset. The proposed algorithm is evaluated using WEKA classifiers and compared. The proposed work shows that the high classification accuracy.

Pectoral Muscles Suppression in Digital Mammograms Using Hybridization of Soft Computing Methods

Breast region segmentation is an essential prerequisite in computerized analysis of mammograms. It aims at separating the breast tissue from the background of the mammogram and it includes two independent segmentations. The first segments the background region which usually contains annotations, labels and frames from the whole breast region, while the second removes the pectoral muscle portion (present in Medio Lateral Oblique (MLO) views) from the rest of the breast tissue. In this paper we propose hybridization of Connected Component Labeling (CCL), Fuzzy, and Straight line methods. Our proposed methods worked good for separating pectoral region. After removal pectoral muscle from the mammogram, further processing is confined to the breast region alone. To demonstrate the validity of our segmentation algorithm, it is extensively tested using over 322 mammographic images from the Mammographic Image Analysis Society (MIAS) database. The segmentation results were evaluated using a Mean Absolute Error (MAE), Hausdroff Distance (HD), Probabilistic Rand Index (PRI), Local Consistency Error (LCE) and Tanimoto Coefficient (TC). The hybridization of fuzzy with straight line method is given more than 96% of the curve segmentations to be adequate or better. In addition a comparison with similar approaches from the state of the art has been given, obtaining slightly improved results. Experimental results demonstrate the effectiveness of the proposed approach.

Query Reformulation Guided by External Resource for Information Retrieval

Reformulating the user query is a technique that aims to improve the performance of an Information Retrieval System (IRS) in terms of precision and recall. This paper tries to evaluate the technique of query reformulation guided by an external resource for Arabic texts. To do this, various precision and recall measures were conducted and two corpora with different external resources like Arabic WordNet (AWN) and the Arabic Dictionary (thesaurus) of Meaning (ADM) were used. Examination of the obtained results will allow us to measure the real contribution of this reformulation technique in improving the IRS performance.

Comparative Study - Three Artificial Intelligence Techniques for Rain Domain in Precipitation Forecast

Precipitation forecast is important in avoid incident of natural disaster which can cause loss in involved area. This review paper involves three techniques from artificial intelligence namely logistic regression, decisions tree, and random forest which used in making precipitation forecast. These combination techniques through VAR model in finding advantages and strength for every technique in forecast process. Data contains variables from rain domain. Adaptation of artificial intelligence techniques involved on rain domain enables the process to be easier and systematic for precipitation forecast.