Improving Detection of Illegitimate Scores and Assessment in Most Advantageous Tenders

Adopting Most Advantageous Tender (MAT) for the government procurement projects has become popular in Taiwan. As time pass by, the problems of MAT has appeared gradually. People condemn two points that are the result might be manipulated by a single committee member’s partiality and how to make a fair decision when the winner has two or more. Arrow’s Impossibility Theorem proposed that the best scoring method should meet the four reasonable criteria. According to these four criteria this paper constructed an “Illegitimate Scores Checking Scheme” for a scoring method and used the scheme to find out the illegitimate of the current evaluation method of MAT. This paper also proposed a new scoring method that is called the “Standardizing Overall Evaluated Score Method”. This method makes each committee member’s influence tend to be identical. Thus, the committee members can scoring freely according to their partiality without losing the fairness. Finally, it was examined by a large-scale simulation, and the experiment revealed that the it improved the problem of dictatorship and perfectly avoided the situation of cyclical majorities, simultaneously. This result verified that the Standardizing Overall Evaluated Score Method is better than any current evaluation method of MAT.

Music-Inspired Harmony Search Algorithm for Fixed Outline Non-Slicing VLSI Floorplanning

Floorplanning plays a vital role in the physical design process of Very Large Scale Integrated (VLSI) chips. It is an essential design step to estimate the chip area prior to the optimized placement of digital blocks and their interconnections. Since VLSI floorplanning is an NP-hard problem, many optimization techniques were adopted in the literature. In this work, a music-inspired Harmony Search (HS) algorithm is used for the fixed die outline constrained floorplanning, with the aim of reducing the total chip area. HS draws inspiration from the musical improvisation process of searching for a perfect state of harmony. Initially, B*-tree is used to generate the primary floorplan for the given rectangular hard modules and then HS algorithm is applied to obtain an optimal solution for the efficient floorplan. The experimental results of the HS algorithm are obtained for the MCNC benchmark circuits.

Business Domain Modelling Using an Integrated Framework

This paper presents an application of a “Systematic Soft Domain Driven Design Framework” as a soft systems approach to domain-driven design of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework have been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, and a real case study “Information Retrieval System for academic research” is used, in this paper, to show further practice and evaluation of the framework in different business domain. We argue that there are advantages from combining and using techniques from different methodologies in this way for business domain modelling. The framework is overviewed and justified as multimethodology using Mingers multimethodology ideas.

Vaccinated Susceptible Infected and Recovered (VSIR) Mathematical Model to Study the Effect of Bacillus Calmette-Guerin (BCG) Vaccine and the Disease Stability Analysis

Tuberculosis (TB) remains a leading cause of infectious mortality. It is primarily transmitted by the respiratory route, individuals with active disease may infect others through airborne particles which releases when they cough, talk, or sing and subsequently inhale by others. In order to study the effect of the Bacilli Calmette-Guerin (BCG) vaccine after vaccination of TB patient, a Vaccinated Susceptible Infected and Recovered (VSIR) mathematical model is being developed to achieve the desired objectives. The mathematical model, so developed, shall be used to quantify the effect of BCG Vaccine to protect the immigrant young adult person. Moreover, equations are to be established for the disease endemic and free equilibrium states and subsequently utilized in disease stability analysis. The stability analysis will give a complete picture of disease annihilation from the total population if the total removal rate from the infectious group should be greater than total number of dormant infections produced throughout infectious period.

Comparative Review of Modulation Techniques for Harmonic Minimization in Multilevel Inverter

This paper proposed the comparison made between Multi-Carrier Pulse Width Modulation, Sinusoidal Pulse Width Modulation and Selective Harmonic Elimination Pulse Width Modulation technique for minimization of Total Harmonic Distortion in Cascaded H-Bridge Multi-Level Inverter. In Multicarrier Pulse Width Modulation method by using Alternate Position of Disposition scheme for switching pulse generation to Multi-Level Inverter. Another carrier based approach; Sinusoidal Pulse Width Modulation method is also implemented to define the switching pulse generation system in the multi-level inverter. In Selective Harmonic Elimination method using Genetic Algorithm and Particle Swarm Optimization algorithm for define the required switching angles to eliminate low order harmonics from the inverter output voltage waveform and reduce the total harmonic distortion value. So, the results validate that the Selective Harmonic Elimination Pulse Width Modulation method does capably eliminate a great number of precise harmonics and minimize the Total Harmonic Distortion value in output voltage waveform in compared with Multi-Carrier Pulse Width Modulation method, Sinusoidal Pulse Width Modulation method. In this paper, comparison of simulation results shows that the Selective Harmonic Elimination method can attain optimal harmonic minimization solution better than Multi-Carrier Pulse Width Modulation method, Sinusoidal Pulse Width Modulation method.

Fuzzy Inference System Based Unhealthy Region Classification in Plant Leaf Image

In addition to environmental parameters like rain, temperature diseases on crop is a major factor which affects production quality & quantity of crop yield. Hence disease management is a key issue in agriculture. For the management of disease, it needs to be detected at early stage. So, treat it properly & control spread of the disease. Now a day, it is possible to use the images of diseased leaf to detect the type of disease by using image processing techniques. This can be achieved by extracting features from the images which can be further used with classification algorithms or content based image retrieval systems. In this paper, color image is used to extract the features such as mean and standard deviation after the process of region cropping. The selected features are taken from the cropped image with different image size samples. Then, the extracted features are taken in to the account for classification using Fuzzy Inference System (FIS).

Protective Effect of L-Carnitine against Gentamicin-Induced Nephrotoxicity in Rats

This study aimed to determine the possible protective effects of L‐carnitine against gentamicin‐induced nephrotoxicity. Forty male albino rats were divided into 4 groups (10 rats each); Group 1: normal control, group 2: induced nephrotoxicity (gentamicin 50 mg/kg/day S.C; 8 days), group 3: treated with L‐ carnitine (40 mg/kg/d SC for 12 days) and group 4: treated with L‐ carnitine 4 days before and for 8 days in concomitant with gentamicin. Gentamicin‐induced nephrotoxicity (group 2): caused significant increase in serum urea, creatinine, urinary N‐acetyl‐B‐D‐ glucosaminidase (NAG), gamma glutamyl transpeptidase (GGT), urinary total protein and kidney tissue malondialdehyde (MDA) with significant decrease in serum superoxide dismutase (SOD), serum catalase and creatinine clearance and marked tubular necrosis in the proximal convoluted tubules with interruption in the basement membrane around the necrotic tubule compared to the normal control group. L‐carnitine 4 days before and for 8 days in concomitant with gentamicin (group 4) offered marked decrease in serum urea, serum creatinine, urinary NAG, urinary GGT, urinary proteins and kidney tissue MDA, with marked increase in serum SOD, serum catalase and creatinine clearance with marked improvement in the tubular damage compared to gentamicin‐induced nephrotoxicity group. L‐carnitine administered for 12 days produced no change in the parameters mentioned above as compared to the normal control group. In conclusion: L‐carnitine could reduce most of the biochemical parameters and also improve the histopathological features of kidney asscociated with gentamicin induced‐nephrotoxicity. 

Tool for Metadata Extraction and Content Packaging as Endorsed in OAIS Framework

Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.  

New Security Approach of Confidential Resources in Hybrid Clouds

Nowadays, cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime. It also provides an optimized and secured access to the resources and gives more security for the data which is stored in the platform. However, some companies do not trust Cloud providers, they think that providers can access and modify some confidential data such as bank accounts. Many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, but, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some operations on the data before sending them to the provider Cloud in the objective to make them unreadable. The principal idea is to allow user how it can protect his data with his own methods. In this paper, we are going to demonstrate our approach and prove that is more efficient in term of execution time than some existing methods. This work aims at enhancing the quality of service of providers and ensuring the trust of the customers. 

Better Perception of Low Resolution Images Using Wavelet Interpolation Techniques

High resolution images are always desired as they contain the more information and they can better represent the original data. So, to convert the low resolution image into high resolution interpolation is done. The quality of such high resolution image depends on the interpolation function and is assessed in terms of sharpness of image. This paper focuses on Wavelet based Interpolation Techniques in which an input image is divided into subbands. Each subband is processed separately and finally combined the processed subbandsto get the super resolution image. 

The Application of FSI Techniques in Modeling of Realist Pulmonary Systems

The modeling lung respiratory system that has complex anatomy and biophysics presents several challenges including tissue-driven flow patterns and wall motion. Also, the pulmonary lung system because of that they stretch and recoil with each breath, has not static walls and structures. The direct relationship between air flow and tissue motion in the lung structures naturally prefers an FSI simulation technique. Therefore, in order to toward the realistic simulation of pulmonary breathing mechanics the development of a coupled FSI computational model is an important step. A simple but physiologically relevant three-dimensional deep long geometry is designed and fluid-structure interaction (FSI) coupling technique is utilized for simulating the deformation of the lung parenchyma tissue that produces airflow fields. The real understanding of respiratory tissue system as a complex phenomenon have been investigated with respect to respiratory patterns, fluid dynamics and tissue viscoelasticity and tidal breathing period. 

Temporal Variation of PM10-Bound Benzo(a)pyrene Concentration in an Urban and a Rural Site of Northwestern Hungary

The main objective of this study was to assess the annual concentration and seasonal variation of benzo(a)pyrene (BaP) associated with PM10 in an urban site of Győr and in a rural site of Sarród in the sampling period of 2008–2012. A total of 280 PM10 aerosol samples were collected in each sampling site and analyzed for BaP by gas chromatography method. The BaP concentrations ranged from undetected to 8 ng/m3 with the mean value of 1.01 ng/m3 in the sampling site of Győr, and from undetected to 4.07 ng/m3 with the mean value of 0.52 ng/m3 in the sampling site of Sarród, respectively. Relatively higher concentrations of BaP were detected in samples collected in both sampling sites in the heating seasons compared with non-heating periods. The annual mean BaP concentrations were comparable with the published data of different other Hungarian sites.

A Validation Technique for Integrated Ontologies

Ontology validation is an important part of web applications’ development, where knowledge integration and ontological reasoning play a fundamental role. It aims to ensure the consistency and correctness of ontological knowledge and to guarantee that ontological reasoning is carried out in a meaningful way. Existing approaches to ontology validation address more or less specific validation issues, but the overall process of validating web ontologies has not been formally established yet. As the size and the number of web ontologies continue to grow, more web applications’ developers will rely on the existing repository of ontologies rather than develop ontologies from scratch. If an application utilizes multiple independently created ontologies, their consistency must be validated and eventually adjusted to ensure proper interoperability between them. This paper presents a validation technique intended to test the consistency of independent ontologies utilized by a common application.

Establishing Pairwise Keys Using Key Predistribution Schemes for Sensor Networks

Designing cost-efficient, secure network protocols for Wireless Sensor Networks (WSNs) is a challenging problem because sensors are resource-limited wireless devices. Security services such as authentication and improved pairwise key establishment are critical to high efficient networks with sensor nodes. For sensor nodes to correspond securely with each other efficiently, usage of cryptographic techniques is necessary. In this paper, two key predistribution schemes that enable a mobile sink to establish a secure data-communication link, on the fly, with any sensor nodes. The intermediate nodes along the path to the sink are able to verify the authenticity and integrity of the incoming packets using a predicted value of the key generated by the sender’s essential power. The proposed schemes are based on the pairwise key with the mobile sink, our analytical results clearly show that our schemes perform better in terms of network resilience to node capture than existing schemes if used in wireless sensor networks with mobile sinks.

Evaluation of Minimization of Moment Ratio Method by Physical Modeling

Under active stress conditions, a rigid cantilever retaining wall tends to rotate about a pivot point located within the embedded depth of the wall. For purely granular and cohesive soils, a methodology was previously reported called minimization of moment ratio to determine the location of the pivot point of rotation. The usage of this new methodology is to estimate the rotational stability safety factor. Moreover, the degree of improvement required in a backfill to get a desired safety factor can be estimated by the concept of the shear strength demand. In this article, the accuracy of this method for another type of cantilever walls called Contiguous Bored Pile (CBP) retaining wall is evaluated by using physical modeling technique. Based on observations, the results of moment ratio minimization method are in good agreement with the results of the carried out physical modeling.

An Automatic Bayesian Classification System for File Format Selection

This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.

How Does Improving the Existing DSL Infrastructure Influence the Expansion of Fiber Technology?

Experts, enterprises and operators expect that the bandwidth request will increase up to rates of 100 to 1,000 Mbps within several years. Therefore the most important question is which technology shall satisfy the future consumer broadband demands. Currently the consensus is, that the fiber technology has the best technical characteristics to achieve such the high bandwidth rates. But fiber technology is so far very cost-intensive and resource consuming. To avoid these investments, operators are concentrating to upgrade the existing copper and hybrid fiber coax infrastructures. This work presents a comparison of the copper and fiber technologies including an overview about the current German broadband market. Both technologies are reviewed in the terms of demand, willingness to pay and economic efficiency in connection with the technical characteristics.

Off-Line Detection of “Pannon Wheat” Milling Fractions by Near-Infrared Spectroscopic Methods

The aim of this investigation is to elaborate nearinfrared methods for testing and recognition of chemical components and quality in “Pannon wheat” allied (i.e. true to variety or variety identified) milling fractions as well as to develop spectroscopic methods following the milling processes and evaluate the stability of the milling technology by different types of milling products and according to sampling times, respectively. These wheat categories produced under industrial conditions where samples were collected versus sampling time and maximum or minimum yields. The changes of the main chemical components (such as starch, protein, lipid) and physical properties of fractions (particle size) were analysed by dispersive spectrophotometers using visible (VIS) and near-infrared (NIR) regions of the electromagnetic radiation. Close correlation were obtained between the data of spectroscopic measurement techniques processed by various chemometric methods (e.g. principal component analysis [PCA], cluster analysis [CA]) and operation condition of milling technology. It is obvious that NIR methods are able to detect the deviation of the yield parameters and differences of the sampling times by a wide variety of fractions, respectively. NIR technology can be used in the sensitive monitoring of milling technology.

Classification of Political Affiliations by Reduced Number of Features

By the evolvement in technology, the way of expressing opinions switched direction to the digital world. The domain of politics, as one of the hottest topics of opinion mining research, merged together with the behavior analysis for affiliation determination in texts, which constitutes the subject of this paper. This study aims to classify the text in news/blogs either as Republican or Democrat with the minimum number of features. As an initial set, 68 features which 64 were constituted by Linguistic Inquiry and Word Count (LIWC) features were tested against 14 benchmark classification algorithms. In the later experiments, the dimensions of the feature vector reduced based on the 7 feature selection algorithms. The results show that the “Decision Tree”, “Rule Induction” and “M5 Rule” classifiers when used with “SVM” and “IGR” feature selection algorithms performed the best up to 82.5% accuracy on a given dataset. Further tests on a single feature and the linguistic based feature sets showed the similar results. The feature “Function”, as an aggregate feature of the linguistic category, was found as the most differentiating feature among the 68 features with the accuracy of 81% in classifying articles either as Republican or Democrat.

Production Plan and Technological Variants Optimization by Goal Programming Methods

In this paper, the goal programming methodology for solving multiple objective problem of the technological variants and production plan optimization has been applied. The optimization criteria are determined and the multiple objective linear programming model for solving a problem of the technological variants and production plan optimization is formed and solved. Then the obtained results are analysed. The obtained results point out to the possibility of efficient application of the goal programming methodology in solving the problem of the technological variants and production plan optimization. The paper points out on the advantages of the application of the goal programming methodology compare to the Surrogat Worth Trade-off method in solving this problem.