Balanced k-Anonymization

The technique of k-anonymization has been proposed to obfuscate private data through associating it with at least k identities. This paper investigates the basic tabular structures that underline the notion of k-anonymization using cell suppression. These structures are studied under idealized conditions to identify the essential features of the k-anonymization notion. We optimize data kanonymization through requiring a minimum number of anonymized values that are balanced over all columns and rows. We study the relationship between the sizes of the anonymized tables, the value k, and the number of attributes. This study has a theoretical value through contributing to develop a mathematical foundation of the kanonymization concept. Its practical significance is still to be investigated.

A Decision Support Tool for Evaluating Mobility Projects

Success is a European project that will implement several clean transport offers in three European cities and evaluate the environmental impacts. The goal of these measures is to improve urban mobility or the displacement of residents inside cities. For e.g. park and ride, electric vehicles, hybrid bus and bike sharing etc. A list of 28 criteria and 60 measures has been established for evaluation of these transport projects. The evaluation criteria can be grouped into: Transport, environment, social, economic and fuel consumption. This article proposes a decision support system based that encapsulates a hybrid approach based on fuzzy logic, multicriteria analysis and belief theory for the evaluation of impacts of urban mobility solutions. A web-based tool called DeSSIA (Decision Support System for Impacts Assessment) has been developed that treats complex data. The tool has several functionalities starting from data integration (import of data), evaluation of projects and finishes by graphical display of results. The tool development is based on the concept of MVC (Model, View, and Controller). The MVC is a conception model adapted to the creation of software's which impose separation between data, their treatment and presentation. Effort is laid on the ergonomic aspects of the application. It has codes compatible with the latest norms (XHTML, CSS) and has been validated by W3C (World Wide Web Consortium). The main ergonomic aspect focuses on the usability of the application, ease of learning and adoption. By the usage of technologies such as AJAX (XML and Java Script asynchrones), the application is more rapid and convivial. The positive points of our approach are that it treats heterogeneous data (qualitative, quantitative) from various information sources (human experts, survey, sensors, model etc.).

Observations about the Principal Components Analysis and Data Clustering Techniques in the Study of Medical Data

The medical data statistical analysis often requires the using of some special techniques, because of the particularities of these data. The principal components analysis and the data clustering are two statistical methods for data mining very useful in the medical field, the first one as a method to decrease the number of studied parameters, and the second one as a method to analyze the connections between diagnosis and the data about the patient-s condition. In this paper we investigate the implications obtained from a specific data analysis technique: the data clustering preceded by a selection of the most relevant parameters, made using the principal components analysis. Our assumption was that, using the principal components analysis before data clustering - in order to select and to classify only the most relevant parameters – the accuracy of clustering is improved, but the practical results showed the opposite fact: the clustering accuracy decreases, with a percentage approximately equal with the percentage of information loss reported by the principal components analysis.

Application of SDS/LABS in Recovery Improvement from Fractured Models

This work concerns on experimentally investigation of surfactant flooding in fractured porous media. In this study a series of water and surfactant injection processes were performed on micromodels initially saturated with a heavy crude oil. Eight fractured glass micromodels were used to illustrate effects of surfactant types and concentrations on oil recovery efficiency in presence of fractures with different properties i.e. fracture orientation, length and number of fractures. Two different surfactants with different concentrations were tested. The results showed that surfactant flooding would be more efficient by using SDS surfactant aqueous solution and also by locating injection well in a proper position respect to fracture properties. This study demonstrates different physical and chemical conditions that affect the efficiency of this method of enhanced oil recovery.

The Classification Model for Hard Disk Drive Functional Tests under Sparse Data Conditions

This paper proposed classification models that would be used as a proxy for hard disk drive (HDD) functional test equitant which required approximately more than two weeks to perform the HDD status classification in either “Pass" or “Fail". These models were constructed by using committee network which consisted of a number of single neural networks. This paper also included the method to solve the problem of sparseness data in failed part, which was called “enforce learning method". Our results reveal that the constructed classification models with the proposed method could perform well in the sparse data conditions and thus the models, which used a few seconds for HDD classification, could be used to substitute the HDD functional tests.

E-Voting: A Trustworthiness In Democratic; A View from Technology, Political and Social Issue

A trustworthy voting process in democratic is important that each vote is recorded with accuracy and impartiality. The accuracy and impartiality are tallied in high rate with biometric system. One of the sign is a fingerprint. Fingerprint recognition is still a challenging problem, because of the distortions among the different impression of the same finger. Because of the trustworthy of biometric voting technologies, it may give a great effect on numbers of voter-s participation and outcomes of the democratic process. Hence in this study, the authors are interested in designing and analyzing the Electronic Voting System and the participation of the users. The system is based on the fingerprint minutiae with the addition of person ID number. This is in order to enhance the accuracy and speed of the voting process. The new design is analyzed by conducting pilot election among a class of students for selecting their representative.

The Bipartite Ramsey Numbers b(C2m; C2n)

Given bipartite graphs H1 and H2, the bipartite Ramsey number b(H1;H2) is the smallest integer b such that any subgraph G of the complete bipartite graph Kb,b, either G contains a copy of H1 or its complement relative to Kb,b contains a copy of H2. It is known that b(K2,2;K2,2) = 5, b(K2,3;K2,3) = 9, b(K2,4;K2,4) = 14 and b(K3,3;K3,3) = 17. In this paper we study the case that both H1 and H2 are even cycles, prove that b(C2m;C2n) ≥ m + n - 1 for m = n, and b(C2m;C6) = m + 2 for m ≥ 4.

Faculty Stress at Higher Education: A Study on the Business Schools of Pakistan

Job stress is one of the most important concepts for the today-s corporate as well as institutional world. The current study is conducted to identify the causes of faculty stress at Higher Education in Pakistan. For the purpose, Public & Private Business Schools of Punjab is selected as representative of Pakistan. A sample of 300 faculty members (214 males, 86 females) responded to the survey. Regression analysis shows that the Workload, Student Related issues and Role Conflicts are the major sources contributing significantly towards producing stress. The study also revealed that Private sector faculty members experienced more stress as compared to faculty in Public sector Business Schools. Moreover, females, younger ages, lower designation & low qualification faculty members experience more stress as compared to males, older ages, higher designation and high qualification. The study yield many significant results for the policy makers of Business Institutions.

Binding of miR398 to mRNA of Chaperone and Superoxide Dismutase Genes in Plants

Among all microRNAs (miRNAs) in 12 plant species investigated in this study, only miR398 targeted the copper chaperone for superoxide dismutase (CCS). The nucleotide sequences of miRNA binding sites were located in the mRNA protein-coding sequence (CDS) and were highly homologous. These binding sites in CCS mRNA encoded a conservative GDLGTL hexapeptide. The binding sites for miR398 in the CDS of superoxide dismutase 1 mRNA encoded GDLGN pentapeptide. The conservative miR398 binding site located in the CDS of superoxide dismutase 2 mRNA encoded the GDLGNI hexapeptide. The miR398 binding site in the CDS of superoxide dismutase 3 mRNA encoded the GDLGNI or GDLGNV hexapeptide. Gene expression of the entire superoxide dismutase family in the studied plant species was regulated only by miR398. All members of the miR398 family, i.e. miR398a,b,c were connected to one site for each CuZnSOD and chaperone mRNA.

Hardware Centric Machine Vision for High Precision Center of Gravity Calculation

We present a hardware oriented method for real-time measurements of object-s position in video. The targeted application area is light spots used as references for robotic navigation. Different algorithms for dynamic thresholding are explored in combination with component labeling and Center Of Gravity (COG) for highest possible precision versus Signal-to-Noise Ratio (SNR). This method was developed with a low hardware cost in focus having only one convolution operation required for preprocessing of data.

Face Localization Using Illumination-dependent Face Model for Visual Speech Recognition

A robust still image face localization algorithm capable of operating in an unconstrained visual environment is proposed. First, construction of a robust skin classifier within a shifted HSV color space is described. Then various filtering operations are performed to better isolate face candidates and mitigate the effect of substantial non-skin regions. Finally, a novel Bhattacharyya-based face detection algorithm is used to compare candidate regions of interest with a unique illumination-dependent face model probability distribution function approximation. Experimental results show a 90% face detection success rate despite the demands of the visually noisy environment.

Fuzzy EOQ Models for Deteriorating Items with Stock Dependent Demand and Non-Linear Holding Costs

This paper deals with infinite time horizon fuzzy Economic Order Quantity (EOQ) models for deteriorating items with  stock dependent demand rate and nonlinear holding costs by taking deterioration rate θ0 as a triangular fuzzy number  (θ0 −δ 1, θ0, θ0 +δ 2), where 1 2 0 0

Prerequisites to Increase the Purchase Intent fora Socially Responsible Company –Development of a Scale

Increasing attention has been given in academia to the concept of corporate social responsibility. Also, the number of companies that undertake social responsibility initiatives has been boosting day by day since behaving in a socially responsible manner brings a lot to the companies. Literature provides various benefits of social responsibility and under which situations these benefits could be realized. However, most of these studies focus on one aspect of the consequences of behaving in a socially responsible manner and there is no study that unifies the conditions that a company should fulfill to make customers prefer its brand. This study aims to fill this gap. More specifically, the purpose of this study is to identify the conditions that a socially responsible company should fulfill in order to attract customers. To this end, a scale is developed and its reliability and validity is assessed through the method of Multitrait- Multimethod Matrix.

Simulation of Natural Convection Flow in an Inclined open Cavity using Lattice Boltzmann Method

In this paper effects of inclination angle on natural convection flow in an open cavity has been analyzed with Lattice Boltzmann Method (LBM).The angle of inclination varied from θ= - 45° to 45° with 15° intervals. Study has been conducted for Rayleigh numbers (Ra) 104 to 106. The comparisons show that the average Nusselt number increases with growth of Rayleigh number and the average Nusselt number increase as inclination angles increases at Ra=104.At Ra=105 and Ra=106 the average Nusselt number enhance as inclination angels varied from θ= -45° to θ= 0° and decrease as inclination angels increase in θ= 0° to θ= 45°.

A Study of Relationship between Mountaineering Participation Motivation and Risk Perception

The main purpose of this study is to analyze climbers involved in motivation and risk perception and analysis of the predictive ability of the risk perception "mountaineering" involved in motivation. This study used questionnaires, to have to climb the 3000m high mountain in Taiwan climbers object to carry out an investigation in order to non-random sampling, a total of 231 valid questionnaires were. After statistical analysis, the study found that: 1. Climbers the highest climbers involved in motivation "to enjoy the natural beauty of the fun. 2 climbers for climbers "risk perception" the highest: the natural environment of risk. 3. Climbers “seeking adventure stimulate", “competence achievement" motivation highly predictive of risk perception. Based on these findings, this study not only practices the recommendations of the outdoor leisure industry, and also related research proposals for future researchers.

Wind Energy Development in the African Great Lakes Region to Supplement the Hydroelectricity in the Locality: A Case Study from Tanzania

The African Great Lakes Region refers to the zone around lakes Victoria, Tanganyika, Albert, Edward, Kivu, and Malawi. The main source of electricity in this region is hydropower whose systems are generally characterized by relatively weak, isolated power schemes, poor maintenance and technical deficiencies with limited electricity infrastructures. Most of the hydro sources are rain fed, and as such there is normally a deficiency of water during the dry seasons and extended droughts. In such calamities fossil fuels sources, in particular petroleum products and natural gas, are normally used to rescue the situation but apart from them being nonrenewable, they also release huge amount of green house gases to our environment which in turn accelerates the global warming that has at present reached an amazing stage. Wind power is ample, renewable, widely distributed, clean, and free energy source that does not consume or pollute water. Wind generated electricity is one of the most practical and commercially viable option for grid quality and utility scale electricity production. However, the main shortcoming associated with electric wind power generation is fluctuation in its output both in space and time. Before making a decision to establish a wind park at a site, the wind speed features there should therefore be known thoroughly as well as local demand or transmission capacity. The main objective of this paper is to utilise monthly average wind speed data collected from one prospective site within the African Great Lakes Region to demonstrate that the available wind power there is high enough to generate electricity. The mean monthly values were calculated from records gathered on hourly basis for a period of 5 years (2001 to 2005) from a site in Tanzania. The documentations that were collected at a height of 2 m were projected to a height of 50 m which is the standard hub height of wind turbines. The overall monthly average wind speed was found to be 12.11 m/s whereas June to November was established to be the windy season as the wind speed during the session is above the overall monthly wind speed. The available wind power density corresponding to the overall mean monthly wind speed was evaluated to be 1072 W/m2, a potential that is worthwhile harvesting for the purpose of electric generation.

Trapping Efficiency of Diesel Particles Through a Square Duct

Diesel Engines emit complex mixtures of inorganic and organic compounds in the form of both solid and vapour phase particles. Most of the particulates released are ultrafine nanoparticles which are detrimental to human health and can easily enter the body by respiration. The emissions standards on particulate matter release from diesel engines are constantly upgraded within the European Union and with future regulations based on the particles numbers released instead of merely mass, the need for effective aftertreatment devices will increase. Standard particulate filters in the form of wall flow filters can have problems with high soot accumulation, producing a large exhaust backpressure. A potential solution would be to combine the standard filter with a flow through filter to reduce the load on the wall flow filter. In this paper soot particle trapping has been simulated in different continuous flow filters of monolithic structure including the use of promoters, at laminar flow conditions. An Euler Lagrange model, the discrete phase model in Ansys used with user defined functions for forces acting on particles. A method to quickly screen trapping of 5 nm and 10 nm particles in different catalysts designs with tracers was also developed. Simulations of square duct monoliths with promoters show that the strength of the vortices produced are not enough to give a high amount of particle deposition on the catalyst walls. The smallest particles in the simulations, 5 and 10 nm particles were trapped to a higher extent, than larger particles up to 1000 nm, in all studied geometries with the predominant deposition mechanism being Brownian diffusion. The comparison of the different filters designed with a wall flow filter does show that the options for altering a design of a flow through filter, without imposing a too large pressure drop penalty are good.

Translation of Phraseological Units in Abai Kunanbayev-s Poems

Abai Kunanbayev (1845-1904) was a great Kazakh poet, composer and philosopher. Abai's main contribution to Kazakh culture and folklore lies in his poetry, which expresses great nationalism and grew out of Kazakh folk culture. Before him, most Kazakh poetry was oral, echoing the nomadic habits of the people of the Kazakh steppes. We want to introduce to abroad our country, its history, tradition and culture. We can introduce it only through translations. Only by reading the Kazakh works can foreign people know who are kazakhs, the style of their life, their thoughts and so on. All information comes only through translation. The main requirement to a good translation is that it should be natural or that it should read as smoothly as the original. Literary translation should be adequate, should follow the original to the fullest. Translators have to be loyal to original text, they shouldn-t give the way to liberty.

A Survey on Performance Tools for OpenMP

Advances in processors architecture, such as multicore, increase the size of complexity of parallel computer systems. With multi-core architecture there are different parallel languages that can be used to run parallel programs. One of these languages is OpenMP which embedded in C/Cµ or FORTRAN. Because of this new architecture and the complexity, it is very important to evaluate the performance of OpenMP constructs, kernels, and application program on multi-core systems. Performance is the activity of collecting the information about the execution characteristics of a program. Performance tools consists of at least three interfacing software layers, including instrumentation, measurement, and analysis. The instrumentation layer defines the measured performance events. The measurement layer determines what performance event is actually captured and how it is measured by the tool. The analysis layer processes the performance data and summarizes it into a form that can be displayed in performance tools. In this paper, a number of OpenMP performance tools are surveyed, explaining how each is used to collect, analyse, and display data collection.

A Detailed Timber Harvest Simulator Coupled with 3-D Visualization

In today-s world, the efficient utilization of wood resources comes more and more to the mind of forest owners. It is a very complex challenge to ensure an efficient harvest of the wood resources. This is one of the scopes the project “Virtual Forest II" addresses. Its core is a database with data about forests containing approximately 260 million trees located in North Rhine-Westphalia (NRW). Based on this data, tree growth simulations and wood mobilization simulations can be conducted. This paper focuses on the latter. It describes a discrete-event-simulation with an attached 3-D real time visualization which simulates timber harvest using trees from the database with different crop resources. This simulation can be displayed in 3-D to show the progress of the wood crop. All the data gathered during the simulation is presented as a detailed summary afterwards. This summary includes cost-benefit calculations and can be compared to those of previous runs to optimize the financial outcome of the timber harvest by exchanging crop resources or modifying their parameters.