Wireless Sensor Network to Help Low Incomes Farmers to Face Drought Impacts

This research presents the main ideas to implement an intelligent system composed by communicating wireless sensors measuring environmental data linked to drought indicators (such as air temperature, soil moisture , etc...). On the other hand, the setting up of a spatio temporal database communicating with a Web mapping application for a monitoring in real time in activity 24:00 /day, 7 days/week is proposed to allow the screening of the drought parameters time evolution and their extraction. Thus this system helps detecting surfaces touched by the phenomenon of drought. Spatio-temporal conceptual models seek to answer the users who need to manage soil water content for irrigating or fertilizing or other activities pursuing crop yield augmentation. Effectively, spatiotemporal conceptual models enable users to obtain a diagram of readable and easy data to apprehend. Based on socio-economic information, it helps identifying people impacted by the phenomena with the corresponding severity especially that this information is accessible by farmers and stakeholders themselves. The study will be applied in Siliana watershed Northern Tunisia.

Local Mesh Co-Occurrence Pattern for Content Based Image Retrieval

This paper presents the local mesh co-occurrence patterns (LMCoP) using HSV color space for image retrieval system. HSV color space is used in this method to utilize color, intensity and brightness of images. Local mesh patterns are applied to define the local information of image and gray level co-occurrence is used to obtain the co-occurrence of LMeP pixels. Local mesh co-occurrence pattern extracts the local directional information from local mesh pattern and converts it into a well-mannered feature vector using gray level co-occurrence matrix. The proposed method is tested on three different databases called MIT VisTex, Corel, and STex. Also, this algorithm is compared with existing methods, and results in terms of precision and recall are shown in this paper.

A Framework for Semantics Preserving SPARQL-to-SQL Translation

The enormous amount of information stored on the web increases from one day to the next, exposing the web currently faced with the inevitable difficulties of research pertinent information that users really want. The problem today is not limited to expanding the size of the information highways, but to design a system for intelligent search. The vast majority of this information is stored in relational databases, which in turn represent a backend for managing RDF data of the semantic web. This problem has motivated us to write this paper in order to establish an effective approach to support semantic transformation algorithm for SPARQL queries to SQL queries, more precisely SPARQL SELECT queries; by adopting this method, the relational database can be questioned easily with SPARQL queries maintaining the same performance.

Non-Parametric, Unconditional Quantile Estimation of Efficiency in Microfinance Institutions

We apply the non-parametric, unconditional, hyperbolic order-α quantile estimator to appraise the relative efficiency of Microfinance Institutions in Africa in terms of outreach. Our purpose is to verify if these institutions, which must constantly try to strike a compromise between their social role and financial sustainability are operationally efficient. Using data on African MFIs extracted from the Microfinance Information eXchange (MIX) database and covering the 2004 to 2006 periods, we find that more efficient MFIs are also the most profitable. This result is in line with the view that social performance is not in contradiction with the pursuit of excellent financial performance. Our results also show that large MFIs in terms of asset and those charging the highest fees are not necessarily the most efficient.

A Study on the Assessment of Prosthetic Infection after Total Knee Replacement Surgery

This study, for its research subjects, uses patients who had undergone total knee replacement surgery from the database of the National Health Insurance Administration. Through the review of literatures and the interviews with physicians, important factors are selected after careful screening. Then using Cross Entropy Method, Genetic Algorithm Logistic Regression, and Particle Swarm Optimization, the weight of each factor is calculated and obtained. In the meantime, Excel VBA and Case Based Reasoning are combined and adopted to evaluate the system. Results show no significant difference found through Genetic Algorithm Logistic Regression and Particle Swarm Optimization with over 97% accuracy in both methods. Both ROC areas are above 0.87. This study can provide critical reference to medical personnel as clinical assessment to effectively enhance medical care quality and efficiency, prevent unnecessary waste, and provide practical advantages to resource allocation to medical institutes.

3D Objects Indexing with a Direct and Analytical Method for Calculating the Spherical Harmonics Coefficients

In this paper, we propose a new method for threedimensional object indexing based on D.A.M.C-S.H.C descriptor (Direct and Analytical Method for Calculating the Spherical Harmonics Coefficients). For this end, we propose a direct calculation of the coefficients of spherical harmonics with perfect precision. The aims of the method are to minimize, the processing time on the 3D objects database and the searching time of similar objects to a request object. Firstly we start by defining the new descriptor using a new division of 3-D object in a sphere. Then we define a new distance which will be tested and prove his efficiency in the search for similar objects in the database in which we have objects with very various and important size.

A Consideration on the Offset Frontal Impact Modeling Using Spring-Mass Model

To construct the lumped spring-mass model considering the occupants for the offset frontal crash, the SISAME software and the NHTSA test data were used. The data on 56 kph 40% offset frontal vehicle to deformable barrier crash test of a MY2007 Mazda 6 4-door sedan were obtained from NHTSA test database. The overall behaviors of B-pillar and engine of simulation models agreed very well with the test data. The trends of accelerations at the driver and passenger head were similar but big differences in peak values. The differences of peak values caused the large errors of the HIC36 and 3 ms chest g’s. To predict well the behaviors of dummies, the spring-mass model for the offset frontal crash needs to be improved.

Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences

Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.). 

Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry

In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.

Developing a Town Based Soil Database to Assess the Sensitive Zones in Nutrient Management

For this study, a town based soil database created in Gümüsçay District of Biga Town, Çanakkale, Turkey. Crop and livestock production are major activities in the district. Nutrient management is mainly based on commercial fertilizer application ignoring the livestock manure. Within the boundaries of district, 122 soil sampling points determined over the satellite image. Soil samples collected from the determined points with the help of handheld Global Positioning System. Labeled samples were sent to a commercial laboratory to determine 11 soil parameters including salinity, pH, lime, organic matter, nitrogen, phosphorus, potassium, iron, manganese, copper and zinc. Based on the test results soil maps for mentioned parameters were developed using remote sensing, GIS, and geostatistical analysis. In this study we developed a GIS database that will be used for soil nutrient management. Methods were explained and soil maps and their interpretations were summarized in the study.

Content-Based Color Image Retrieval Based On 2-D Histogram and Statistical Moments

In this paper, we are interested in the problem of finding similar images in a large database. For this purpose we propose a new algorithm based on a combination of the 2-D histogram intersection in the HSV space and statistical moments. The proposed histogram is based on a 3x3 window and not only on the intensity of the pixel. This approach overcome the drawback of the conventional 1-D histogram which is ignoring the spatial distribution of pixels in the image, while the statistical moments are used to escape the effects of the discretisation of the color space which is intrinsic to the use of histograms. We compare the performance of our new algorithm to various methods of the state of the art and we show that it has several advantages. It is fast, consumes little memory and requires no learning. To validate our results, we apply this algorithm to search for similar images in different image databases.

Bioinformatics and Molecular Biological Characterization of a Hypothetical Protein SAV1226 as a Potential Drug Target for Methicillin/Vancomycin- Staphylococcus aureus Infections

Methicillin/multiple-resistant Staphylococcus aureus (MRSA) are infectious bacteria that are resistant to common antibiotics. A previous in silico study in our group has identified a hypothetical protein SAV1226 as one of the potential drug targets. In this study, we reported the bioinformatics characterization, as well as cloning, expression, purification and kinetic assays of hypothetical protein SAV1226 from methicillin/vancomycin-resistant Staphylococcus aureus Mu50 strain. MALDI-TOF/MS analysis revealed a low degree of structural similarity with known proteins. Kinetic assays demonstrated that hypothetical protein SAV1226 is neither a domain of an ATP dependent dihydroxyacetone kinase nor of a phosphotransferase system (PTS) dihydroxyacetone kinase, suggesting that the function of hypothetical protein SAV1226 might be misannotated on public databases such as UniProt and InterProScan 5.

A System for Analyzing and Eliciting Public Grievances Using Cache Enabled Big Data

The system for analyzing and eliciting public grievances serves its main purpose to receive and process all sorts of complaints from the public and respond to users. Due to the more number of complaint data becomes big data which is difficult to store and process. The proposed system uses HDFS to store the big data and uses MapReduce to process the big data. The concept of cache was applied in the system to provide immediate response and timely action using big data analytics. Cache enabled big data increases the response time of the system. The unstructured data provided by the users are efficiently handled through map reduce algorithm. The processing of complaints takes place in the order of the hierarchy of the authority. The drawbacks of the traditional database system used in the existing system are set forth by our system by using Cache enabled Hadoop Distributed File System. MapReduce framework codes have the possible to leak the sensitive data through computation process. We propose a system that add noise to the output of the reduce phase to avoid signaling the presence of sensitive data. If the complaints are not processed in the ample time, then automatically it is forwarded to the higher authority. Hence it ensures assurance in processing. A copy of the filed complaint is sent as a digitally signed PDF document to the user mail id which serves as a proof. The system report serves to be an essential data while making important decisions based on legislation.

A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks

Thousands of organisations store important and confidential information related to them, their customers, and their business partners in databases all across the world. The stored data ranges from less sensitive (e.g. first name, last name, date of birth) to more sensitive data (e.g. password, pin code, and credit card information). Losing data, disclosing confidential information or even changing the value of data are the severe damages that Structured Query Language injection (SQLi) attack can cause on a given database. It is a code injection technique where malicious SQL statements are inserted into a given SQL database by simply using a web browser. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLi attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLi attack categories, and a NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLi attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.

Modernization of the Economic Price Adjustment Software

The US Consumer Price Indices (CPIs) measures hundreds of items in the US economy. Many social programs and government benefits index to the CPIs. The purpose of this project is to modernize an existing process. This paper will show the development of a small, visual, software product that documents the Economic Price Adjustment (EPA) for longterm contracts. The existing workbook does not provide the flexibility to calculate EPAs where the base-month and the option-month are different. Nor does the workbook provide automated error checking. The small, visual, software product provides the additional flexibility and error checking. This paper presents the feedback to project.

OPEN_EmoRec_II- A Multimodal Corpus of Human-Computer Interaction

OPEN_EmoRec_II is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction (HCI), realized with a wizard-of-oz design. The induced emotions are based on the dimensional theory of emotions (valence, arousal and dominance). These emotional sequences - recorded with multimodal data (facial reactions, speech, audio and physiological reactions) during a naturalistic-like HCI-environment one can improve classification methods on a multimodal level. This database is the result of an HCI-experiment, for which 30 subjects in total agreed to a publication of their data including the video material for research purposes*. The now available open corpus contains sensory signal of: video, audio, physiology (SCL, respiration, BVP, EMG Corrugator supercilii, EMG Zygomaticus Major) and facial reactions annotations.

OPEN_EmoRec_II- A Multimodal Corpus of Human-Computer Interaction

OPEN_EmoRec_II is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction (HCI), realized with a wizard-of-oz design. The induced emotions are based on the dimensional theory of emotions (valence, arousal and dominance). These emotional sequences - recorded with multimodal data (facial reactions, speech, audio and physiological reactions) during a naturalistic-like HCI-environment one can improve classification methods on a multimodal level. This database is the result of an HCI-experiment, for which 30 subjects in total agreed to a publication of their data including the video material for research purposes*. The now available open corpus contains sensory signal of: video, audio, physiology (SCL, respiration, BVP, EMG Corrugator supercilii, EMG Zygomaticus Major) and facial reactions annotations.

Recycling in Bogotá: A SWOT Analysis of Three Associations to Evaluate the Integrating the Informal Sector into Solid Waste Management

In emerging economies, recycling is an opportunity for the cities to increase the lifespan of sanitary landfills, reduce the costs of the solid waste management, decrease the environmental problems of the waste treatment through reincorporate waste in the productive cycle and protect and develop people’s livelihoods of informal waste pickers. However, few studies have analysed the possibilities and strategies to integrate formal and informal sectors in the solid waste management for the benefit of both. This study seek to make a strength, weakness, opportunity, and threat (SWOT) analysis in three recycling associations of Bogotá with the aim to understand and determine the situation of recycling from perspective of informal sector in its transition to enter as authorized waste providers. Data used in the analysis are derived from multiple strategies such as literature review, the Bogota’s recycling database, focus group meetings, governmental reports, national laws and regulations and specific interviews with key stakeholders. Results of this study show as the main stakeholders of formal and informal sector of waste management can identify the internal and internal conditions of recycling in Bogotá. Several strategies were designed based on the SWOTs determined, could be useful for Bogotá to advance and promote recycling as a key strategy for integrated sustainable waste management in the city.

Bacteriological Screening and Antibiotic – Heavy Metal Resistance Profile of the Bacteria Isolated from Some Amphibian and Reptile Species of the Biga Stream in Turkey

In this article, the antibiogram and heavy metal resistance profile of the bacteria isolated from total 34 studied animals (Pelophylax ridibundus = 12; Mauremys rivulata = 14; Natrix natrix = 8) captured around the Biga Stream, are described. There was no database information on antibiogram and heavy metal resistance profile of bacteria from these area’s amphibians and reptiles. A total of 200 bacteria were successfully isolated from cloaca and oral samples of the aquatic amphibians and reptiles as well as from the water sample. According to Jaccard’s similarity index, the degree of similarity in the bacterial flora was quite high among the amphibian and reptile species under examination, whereas it was different from the bacterial diversity in the water sample. The most frequent isolates were A. hydrophila (31.5%), B. pseudomallei (8.5%), and C. freundii (7%). The total numbers of bacteria obtained were as follows: 45 in P. ridibundus, 45 in N. natrix 30 in M. rivulata, and 80 in the water sample. The result showed that cefmetazole was the most effective antibiotic to control the bacteria isolated in this study and that approximately 93.33% of the bacterial isolates were sensitive to this antibiotic. The multiple antibiotic resistances (MAR) index indicated that P. ridibundus (0.95) > N. natrix (0.89) > M. rivulata (0.39). Furthermore, all the tested heavy metals (Pb+2, Cu+2, Cr+3, and Mn+2) inhibit the growth of the bacterial isolates at different rates. Therefore, it indicated that the water source of the animals was contaminated with both antibiotic residues and heavy metals.

Analysis of the Physical Behavior of Library Users in Reading Rooms through GIS: A Case Study of the Central Library of Tehran University

Taking into account the significance of measuring the daily use of the study space in the libraries in order to develop and reorganize the space for enhancing the efficiency of the study space, the current study aimed to apply GIS in analyzing the study halls of the Central Library and Document Center of Tehran University in order to determine how study desks and chairs were used by the students. The study used a combination of survey-descriptive and system design method. In order to gather the required data, surveydescriptive method was used. For implementing and entering data into ArcGIS and analyzing the data and displaying the results on the maps of the study halls of the library, system design method was utilized. The design of the spatial database of the use of the study halls was measured through the extent of occupancy of the space by the library users and the maps of the study halls of the central library of Tehran University as the case study. The results showed that Abooreyhan hall had the highest rate of occupancy of the desks and chairs compared to the other halls. The Hall of Science and Technology, with an average occupancy rate of 0.39 for the tables represented the lowest number of users and Rashid al-Dins hall, and Science and Technology hall with an average occupancy rate (0.40) had the lowest number of users for seats. In this study, the comparison of the space occupied at different periods in the morning, evenings, afternoons, and several months was performed through GIS. This system analyzed the space relationships effectively and efficiently. The output of this study would be used by administrators and librarians to determine the exact extent of use of the equipment of the study halls and librarians can use the output map to design the space more efficiently at the library.