The Role of Online Social Networks in Social Movements: Social Polarization and Violations against Social Unity and Privacy of Individuals in Turkey

As a matter of the fact that online social networks like Twitter, Facebook and MySpace have experienced an extensive growth in recent years. Social media offers individuals with a tool for communicating and interacting with one another. These social networks enable people to stay in touch with other people and express themselves. This process makes the users of online social networks active creators of content rather than being only consumers of traditional media. That’s why millions of people show strong desire to learn the methods and tools of digital content production and necessary communication skills. However, the booming interest in communication and interaction through online social networks and high level of eagerness to invent and implement the ways to participate in content production raise some privacy and security concerns. This presentation aims to open the assumed revolutionary, democratic and liberating nature of the online social media up for discussion by reviewing some recent political developments in Turkey. Firstly, the role of Internet and online social networks in mobilizing collective movements through social interactions and communications will be questioned. Secondly, some cases from Gezi and Okmeydanı Protests and also December 17-25 period will be presented in order to illustrate misinformation and manipulation in social media and violation of individual privacy through online social networks in order to damage social unity and stability contradictory to democratic nature of online social networking.

Embedding a Large Amount of Information Using High Secure Neural Based Steganography Algorithm

In this paper, we construct and implement a new Steganography algorithm based on learning system to hide a large amount of information into color BMP image. We have used adaptive image filtering and adaptive non-uniform image segmentation with bits replacement on the appropriate pixels. These pixels are selected randomly rather than sequentially by using new concept defined by main cases with sub cases for each byte in one pixel. According to the steps of design, we have been concluded 16 main cases with their sub cases that covere all aspects of the input information into color bitmap image. High security layers have been proposed through four layers of security to make it difficult to break the encryption of the input information and confuse steganalysis too. Learning system has been introduces at the fourth layer of security through neural network. This layer is used to increase the difficulties of the statistical attacks. Our results against statistical and visual attacks are discussed before and after using the learning system and we make comparison with the previous Steganography algorithm. We show that our algorithm can embed efficiently a large amount of information that has been reached to 75% of the image size (replace 18 bits for each pixel as a maximum) with high quality of the output.

Biometric Technology in Securing the Internet Using Large Neural Network Technology

The article examines the methods of protection of citizens' personal data on the Internet using biometric identity authentication technology. It`s celebrated their potential danger due to the threat of loss of base biometric templates. To eliminate the threat of compromised biometric templates is proposed to use neural networks large and extra-large sizes, which will on the one hand securely (Highly reliable) to authenticate a person by his biometrics, and on the other hand make biometrics a person is not available for observation and understanding. This article also describes in detail the transformation of personal biometric data access code. It`s formed the requirements for biometrics converter code for his work with the images of "Insider," "Stranger", all the "Strangers". It`s analyzed the effect of the dimension of neural networks on the quality of converters mystery of biometrics in access code.

Weight-Based Query Optimization System Using Buffer

Fast retrieval of data has been a need of user in any database application. This paper introduces a buffer based query optimization technique in which queries are assigned weights according to their number of execution in a query bank. These queries and their optimized executed plans are loaded into the buffer at the start of the database application. For every query the system searches for a match in the buffer and executes the plan without creating new plans.

Esterification of Free Fatty Acids in Crude Palm Oil Using Alumina-Doped Sulfated Tin Oxide as a Catalyst

The conventional production of biodiesel from crude palm oil which contains large amounts of free fatty acids in the presence of a homogeneous base catalyst confronts the problems of soap formation and very low yield of biodiesel. To overcome these problems, free fatty acids must be esterified to their esters in the presence of an acid catalyst prior to alkaline-catalyzed transesterification. Sulfated metal oxides are a promising group of catalysts due to their very high acidity. In this research, aluminadoped sulfated tin oxide (SO4 2-/Al2O3-SnO2) catalysts were prepared and used for esterification of free fatty acids in crude palm oil in a batch reactor. The SO4 2-/Al2O3-SnO2 catalysts were prepared from different Al precursors. The results showed that different Al precursors gave different activities of the SO4 2-/Al2O3-SnO2 catalysts. The esterification of free fatty acids in crude palm oil with methanol in the presence of SO4 2-/Al2O3-SnO2 catalysts followed first-order kinetics.

Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data

The k-nearest neighbors (knn) is a simple but effective method of classification. In this paper we present an extended version of this technique for chemical compounds used in High Throughput Screening, where the distances of the nearest neighbors can be taken into account. Our algorithm uses kernel weight functions as guidance for the process of defining activity in screening data. Proposed kernel weight function aims to combine properties of graphical structure and molecule descriptors of screening compounds. We apply the modified knn method on several experimental data from biological screens. The experimental results confirm the effectiveness of the proposed method.

Grid-HPA: Predicting Resource Requirements of a Job in the Grid Computing Environment

For complete support of Quality of Service, it is better that environment itself predicts resource requirements of a job by using special methods in the Grid computing. The exact and correct prediction causes exact matching of required resources with available resources. After the execution of each job, the used resources will be saved in the active database named "History". At first some of the attributes will be exploit from the main job and according to a defined similarity algorithm the most similar executed job will be exploited from "History" using statistic terms such as linear regression or average, resource requirements will be predicted. The new idea in this research is based on active database and centralized history maintenance. Implementation and testing of the proposed architecture results in accuracy percentage of 96.68% to predict CPU usage of jobs and 91.29% of memory usage and 89.80% of the band width usage.

Aircraft Gas Turbine Engines Technical Condition Identification System

In this paper is shown that the probability-statistic methods application, especially at the early stage of the aviation gas turbine engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence is considered the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods. Training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. Thus for GTE technical condition more adequate model making are analysed dynamics of skewness and kurtosis coefficients' changes. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE work parameters have fuzzy character. Hence consideration of fuzzy skewness and kurtosis coefficients is expedient. Investigation of the basic characteristics changes- dynamics of GTE work parameters allows to draw conclusion on necessity of the Fuzzy Statistical Analysis at preliminary identification of the engines' technical condition. Researches of correlation coefficients values- changes shows also on their fuzzy character. Therefore for models choice the application of the Fuzzy Correlation Analysis results is offered. For checking of models adequacy is considered the Fuzzy Multiple Correlation Coefficient of Fuzzy Multiple Regression. At the information sufficiency is offered to use recurrent algorithm of aviation GTE technical condition identification (Hard Computing technology is used) on measurements of input and output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-bystage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine temperature condition was made.

Validation of Reverse Engineered Web Application Models

Web applications have become complex and crucial for many firms, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering). The scientific community has focused attention to Web application design, development, analysis, testing, by studying and proposing methodologies and tools. Static and dynamic techniques may be used to analyze existing Web applications. The use of traditional static source code analysis may be very difficult, for the presence of dynamically generated code, and for the multi-language nature of the Web. Dynamic analysis may be useful, but it has an intrinsic limitation, the low number of program executions used to extract information. Our reverse engineering analysis, used into our WAAT (Web Applications Analysis and Testing) project, applies mutational techniques in order to exploit server side execution engines to accomplish part of the dynamic analysis. This paper studies the effects of mutation source code analysis applied to Web software to build application models. Mutation-based generated models may contain more information then necessary, so we need a pruning mechanism.

Existence and Exponential Stability of Almost Periodic Solution for Recurrent Neural Networks on Time Scales

In this paper, a class of recurrent neural networks (RNNs) with variable delays are studied on almost periodic time scales, some sufficient conditions are established for the existence and global exponential stability of the almost periodic solution. These results have important leading significance in designs and applications of RNNs. Finally, two examples and numerical simulations are presented to illustrate the feasibility and effectiveness of the results.

Analysis of MAC Protocols with Correlation Receiver for OCDMA Networks - Part II

In this paper optical code-division multiple-access (OCDMA) packet network is considered, which offers inherent security in the access networks. Two types of random access protocols are proposed for packet transmission. In protocol 1, all distinct codes and in protocol 2, distinct codes as well as shifted versions of all these codes are used. O-CDMA network performance using optical orthogonal codes (OOCs) 1-D and two-dimensional (2-D) wavelength/time single-pulse-per-row (W/T SPR) codes are analyzed. The main advantage of using 2-D codes instead of onedimensional (1-D) codes is to reduce the errors due to multiple access interference among different users. In this paper, correlation receiver is considered in the analysis. Using analytical model, we compute and compare packet-success probability for 1-D and 2-D codes in an O-CDMA network and the analysis shows improved performance with 2-D codes as compared to 1-D codes.

Migration and Unemployment Duration: The Case of the OECD Countries

This paper examines whether or not immigration has a positive influence on the duration of unemployment, in a macroeconomic perspective. We analyse also whether the degree of labor market integration can influence migration. The integration of immigrants into the labor market is a recurrence theme in the work on the economic consequences of immigration. However, to our knowledge, no researchers have studied the impact of immigration on unemployment duration, and vice versa. With two methodology of research (panel estimations (OLS and 2SLS) and panel cointegration techniques), we show that migration seems to influence positively the short-term unemployment and negatively long-term unemployment, for 14 OECD destination countries. In addition, immigration seems to be conditioned by the structural and institutional characteristics of the labour market.

Application of Multi-objective Optimization Packages in Design of an Evaporator Coil

A novel methodology has been used to design an evaporator coil of a refrigerant. The methodology used is through a complete Computer Aided Design /Computer Aided Engineering approach, by means of a Computational Fluid Dynamic/Finite Element Analysis model which is executed many times for the thermal-fluid exploration of several designs' configuration by an commercial optimizer. Hence the design is carried out automatically by parallel computations, with an optimization package taking the decisions rather than the design engineer. The engineer instead takes decision regarding the physical settings and initializing of the computational models to employ, the number and the extension of the geometrical parameters of the coil fins and the optimization tools to be employed. The final design of the coil geometry found to be better than the initial design.

A Generalised Relational Data Model

A generalised relational data model is formalised for the representation of data with nested structure of arbitrary depth. A recursive algebra for the proposed model is presented. All the operations are formally defined. The proposed model is proved to be a superset of the conventional relational model (CRM). The functionality and validity of the model is shown by a prototype implementation that has been undertaken in the functional programming language Miranda.

Modeling and Verification for the Micropayment Protocol Netpay

There are many virtual payment systems available to conduct micropayments. It is essential that the protocols satisfy the highest standards of correctness. This paper examines the Netpay Protocol [3], provide its formalization as automata model, and prove two important correctness properties, namely absence of deadlock and validity of an ecoin during the execution of the protocol. This paper assumes a cooperative customer and will prove that the protocol is executing according to its description.

Recursive Wiener-Khintchine Theorem

Power Spectral Density (PSD) computed by taking the Fourier transform of auto-correlation functions (Wiener-Khintchine Theorem) gives better result, in case of noisy data, as compared to the Periodogram approach. However, the computational complexity of Wiener-Khintchine approach is more than that of the Periodogram approach. For the computation of short time Fourier transform (STFT), this problem becomes even more prominent where computation of PSD is required after every shift in the window under analysis. In this paper, recursive version of the Wiener-Khintchine theorem has been derived by using the sliding DFT approach meant for computation of STFT. The computational complexity of the proposed recursive Wiener-Khintchine algorithm, for a window size of N, is O(N).

A Subjectively Influenced Router for Vehicles in a Four-Junction Traffic System

A subjectively influenced router for vehicles in a fourjunction traffic system is presented. The router is based on a 3-layer Backpropagation Neural Network (BPNN) and a greedy routing procedure. The BPNN detects priorities of vehicles based on the subjective criteria. The subjective criteria and the routing procedure depend on the routing plan towards vehicles depending on the user. The routing procedure selects vehicles from their junctions based on their priorities and route them concurrently to the traffic system. That is, when the router is provided with a desired vehicles selection criteria and routing procedure, it routes vehicles with a reasonable junction clearing time. The cost evaluation of the router determines its efficiency. In the case of a routing conflict, the router will route the vehicles in a consecutive order and quarantine faulty vehicles. The simulations presented indicate that the presented approach is an effective strategy of structuring a subjective vehicle router.

Collaborative Tracking Control of UAV-UGV

This paper suggests a fast and stable Target Tracking system in collaborative control of UAV and UGV. Wi-Fi communication range is limited in collaborative control of UAV and UGV. Thus, to secure a stable communications, UAV and UGV have to be kept within a certain distance from each other. But existing method which uses UAV Vertical Camera to follow the motion of UGV is likely to lose a target with a sudden movement change. Eventually, UGV has disadvantages that it could only move at a low speed and not make any sudden change of direction in order to keep track of the target. Therefore, we suggest utilizing AR Drone UAV front camera to track fast-moving and Omnidirectional Mecanum Wheel UGV.

Synthesis and Characterization of ZnO and Fe3O4 Nanocrystals from Oleat-based Organometallic Compounds

Magnetic and semiconductor nanomaterials exhibit novel magnetic and optical properties owing to their unique size and shape-dependent effects. With shrinking the size down to nanoscale region, various anomalous properties that normally not present in bulk start to dominate. Ability in harnessing of these anomalous properties for the design of various advance electronic devices is strictly dependent on synthetic strategies. Hence, current research has focused on developing a rational synthetic control to produce high quality nanocrystals by using organometallic approach to tune both size and shape of the nanomaterials. In order to elucidate the growth mechanism, transmission electron microscopy was employed as a powerful tool in performing real time-resolved morphologies and structural characterization of magnetic (Fe3O4) and semiconductor (ZnO) nanocrystals. The current synthetic approach is found able to produce nanostructures with well-defined shapes. We have found that oleic acid is an effective capping ligand in preparing oxide-based nanostructures without any agglomerations, even at high temperature. The oleate-based precursors and capping ligands are fatty acid compounds, which are respectively originated from natural palm oil with low toxicity. In comparison with other synthetic approaches in producing nanostructures, current synthetic method offers an effective route to produce oxide-based nanomaterials with well-defined shapes and good monodispersity. The nanocystals are well-separated with each other without any stacking effect. In addition, the as-synthesized nanopellets are stable in terms of chemically and physically if compared to those nanomaterials that are previous reported. Further development and extension of current synthetic strategy are being pursued to combine both of these materials into nanocomposite form that will be used as “smart magnetic nanophotocatalyst" for industry waste water treatment.

Salient Points Reduction for Content-Based Image Retrieval

Salient points are frequently used to represent local properties of the image in content-based image retrieval. In this paper, we present a reduction algorithm that extracts the local most salient points such that they not only give a satisfying representation of an image, but also make the image retrieval process efficiently. This algorithm recursively reduces the continuous point set by their corresponding saliency values under a top-down approach. The resulting salient points are evaluated with an image retrieval system using Hausdoff distance. In this experiment, it shows that our method is robust and the extracted salient points provide better retrieval performance comparing with other point detectors.