A Laser Point Interaction System Integrating Mouse Functions

The computer has become an essential tool in modern life, and the combined use of a computer with a projector is very common in teaching and presentations. However, as typical computer operating devices involve a mouse or keyboard, when making presentations, users often need to stay near the computer to execute functions such as changing pages, writing, and drawing, thus, making the operation time-consuming, and reducing interactions with the audience. This paper proposes a laser pointer interaction system able to simulate mouse functions in order that users need not remain near the computer, but can directly use laser pointer operations from at a distance. It can effectively reduce the users- time spent by the computer, allowing for greater interactions with the audience.

Low Resolution Face Recognition Using Mixture of Experts

Human activity is a major concern in a wide variety of applications, such as video surveillance, human computer interface and face image database management. Detecting and recognizing faces is a crucial step in these applications. Furthermore, major advancements and initiatives in security applications in the past years have propelled face recognition technology into the spotlight. The performance of existing face recognition systems declines significantly if the resolution of the face image falls below a certain level. This is especially critical in surveillance imagery where often, due to many reasons, only low-resolution video of faces is available. If these low-resolution images are passed to a face recognition system, the performance is usually unacceptable. Hence, resolution plays a key role in face recognition systems. In this paper we introduce a new low resolution face recognition system based on mixture of expert neural networks. In order to produce the low resolution input images we down-sampled the 48 × 48 ORL images to 12 × 12 ones using the nearest neighbor interpolation method and after that applying the bicubic interpolation method yields enhanced images which is given to the Principal Component Analysis feature extractor system. Comparison with some of the most related methods indicates that the proposed novel model yields excellent recognition rate in low resolution face recognition that is the recognition rate of 100% for the training set and 96.5% for the test set.

Traffic Signal Coordinated Control Optimization: A Case Study

In the urban traffic network, the intersections are the “bottleneck point" of road network capacity. And the arterials are the main body in road network and the key factor which guarantees the normal operation of the city-s social and economic activities. The rapid increase in vehicles leads to seriously traffic jam and cause the increment of vehicles- delay. Most cities of our country are traditional single control system, which cannot meet the need for the city traffic any longer. In this paper, Synchro6.0 as a platform to minimize the intersection delay, optimizesingle signal cycle and split for Zhonghua Street in Handan City. Meanwhile, linear control system uses to optimize the phase for the t arterial road in this system. Comparing before and after use the control, capacities and service levels of this road and the adjacent road have improved significantly.

Sound Teaching Practices in Conducting a Physical Education Program for Persons with an Intellectual Disability

This paper presents key challenges reported by a group of Australian undergraduate Physical Education students in conducting a program for persons with an intellectual disability. Strategies adopted to address these challenges are presented together with representative feedback given by the Physical Education students at the completion of the program. The significance of the program’s findings is summarized.

Learning to Order Terms: Supervised Interestingness Measures in Terminology Extraction

Term Extraction, a key data preparation step in Text Mining, extracts the terms, i.e. relevant collocation of words, attached to specific concepts (e.g. genetic-algorithms and decisiontrees are terms associated to the concept “Machine Learning" ). In this paper, the task of extracting interesting collocations is achieved through a supervised learning algorithm, exploiting a few collocations manually labelled as interesting/not interesting. From these examples, the ROGER algorithm learns a numerical function, inducing some ranking on the collocations. This ranking is optimized using genetic algorithms, maximizing the trade-off between the false positive and true positive rates (Area Under the ROC curve). This approach uses a particular representation for the word collocations, namely the vector of values corresponding to the standard statistical interestingness measures attached to this collocation. As this representation is general (over corpora and natural languages), generality tests were performed by experimenting the ranking function learned from an English corpus in Biology, onto a French corpus of Curriculum Vitae, and vice versa, showing a good robustness of the approaches compared to the state-of-the-art Support Vector Machine (SVM).

Phase Noise Impact on BER in Space Communication

This paper deals with the modeling and the evaluation of a multiplicative phase noise influence on the bit error ratio in a general space communication system. Our research is focused on systems with multi-state phase shift keying modulation techniques and it turns out, that the phase noise significantly affects the bit error rate, especially for higher signal to noise ratios. These results come from a system model created in Matlab environment and are shown in a form of constellation diagrams and bit error rate dependencies. The change of a user data bit rate is also considered and included into simulation results. Obtained outcomes confirm theoretical presumptions.

Derivative Spectrophotometry Applied to the Determination of Triprolidine Hydrochloride and Pseudoephedrine Hydrochloride in Tablets and Dissolution Testing

A spectrophotometric method was developed for simultaneous quantification of pseudoephedrine hydrochloride (PSE) triprolidine hydrochloride (TRI) using second derivative method (zero-crossing technique). The second derivative amplitudes of PSE and TRI were measured at 271 and 321 nm, respectively. The calibration curves were linear in the range of 200 to 1,000 g/ml for PSE and 10 to 50 g/ml for TRI. The method was validated for specificity, accuracy, precision, limit of detection and limit of quantitation. The proposed method was applied to the assaying and dissolution of PSE and TRI in commercial tablets without any chemical separation. The results were compared with those obtained by the official USP31 method and statistical tests showed that there is no significant between the methods at 95% confidence level. The proposed method is simple, rapid and suitable for the routine quality control application. KeywordsTriprolidine, Pseudoephedrine, Derivative spectrophotometry, Dissolution testing.

The Association of Matrix Metalloproteinase-3 Gene -1612 5A/6A Polymorphism with Susceptibility to Coronary Artery Stenosis in an Iranian Population

Matrix metalloproteinase-3 (MMP3) is key member of the MMP family, and is known to be present in coronary atherosclerotic. Several studies have demonstrated that MMP-3 5A/6A polymorphism modify each transcriptional activity in allele specific manner. We hypothesized that this polymorphism may play a role as risk factor for development of coronary stenosis. The aim of our study was to estimate MMP-3 (5A/6A) gene polymorphism on interindividual variability in risk for coronary stenosis in an Iranian population.DNA was extracted from white blood cells and genotypes were obtained from coronary stenosis cases (n=95) and controls (n=100) by PCR (polymerase chain reaction) and restriction fragment length polymorphism techniques. Significant differences between cases and controls were observed for MMP3 genotype frequencies (X2=199.305, p< 0.001); the 6A allele was less frequently seen in the control group, compared to the disease group (85.79 vs. 78%, 6A/6A+5A/6A vs. 5A/5A, P≤0.001). These data imply the involvement of -1612 5A/6A polymorphism in coronary stenosis, and suggest that probably the 6A/6A MMP-3 genotype is a genetic susceptibility factor for coronary stenosis.

Intellectual Capital and Competitive Advantage: An Analysis of the Biotechnology Industry

Intellectual capital measurement is a central aspect of knowledge management. The measurement and the evaluation of intangible assets play a key role in allowing an effective management of these assets as sources of competitiveness. For these reasons, managers and practitioners need conceptual and analytical tools taking into account the unique characteristics and economic significance of Intellectual Capital. Following this lead, we propose an efficiency and productivity analysis of Intellectual Capital, as a determinant factor of the company competitive advantage. The analysis is carried out by means of Data Envelopment Analysis (DEA) and Malmquist Productivity Index (MPI). These techniques identify Bests Practice companies that have accomplished competitive advantage implementing successful strategies of Intellectual Capital management, and offer to inefficient companies development paths by means of benchmarking. The proposed methodology is employed on the Biotechnology industry in the period 2007-2010.

Studying Implication of Globalization on Engineering Education

The primary purpose of this article is an attempt to find the implication of globalization on education. Globalization has an important role as a process in the economical, political, cultural and technological dimensions in the life of the contemporary human being and has been affected by it. Education has its effects in this procedure and while influencing it through educating global citizens having universal human features and characteristics, has been influenced by this phenomenon too. Nowadays, the role of education is not just to develop in the students the knowledge and skills necessary for the new kinds of jobs. If education wants to help students be prepared of the new global society, it has to make them engaged productive and critical citizens for the global era, so that they can reflect about their roles as key actors in a dynamic often uneven, matrix of economic and cultural exchanges. If education wants to reinforce and raise the national identity, the value system and the children and teenagers, it should make them ready for living in the global era of this century. The used method in this research is documentary and analyzing the documents. Studies in this field show globalization has influences on the processes of the production, distribution and consuming of knowledge. The happening of this event in the information era has not only provide the necessary opportunities for the exchanges of education worldwide but also has privileges for the developing countries which enables them to strengthen educational bases of their society and have an important step toward their future.

Improving the Reusability and Interoperability of E-Learning Material

A key requirement for e-learning materials is reusability and interoperability, that is the possibility to use at least part of the contents in different courses, and to deliver them trough different platforms. These features make possible to limit the cost of new packages, but require the development of material according to proper specifications. SCORM (Sharable Content Object Reference Model) is a set of guidelines suitable for this purpose. A specific adaptation project has been started to make possible to reuse existing materials. The paper describes the main characteristics of SCORM specification, and the procedure used to modify the existing material.

Temperature Effect on the Solid-State Synthesis of Dehydrated Zinc Borates

Turkey has 72 % of total world boron reserves on the basis of B2O3.Borates that is a refined form of boron minerals have a wide range of applications. Zinc borates can be used as multifunctional synergistic additives. The most important properties are low solubility in water and high dehydration temperature. Zinc borates dehydrate above 290°C and anhydrous zinc borate has thermal resistance about 400°C. Zinc borates can be synthesized using several methods such as hydrothermal and solid-state processes. In this study, the solid-state method was applied between 500 and 800°C using the starting materials of ZnO and H3BO3 with 1:4 mole ratio. The reaction time was determined as 4 hours after some preliminary experiments. After the synthesis, the crystal structure and the morphology of the products were examined by XRay Diffraction (XRD), Fourier Transform Infrared Spectroscopy (FT-IR) and Raman Spectrometer. As a result the form of ZnB4O7 was synthesized with the highest crystal score at 800°C.

Connectivity Characteristic of Transcription Factor

Transcription factors are a group of proteins that helps for interpreting the genetic information in DNA. Protein-protein interactions play a major role in the execution of key biological functions of a cell. These interactions are represented in the form of a graph with nodes and edges. Studies have showed that some nodes have high degree of connectivity and such nodes, known as hub nodes, are the inevitable parts of the network. In the present paper a method is proposed to identify hub transcription factor proteins using sequence information. On a complete data set of transcription factor proteins available from the APID database, the proposed method showed an accuracy of 77%, sensitivity of 79% and specificity of 76%.

Deniable Authentication Protocol Resisting Man-in-the-Middle Attack

Deniable authentication is a new protocol which not only enables a receiver to identify the source of a received message but also prevents a third party from identifying the source of the message. The proposed protocol in this paper makes use of bilinear pairings over elliptic curves, as well as the Diffie-Hellman key exchange protocol. Besides the security properties shared with previous authentication protocols, the proposed protocol provides the same level of security with smaller public key sizes.

Effect of Geum Kokanicum Total Extract on Induced Nociception and Inflammation in Male Mice

The aim of this study is evaluating the antinociceptive and anti-inflamatory activity of Geum kokanicum. After determination total extract LD50, different doses of extract were chosen for intrapritoneal injections. In inflammation test, male NMRI mice were divided into 6 groups: control (normal saline), positive control (Dexamethasone 15mg/kg), and total extract (0.025, 0.05, 0.1, and 0.2 gr/kg). The inflammation was produced by xyleneinduced edema. In order to evaluate the antinociceptive effect of total extract, formalin test was used. Mice were divided into 6 groups: control, positive control (morphine 10mg/kg), and 4 groups which received total extract. Then they received Formalin. The animals were observed for the reaction to pain. Data were analyzed using One-way ANOVA followed by Tukey-Kramer multiple comparison test. LD50 was 1 gr/kg. Data indicated that 0.5,0.1 and 0.2 gr/kg doses of total extract have particular antinociceptive and antiinflammatory effects in a comparison with control (P

Image Segmentation Based on Graph Theoretical Approach to Improve the Quality of Image Segmentation

Graph based image segmentation techniques are considered to be one of the most efficient segmentation techniques which are mainly used as time & space efficient methods for real time applications. How ever, there is need to focus on improving the quality of segmented images obtained from the earlier graph based methods. This paper proposes an improvement to the graph based image segmentation methods already described in the literature. We contribute to the existing method by proposing the use of a weighted Euclidean distance to calculate the edge weight which is the key element in building the graph. We also propose a slight modification of the segmentation method already described in the literature, which results in selection of more prominent edges in the graph. The experimental results show the improvement in the segmentation quality as compared to the methods that already exist, with a slight compromise in efficiency.

Study of Effect of Removal of Shiftrows and Mixcolumns Stages of AES and AES-KDS on their Encryption Quality and Hence Security

This paper demonstrates the results when either Shiftrows stage or Mixcolumns stage and when both the stages are omitted in the well known block cipher Advanced Encryption Standard(AES) and its modified version AES with Key Dependent S-box(AES-KDS), using avalanche criterion and other tests namely encryption quality, correlation coefficient, histogram analysis and key sensitivity tests.

Securing Message in Wireless Sensor Network by using New Method of Code Conversions

Recently, wireless sensor networks have been paid more interest, are widely used in a lot of commercial and military applications, and may be deployed in critical scenarios (e.g. when a malfunctioning network results in danger to human life or great financial loss). Such networks must be protected against human intrusion by using the secret keys to encrypt the exchange messages between communicating nodes. Both the symmetric and asymmetric methods have their own drawbacks for use in key management. Thus, we avoid the weakness of these two cryptosystems and make use of their advantages to establish a secure environment by developing the new method for encryption depending on the idea of code conversion. The code conversion-s equations are used as the key for designing the proposed system based on the basics of logic gate-s principals. Using our security architecture, we show how to reduce significant attacks on wireless sensor networks.

Lessons Learned from Observing User Behavior through Repeated Usability Evaluations

Academic research information service is a must for surveying previous studies in research and development process. OntoFrame is an academic research information service under Semantic Web framework different from simple keyword-based services such as CiteSeer and Google Scholar. The first purpose of this study is for revealing user behavior in their surveys, the objects of using academic research information services, and their needs. The second is for applying lessons learned from the results to OntoFrame.

FPGA Implementation of RSA Cryptosystem

In this paper, the hardware implementation of the RSA public-key cryptographic algorithm is presented. The RSA cryptographic algorithm is depends on the computation of repeated modular exponentials. The Montgomery algorithm is used and modified to reduce hardware resources and to achieve reasonable operating speed for FPGA. An efficient architecture for modular multiplications based on the array multiplier is proposed. We have implemented a RSA cryptosystem based on Montgomery algorithm. As a result, it is shown that proposed architecture contributes to small area and reasonable speed.