A High Precision Temperature Insensitive Current and Voltage Reference Generator

A high precision temperature insensitive current and voltage reference generator is presented. It is specifically developed for temperature compensated oscillator. The circuit, designed using MXIC 0.5um CMOS technology, has an operating voltage that ranges from 2.6V to 5V and generates a voltage of 1.21V and a current of 6.38 ӴA. It exhibits a variation of ±0.3nA for the current reference and a stable output for voltage reference as the temperature is varied from 0°C to 70°C. The power supply rejection ratio obtained without any filtering capacitor at 100Hz and 10MHz is -30dB and -12dB respectively.

XML Data Management in Compressed Relational Database

XML is an important standard of data exchange and representation. As a mature database system, using relational database to support XML data may bring some advantages. But storing XML in relational database has obvious redundancy that wastes disk space, bandwidth and disk I/O when querying XML data. For the efficiency of storage and query XML, it is necessary to use compressed XML data in relational database. In this paper, a compressed relational database technology supporting XML data is presented. Original relational storage structure is adaptive to XPath query process. The compression method keeps this feature. Besides traditional relational database techniques, additional query process technologies on compressed relations and for special structure for XML are presented. In this paper, technologies for XQuery process in compressed relational database are presented..

A Medical Images Based Retrieval System using Soft Computing Techniques

Content-Based Image Retrieval (CBIR) has been one on the most vivid research areas in the field of computer vision over the last 10 years. Many programs and tools have been developed to formulate and execute queries based on the visual or audio content and to help browsing large multimedia repositories. Still, no general breakthrough has been achieved with respect to large varied databases with documents of difering sorts and with varying characteristics. Answers to many questions with respect to speed, semantic descriptors or objective image interpretations are still unanswered. In the medical field, images, and especially digital images, are produced in ever increasing quantities and used for diagnostics and therapy. In several articles, content based access to medical images for supporting clinical decision making has been proposed that would ease the management of clinical data and scenarios for the integration of content-based access methods into Picture Archiving and Communication Systems (PACS) have been created. This paper gives an overview of soft computing techniques. New research directions are being defined that can prove to be useful. Still, there are very few systems that seem to be used in clinical practice. It needs to be stated as well that the goal is not, in general, to replace text based retrieval methods as they exist at the moment.

A Collusion-Resistant Distributed Signature Delegation Based on Anonymous Mobile Agent

This paper presents a novel method that allows an agent host to delegate its signing power to an anonymous mobile agent in such away that the mobile agent does not reveal any information about its host-s identity and, at the same time, can be authenticated by the service host, hence, ensuring fairness of service provision. The solution introduces a verification server to verify the signature generated by the mobile agent in such a way that even if colluding with the service host, both parties will not get more information than what they already have. The solution incorporates three methods: Agent Signature Key Generation method, Agent Signature Generation method, Agent Signature Verification method. The most notable feature of the solution is that, in addition to allowing secure and anonymous signature delegation, it enables tracking of malicious mobile agents when a service host is attacked. The security properties of the proposed solution are analyzed, and the solution is compared with the most related work.

Exploring Customer Trust in B2C Mobile Payments – A Qualitative Study

Mobile payments have been deployed by businesses for more than a decade. Customers use mobile payments if they trust in this relatively new payment method, have a belief and confidence in, as well as reliance on its services and applications. Despite its potential, the current literature shows that there is lack of customer trust in B2C mobile payments, and a lack of studies that determine the factors that influence their trust in these payments; which make these factors yet to be understood, especially in the Middle East region. Thus, this study aims to explore the factors that influence customer trust in mobile payments. The empirical data for this explorative study was collected by establishing four focus group sessions in the UAE. The results indicate that the explored significant factors can be classified into five main groups: customer characteristics, environmental (social and cultural) influences, provider characteristics, mobile-device characteristics, and perceived risks.

The Game of Synchronized Quadromineering

In synchronized games players make their moves simultaneously rather than alternately. Synchronized Quadromineering is the synchronized version of Quadromineering, a variants of a classical two-player combinatorial game called Domineering. Experimental results for small m × n boards (with m + n < 15) and some theoretical results for general k × n boards (with k = 4, 5, 6) are presented. Moreover, some Synchronized Quadromineering variants are also investigated.

Chewing behavior and Bolus Properties as Affected by Different Rice Types

The study aimed to investigate the effect of rice types on chewing behaviours (chewing time, number of chews, and portion size) and bolus properties (bolus moisture content, solid loss, and particle size distribution (PSD)) in human subjects. Five cooked rice types including brown rice (BR), white rice (WR), parboiled white rice (PR), high amylose white rice (HR) and waxy white rice (WXR) were chewed by six subjects. The chewing behaviours were recorded and the food boluses were collected during mastication. Rice typeswere found to significantly influence all chewing parameters evaluated. The WXR and BR showed the most pronounced differences compared with other rice types. The initial moisture content of un-chewed WXR was lowest (43.39%) whereas those of other rice types were ranged from 66.86 to 70.33%. The bolus obtained from chewing the WXR contained lowest moisture content (56.43%) whilst its solid loss (22.03%) was not significant different from those of all rice types. In PSD evaluation using Mastersizer S, the diameter of particles measured was ranged between 4 to 3500 μm. The particle size of food bolus from BR, HR, and WXR contained much finer particles than those of WR and PR.

Environmental Management of the Tanning Industry's Supply Chain: An Integration Model from Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001:2004

The environmental impact caused by industries is an issue that, in the last 20 years, has become very important in terms of society, economics and politics in Colombia. Particularly, the tannery process is extremely polluting because of uneffective treatments and regulations given to the dumping process and atmospheric emissions. Considering that, this investigation is intended to propose a management model based on the integration of Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001-2004, that prioritizes the strategic components of the organizations. As a result, a management model will be obtained and it will provide a strategic perspective through a systemic approach to the tanning process. This will be achieved through the use of Multicriteria Decision tools, along with Quality Function Deployment and Fuzzy Logic. The strategic approach that embraces the management model using the alignment of Lean Supply Chain, Green Supply Chain, Cleaner Production and ISO 14001-2004, is an integrated perspective that allows a gradual frame of the tactical and operative elements through the correct setting of the information flow, improving the decision making process. In that way, Small Medium Enterprises (SMEs) could improve their productivity, competitiveness and as an added value, the minimization of the environmental impact. This improvement is expected to be controlled through a Dashboard that helps the Organization measure its performance along the implementation of the model in its productive process.

An Energy-Latency-Efficient MAC Protocol for Wireless Sensor Networks

Because nodes are usually battery-powered, the energy presents a very scarce resource in wireless sensor networks. For this reason, the design of medium access control had to take energy efficiency as one of its hottest concerns. Accordingly, in order to improve the energy performance of MAC schemes in wireless sensor networks, several ways can be followed. In fact, some researchers try to limit idle listening while others focus on mitigating overhearing (i.e. a node can hear a packet which is destined to another node) or reducing the number of the used control packets. We, in this paper, propose a new hybrid MAC protocol termed ELE-MAC (i.e. Energy Latency Efficient MAC). The ELE-MAC major design goals are energy and latency efficiencies. It adopts less control packets than SMAC in order to preserve energy. We carried out ns- 2 simulations to evaluate the performance of the proposed protocol. Thus, our simulation-s results prove the ELE-MAC energy efficiency. Additionally, our solution performs statistically the same or better latency characteristic compared to adaptive SMAC.

A Tool for Audio Quality Evaluation Under Hostile Environment

In this paper is to evaluate audio and speech quality with the help of Digital Audio Watermarking Technique under the different types of attacks (signal impairments) like Gaussian Noise, Compression Error and Jittering Effect. Further attacks are considered as Hostile Environment. Audio and Speech Quality Evaluation is an important research topic. The traditional way for speech quality evaluation is using subjective tests. They are reliable, but very expensive, time consuming, and cannot be used in certain applications such as online monitoring. Objective models, based on human perception, were developed to predict the results of subjective tests. The existing objective methods require either the original speech or complicated computation model, which makes some applications of quality evaluation impossible.

Deep Learning and Virtual Environment

While computers are known to facilitate lower levels of learning, such as rote memorization of facts, measurable through electronically administered and graded multiple-choice questions, yes/no, and true/false answers, the imparting and measurement of higher-level cognitive skills is more vexing. These require more open-ended delivery and answers, and may be more problematic in an entirely virtual environment, notwithstanding the advances in technologies such as wikis, blogs, discussion boards, etc. As with the integration of all technology, merit is based more on the instructional design of the course than on the technology employed in, and of, itself. With this in mind, this study examined the perceptions of online students in an introductory Computer Information Systems course regarding the fostering of various higher-order thinking and team-building skills as a result of the activities, resources and technologies (ART) used in the course.

The Identification of Selected Dysfunctions and Paradoxes in Corporate Social Responsibility Management in Small Enterprise

The study presents a brief and synthetic discussion of selected conclusions resulting from multidimensional and in-depth empirical studies. Its theoretical part presents the assumptions referring to social responsibility management from the perspective of the specific nature of small enterprise functioning, while the empirical part presents the selected dysfunctions and paradoxes in social responsibility management referring to this group of enterprises. The paper is summarized by a short list of the resulting recommendations.

New Investigation of the Exchange Effects Role on the Elastic and Inelastic Scattering of α-Particles on 9Be

Elastic and inelastic scattering of α-particles by 9Be nuclei at different incident energies have been analyzed. Optical model parameters (OMPs) of α-particles elastic scattering by 9Be at different energies have been obtained. Coupled Reaction Channel (CRC) of elastic scattering, inelastic scattering and transfer reaction has been calculated using Fresco Code. The effect of involving CRC calculations on the analysis of differential cross section has been studied. The transfer reaction of (5He) in the reaction 9Be(α,9Be)α has been studied. The spectroscopic factor of 9Be≡α+5He has been extracted.

Factors Influencing Students' Self-Concept among Malaysian Students

This paper examines the students’ self-concept among 16- and 17- year- old adolescents in Malaysian secondary schools. Previous studies have shown that positive self-concept played an important role in student adjustment and academic performance during schooling. This study attempts to investigate the factors influencing students’ perceptions toward their own self-concept. A total of 1168 students participated in the survey. This study utilized the CoPs (UM) instrument to measure self-concept. Principal Component Analysis (PCA) revealed three factors: academic selfconcept, physical self-concept and social self-concept. This study confirmed that students perceived certain internal context factors, and revealed that external context factor also have an impact on their self-concept.

Finite Element Prediction and Experimental Verification of the Failure Pattern of Proximal Femur using Quantitative Computed Tomography Images

This paper presents a novel method for prediction of the mechanical behavior of proximal femur using the general framework of the quantitative computed tomography (QCT)-based finite element Analysis (FEA). A systematic imaging and modeling procedure was developed for reliable correspondence between the QCT-based FEA and the in-vitro mechanical testing. A speciallydesigned holding frame was used to define and maintain a unique geometrical reference system during the analysis and testing. The QCT images were directly converted into voxel-based 3D finite element models for linear and nonlinear analyses. The equivalent plastic strain and the strain energy density measures were used to identify the critical elements and predict the failure patterns. The samples were destructively tested using a specially-designed gripping fixture (with five degrees of freedom) mounted within a universal mechanical testing machine. Very good agreements were found between the experimental and the predicted failure patterns and the associated load levels.

Application of Mutual Information based Least dependent Component Analysis (MILCA) for Removal of Ocular Artifacts from Electroencephalogram

The electrical potentials generated during eye movements and blinks are one of the main sources of artifacts in Electroencephalogram (EEG) recording and can propagate much across the scalp, masking and distorting brain signals. In recent times, signal separation algorithms are used widely for removing artifacts from the observed EEG data. In this paper, a recently introduced signal separation algorithm Mutual Information based Least dependent Component Analysis (MILCA) is employed to separate ocular artifacts from EEG. The aim of MILCA is to minimize the Mutual Information (MI) between the independent components (estimated sources) under a pure rotation. Performance of this algorithm is compared with eleven popular algorithms (Infomax, Extended Infomax, Fast ICA, SOBI, TDSEP, JADE, OGWE, MS-ICA, SHIBBS, Kernel-ICA, and RADICAL) for the actual independence and uniqueness of the estimated source components obtained for different sets of EEG data with ocular artifacts by using a reliable MI Estimator. Results show that MILCA is best in separating the ocular artifacts and EEG and is recommended for further analysis.

Evolutionary Training of Hybrid Systems of Recurrent Neural Networks and Hidden Markov Models

We present a hybrid architecture of recurrent neural networks (RNNs) inspired by hidden Markov models (HMMs). We train the hybrid architecture using genetic algorithms to learn and represent dynamical systems. We train the hybrid architecture on a set of deterministic finite-state automata strings and observe the generalization performance of the hybrid architecture when presented with a new set of strings which were not present in the training data set. In this way, we show that the hybrid system of HMM and RNN can learn and represent deterministic finite-state automata. We ran experiments with different sets of population sizes in the genetic algorithm; we also ran experiments to find out which weight initializations were best for training the hybrid architecture. The results show that the hybrid architecture of recurrent neural networks inspired by hidden Markov models can train and represent dynamical systems. The best training and generalization performance is achieved when the hybrid architecture is initialized with random real weight values of range -15 to 15.

An Optimization of the New Die Design of Sheet Hydroforming by Taguchi Method

During the last few years, several sheet hydroforming processes have been introduced. Despite the advantages of these methods, they have some limitations. Of the processes, the two main ones are the standard hydroforming and hydromechanical deep drawing. A new sheet hydroforming die set was proposed that has the advantages of both processes and eliminates their limitations. In this method, a polyurethane plate was used as a part of the die-set to control the blank holder force. This paper outlines the Taguchi optimization methodology, which is applied to optimize the effective parameters in forming cylindrical cups by the new die set of sheet hydroforming process. The process parameters evaluated in this research are polyurethane hardness, polyurethane thickness, forming pressure path and polyurethane hole diameter. The design of experiments based upon L9 orthogonal arrays by Taguchi was used and analysis of variance (ANOVA) was employed to analyze the effect of these parameters on the forming pressure. The analysis of the results showed that the optimal combination for low forming pressure is harder polyurethane, bigger diameter of polyurethane hole and thinner polyurethane. Finally, the confirmation test was derived based on the optimal combination of parameters and it was shown that the Taguchi method is suitable to examine the optimization process.

The Knapsack Sharing Problem: A Tree Search Exact Algorithm

In this paper, we study the knapsack sharing problem, a variant of the well-known NP-Hard single knapsack problem. We investigate the use of a tree search for optimally solving the problem. The used method combines two complementary phases: a reduction interval search phase and a branch and bound procedure one. First, the reduction phase applies a polynomial reduction strategy; that is used for decomposing the problem into a series of knapsack problems. Second, the tree search procedure is applied in order to attain a set of optimal capacities characterizing the knapsack problems. Finally, the performance of the proposed optimal algorithm is evaluated on a set of instances of the literature and its runtime is compared to the best exact algorithm of the literature.

Clustering Unstructured Text Documents Using Fading Function

Clustering unstructured text documents is an important issue in data mining community and has a number of applications such as document archive filtering, document organization and topic detection and subject tracing. In the real world, some of the already clustered documents may not be of importance while new documents of more significance may evolve. Most of the work done so far in clustering unstructured text documents overlooks this aspect of clustering. This paper, addresses this issue by using the Fading Function. The unstructured text documents are clustered. And for each cluster a statistics structure called Cluster Profile (CP) is implemented. The cluster profile incorporates the Fading Function. This Fading Function keeps an account of the time-dependent importance of the cluster. The work proposes a novel algorithm Clustering n-ary Merge Algorithm (CnMA) for unstructured text documents, that uses Cluster Profile and Fading Function. Experimental results illustrating the effectiveness of the proposed technique are also included.