Stresses in Cast Metal Inlays Restored Molars

Cast metal inlays can be used on molars requiring a class II restoration instead amalgam and offer a durable alternative. Because it is known that class II inlays may increase the susceptibility to fracture, it is important to ensure optimal performance in selection of the adequate preparation design to reduce stresses in teeth structures and also in the restorations. The aim of the study was to investigate the influence of preparation design on stress distribution in molars with different class II preparations and in cast metal inlays. The first step of the study was to achieve 3D models in order to analyze teeth and cast metal class II inlays. The geometry of the intact tooth was obtained by 3D scanning using a manufactured device. With a NURBS modeling program the preparations and the appropriately inlays were designed. 3D models of first upper molars of the same shape and size were created. Inlay cavities designs were created using literature data. The geometrical model was exported and the mesh structure of the solid 3D model was created for structural simulations. Stresses were located around the occlusal contact areas. For the studied cases, the stress values were not significant influenced by the taper of the preparation. it was demonstrated stresses are higher in the cast metal restorations and therefore the strength of the teeth is not affected.

Retarding Potential Analyzer Design and Result Analysis for Ion Energy Distribution Measurement of the Thruster Plume in the Laboratory

Plasma plume will be produced and arrive at spacecraft when the electric thruster operates on orbit. It-s important to characterize the thruster plasma parameters because the plume has significant effects or hazards on spacecraft sub-systems and parts. Through the ground test data of the desired parameters, the major characteristics of the thruster plume will be achieved. Also it is very important for optimizing design of Ion thruster. Retarding Potential Analyzer (RPA) is an effective instrument for plasma ion energy per unit charge distribution measurement. Special RPA should be designed according to certain plume plasma parameters range and feature. In this paper, major principles usable for good RPA design are discussed carefully. Conform to these principles, a four-grid planar electrostatic energy analyzer RPA was designed to avoid false data, and details were discussed including construction, materials, aperture diameter and so on. At the same time, it was designed more suitable for credible and long-duration measurements in the laboratory. In the end, RPA measurement results in the laboratory were given and discussed.

Comparative Analysis of Farm Enterprises Performance in Two Agro-Ecological Feuding Zone of Nigeria

The two agro-ecological zones became the focus of the study because of violent nature of the incessant conflict in the zones. The available register of farmers association was the sampling frame work where ten percent (61) farmers per state were randomly sampled. Data were collected and analysed using z-test. The research findings revealed tree crops and grains production enterprises ranked higher in Osun (rain fed zones) and Taraba states (savannah zones) respectively. Osun state entrepreneur felt the effect of the conflict on their enterprises more than Tarba state. The reasons adduced for severity of the conflict on enterprises are majority (77.0%) migrated and (75.5%) of them were not allowed to enter their farms during and when conflict deescalated unlike situation in Taraba state. The different in enterprises production level between the two agroecological zone was statistically significant at p

CSOLAP (Continuous Spatial On-Line Analytical Processing)

Decision support systems are usually based on multidimensional structures which use the concept of hypercube. Dimensions are the axes on which facts are analyzed and form a space where a fact is located by a set of coordinates at the intersections of members of dimensions. Conventional multidimensional structures deal with discrete facts linked to discrete dimensions. However, when dealing with natural continuous phenomena the discrete representation is not adequate. There is a need to integrate spatiotemporal continuity within multidimensional structures to enable analysis and exploration of continuous field data. Research issues that lead to the integration of spatiotemporal continuity in multidimensional structures are numerous. In this paper, we discuss research issues related to the integration of continuity in multidimensional structures, present briefly a multidimensional model for continuous field data. We also define new aggregation operations. The model and the associated operations and measures are validated by a prototype.

Efficient Feature-Based Registration for CT-M R Images Based on NSCT and PSO

Feature-based registration is an effective technique for clinical use, because it can greatly reduce computational costs. However, this technique, which estimates the transformation by using feature points extracted from two images, may cause misalignments. To handle with this limitation, we propose to extract the salient edges and extracted control points (CP) of medical images by using efficiency of multiresolution representation of data nonsubsampled contourlet transform (NSCT) that finds the best feature points. The MR images were first decomposed using the NSCT, and then Edge and CP were extracted from bandpass directional subband of NSCT coefficients and some proposed rules. After edge and CP extraction, mutual information was adopted for the registration of feature points and translation parameters are calculated by using particle swarm optimization (PSO). The experimental results showed that the proposed method produces totally accurate performance for registration medical CT-MR images.

A Study on the Secure ebXML Transaction Models

ebXML (Electronic Business using eXtensible Markup Language) is an e-business standard, sponsored by UN/CEFACT and OASIS, which enables enterprises to exchange business messages, conduct trading relationships, communicate data in common terms and define and register business processes. While there is tremendous e-business value in the ebXML, security remains an unsolved problem and one of the largest barriers to adoption. XML security technologies emerging recently have extensibility and flexibility suitable for security implementation such as encryption, digital signature, access control and authentication. In this paper, we propose ebXML business transaction models that allow trading partners to securely exchange XML based business transactions by employing XML security technologies. We show how each XML security technology meets the ebXML standard by constructing the test software and validating messages between the trading partners.

Pre-Clinical Studying of Antitumor Ramon Preparation: Acute Toxicity

In article the data of acute toxicity for pre-clinical researches of Ramon preparation is described. Ramon effects to clinical characteristics of blood, cardio-vascular system, hepatotoxic and diuretic effects were studied.

Query Algebra for Semistuctured Data

With the tremendous growth of World Wide Web (WWW) data, there is an emerging need for effective information retrieval at the document level. Several query languages such as XML-QL, XPath, XQL, Quilt and XQuery are proposed in recent years to provide faster way of querying XML data, but they still lack of generality and efficiency. Our approach towards evolving a framework for querying semistructured documents is based on formal query algebra. Two elements are introduced in the proposed framework: first, a generic and flexible data model for logical representation of semistructured data and second, a set of operators for the manipulation of objects defined in the data model. In additional to accommodating several peculiarities of semistructured data, our model offers novel features such as bidirectional paths for navigational querying and partitions for data transformation that are not available in other proposals.

Toward a Use of Ontology to Reinforcing Semantic Classification of Message Based On LSA

For best collaboration, Asynchronous tools and particularly the discussion forums are the most used thanks to their flexibility in terms of time. To convey only the messages that belong to a theme of interest of the tutor in order to help him during his tutoring work, use of a tool for classification of these messages is indispensable. For this we have proposed a semantics classification tool of messages of a discussion forum that is based on LSA (Latent Semantic Analysis), which includes a thesaurus to organize the vocabulary. Benefits offered by formal ontology can overcome the insufficiencies that a thesaurus generates during its use and encourage us then to use it in our semantic classifier. In this work we propose the use of some functionalities that a OWL ontology proposes. We then explain how functionalities like “ObjectProperty", "SubClassOf" and “Datatype" property make our classification more intelligent by way of integrating new terms. New terms found are generated based on the first terms introduced by tutor and semantic relations described by OWL formalism.

Developing Road Performance Measurement System with Evaluation Instrument

Transportation authorities need to provide the services and facilities that are critical to every country-s well-being and development. Management of the road network is becoming increasingly challenging as demands increase and resources are limited. Public sector institutions are integrating performance information into budgeting, managing and reporting via implementing performance measurement systems. In the face of growing challenges, performance measurement of road networks is attracting growing interest in many countries. The large scale of public investments makes the maintenance and development of road networks an area where such systems are an important assessment tool. Transportation agencies have been using performance measurement and modeling as part of pavement and bridge management systems. Recently the focus has been on extending the process to applications in road construction and maintenance systems, operations and safety programs, and administrative structures and procedures. To eliminate failure and dysfunctional consequences the importance of obtaining objective data and implementing evaluation instrument where necessary is presented in this paper

Detecting the Capacity Reserve in an Overhead Line

There are various solutions for improving existing overhead line systems with the general purpose of increasing their limited capacity. The capacity reserve of the existing overhead lines is an important problem that must be considered from different aspects. The paper contains a comparative analysis of the mechanical and thermal limitations of an existing overhead line based on certain calculation conditions characterizing the examined variants. The methodology of the proposed estimation of the permissible conductor temperature and maximum load current is described in detail. The transmission line model consists of specific information of an existing overhead line of the Latvian power network. The main purpose of the simulation tasks is to find an additional capacity reserve by using accurate mathematical models. The results of the obtained data are presented.

Mass Transfer Modeling in a Packed Bed of Palm Kernels under Supercritical Conditions

Studies on gas solid mass transfer using Supercritical fluid CO2 (SC-CO2) in a packed bed of palm kernels was investigated at operating conditions of temperature 50 °C and 70 °C and pressures ranges from 27.6 MPa, 34.5 MPa, 41.4 MPa and 48.3 MPa. The development of mass transfer models requires knowledge of three properties: the diffusion coefficient of the solute, the viscosity and density of the Supercritical fluids (SCF). Matematical model with respect to the dimensionless number of Sherwood (Sh), Schmidt (Sc) and Reynolds (Re) was developed. It was found that the model developed was found to be in good agreement with the experimental data within the system studied.

A Novel Approach for Tracking of a Mobile Node Based on Particle Filter and Trilateration

This paper evaluates the performance of a novel algorithm for tracking of a mobile node, interms of execution time and root mean square error (RMSE). Particle Filter algorithm is used to track the mobile node, however a new technique in particle filter algorithm is also proposed to reduce the execution time. The stationary points were calculated through trilateration and finally by averaging the number of points collected for a specific time, whereas tracking is done through trilateration as well as particle filter algorithm. Wi-Fi signal is used to get initial guess of the position of mobile node in x-y coordinates system. Commercially available software “Wireless Mon" was used to read the WiFi signal strength from the WiFi card. Visual Cµ version 6 was used to interact with this software to read only the required data from the log-file generated by “Wireless Mon" software. Results are evaluated through mathematical modeling and MATLAB simulation.

New Adaptive Linear Discriminante Analysis for Face Recognition with SVM

We have applied new accelerated algorithm for linear discriminate analysis (LDA) in face recognition with support vector machine. The new algorithm has the advantage of optimal selection of the step size. The gradient descent method and new algorithm has been implemented in software and evaluated on the Yale face database B. The eigenfaces of these approaches have been used to training a KNN. Recognition rate with new algorithm is compared with gradient.

The Analysis of Knee Joint Movement During Golf Swing in Professional and Amateur Golfers

The understanding of knee movement during swing importance for golf swing improving and preventing injury. Thirty male professional and amateur golfers were assigned to swing time by time for 3 times. Data from a vedio-based motion capture were used to compute knee joint movement variables. The results showed that professional and amateur golfers were significantly in left knee flexion angle at the impact point and mid follow through phase. Nevertheless, left knee external rotation in both groups was also significant. The right knee were no significant different in all variable. However, pattern of knee joint movement are also likely between professional and amateur golfers.

Annual Changes in Some Qualitative Parameters of Groundwater in Shirvan Plain North East of Iran

Shirvan is located in plain in Northern Khorasan province north east of Iran and has semiarid to temperate climate. To investigate the annual changes in some qualitative parameters such as electrical conductivity, total dissolved solids and chloride concentrations which have increased during ten continuous years. Fourteen groundwater sources including deep as well as semi-deep wells were sampled and were analyzed using standard methods. The trends of obtained data were analyzed during these years and the effects of different factors on the changes in electrical conductivity, concentration of chloride and total dissolved solids were clarified. The results showed that the amounts of some qualitative parameters have been increased during 10 years time which has led to decrease in water quality. The results also showed that increased in urban populations as well as extensive industrialization in the studied area are the most important reasons to influence underground water quality. Furthermore decrease in water quantity is also evident due to more water utilization and occurrence of recent droughts in the region during recent years.

A New Approach for the Fingerprint Classification Based On Gray-Level Co- Occurrence Matrix

In this paper, we propose an approach for the classification of fingerprint databases. It is based on the fact that a fingerprint image is composed of regular texture regions that can be successfully represented by co-occurrence matrices. So, we first extract the features based on certain characteristics of the cooccurrence matrix and then we use these features to train a neural network for classifying fingerprints into four common classes. The obtained results compared with the existing approaches demonstrate the superior performance of our proposed approach.

SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space

Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.

Combining Diverse Neural Classifiers for Complex Problem Solving: An ECOC Approach

Combining classifiers is a useful method for solving complex problems in machine learning. The ECOC (Error Correcting Output Codes) method has been widely used for designing combining classifiers with an emphasis on the diversity of classifiers. In this paper, in contrast to the standard ECOC approach in which individual classifiers are chosen homogeneously, classifiers are selected according to the complexity of the corresponding binary problem. We use SATIMAGE database (containing 6 classes) for our experiments. The recognition error rate in our proposed method is %10.37 which indicates a considerable improvement in comparison with the conventional ECOC and stack generalization methods.

Evaluation of Classifiers Based On I2C Distance for Action Recognition

Naive Bayes Nearest Neighbor (NBNN) and its variants, i,e., local NBNN and the NBNN kernels, are local feature-based classifiers that have achieved impressive performance in image classification. By exploiting instance-to-class (I2C) distances (instance means image/video in image/video classification), they avoid quantization errors of local image descriptors in the bag of words (BoW) model. However, the performances of NBNN, local NBNN and the NBNN kernels have not been validated on video analysis. In this paper, we introduce these three classifiers into human action recognition and conduct comprehensive experiments on the benchmark KTH and the realistic HMDB datasets. The results shows that those I2C based classifiers consistently outperform the SVM classifier with the BoW model.