Generating Frequent Patterns through Intersection between Transactions

The problem of frequent itemset mining is considered in this paper. One new technique proposed to generate frequent patterns in large databases without time-consuming candidate generation. This technique is based on focusing on transaction instead of concentrating on itemset. This algorithm based on take intersection between one transaction and others transaction and the maximum shared items between transactions computed instead of creating itemset and computing their frequency. With applying real life transactions and some consumption is taken from real life data, the significant efficiency acquire from databases in generation association rules mining.

A Self-Consistent Scheme for Elastic-Plastic Asperity Contact

In this paper, a generalized self-consistent scheme, or “three phase model", is used to set up a micro-mechanics model for rough surface contact with randomly distributed asperities. The dimensionless average real pressure p is obtained as function of the ratio of the real contact area to the apparent contact area, 0 A / A r . Both elastic and plastic materials are considered, and the influence of the plasticity of material on p is discussed. Both two-dimensional and three-dimensional rough surface contact problems are considered.

Compressive Properties of a Synthetic Bone Substitute for Vertebral Cancellous Bone

Transpedicular screw fixation in spinal fractures, degenerative changes, or deformities is a well-established procedure. However, important rate of fixation failure due to screw bending, loosening, or pullout are still reported particularly in weak bone stock in osteoporosis. To overcome the problem, mechanism of failure has to be fully investigated in vitro. Post-mortem human subjects are less accessible and animal cadavers comprise limitations due to different geometry and mechanical properties. Therefore, the development of a synthetic model mimicking the realistic human vertebra is highly demanded. A bone surrogate, composed of Polyurethane (PU) foam analogous to cancellous bone porous structure, was tested for 3 different densities in this study. The mechanical properties were investigated under uniaxial compression test by minimizing the end artifacts on specimens. The results indicated that PU foam of 0.32 g.cm-3 density has comparable mechanical properties to human cancellous bone in terms of young-s modulus and yield strength. Therefore, the obtained information can be considered as primary step for developing a realistic cancellous bone of human vertebral body. Further evaluations are also recommended for other density groups.

Application of Vortex Tubes for Extracting Sediments Using SHARC Software - A Case Study of the Western Canal in the Dez Diversion Weir

Sediment loads transfer in hydraulic installations and their consequences for the O&M of modern canal systems is emerging as one of the most important considerations in hydraulic engineering projects apriticularly those which are inteded to feed the irrigation and draiange schemes of large command areas such as the Dez and Mogahn in Iran.. The aim of this paper is to investigate the applicability of the vortex tube as a viable means of extracting sediment loads entering the canal systems in general and the water inatke structures in particulars. The Western conveyance canal of the Dez Diversion weir which feeds the Karkheh Flood Plain in Sothwestern Dezful has been used as the case study using the data from the Dastmashan Hydrometric Station. The SHARC software has been used as an analytical framework to interprete the data. Results show that given the grain size D50 and the canal turbulence the adaption length from the beginning of the canal and after the diversion dam is estimated at 477 m, a point which is suitable for laying the vortex tube.

A Patricia-Tree Approach for Frequent Closed Itemsets

In this paper, we propose an adaptation of the Patricia-Tree for sparse datasets to generate non redundant rule associations. Using this adaptation, we can generate frequent closed itemsets that are more compact than frequent itemsets used in Apriori approach. This adaptation has been experimented on a set of datasets benchmarks.

Evolved Disease Avoidance Mechanisms, Generalized Prejudice, Modern Attitudes towards Individuals with Intellectual Disability

Previous research has demonstrated that negative attitudes towards people with physical disabilities and obesity are predicted by a component of perceived vulnerability to disease; germ aversion. These findings have been suggested as illustrations of an evolved but over-active mechanism which promotes the avoidance of pathogen-carrying individuals. To date, this interpretation of attitude formation has not been explored with regard to people with intellectual disability, and no attempts have been made to examine possible mediating factors. This study examined attitudes in 333 adults and demonstrated that the moderate positive relationship between germ aversion and negative attitudes toward people with intellectual disability is fully mediated by social dominance orientation, a general preference for hierarchies and inequalities among social groups. These findings have implications for the design of programs which attempt to promote community acceptance and inclusion of people with disabilities.

A Web-Based System for Mapping Features into ISO 14649-Compliant Machining Workingsteps

The rapid development of manufacturing and information systems has caused significant changes in manufacturing environments in recent decades. Mass production has given way to flexible manufacturing systems, in which an important characteristic is customized or "on demand" production. In this scenario, the seamless and without gaps information flow becomes a key factor for success of enterprises. In this paper we present a framework to support the mapping of features into machining workingsteps compliant with the ISO 14649 standard (known as STEP-NC). The system determines how the features can be made with the available manufacturing resources. Examples of the mapping method are presented for features such as a pocket with a general surface.

Intention Recognition using a Graph Representation

The human friendly interaction is the key function of a human-centered system. Over the years, it has received much attention to develop the convenient interaction through intention recognition. Intention recognition processes multimodal inputs including speech, face images, and body gestures. In this paper, we suggest a novel approach of intention recognition using a graph representation called Intention Graph. A concept of valid intention is proposed, as a target of intention recognition. Our approach has two phases: goal recognition phase and intention recognition phase. In the goal recognition phase, we generate an action graph based on the observed actions, and then the candidate goals and their plans are recognized. In the intention recognition phase, the intention is recognized with relevant goals and user profile. We show that the algorithm has polynomial time complexity. The intention graph is applied to a simple briefcase domain to test our model.

Dispersion of a Solute in Peristaltic Motion of a Couple Stress Fluid through a Porous Medium with Slip Condition

The paper presents an analytical solution for dispersion of a solute in the peristaltic motion of a couple stress fluid through a porous medium with slip condition in the presence of both homogeneous and heterogeneous chemical reactions. The average effective dispersion coefficient has been found using Taylor-s limiting condition and long wavelength approximation. The effects of various relevant parameters on the average coefficient of dispersion have been studied. The average effective dispersion coefficient tends to increase with permeability parameter but tends to decrease with homogeneous chemical reaction rate parameter, couple stress parameter, slip parameter and heterogeneous reaction rate parameter.

Theoretical Analysis of a Crossed-Electrode 2D Array for 3D Imaging

Planar systems of electrodes arranged on both sides of dielectric piezoelectric layer are applied in numerous transducers. They are capable of electronic beam-steering of generated wave both in azimuth and elevation. The wave-beam control is achieved by addressable driving of two-dimensional transducer through proper voltage supply of electrodes on opposite surfaces of the layer. In this paper a semi-analytical method of analysis of the considered transducer is proposed, which is a generalization of the well-known BIS-expansion method. It was earlier exploited with great success in the theory of interdigital transducers of surface acoustic waves, theory of elastic wave scattering by cracks and certain advanced electrostatic problems. The corresponding nontrivial electrostatic problem is formulated and solved numerically.

The Optimal Placement of Capacitor in Order to Reduce Losses and the Profile of Distribution Network Voltage with GA, SA

Most of the losses in a power system relate to the distribution sector which always has been considered. From the important factors which contribute to increase losses in the distribution system is the existence of radioactive flows. The most common way to compensate the radioactive power in the system is the power to use parallel capacitors. In addition to reducing the losses, the advantages of capacitor placement are the reduction of the losses in the release peak of network capacity and improving the voltage profile. The point which should be considered in capacitor placement is the optimal placement and specification of the amount of the capacitor in order to maximize the advantages of capacitor placement. In this paper, a new technique has been offered for the placement and the specification of the amount of the constant capacitors in the radius distribution network on the basis of Genetic Algorithm (GA). The existing optimal methods for capacitor placement are mostly including those which reduce the losses and voltage profile simultaneously. But the retaliation cost and load changes have not been considered as influential UN the target function .In this article, a holistic approach has been considered for the optimal response to this problem which includes all the parameters in the distribution network: The price of the phase voltage and load changes. So, a vast inquiry is required for all the possible responses. So, in this article, we use Genetic Algorithm (GA) as the most powerful method for optimal inquiry.

Evaluation of Sensitometric Properties of Radiographic Films at Different Processing Solutions

The aim of this study was to compare the sensitometric properties of commonly used radiographic films processed with chemical solutions in different workload hospitals. The effect of different processing conditions on induced densities on radiologic films was investigated. Two accessible double emulsions Fuji and Kodak films were exposed with 11-step wedge and processed with Champion and CPAC processing solutions. The mentioned films provided in both workloads centers, high and low. Our findings displays that the speed and contrast of Kodak filmscreen in both work load (high and low) is higher than Fuji filmscreen for both processing solutions. However there was significant differences in films contrast for both workloads when CPAC solution had been used (p=0.000 and 0.028). The results showed base plus fog density for Kodak film was lower than Fuji. Generally Champion processing solution caused more speed and contrast for investigated films in different conditions and there was significant differences in 95% confidence level between two used processing solutions (p=0.01). Low base plus fog density for Kodak films provide more visibility and accuracy and higher contrast results in using lower exposure factors to obtain better quality in resulting radiographs. In this study we found an economic advantages since Champion solution and Kodak film are used while it makes lower patient dose. Thus, in a radiologic facility any change in film processor/processing cycle or chemistry should be carefully investigated before radiological procedures of patients are acquired.

Digital Hypertexts vs. Traditional Books: An Inquiry into Non-Linearity

The current study begins with an awareness that today-s media environment is characterized by technological development and a new way of reading caused by the introduction of the Internet. The researcher conducted a meta analysis framed within Technological Determinism to investigate the process of hypertext reading, its differences from linear reading and the effects such differences can have on people-s ways of mentally structuring their world. The relationship between literacy and the comprehension achieved by reading hypertexts is also investigated. The results show hypertexts are not always user friendly. People experience hyperlinks as interruptions that distract their attention generating comprehension and disorientation. On one hand hypertextual jumping reading generates interruptions that finally make people lose their concentration. On the other hand hypertexts fascinate people who would rather read a document in such a format even though the outcome is often frustrating and affects their ability to elaborate and retain information.

Rational Structure of Panel with Curved Plywood Ribs

Optimization of rational geometrical and mechanical parameters of panel with curved plywood ribs is considered in this paper. The panel consists of cylindrical plywood ribs manufactured from Finish plywood, upper and bottom plywood flange, stiffness diaphragms. Panel is filled with foam. Minimal ratio of structure self weight and load that could be applied to structure is considered as rationality criteria. Optimization is done, by using classical beam theory without nonlinearities. Optimization of discreet design variables is done by Genetic algorithm.

Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions

The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.

A Hybrid Radial-Based Neuro-GA Multiobjective Design of Laminated Composite Plates under Moisture and Thermal Actions

In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.

Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory

Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).

Developing Cu-Mesoporous TiO2 Cooperated with Ozone Assistance and Online- Regeneration System for Acid Odor Removal in All Weather

Cu-mesoporous TiO2 is developed for removal acid odor cooperated with ozone assistance and online- regeneration system with/without UV irradiation (all weather) in study. The results showed that Cu-mesoporous TiO2 present the desirable adsorption efficiency of acid odor without UV irradiation, due to the larger surface area, pore sizeand the additional absorption ability provided by Cu. In the photocatalysis process, the material structure also benefits Cu-mesoporous TiO2 to perform the more outstanding efficiency on degrading acid odor. Cu also postponed the recombination of electron-hole pairs excited from TiO2 to enhance photodegradation ability. Cu-mesoporous TiO2 could gain the conspicuous increase on photocatalysis ability from ozone assistance, but without any benefit on adsorption. In addition, the online regeneration procedure could process the used Cu-mesoporous TiO2 to reinstate the adsorption ability and maintain the photodegradtion performance, depended on scrubbing, desorping acid odor and reducing Cu to metal state.

Performance Analysis of List Scheduling in Heterogeneous Computing Systems

Given a parallel program to be executed on a heterogeneous computing system, the overall execution time of the program is determined by a schedule. In this paper, we analyze the worst-case performance of the list scheduling algorithm for scheduling tasks of a parallel program in a mixed-machine heterogeneous computing system such that the total execution time of the program is minimized. We prove tight lower and upper bounds for the worst-case performance ratio of the list scheduling algorithm. We also examine the average-case performance of the list scheduling algorithm. Our experimental data reveal that the average-case performance of the list scheduling algorithm is much better than the worst-case performance and is very close to optimal, except for large systems with large heterogeneity. Thus, the list scheduling algorithm is very useful in real applications.

An Efficient Biometric Cryptosystem using Autocorrelators

Cryptography provides the secure manner of information transmission over the insecure channel. It authenticates messages based on the key but not on the user. It requires a lengthy key to encrypt and decrypt the sending and receiving the messages, respectively. But these keys can be guessed or cracked. Moreover, Maintaining and sharing lengthy, random keys in enciphering and deciphering process is the critical problem in the cryptography system. A new approach is described for generating a crypto key, which is acquired from a person-s iris pattern. In the biometric field, template created by the biometric algorithm can only be authenticated with the same person. Among the biometric templates, iris features can efficiently be distinguished with individuals and produces less false positives in the larger population. This type of iris code distribution provides merely less intra-class variability that aids the cryptosystem to confidently decrypt messages with an exact matching of iris pattern. In this proposed approach, the iris features are extracted using multi resolution wavelets. It produces 135-bit iris codes from each subject and is used for encrypting/decrypting the messages. The autocorrelators are used to recall original messages from the partially corrupted data produced by the decryption process. It intends to resolve the repudiation and key management problems. Results were analyzed in both conventional iris cryptography system (CIC) and non-repudiation iris cryptography system (NRIC). It shows that this new approach provides considerably high authentication in enciphering and deciphering processes.