Design and Manufacture of Non-Contact Moving Load for Experimental Analysis of Beams

Dynamic tests are an important step of the design of engineering structures, because the accuracy of predictions of theoretical–numerical procedures can be assessed. In experimental test of moving loads that is one of the major research topics, the load is modeled as a simple moving mass or a small vehicle. This paper deals with the applicability of Non-contact Moving Load (NML) for vibration analysis. For this purpose, an experimental set-up is designed to generate the different types of NML including constant and harmonic. The proposed method relies on pressurized air which is useful, especially when dealing with fragile or sensitive structures. To demonstrate the performance of this system, the set-up is employedfor a modal analysis of a beam and detecting crack of the beam.The obtained results indicate that the experimental set-up for NML can be an attractive alternative to the moving load problems.

Influence of Overfeeding on Productive Performance Traits, Foie Gras Production, Blood Parameters, Internal Organs, Carcass Traits, and Mortality Rate in Two Breeds of Ducks

A total of 60 male mule ducks and 60 male Muscovy ducks were allotted into three groups (n = 20) to estimate the effects of overfeeding (two and four meals) versus ad libitum feeding on productive performance traits, foie gras production, internal organs, and blood parameters. The results show that force-feeding four meals significantly increased (P < 0.01) body weight, weight gain, and gain percentage compared to force-feeding two meals. Both force-feeding regimes (two or four meals) induced significantly higher body weight, weight gain, gain percentage, and absolute carcass weight than ad libitum feeding; however, carcass percentage was significantly higher in ad libitum feeding. Mule ducks had significantly higher weight gain and weight gain percentages than Muscovy ducks. Feed consumption per kilogram of foie gras and per kilogram weight gain was lower for the four-meal than for the two-meal forced feeding regime. Force-feeding four meals induced significantly higher liver weight and percentage (488.96 ± 25.78g, 7.82 ± 0.40%) than force-feeding two meals (381.98 ± 13.60g, 6.42 ± 0.21%). Moreover, feed conversion was significantly higher under forced feeding than under ad libitum feeding (77.65 ± 3.41g, 1.72 ± 0.05%; P < 0.01). Forced feeding (two or four meals) increased all organ weights (intestine, proventriculus, heart, spleen, and pancreas) over ad libitum feeding weights, except for the gizzard; however intestinal and abdominal fat values were higher for four-meal forced feeding than for two-meal forced feeding. Overfeeding did not change blood parameters significantly compared to ad libitum feeding; however, four-meal forced feeding improved the quality of foie gras since it significantly increased the percentage of grade A foie gras (62.5%) at the expense of grades B (33.33%) and C (4.17%) compared with the two-meal forced feeding. The mortality percentage among Muscovy ducks during the forced feeding period was 22.5%, compared to 0% in mule ducks. Liver weight was highly significantly correlated with life weight after overfeeding and certain blood plasma traits.

Robust Coherent Noise Suppression by Point Estimation of the Cauchy Location Parameter

This paper introduces a new point estimation algorithm, with particular focus on coherent noise suppression, given several measurements of the device under test where it is assumed that 1) the noise is first-order stationery and 2) the device under test is linear and time-invariant. The algorithm exploits the robustness of the Pitman estimator of the Cauchy location parameter through the initial scaling of the test signal by a centred Gaussian variable of predetermined variance. It is illustrated through mathematical derivations and simulation results that the proposed algorithm is more accurate and consistently robust to outliers for different tailed density functions than the conventional methods of sample mean (coherent averaging technique) and sample median search.

Shot Transition Detection with Minimal Decoding of MPEG Video Streams

Digital libraries become more and more necessary in order to support users with powerful and easy-to-use tools for searching, browsing and retrieving media information. The starting point for these tasks is the segmentation of video content into shots. To segment MPEG video streams into shots, a fully automatic procedure to detect both abrupt and gradual transitions (dissolve and fade-groups) with minimal decoding in real time is developed in this study. Each was explored through two phases: macro-block type's analysis in B-frames, and on-demand intensity information analysis. The experimental results show remarkable performance in detecting gradual transitions of some kinds of input data and comparable results of the rest of the examined video streams. Almost all abrupt transitions could be detected with very few false positive alarms.

Children and Advertising: Issues in Consumer Socialization Process

Today advertising is actively penetrating into many spheres of our lives. We cannot imagine the existence of a lot of economic activities without advertising. That mostly concerns trade and services. Everyone of us should look better into the everyday communication and carefully consider the amount and the quality of the information we receive as well as its influence on our behaviour. Special attention should be paid to the young generation. Theoretical and practical research has proved the ever growing influence of information (especially the one contained in advertising) on a society; on its economics, culture, religion, politics and even people-s private lives and behaviour. Children have plenty of free time and, therefore, see a lot of different advertising. Though education of children is in the hands of parents and schools, advertising makers and customers should think with responsibility about the selection of time and transmission channels of child targeted advertising. The purpose of the present paper is to investigate the influence of advertising upon consumer views and behaviour of children in different age groups. The present investigation has clarified the influence of advertising as a means of information on a certain group of society, which in the modern information society is the most vulnerable – children. In this paper we assess children-s perception and their understanding of advertising.

A Novel Fuzzy-Neural Based Medical Diagnosis System

In this paper, application of artificial neural networks in typical disease diagnosis has been investigated. The real procedure of medical diagnosis which usually is employed by physicians was analyzed and converted to a machine implementable format. Then after selecting some symptoms of eight different diseases, a data set contains the information of a few hundreds cases was configured and applied to a MLP neural network. The results of the experiments and also the advantages of using a fuzzy approach were discussed as well. Outcomes suggest the role of effective symptoms selection and the advantages of data fuzzificaton on a neural networks-based automatic medical diagnosis system.

How Celebrities can be used in Advertising to the Best Advantage?

The ever increasing product diversity and competition on the market of goods and services has dictated the pace of growth in the number of advertisements. Despite their admittedly diminished effectiveness over the recent years, advertisements remain the favored method of sales promotion. Consequently, the challenge for an advertiser is to explore every possible avenue of making an advertisement more noticeable, attractive and impellent for consumers. One way to achieve this is through invoking celebrity endorsements. On the one hand, the use of a celebrity to endorse a product involves substantial costs, however, on the other hand, it does not immediately guarantee the success of an advertisement. The question of how celebrities can be used in advertising to the best advantage is therefore of utmost importance. Celebrity endorsements have become commonplace: empirical evidence indicates that approximately 20 to 25 per cent of advertisements feature some famous person as a product endorser. The popularity of celebrity endorsements demonstrates the relevance of the topic, especially in the context of the current global economic downturn, when companies are forced to save in order to survive, yet simultaneously to heavily invest in advertising and sales promotion. The issue of the effective use of celebrity endorsements also figures prominently in the academic discourse. The study presented below is thus aimed at exploring what qualities (characteristics) of a celebrity endorser have an impact on the ffectiveness of the advertisement in which he/she appears and how.

Web Pages Aesthetic Evaluation Using Low-Level Visual Features

Web sites are rapidly becoming the preferred media choice for our daily works such as information search, company presentation, shopping, and so on. At the same time, we live in a period where visual appearances play an increasingly important role in our daily life. In spite of designers- effort to develop a web site which be both user-friendly and attractive, it would be difficult to ensure the outcome-s aesthetic quality, since the visual appearance is a matter of an individual self perception and opinion. In this study, it is attempted to develop an automatic system for web pages aesthetic evaluation which are the building blocks of web sites. Based on the image processing techniques and artificial neural networks, the proposed method would be able to categorize the input web page according to its visual appearance and aesthetic quality. The employed features are multiscale/multidirectional textural and perceptual color properties of the web pages, fed to perceptron ANN which has been trained as the evaluator. The method is tested using university web sites and the results suggested that it would perform well in the web page aesthetic evaluation tasks with around 90% correct categorization.

Variance Based Component Analysis for Texture Segmentation

This paper presents a comparative analysis of a new unsupervised PCA-based technique for steel plates texture segmentation towards defect detection. The proposed scheme called Variance Based Component Analysis or VBCA employs PCA for feature extraction, applies a feature reduction algorithm based on variance of eigenpictures and classifies the pixels as defective and normal. While the classic PCA uses a clusterer like Kmeans for pixel clustering, VBCA employs thresholding and some post processing operations to label pixels as defective and normal. The experimental results show that proposed algorithm called VBCA is 12.46% more accurate and 78.85% faster than the classic PCA.

How are Equalities Defined, Strong or Weak on a Multiple Algebra?

For the purpose of finding the quotient structure of multiple algebras such as groups, Abelian groups and rings, we will state concepts of ( strong or weak ) equalities on multiple algebras, which will lead us to research on how ( strong or weak) are equalities defined on a multiple algebra over the quotients obtained from it. In order to find a quotient structure of multiple algebras such as groups, Abelian groups and loops, a part of this article has been allocated to the concepts of equalities (strong and weak) of the defined multiple functions on multiple algebras. This leads us to do research on how defined equalities (strong and weak) are made in the multiple algebra on its resulted quotient.

Total Petroleum Hydrocarbon Contamination in Sediment and Wastewater from the Imam Khomeini and Razi Petrochemical Companies- Iran

The present study was performed in Musa bay (northern part of the Persian Gulf) around the coastal area of Bandare-Imam Khomeini and Razi Petrochemical Companies. Sediment samples and effluent samples were collected from the selected stations, from June 2009 to June 2010. The samples were analyzed to determine the degree of hydrocarbon contamination. The average level of TPH concentration in the study area was more than the natural background value at all of the stations, especially at station BI1 which was the main effluent outlet of Bandar-e- Imam Khomeini petrochemical company. Also the concentration of total petroleum hydrocarbon was monitored in the effluents of aforementioned petrochemical companies and the results showed that the concentration of TPH in the effluents of Bandar-e- Imam Khomeini petrochemical company was greater than Razi petrochemical company which is may be related to the products of Bandar-e- Imam Khomeini petrochemical company (aromatics, polymers, chemicals, fuel).

A Norm-based Approach for Profiling Business Knowledge

Knowledge is a key asset for any organisation to sustain competitive advantages, but it is difficult to identify and represent knowledge which is needed to perform activities in business processes. The effective knowledge management and support for relevant business activities definitely gives a huge impact to the performance of the organisation as a whole. This is because that knowledge have the functions of directing, coordinating and controlling actions within business processes. The study has introduced organisational morphology, a norm-based approach by applying semiotic theories which emphasise on the representation of knowledge in norms. This approach is concerned with the identification of activities into three categories: substantive, communication and control activities. All activities are directed by norms; hence three types of norms exist; each is associated to a category of activities. The paper describes the approach briefly and illustrates the application of this approach through a case study of academic activities in higher education institutions. The result of the study shows that the approach provides an effective way to profile business knowledge and the profile enables the understanding and specification of business requirements of an organisation.

Analysis and Prototyping of Biological Systems: the Abstract Biological Process Model

The aim of a biological model is to understand the integrated structure and behavior of complex biological systems as a function of the underlying molecular networks to achieve simulation and forecast of their operation. Although several approaches have been introduced to take into account structural and environment related features, relatively little attention has been given to represent the behavior of biological systems. The Abstract Biological Process (ABP) model illustrated in this paper is an object-oriented model based on UML (the standard object-oriented language). Its main objective is to bring into focus the functional aspects of the biological system under analysis.

Ecological Risk Assessment of Polycyclic Aromatic Hydrocarbons in the Northwest of the Persian Gulf

This study investigated the presence of polycyclic aromatic hydrocarbons (PAHs) in the sediments of the Musa Bay (around the PETZONE coastal area) from Feb 2010 to Jun 2010. Concentrations of PAHs recorded in the Musa Bay sediments ranged from 537.89 to 26,659.06 ng/g dry weight with a mean value of 3990.74 ng/g. the highest concentration of PAHs was observed at station 4, which is located near the aromatic outlet of Imam Khomeini petrochemical company (station 4: BI-PC Aromatic effluent outlet) in which its concentration level was more than the NOAA sediment quality guideline value (ERL= 4022 ng/g dry weight). Owing to the concentration of PAHs in the study area, its concentration level was still meet the NOAA sediment quality guideline value (ERL: 4022 ng/g dry weight); however, according to the PELq factor, slightly adverse biological effects are associated with the exposure to PAHs levels in the study area (0.1< PELq= 0.24 > 0.5).

Evolutionary Eigenspace Learning using CCIPCA and IPCA for Face Recognition

Traditional principal components analysis (PCA) techniques for face recognition are based on batch-mode training using a pre-available image set. Real world applications require that the training set be dynamic of evolving nature where within the framework of continuous learning, new training images are continuously added to the original set; this would trigger a costly continuous re-computation of the eigen space representation via repeating an entire batch-based training that includes the old and new images. Incremental PCA methods allow adding new images and updating the PCA representation. In this paper, two incremental PCA approaches, CCIPCA and IPCA, are examined and compared. Besides, different learning and testing strategies are proposed and applied to the two algorithms. The results suggest that batch PCA is inferior to both incremental approaches, and that all CCIPCAs are practically equivalent.

Addressing Scalability Issues of Named Entity Recognition Using Multi-Class Support Vector Machines

This paper explores the scalability issues associated with solving the Named Entity Recognition (NER) problem using Support Vector Machines (SVM) and high-dimensional features. The performance results of a set of experiments conducted using binary and multi-class SVM with increasing training data sizes are examined. The NER domain chosen for these experiments is the biomedical publications domain, especially selected due to its importance and inherent challenges. A simple machine learning approach is used that eliminates prior language knowledge such as part-of-speech or noun phrase tagging thereby allowing for its applicability across languages. No domain-specific knowledge is included. The accuracy measures achieved are comparable to those obtained using more complex approaches, which constitutes a motivation to investigate ways to improve the scalability of multiclass SVM in order to make the solution more practical and useable. Improving training time of multi-class SVM would make support vector machines a more viable and practical machine learning solution for real-world problems with large datasets. An initial prototype results in great improvement of the training time at the expense of memory requirements.

Appreciating, Interpreting and Understanding Posters via Levels of Visual Literacy

This study was conducted in Malaysia to discover how meaning and appreciation were construed among 35 Form Five students. Panofsky-s theory was employed to discover the levels of reasoning among students when various types of posters were displayed. The independent variables used were posters that carried explicit and implicit meanings; the moderating variable was students- visual literacy levels while the dependent variable was the implicit interpretation level. One-way ANOVA was applied for the data analysis. The data showed that before students were exposed to Panofsky-s theory, there were differences in thinking between boys, who did not think abstractly or implicit in comparison to girls. The study showed that students- visual literacy in posters depended on the use of visual texts and illustration. This paper discuss further on posters with text only have a tendency to be too abstract as opposed to posters with visuals plus text.

Texture Feature Extraction using Slant-Hadamard Transform

Random and natural textures classification is still one of the biggest challenges in the field of image processing and pattern recognition. In this paper, texture feature extraction using Slant Hadamard Transform was studied and compared to other signal processing-based texture classification schemes. A parametric SHT was also introduced and employed for natural textures feature extraction. We showed that a subtly modified parametric SHT can outperform ordinary Walsh-Hadamard transform and discrete cosine transform. Experiments were carried out on a subset of Vistex random natural texture images using a kNN classifier.

Geochemical Assessment of Heavy Metals Concentration in Surface Sediment of West Port, Malaysia

One year (November 2009-October 2010) sediment monitoring was used to evaluate pollution status, concentration and distribution of heavy metals (As, Cu, Cd, Cr, Hg, Ni, Pb and Zn) in West Port of Malaysia. Sediment sample were collected from nine stations every four months. Geo-accumulation factor and Pollution Load Index (PLI) were estimated to better understand the pollution level in study area. The heavy metal concentration (Mg/g dry weight) were ranged from 20.2 to 162 for As, 7.4 to 27.6 for Cu, 0.244 to 3.53 for Cd, 11.5 to 61.5 for Cr, 0.11 to 0.409 for Hg, 7.2 to 22.2 for Ni, 22.3 to 80 for Pb and 23 to 98.3 for Zn. In general, concentration some metals (As,Cd, Hg and Pb) was higher than background values that are considered as serious concern for aquatic life and the human health.

Equalities in a Variety of Multiple Algebras

The purpose of this research is to study the concepts of multiple Cartesian product, variety of multiple algebras and to present some examples. In the theory of multiple algebras, like other theories, deriving new things and concepts from the things and concepts available in the context is important. For example, the first were obtained from the quotient of a group modulo the equivalence relation defined by a subgroup of it. Gratzer showed that every multiple algebra can be obtained from the quotient of a universal algebra modulo a given equivalence relation. The purpose of this study is examination of multiple algebras and basic relations defined on them as well as introduction to some algebraic structures derived from multiple algebras. Among the structures obtained from multiple algebras, this article studies submultiple algebras, quotients of multiple algebras and the Cartesian product of multiple algebras.