On Methodologies for Analysing Sickness Absence Data: An Insight into a New Method

Sickness absence represents a major economic and social issue. Analysis of sick leave data is a recurrent challenge to analysts because of the complexity of the data structure which is often time dependent, highly skewed and clumped at zero. Ignoring these features to make statistical inference is likely to be inefficient and misguided. Traditional approaches do not address these problems. In this study, we discuss model methodologies in terms of statistical techniques for addressing the difficulties with sick leave data. We also introduce and demonstrate a new method by performing a longitudinal assessment of long-term absenteeism using a large registration dataset as a working example available from the Helsinki Health Study for municipal employees from Finland during the period of 1990-1999. We present a comparative study on model selection and a critical analysis of the temporal trends, the occurrence and degree of long-term sickness absences among municipal employees. The strengths of this working example include the large sample size over a long follow-up period providing strong evidence in supporting of the new model. Our main goal is to propose a way to select an appropriate model and to introduce a new methodology for analysing sickness absence data as well as to demonstrate model applicability to complicated longitudinal data.

The Influence of Preprocessing Parameters on Text Categorization

Text categorization (the assignment of texts in natural language into predefined categories) is an important and extensively studied problem in Machine Learning. Currently, popular techniques developed to deal with this task include many preprocessing and learning algorithms, many of which in turn require tuning nontrivial internal parameters. Although partial studies are available, many authors fail to report values of the parameters they use in their experiments, or reasons why these values were used instead of others. The goal of this work then is to create a more thorough comparison of preprocessing parameters and their mutual influence, and report interesting observations and results.

Modeling Concave Globoidal Cam with Swinging Roller Follower : A Case Study

This paper describes a computer-aided design for design of the concave globoidal cam with cylindrical rollers and swinging follower. Four models with different modeling methods are made from the same input data. The input data are angular input and output displacements of the cam and the follower and some other geometrical parameters of the globoidal cam mechanism. The best cam model is the cam which has no interference with the rollers when their motions are simulated in assembly conditions. The angular output displacement of the follower for the best cam is also compared with that of in the input data to check errors. In this study, Pro/ENGINEER® Wildfire 2.0 is used for modeling the cam, simulating motions and checking interference and errors of the system.

Positive Periodic Solutions for a Predator-prey Model with Modified Leslie-Gower Holling-type II Schemes and a Deviating Argument

In this paper, by utilizing the coincidence degree theorem a predator-prey model with modified Leslie-Gower Hollingtype II schemes and a deviating argument is studied. Some sufficient conditions are obtained for the existence of positive periodic solutions of the model.

Efficient Detection Using Sequential Probability Ratio Test in Mobile Cognitive Radio Systems

This paper proposes a smart design strategy for a sequential detector to reliably detect the primary user-s signal, especially in fast fading environments. We study the computation of the log-likelihood ratio for coping with a fast changing received signal and noise sample variances, which are considered random variables. First, we analyze the detectability of the conventional generalized log-likelihood ratio (GLLR) scheme when considering fast changing statistics of unknown parameters caused by fast fading effects. Secondly, we propose an efficient sensing algorithm for performing the sequential probability ratio test in a robust and efficient manner when the channel statistics are unknown. Finally, the proposed scheme is compared to the conventional method with simulation results with respect to the average number of samples required to reach a detection decision.

A Hybrid Approach for Quantification of Novelty in Rule Discovery

Rule Discovery is an important technique for mining knowledge from large databases. Use of objective measures for discovering interesting rules lead to another data mining problem, although of reduced complexity. Data mining researchers have studied subjective measures of interestingness to reduce the volume of discovered rules to ultimately improve the overall efficiency of KDD process. In this paper we study novelty of the discovered rules as a subjective measure of interestingness. We propose a hybrid approach that uses objective and subjective measures to quantify novelty of the discovered rules in terms of their deviations from the known rules. We analyze the types of deviation that can arise between two rules and categorize the discovered rules according to the user specified threshold. We implement the proposed framework and experiment with some public datasets. The experimental results are quite promising.

AI Applications to Metal Stamping Die Design– A Review

Metal stamping die design is a complex, experiencebased and time-consuming task. Various artificial intelligence (AI) techniques are being used by worldwide researchers for stamping die design to reduce complexity, dependence on human expertise and time taken in design process as well as to improve design efficiency. In this paper a comprehensive review of applications of AI techniques in manufacturability evaluation of sheet metal parts, die design and process planning of metal stamping die is presented. Further the salient features of major research work published in the area of metal stamping are presented in tabular form and scope of future research work is identified.

Reliability Analysis of Underground Pipelines Using Subset Simulation

An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.

A Hybrid Approach for Color Image Quantization Using K-means and Firefly Algorithms

Color Image quantization (CQ) is an important problem in computer graphics, image and processing. The aim of quantization is to reduce colors in an image with minimum distortion. Clustering is a widely used technique for color quantization; all colors in an image are grouped to small clusters. In this paper, we proposed a new hybrid approach for color quantization using firefly algorithm (FA) and K-means algorithm. Firefly algorithm is a swarmbased algorithm that can be used for solving optimization problems. The proposed method can overcome the drawbacks of both algorithms such as the local optima converge problem in K-means and the early converge of firefly algorithm. Experiments on three commonly used images and the comparison results shows that the proposed algorithm surpasses both the base-line technique k-means clustering and original firefly algorithm.

Mode III Interlaminar Fracture in Woven Glass/Epoxy Composite Laminates

In the present study, fracture behavior of woven fabric-reinforced glass/epoxy composite laminates under mode III crack growth was experimentally investigated and numerically modeled. Two methods were used for the calculation of the strain energy release rate: the experimental compliance calibration (CC) method and the Virtual Crack Closure Technique (VCCT). To achieve this aim ECT (Edge Crack Torsion) was used to evaluate fracture toughness in mode III loading (out of plane-shear) at different crack lengths. Load–displacement and associated energy release rates were obtained for various case of interest. To calculate fracture toughness JIII, two criteria were considered including non-linearity and maximum points in load-displacement curve and it is observed that JIII increases with the crack length increase. Both the experimental compliance method and the virtual crack closure technique proved applicable for the interpretation of the fracture mechanics data of woven glass/epoxy laminates in mode III.

Visualisation and Navigation in Large Scale P2P Service Networks

In Peer-to-Peer service networks, where peers offer any kind of publicly available services or applications, intuitive navigation through all services in the network becomes more difficult as the number of services increases. In this article, a concept is discussed that enables users to intuitively browse and use large scale P2P service networks. The concept extends the idea of creating virtual 3D-environments solely based on Peer-to-Peer technologies. Aside from browsing, users shall have the possibility to emphasize services of interest using their own semantic criteria. The appearance of the virtual world shall intuitively reflect network properties that may be of interest for the user. Additionally, the concept comprises options for load- and traffic-balancing. In this article, the requirements concerning the underlying infrastructure and the graphical user interface are defined. First impressions of the appearance of future systems are presented and the next steps towards a prototypical implementation are discussed.

The Algorithm of Semi-Automatic Thai Spoonerism Words for Bi-Syllable

The purposes of this research are to study and develop the algorithm of Thai spoonerism words by semi-automatic computer programs, that is to say, in part of data input, syllables are already separated and in part of spoonerism, the developed algorithm is utilized, which can establish rules and mechanisms in Thai spoonerism words for bi-syllables by utilizing analysis in elements of the syllables, namely cluster consonant, vowel, intonation mark and final consonant. From the study, it is found that bi-syllable Thai spoonerism has 1 case of spoonerism mechanism, namely transposition in value of vowel, intonation mark and consonant of both 2 syllables but keeping consonant value and cluster word (if any). From the study, the rules and mechanisms in Thai spoonerism word were applied to develop as Thai spoonerism word software, utilizing PHP program. the software was brought to conduct a performance test on software execution; it is found that the program performs bi-syllable Thai spoonerism correctly or 99% of all words used in the test and found faults on the program at 1% as the words obtained from spoonerism may not be spelling in conformity with Thai grammar and the answer in Thai spoonerism could be more than 1 answer.

Study of the Effect of Project Management on Manufacturing and Production Projects

In this article the accumulated results out of the effects and length of the manufacture and production projects in the university and research standard have been settled with the usefulness definition of the process of project management for the accessibility to the proportional pattern in the “time and action" stages. Studies show that many problems confronted by the researchers in these projects are connected to the non-profiting of: 1) autonomous timing for gathering the educational theme, 2) autonomous timing for planning and pattern, presenting before the construction, and 3) autonomous timing for manufacture and sample presentation from the output. The result of this study indicates the division of every manufacture and production projects into three smaller autonomous projects from its kind, budget and autonomous expenditure, shape and order of the stages for the management of these kinds of projects. In this case study real result are compared with theoretical results.

Effects of Xylanase and Cellulase Production during Composting of EFB and POME using Fungi

Empty Fruit Bunches (EFB) and Palm Oil Mill Effluent (POME) are two main wastes from oil palm industries which contain rich lignocellulose. Degradation of EFB and POME by microorganisms will produce hydrolytic enzyme which will degrade cellulose and hemicellulose during composting process. However, normal composting takes about four to six months to reach maturity. Hence, application of fungi into compost can shorten the period of composting. This study identifies the effect of xylanase and cellulase produced by Aspergillus niger and Trichoderma virens on composting process using EFB and POME. The degradation of EFB and POME indicates the lignocellulolytic capacity of Aspergillus niger and Trichoderma virens with more than 7% decrease in hemicellulose and more than 25% decrease in cellulose for both inoculated compost. Inoculation of Aspergillus niger and Trichoderma virens also increased the enzyme activities during the composting period compared to the control compost by 21% for both xylanase and cellulase. Rapid rise in the activities of cellulase and xylanase was observed by Aspergillus niger with the highest activities of 14.41 FPU/mg and 3.89 IU/mg, respectively. Increased activities of cellulase and xylanase also occurred in inoculation of Trichoderma virens with the highest activities obtained at 13.21 FPU/mg and 4.43 IU/mg, respectively. Therefore, it is evident that the inoculation of fungi can increase the enzyme activities hence effectively degrading the EFB and POME.

A Comparison of Marginal and Joint Generalized Quasi-likelihood Estimating Equations Based On the Com-Poisson GLM: Application to Car Breakdowns Data

In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.

Position Awareness Mechanisms for Wireless Sensor Networks

A Wireless sensor network (WSN) consists of a set of battery-powered nodes, which collaborate to perform sensing tasks in a given environment. Each node in WSN should be capable to act for long periods of time with scrimpy or no external management. One requirement for this independent is: in the presence of adverse positions, the sensor nodes must be capable to configure themselves. Hence, the nodes for determine the existence of unusual events in their surroundings should make use of position awareness mechanisms. This work approaches the problem by considering the possible unusual events as diseases, thus making it possible to diagnose them through their symptoms, namely, their side effects. Considering these awareness mechanisms as a foundation for highlevel monitoring services, this paper also shows how these mechanisms are included in the primal plan of an intrusion detection system.

Rigorous Modeling of Fixed-Bed Reactors Containing Finite Hollow Cylindrical Catalyst with Michaelis-Menten Type of Kinetics

A large number of chemical, bio-chemical and pollution-control processes use heterogeneous fixed-bed reactors. The use of finite hollow cylindrical catalyst pellets can enhance conversion levels in such reactors. The absence of the pellet core can significantly lower the diffusional resistance associated with the solid phase. This leads to a better utilization of the catalytic material, which is reflected in the higher values for the effectiveness factor, leading ultimately to an enhanced conversion level in the reactor. It is however important to develop a rigorous heterogeneous model for the reactor incorporating the two-dimensional feature of the solid phase owing to the presence of the finite hollow cylindrical catalyst pellet. Presently, heterogeneous models reported in the literature invariably employ one-dimension solid phase models meant for spherical catalyst pellets. The objective of the paper is to present a rigorous model of the fixed-bed reactors containing finite hollow cylindrical catalyst pellets. The reaction kinetics considered here is the widely used Michaelis–Menten kinetics for the liquid-phase bio-chemical reactions. The reaction parameters used here are for the enzymatic degradation of urea. Results indicate that increasing the height to diameter ratio helps to improve the conversion level. On the other hand, decreasing the thickness is apparently not as effective. This could however be explained in terms of the higher void fraction of the bed that causes a smaller amount of the solid phase to be packed in the fixed-bed bio-chemical reactor.

A Single-chip Proportional to Absolute Temperature Sensor Using CMOS Technology

Nowadays it is a trend for electronic circuit designers to integrate all system components on a single-chip. This paper proposed the design of a single-chip proportional to absolute temperature (PTAT) sensor including a voltage reference circuit using CEDEC 0.18m CMOS Technology. It is a challenge to design asingle-chip wide range linear response temperature sensor for many applications. The channel widths between the compensation transistor and the reference transistor are critical to design the PTAT temperature sensor circuit. The designed temperature sensor shows excellent linearity between -100°C to 200° and the sensitivity is about 0.05mV/°C. The chip is designed to operate with a single voltage source of 1.6V.

The use of Hormone Auxin in the Different Period Growth on Yield Components of Plant Vetch

The trial in the city, located 170 kilometers from the Iranian city of Ahvaz was Omidiyeh. The main factor in this project includes 4 levels in control (without hormones), use of hormones in the seed, vegetative and flowering stage respectively. And sub-plots included 3 varieties of vetch in three levels, with local names, was the jewel in the study of light and Auxin in the vegetative and reproductive different times in different varieties of vetch was investigated. This test has been taken in the plots in a randomized complete block with four replications. In order to study the effects of the hormone Auxin in the growth stages (seed, vegetative and flowering) to control (no hormone Auxin) on three local varieties of vetch, the essence of light and plant height, number of pods per plant, seed number The pods, seeds per plant, grain weight, grain yield, plant dry weight and protein content were measured. Among the vetch varieties for plant height, number of pods per plant, a seed per plant, grain weight, grain yield, and plant dry weight and protein levels of 1 percent of plant and seed number per pod per plant at 5% level of There was no significant difference. Interactions for grain yield per plant, grain yield and protein levels of 1 percent and the number of seeds per pod and seed weight are significant differences in levels 5 and plant height and plant dry weight of the interaction were INFLUENCE There was no significant difference in them.