Effect of Cold, Warm or Contrast Therapy on Controlling Knee Osteoarthritis Associated Problems

Osteoarthritis (OA) is the most prevalent and far common debilitating form of arthritis which can be defined as a degenerative condition affecting synovial joint. Patients suffering from osteoarthritis often complain of dull ache pain on movement. Physical agents can fight the painful process when correctly indicated and used such as heat or cold therapy Aim. This study was carried out to: Compare the effect of cold, warm and contrast therapy on controlling knee osteoarthritis associated problems. Setting: The study was carried out in orthopedic outpatient clinics of Menoufia University and teaching Hospitals, Egypt. Sample: A convenient sample of 60 adult patients with unilateral knee osteoarthritis. Tools: three tools were utilized to collect the data. Tool I : An interviewing questionnaire. It comprised of three parts covering  sociodemographic data, medical data and adverse effects of the treatment protocol. Tool II : Knee Injury and Osteoarthritis Outcome Score (KOOS) It consists of five main parts. Tool II1 : 0-10 Numeric pain rating scale. Results: reveled that the total knee symptoms score was decreased from moderate symptoms pre intervention to mild symptoms after warm and contrast method of therapy, but the contrast therapy had significant effect in reducing the knee symptoms and pain than the other symptoms. Conclusions: all of the three methods of therapy resulted in improvement in all knee symptoms and pain but the most appropriate protocol of treatment to relive symptoms and pain was contrast therapy.

Codebook Generation for Vector Quantization on Orthogonal Polynomials based Transform Coding

In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.

Overview of CARDIOSENSOR Project on the Development of a Nanosensor for Assessing the Risk of Cardiovascular Disease

This paper aims at overviewing the topics of a research project (CARDIOSENSOR) on the field of health sciences (biomaterials and biomedical engineering). The project has focused on the development of a nanosensor for the assessment of the risk of cardiovascular diseases by the monitoring of C-reactive protein (CRP), which has been currently considered as the best validated inflammatory biomarker associated to cardiovascular diseases. The project involves tasks such as: 1) the development of sensor devices based on field effect transistors (FET): assembly, optimization and validation; 2) application of sensors to the detection of CRP in standard solutions and comparison with enzyme-linked immunosorbent assay (ELISA); and 3) application of sensors to real samples such as blood and saliva and evaluation of their ability to predict the risk of cardiovascular disease.

A Scheme of Model Verification of the Concurrent Discrete Wavelet Transform (DWT) for Image Compression

The scientific community has invested a great deal of effort in the fields of discrete wavelet transform in the last few decades. Discrete wavelet transform (DWT) associated with the vector quantization has been proved to be a very useful tool for the compression of image. However, the DWT is very computationally intensive process requiring innovative and computationally efficient method to obtain the image compression. The concurrent transformation of the image can be an important solution to this problem. This paper proposes a model of concurrent DWT for image compression. Additionally, the formal verification of the model has also been performed. Here the Symbolic Model Verifier (SMV) has been used as the formal verification tool. The system has been modeled in SMV and some properties have been verified formally.

Trade Openness and Its Effects on Economic Growth in Selected South Asian Countries: A Panel Data Study

The study investigates the causal link between trade openness and economic growth for four South Asian countries for period 1972-1985 and 1986-2007 to examine the scenario before and after the implementation of SAARC. Panel cointegration and FMOLS techniques are employed for short run and long run estimates. In 1972-85 short run unidirectional causality from GDP to openness is found whereas, in 1986-2007 there exists bi-directional causality between GDP and openness. The long run elasticity magnitude between GDP and openness contains negative sign in 1972-85 which shows that there exists long run negative relationship. While in time period 1986-2007 the elasticity magnitude has positive sign that indicates positive causation between GDP and openness. So it can be concluded that after the implementation of SAARC overall situation of selected countries got better. Also long run coefficient of error term suggests that short term equilibrium adjustments are driven by adjustment back to long run equilibrium.

The Effect of Canard Configurations to the Aerodynamics of the Blended Wing Body

The aerodynamics characteristics of a blended-wing body (BWB) aircraft were obtained in Universiti Teknologi MARA low speed wind tunnel. The scaled-down of BWB model consisted of a canard as its horizontal stabilizer. There were four canards with different aspect ratio used in the experiments. Canard setting angles were varied from -20q to 20q. All tests were conducted at velocity of 35 m/s, with Mach number 0.1. At low angles of attacks, the increment of lift slope for various canards aspect ratio is small and almost constant. Higher canard aspect ratio will cause higher drag. However, canard has a high effect to the moment at zero lift, CM,0.The visualization using mini tuff was performed to observe the airflow at the upper surface of canard. KeywordsAerodynamics,blended-wing body, canard, wind tunnel.

Thermal and Electrical Properties of Carbon Nanotubes Purified by Acid Digestion

Carbon nanotubes (CNTs) possess unique structural, mechanical, thermal and electronic properties, and have been proposed to be used for applications in many fields. However, to reach the full potential of the CNTs, many problems still need to be solved, including the development of an easy and effective purification procedure, since synthesized CNTs contain impurities, such as amorphous carbon, carbon nanoparticles and metal particles. Different purification methods yield different CNT characteristics and may be suitable for the production of different types of CNTs. In this study, the effect of different purification chemicals on carbon nanotube quality was investigated. CNTs were firstly synthesized by chemical vapor deposition (CVD) of acetylene (C2H2) on a magnesium oxide (MgO) powder impregnated with an iron nitrate (Fe(NO3)3·9H2O) solution. The synthesis parameters were selected as: the synthesis temperature of 800°C, the iron content in the precursor of 5% and the synthesis time of 30 min. The liquid phase oxidation method was applied for the purification of the synthesized CNT materials. Three different acid chemicals (HNO3, H2SO4, and HCl) were used in the removal of the metal catalysts from the synthesized CNT material to investigate the possible effects of each acid solution to the purification step. Purification experiments were carried out at two different temperatures (75 and 120 °C), two different acid concentrations (3 and 6 M) and for three different time intervals (6, 8 and 15 h). A 30% H2O2 : 3M HCl (1:1 v%) solution was also used in the purification step to remove both the metal catalysts and the amorphous carbon. The purifications using this solution were performed at the temperature of 75°C for 8 hours. Purification efficiencies at different conditions were evaluated by thermogravimetric analysis. Thermal and electrical properties of CNTs were also determined. It was found that the obtained electrical conductivity values for the carbon nanotubes were typical for organic semiconductor materials and thermal stabilities were changed depending on the purification chemicals.

Source of Oseltamivir Resistance Due to R152K Mutation of Influenza B Virus Neuraminidase: Molecular Modeling

Every 2-3 years the influenza B virus serves epidemics. Neuraminidase (NA) is an important target for influenza drug design. Although, oseltamivir, an oral neuraminidase drug, has been shown good inhibitory efficiency against wild-type of influenza B virus, the lower susceptibility to the R152K mutation has been reported. Better understanding of oseltamivir efficiency and resistance toward the influenza B NA wild-type and R152K mutant, respectively, could be useful for rational drug design. Here, two complex systems of wild-type and R152K NAs with oseltamivir bound were studied using molecular dynamics (MD) simulations. Based on 5-ns MD simulation, the loss of notable hydrogen bond and decrease in per-residue decomposition energy from the mutated residue K152 contributed to drug compared to those of R152 in wildtype were found to be a primary source of high-level of oseltamivir resistance due to the R152K mutation.

Digital Paradoxes in Learning Theories

As a learning theory tries to borrow from science a framework to found its method, it shows paradoxes and paralysing contraddictions. This results, on one hand, from adopting a learning/teaching model as it were a mere “transfer of data" (mechanical learning approach), and on the other hand from borrowing the complexity theory (an indeterministic and non-linear model), that risks to vanish every educational effort. This work is aimed at describing existing criticism, unveiling the antinomic nature of such paradoxes, focussing on a view where neither the mechanical learning perspective nor the chaotic and nonlinear model can threaten and jeopardize the educational work. Author intends to go back over the steps that led to these paradoxes and to unveil their antinomic nature. Actually this could serve the purpose to explain some current misunderstandings about the real usefulness of Ict within the youth-s learning process and growth.

Examining the Value of Attribute Scores for Author-Supplied Keyphrases in Automatic Keyphrase Extraction

Automatic keyphrase extraction is useful in efficiently locating specific documents in online databases. While several techniques have been introduced over the years, improvement on accuracy rate is minimal. This research examines attribute scores for author-supplied keyphrases to better understand how the scores affect the accuracy rate of automatic keyphrase extraction. Five attributes are chosen for examination: Term Frequency, First Occurrence, Last Occurrence, Phrase Position in Sentences, and Term Cohesion Degree. The results show that First Occurrence is the most reliable attribute. Term Frequency, Last Occurrence and Term Cohesion Degree display a wide range of variation but are still usable with suggested tweaks. Only Phrase Position in Sentences shows a totally unpredictable pattern. The results imply that the commonly used ranking approach which directly extracts top ranked potential phrases from candidate keyphrase list as the keyphrases may not be reliable.

A Novel Pareto-Based Meta-Heuristic Algorithm to Optimize Multi-Facility Location-Allocation Problem

This article proposes a novel Pareto-based multiobjective meta-heuristic algorithm named non-dominated ranking genetic algorithm (NRGA) to solve multi-facility location-allocation problem. In NRGA, a fitness value representing rank is assigned to each individual of the population. Moreover, two features ranked based roulette wheel selection including select the fronts and choose solutions from the fronts, are utilized. The proposed solving methodology is validated using several examples taken from the specialized literature. The performance of our approach shows that NRGA algorithm is able to generate true and well distributed Pareto optimal solutions.

Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application

In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or  absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.

Assessing the Value of Virtual Worlds for Post- Secondary Instructors: A Survey of Innovators, Early Adopters and the Early Majority in Second Life

The purpose of this study was to assess the value of Second Life among post-secondary instructors with experience using Second Life as an educational tool. Using Everett Rogers-s diffusion of innovations theory, survey respondents (N = 162), were divided into three adopter categories: innovators, early adopters and the early majority. Respondents were from 15 countries and 25 academic disciplines, indicating the considerable potential this innovation has to be adopted across many different borders and in many areas of academe. Nearly 94% of respondents said they plan to use Second Life again as an educational tool. However, no significant differences were found in instructors- levels of satisfaction with Second Life as an educational tool or their perceived effect on student learning across adopter categories. On the other hand, instructors who conducted class fully in Second Life were significantly more satisfied than those who used Second Life as only a small supplement to a real-world class. Overall, personal interest factors, rather than interpersonal communication factors, most influenced respondents- decision to adopt Second Life as an educational tool. In light of these findings, theoretical implications are discussed and practical suggestions are provided.

Preparation of Tender for Building Conservation Work: Current Practices in Malaysia

Building conservation work generally involves complex and non-standard work different from new building construction processes. In preparing tenders for building conservation projects, therefore, the quantity surveyor must carefully consider the specificity of non-standard items and demarcate the scope of unique conservation work. While the quantity surveyor must appreciate the full range of works to prepare a good tender document, he typically manages many unfamiliar elements, including practical construction methods, restoration techniques and work sequences. Only by fulfilling the demanding requirements of building conservation work can the quantity surveyor enhance his professionalism an area of growing cultural value and economic importance. By discussing several issues crucial to tender preparations for building conservation projects in Malaysia, this paper seeks a deeper understanding of how quantity surveying can better standardize tender preparation work and more successfully manage building conservation processes.

Comparison of FAHP and TOPSIS for Evacuation Capability Assessment of High-rise Buildings

A lot of computer-based methods have been developed to assess the evacuation capability (EC) of high-rise buildings. Because softwares are time-consuming and not proper for on scene applications, we adopted two methods, fuzzy analytic hierarchy process (FAHP) and technique for order preference by similarity to an ideal solution (TOPSIS), for EC assessment of a high-rise building in Jinan. The EC scores obtained with the two methods and the evacuation time acquired with Pathfinder 2009 for floors 47-60 of the building were compared with each other. The results show that FAHP performs better than TOPSIS for EC assessment of high-rise buildings, especially in the aspect of dealing with the effect of occupant type and distance to exit on EC, tackling complex problem with multi-level structure of criteria, and requiring less amount of computation. However, both FAHP and TOPSIS failed to appropriately handle the situation where the exit width changes while occupants are few.

Influence of Hydraulic Hysteresis on Effective Stress in Unsaturated Clay

A comprehensive program of laboratory testing on a compacted kaolin in a modified triaxial cell was perform to investigate the influence of hydraulic hysteresis on effective stress in unsaturated soils. The test data are presented on a range of constant suction shear tests along wetting and drying paths. The values of effective stress parameter χ at different matric suction were determined using the test results. The effect of hydraulic hysteresis phenomenon on the effective stress was observed. The values of effective stress parameter χ obtained from the experiments were compared with those obtained from the expressions proposed in literature.

Feature Selection Methods for an Improved SVM Classifier

Text categorization is the problem of classifying text documents into a set of predefined classes. After a preprocessing step, the documents are typically represented as large sparse vectors. When training classifiers on large collections of documents, both the time and memory restrictions can be quite prohibitive. This justifies the application of feature selection methods to reduce the dimensionality of the document-representation vector. In this paper, three feature selection methods are evaluated: Random Selection, Information Gain (IG) and Support Vector Machine feature selection (called SVM_FS). We show that the best results were obtained with SVM_FS method for a relatively small dimension of the feature vector. Also we present a novel method to better correlate SVM kernel-s parameters (Polynomial or Gaussian kernel).

Mathematical Modeling to Predict Surface Roughness in CNC Milling

Surface roughness (Ra) is one of the most important requirements in machining process. In order to obtain better surface roughness, the proper setting of cutting parameters is crucial before the process take place. This research presents the development of mathematical model for surface roughness prediction before milling process in order to evaluate the fitness of machining parameters; spindle speed, feed rate and depth of cut. 84 samples were run in this study by using FANUC CNC Milling α-Τ14ιE. Those samples were randomly divided into two data sets- the training sets (m=60) and testing sets(m=24). ANOVA analysis showed that at least one of the population regression coefficients was not zero. Multiple Regression Method was used to determine the correlation between a criterion variable and a combination of predictor variables. It was established that the surface roughness is most influenced by the feed rate. By using Multiple Regression Method equation, the average percentage deviation of the testing set was 9.8% and 9.7% for training data set. This showed that the statistical model could predict the surface roughness with about 90.2% accuracy of the testing data set and 90.3% accuracy of the training data set.

Induction Motor Speed Control Using Fuzzy Logic Controller

Because of the low maintenance and robustness induction motors have many applications in the industries. The speed control of induction motor is more important to achieve maximum torque and efficiency. Various speed control techniques like, Direct Torque Control, Sensorless Vector Control and Field Oriented Control are discussed in this paper. Soft computing technique – Fuzzy logic is applied in this paper for the speed control of induction motor to achieve maximum torque with minimum loss. The fuzzy logic controller is implemented using the Field Oriented Control technique as it provides better control of motor torque with high dynamic performance. The motor model is designed and membership functions are chosen according to the parameters of the motor model. The simulated design is tested using various tool boxes in MATLAB. The result concludes that the efficiency and reliability of the proposed speed controller is good.

Computational Intelligence Hybrid Learning Approach to Time Series Forecasting

Time series forecasting is an important and widely popular topic in the research of system modeling. This paper describes how to use the hybrid PSO-RLSE neuro-fuzzy learning approach to the problem of time series forecasting. The PSO algorithm is used to update the premise parameters of the proposed prediction system, and the RLSE is used to update the consequence parameters. Thanks to the hybrid learning (HL) approach for the neuro-fuzzy system, the prediction performance is excellent and the speed of learning convergence is much faster than other compared approaches. In the experiments, we use the well-known Mackey-Glass chaos time series. According to the experimental results, the prediction performance and accuracy in time series forecasting by the proposed approach is much better than other compared approaches, as shown in Table IV. Excellent prediction performance by the proposed approach has been observed.