Consistent Modeling of Functional Dependencies along with World Knowledge

In this paper we propose a method for vision systems to consistently represent functional dependencies between different visual routines along with relational short- and long-term knowledge about the world. Here the visual routines are bound to visual properties of objects stored in the memory of the system. Furthermore, the functional dependencies between the visual routines are seen as a graph also belonging to the object-s structure. This graph is parsed in the course of acquiring a visual property of an object to automatically resolve the dependencies of the bound visual routines. Using this representation, the system is able to dynamically rearrange the processing order while keeping its functionality. Additionally, the system is able to estimate the overall computational costs of a certain action. We will also show that the system can efficiently use that structure to incorporate already acquired knowledge and thus reduce the computational demand.

Analysis of Socio-Cultural Obstacles for Dissemination of Nanotechnology from Iran's Agricultural Experts Perspective

The main purpose of this research was to analyze Socio-Cultural obstacles of disseminating of nanotechnology in Iran's agricultural section. One hundred twenty eight out of a total of 190 researchers with different levels of expertise in and familiarity with nanotechnology were randomly selected and questionnaires completed by them. Face validity have been done by expert's suggestion and correction, reliability by using Cronbakh-Alpha formula. The results of a factor analysis showed variation for different factors. For cultural factors 19/475 percent, for management 13/139 percent, information factor 11/277 percent, production factor 9/703 percent, social factor 9/267 percent, and for attitude factor it became 8/947 percent. Also results indicated that socio-cultural factors were the most important obstacle for nanotechnology dissemination in agricultural section in Iran.

Instructional Design Practitioners in Malaysia: Skills and Issues

The purpose of this research is to determine the knowledge and skills possessed by instructional design (ID) practitioners in Malaysia. As ID is a relatively new field in the country and there seems to be an absence of any studies on its community of practice, the main objective of this research is to discover the tasks and activities performed by ID practitioners in educational and corporate organizations as suggested by the International Board of Standards for Training, Performance and Instruction. This includes finding out the ID models applied in the course of their work. This research also attempts to identify the barriers and issues as to why some ID tasks and activities are rarely or never conducted. The methodology employed in this descriptive study was a survey questionnaire sent to 30 instructional designers nationwide. The results showed that majority of the tasks and activities are carried out frequently enough but omissions do occur due to reasons such as it being out of job scope, the decision was already made at a higher level, and the lack of knowledge and skills. Further investigations of a qualitative manner should be conducted to achieve a more in-depth understanding of ID practices in Malaysia

An Unstructured Finite-volume Technique for Shallow-water Flows with Wetting and Drying Fronts

An unstructured finite volume numerical model is presented here for simulating shallow-water flows with wetting and drying fronts. The model is based on the Green-s theorem in combination with Chorin-s projection method. A 2nd-order upwind scheme coupled with a Least Square technique is used to handle convection terms. An Wetting and drying treatment is used in the present model to ensures the total mass conservation. To test it-s capacity and reliability, the present model is used to solve the Parabolic Bowl problem. We compare our numerical solutions with the corresponding analytical and existing standard numerical results. Excellent agreements are found in all the cases.

Cloning and Expression of D-Threonine Aldolase from Ensifer arboris NBRC100383

D-erythro-cyclohexylserine (D chiral unnatural β-hydroxy amino acid expected for the synthesis of drug for AIDS treatment. To develop a continuous bioconversion system with whole cell biocatalyst of D-threonine aldolase (D genes for the D-erythro-CHS production, D-threonine aldolase gene was amplified from Ensifer arboris 100383 by direct PCR amplication using two degenerated oligonucleotide primers designed based on genomic sequence of Shinorhizobium meliloti Sequence analysis of the cloned DNA fragment revealed one open-reading frame of 1059 bp and 386 amino acids. This putative D-TA gene was cloned into NdeI and EcoRI (pEnsi His-tag sequence or BamHI (pEnsi-DTA[2]) sequence of the pET21(a) vector. The expression level of the cloned gene was extremely overexpressed by E. coli BL21(DE3) transformed with pEnsi-DTA[1] compared to E. coli BL21(DE3) transformed with pEnsi-DTA[2]. When the cells expressing the wild used for D-TA enzyme activity, 12 mM glycine was successfully detected in HPLC analysis. Moreover, the whole cells harbouring the recombinant D-TA was able to synthesize D-erythro of 0.6 mg/ml in a batch reaction.

Endothelial Specificity of ICAM2, Flt-1, and Tie2 Promoters In Vitro and In Vivo

To identify an endothelial cell-specific promoter suitable for vascular-specific targeting, we tested five promoters in vitro--Tie2SE, Tie2LE, ICAM2, Flt-1 and vWF--for promoter activity and specificity in endothelial cells, smooth muscle cells and non-vascular resident cells as well as tissues. These promoters, except for vWF, exhibited good endothelial activity and specificity in vitro. In a syngenic heart transplantation model, the ICAM2 promoter was variably functional in coronary endothelial cells of donor hearts. Thus, the ICAM2, Flt-1, Tie2SE and Tie2LE promoters hold promise for endothelial-specific targeting, but in vitro expression may not predict in vivo expression.

Combination of Different Classifiers for Cardiac Arrhythmia Recognition

This paper describes a new supervised fusion (hybrid) electrocardiogram (ECG) classification solution consisting of a new QRS complex geometrical feature extraction as well as a new version of the learning vector quantization (LVQ) classification algorithm aimed for overcoming the stability-plasticity dilemma. Toward this objective, after detection and delineation of the major events of ECG signal via an appropriate algorithm, each QRS region and also its corresponding discrete wavelet transform (DWT) are supposed as virtual images and each of them is divided into eight polar sectors. Then, the curve length of each excerpted segment is calculated and is used as the element of the feature space. To increase the robustness of the proposed classification algorithm versus noise, artifacts and arrhythmic outliers, a fusion structure consisting of five different classifiers namely as Support Vector Machine (SVM), Modified Learning Vector Quantization (MLVQ) and three Multi Layer Perceptron-Back Propagation (MLP–BP) neural networks with different topologies were designed and implemented. The new proposed algorithm was applied to all 48 MIT–BIH Arrhythmia Database records (within–record analysis) and the discrimination power of the classifier in isolation of different beat types of each record was assessed and as the result, the average accuracy value Acc=98.51% was obtained. Also, the proposed method was applied to 6 number of arrhythmias (Normal, LBBB, RBBB, PVC, APB, PB) belonging to 20 different records of the aforementioned database (between– record analysis) and the average value of Acc=95.6% was achieved. To evaluate performance quality of the new proposed hybrid learning machine, the obtained results were compared with similar peer– reviewed studies in this area.

Efficient Web-Learning Collision Detection Tool on Five-Axis Machine

As networking has become popular, Web-learning tends to be a trend while designing a tool. Moreover, five-axis machining has been widely used in industry recently; however, it has potential axial table colliding problems. Thus this paper aims at proposing an efficient web-learning collision detection tool on five-axis machining. However, collision detection consumes heavy resource that few devices can support, thus this research uses a systematic approach based on web knowledge to detect collision. The methodologies include the kinematics analyses for five-axis motions, separating axis method for collision detection, and computer simulation for verification. The machine structure is modeled as STL format in CAD software. The input to the detection system is the g-code part program, which describes the tool motions to produce the part surface. This research produced a simulation program with C programming language and demonstrated a five-axis machining example with collision detection on web site. The system simulates the five-axis CNC motion for tool trajectory and detects for any collisions according to the input g-codes and also supports high-performance web service benefiting from C. The result shows that our method improves 4.5 time of computational efficiency, comparing to the conventional detection method.

Orbit Propagator and Geomagnetic Field Estimator for NanoSatellite: The ICUBE Mission

This research contribution is drafted to present the orbit design, orbit propagator and geomagnetic field estimator for the nanosatellites specifically for the upcoming CUBESAT, ICUBE-1 of the Institute of Space Technology (IST), Islamabad, Pakistan. The ICUBE mission is designed for the low earth orbit at the approximate height of 700KM. The presented research endeavor designs the Keplarian elements for ICUBE-1 orbit while incorporating the mission requirements and propagates the orbit using J2 perturbations, The attitude determination system of the ICUBE-1 consists of attitude determination sensors like magnetometer and sun sensor. The Geomagnetic field estimator is developed according to the model of International Geomagnetic Reference Field (IGRF) for comparing the magnetic field measurements by the magnetometer for attitude determination. The output of the propagator namely the Keplarians position and velocity vectors and the magnetic field vectors are compared and verified with the same scenario generated in the  Satellite Tool Kit (STK).

Hippocampus Segmentation using a Local Prior Model on its Boundary

Segmentation techniques based on Active Contour Models have been strongly benefited from the use of prior information during their evolution. Shape prior information is captured from a training set and is introduced in the optimization procedure to restrict the evolution into allowable shapes. In this way, the evolution converges onto regions even with weak boundaries. Although significant effort has been devoted on different ways of capturing and analyzing prior information, very little thought has been devoted on the way of combining image information with prior information. This paper focuses on a more natural way of incorporating the prior information in the level set framework. For proof of concept the method is applied on hippocampus segmentation in T1-MR images. Hippocampus segmentation is a very challenging task, due to the multivariate surrounding region and the missing boundary with the neighboring amygdala, whose intensities are identical. The proposed method, mimics the human segmentation way and thus shows enhancements in the segmentation accuracy.

Improving the Shunt Active Power Filter Performance Using Synchronous Reference Frame PI Based Controller with Anti-Windup Scheme

In this paper the reference current for Voltage Source Converter (VSC) of the Shunt Active Power Filter (SAPF) is generated using Synchronous Reference Frame method, incorporating the PI controller with anti-windup scheme. The proposed method improves the harmonic filtering by compensating the winding up phenomenon caused by the integral term of the PI controller. Using Reference Frame Transformation, the current is transformed from om a - b - c stationery frame to rotating 0 - d - q frame. Using the PI controller, the current in the 0 - d - q frame is controlled to get the desired reference signal. A controller with integral action combined with an actuator that becomes saturated can give some undesirable effects. If the control error is so large that the integrator saturates the actuator, the feedback path becomes ineffective because the actuator will remain saturated even if the process output changes. The integrator being an unstable system may then integrate to a very large value, the phenomenon known as integrator windup. Implementing the integrator anti-windup circuit turns off the integrator action when the actuator saturates, hence improving the performance of the SAPF and dynamically compensating harmonics in the power network. In this paper the system performance is examined with Shunt Active Power Filter simulation model.

The Link between Unemployment and Inflation Using Johansen’s Co-Integration Approach and Vector Error Correction Modelling

In this paper bi-annual time series data on unemployment rates (from the Labour Force Survey) are expanded to quarterly rates and linked to quarterly unemployment rates (from the Quarterly Labour Force Survey). The resultant linked series and the consumer price index (CPI) series are examined using Johansen’s cointegration approach and vector error correction modeling. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant co-integrating relationship is found to exist between the time series of unemployment rates and the CPI. Given this significant relationship, the study models this relationship using Vector Error Correction Models (VECM), one with a restriction on the deterministic term and the other with no restriction. A formal statistical confirmation of the existence of a unique linear and lagged relationship between inflation and unemployment for the period between September 2000 and June 2011 is presented. For the given period, the CPI was found to be an unbiased predictor of the unemployment rate. This relationship can be explored further for the development of appropriate forecasting models incorporating other study variables.

The Impact of Subsequent Stock Market Liberalization on the Integration of Stock Markets in ASEAN-4 + South Korea

To strengthen the capital market, there is a need to integrate the capital markets within the region by removing legal or informal restriction, specifically, stock market liberalization. Thus the paper is to investigate the effects of the subsequent stock market liberalization on stock market integration in 4 ASEAN countries (Malaysia, Indonesia, Thailand, Singapore) and Korea from 1997 to 2007. The correlation between stock market liberalization and stock market integration are to be examined by analyzing the stock prices and returns within the region and in comparison with the world MSCI index. Event study method is to be used with windows of ±12 months and T-7 + T. The results show that the subsequent stock market liberalization generally, gives minor positive effects to stock returns, except for one or two countries. The subsequent liberalization also integrates the markets short-run and long-run.

A Formative Assessment Tool for Effective Feedback

In this study we present our developed formative assessment tool for students' assignments. The tool enables lecturers to define assignments for the course and assign each problem in each assignment a list of criteria and weights by which the students' work is evaluated. During assessment, the lecturers feed the scores for each criterion with justifications. When the scores of the current assignment are completely fed in, the tool automatically generates reports for both students and lecturers. The students receive a report by email including detailed description of their assessed work, their relative score and their progress across the criteria along the course timeline. This information is presented via charts generated automatically by the tool based on the scores fed in. The lecturers receive a report that includes summative (e.g., averages, standard deviations) and detailed (e.g., histogram) data of the current assignment. This information enables the lecturers to follow the class achievements and adjust the learning process accordingly. The tool was examined on two pilot groups of college students that study a course in (1) Object-Oriented Programming (2) Plane Geometry. Results reveal that most of the students were satisfied with the assessment process and the reports produced by the tool. The lecturers who used the tool were also satisfied with the reports and their contribution to the learning process.

Stress Analysis of Adhesively Bonded Double- Lap Joints Subjected to Combined Loading

Adhesively bonded joints are preferred over the conventional methods of joining such as riveting, welding, bolting and soldering. Some of the main advantages of adhesive joints compared to conventional joints are the ability to join dissimilar materials and damage-sensitive materials, better stress distribution, weight reduction, fabrication of complicated shapes, excellent thermal and insulation properties, vibration response and enhanced damping control, smoother aerodynamic surfaces and an improvement in corrosion and fatigue resistance. This paper presents the behavior of adhesively bonded joints subjected to combined thermal loadings, using the numerical methods. The joint configuration considers aluminum as central adherend with six different outer adherends including aluminum, steel, titanium, boronepoxy, unidirectional graphite-epoxy and cross-ply graphite-epoxy and epoxy-based adhesives. Free expansion of the joint in x direction was permitted and stresses in adhesive layer and interfaces calculated for different adherends.

Effect of Pectinase on the Physico-Chemical Properties of Juice from Pawpaw (Carica papaya) Fruits

A procedure for the preparation of clarified Pawpaw Juice was developed. About 750ml Pawpaw pulp was measured into 2 measuring cylinders A & B of capacity 1 litre heated to 400C, cooled to 200C. 30mls pectinase was added into cylinder A, while 30mls distilled water was added into cylinder B. Enzyme treated sample (A) was allowed to digest for 5hours after which it was heated to 900C for 15 minutes to inactivate the enzyme. The heated sample was cooled and with the aid of a mucillin cloth the pulp was filtered to obtain the clarified pawpaw juice. The juice was filled into 100ml plastic bottles, pasteurized at 950C for 45 minutes, cooled and stored at room temperature. The sample treated with 30mls distilled water also underwent the same process. Freshly pasteurized sample was analyzed for specific gravity, titratable acidity, pH, sugars and ascorbic acid. The remaining sample was then stored for 2 weeks and the above analyses repeated. There were differences in the results of the freshly pasteurized samples and stored sample in pH and ascorbic acid levels, also sample treated with pectinase yielded higher volumes of juice than that treated with distilled water.

An Intelligent System for Phish Detection, using Dynamic Analysis and Template Matching

Phishing, or stealing of sensitive information on the web, has dealt a major blow to Internet Security in recent times. Most of the existing anti-phishing solutions fail to handle the fuzziness involved in phish detection, thus leading to a large number of false positives. This fuzziness is attributed to the use of highly flexible and at the same time, highly ambiguous HTML language. We introduce a new perspective against phishing, that tries to systematically prove, whether a given page is phished or not, using the corresponding original page as the basis of the comparison. It analyzes the layout of the pages under consideration to determine the percentage distortion between them, indicative of any form of malicious alteration. The system design represents an intelligent system, employing dynamic assessment which accurately identifies brand new phishing attacks and will prove effective in reducing the number of false positives. This framework could potentially be used as a knowledge base, in educating the internet users against phishing.

A Comparison of Dilute Sulfuric and Phosphoric Acid Pretreatments in Biofuel Production from Corncobs

Biofuels, like biobutanol, have been recognized for being renewable and sustainable fuels which can be produced from lignocellulosic biomass. To convert lignocellulosic biomass to biofuel, pretreatment process is an important step to remove hemicelluloses and lignin to improve enzymatic hydrolysis. Dilute acid pretreatment has been successful developed for pretreatment of corncobs and the optimum conditions of dilute sulfuric and phosphoric acid pretreatment were obtained at 120 °C for 5 min with 15:1 liquid to solid ratio and 140 °C for 10 min with 10:1 liquid to solid ratio, respectively. The result shows that both of acid pretreatments gave the content of total sugar approximately 34–35 g/l. In case of inhibitor content (furfural), phosphoric acid pretreatment gives higher than sulfuric acid pretreatment. Characterizations of corncobs after pretreatment indicate that both of acid pretreatments can improve enzymatic accessibility and the better results present in corncobs pretreated with sulfuric acid in term of surface area, crystallinity, and composition analysis.

Wind Load Characteristics in Libya

Recent trends in building constructions in Libya are more toward tall (high-rise) building projects. As a consequence, a better estimation of the lateral loading in the design process is becoming the focal of a safe and cost effective building industry. Byin- large, Libya is not considered a potential earthquake prone zone, making wind is the dominant design lateral loads. Current design practice in the country estimates wind speeds on a mere random bases by considering certain factor of safety to the chosen wind speed. Therefore, a need for a more accurate estimation of wind speeds in Libya was the motivation behind this study. Records of wind speed data were collected from 22 metrological stations in Libya, and were statistically analysed. The analysis of more than four decades of wind speed records suggests that the country can be divided into four zones of distinct wind speeds. A computer “survey" program was manipulated to draw design wind speeds contour map for the state of Libya. The paper presents the statistical analysis of Libya-s recorded wind speed data and proposes design wind speed values for a 50-year return period that covers the entire country.

Estimation of Broadcast Probability in Wireless Adhoc Networks

Most routing protocols (DSR, AODV etc.) that have been designed for wireless adhoc networks incorporate the broadcasting operation in their route discovery scheme. Probabilistic broadcasting techniques have been developed to optimize the broadcast operation which is otherwise very expensive in terms of the redundancy and the traffic it generates. In this paper we have explored percolation theory to gain a different perspective on probabilistic broadcasting schemes which have been actively researched in the recent years. This theory has helped us estimate the value of broadcast probability in a wireless adhoc network as a function of the size of the network. We also show that, operating at those optimal values of broadcast probability there is at least 25-30% reduction in packet regeneration during successful broadcasting.