Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory

Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).

Developing Student Teachers to Be Professional Teachers

Practicum placements are an critical factor for student teachers on Education Programs. How can student teachers become professionals? This study was to investigate problems, weakness and obstacles of practicum placements and develop guidelines for partnership in the practicum placements. In response to this issue, a partnership concept was implemented for developing student teachers into professionals. Data were collected through questionnaires on attitude toward problems, weaknesses, and obstacles of practicum placements of student teachers in Rajabhat universities and included focus group interviews. The research revealed that learning management, classroom management, curriculum, assessment and evaluation, classroom action research, and teacher demeanor are the important factors affecting the professional development of Education Program student teachers. Learning management plan and classroom management concerning instructional design, teaching technique, instructional media, and student behavior management are another important aspects influencing the professional development for student teachers.

Quality Evaluation of Compressed MRI Medical Images for Telemedicine Applications

Medical image modalities such as computed tomography (CT), magnetic resonance imaging (MRI), ultrasound (US), X-ray are adapted to diagnose disease. These modalities provide flexible means of reviewing anatomical cross-sections and physiological state in different parts of the human body. The raw medical images have a huge file size and need large storage requirements. So it should be such a way to reduce the size of those image files to be valid for telemedicine applications. Thus the image compression is a key factor to reduce the bit rate for transmission or storage while maintaining an acceptable reproduction quality, but it is natural to rise the question of how much an image can be compressed and still preserve sufficient information for a given clinical application. Many techniques for achieving data compression have been introduced. In this study, three different MRI modalities which are Brain, Spine and Knee have been compressed and reconstructed using wavelet transform. Subjective and objective evaluation has been done to investigate the clinical information quality of the compressed images. For the objective evaluation, the results show that the PSNR which indicates the quality of the reconstructed image is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and 26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For the subjective evaluation test, the results show that the compression ratio of 40:1 was acceptable for brain image, whereas for spine and knee images 50:1 was acceptable.

Academic Staff Perceptions of the Value of the Elements of an Online Learning Environment

Based on 276 responses from academic staff in an evaluation of an online learning environment (OLE), this paper identifies those elements of the OLE that were most used and valued by staff, those elements of the OLE that staff most wanted to see improved, and those factors that most contributed to staff perceptions that the use of the OLE enhanced their teaching. The most used and valued elements were core functions, including accessing unit information, accessing lecture/tutorial/lab notes, and reading online discussions. The elements identified as most needing attention related to online assessment: submitting assignments, managing assessment items, and receiving feedback on assignments. Staff felt that using the OLE enhanced their teaching when they were satisfied that their students were able to access and use their learning materials, and when they were satisfied with the professional development they received and were confident with their ability to teach with the OLE.

Heuristics Analysis for Distributed Scheduling using MONARC Simulation Tool

Simulation is a very powerful method used for highperformance and high-quality design in distributed system, and now maybe the only one, considering the heterogeneity, complexity and cost of distributed systems. In Grid environments, foe example, it is hard and even impossible to perform scheduler performance evaluation in a repeatable and controllable manner as resources and users are distributed across multiple organizations with their own policies. In addition, Grid test-beds are limited and creating an adequately-sized test-bed is expensive and time consuming. Scalability, reliability and fault-tolerance become important requirements for distributed systems in order to support distributed computation. A distributed system with such characteristics is called dependable. Large environments, like Cloud, offer unique advantages, such as low cost, dependability and satisfy QoS for all users. Resource management in large environments address performant scheduling algorithm guided by QoS constrains. This paper presents the performance evaluation of scheduling heuristics guided by different optimization criteria. The algorithms for distributed scheduling are analyzed in order to satisfy users constrains considering in the same time independent capabilities of resources. This analysis acts like a profiling step for algorithm calibration. The performance evaluation is based on simulation. The simulator is MONARC, a powerful tool for large scale distributed systems simulation. The novelty of this paper consists in synthetic analysis results that offer guidelines for scheduler service configuration and sustain the empirical-based decision. The results could be used in decisions regarding optimizations to existing Grid DAG Scheduling and for selecting the proper algorithm for DAG scheduling in various actual situations.

Reliability-Based Topology Optimization Based on Evolutionary Structural Optimization

This paper presents a Reliability-Based Topology Optimization (RBTO) based on Evolutionary Structural Optimization (ESO). An actual design involves uncertain conditions such as material property, operational load and dimensional variation. Deterministic Topology Optimization (DTO) is obtained without considering of the uncertainties related to the uncertainty parameters. However, RBTO involves evaluation of probabilistic constraints, which can be done in two different ways, the reliability index approach (RIA) and the performance measure approach (PMA). Limit state function is approximated using Monte Carlo Simulation and Central Composite Design for reliability analysis. ESO, one of the topology optimization techniques, is adopted for topology optimization. Numerical examples are presented to compare the DTO with RBTO.

An AHP-Delphi Multi-Criteria Usage Cases Model with Application to Citrogypsum Decisions, Case Study: Kimia Gharb Gostar Industries Company

Today, advantage of biotechnology especially in environmental issues compared to other technologies is irrefragable. Kimia Gharb Gostar Industries Company, as a largest producer of citric acid in Middle East, applies biotechnology for this goal. Citrogypsum is a by–product of citric acid production and it considered as a valid residuum of this company. At this paper summary of acid citric production and condition of Citrogypsum production in company were introduced in addition to defmition of Citrogypsum production and its applications in world. According to these information and evaluation of present conditions about Iran needing to Citrogypsum, the best priority was introduced and emphasized on strategy selection and proper programming for self-sufficiency. The Delphi technique was used to elicit expert opinions about criteria for evaluating the usages. The criteria identified by the experts were profitability, capacity of production, the degree of investment, marketable, production ease and time production. The Analytical Hierarchy Process (ARP) and Expert Choice software were used to compare the alternatives on the criteria derived from the Delphi process.

A New Image Psychovisual Coding Quality Measurement based Region of Interest

To model the human visual system (HVS) in the region of interest, we propose a new objective metric evaluation adapted to wavelet foveation-based image compression quality measurement, which exploits a foveation setup filter implementation technique in the DWT domain, based especially on the point and region of fixation of the human eye. This model is then used to predict the visible divergences between an original and compressed image with respect to this region field and yields an adapted and local measure error by removing all peripheral errors. The technique, which we call foveation wavelet visible difference prediction (FWVDP), is demonstrated on a number of noisy images all of which have the same local peak signal to noise ratio (PSNR), but visibly different errors. We show that the FWVDP reliably predicts the fixation areas of interest where error is masked, due to high image contrast, and the areas where the error is visible, due to low image contrast. The paper also suggests ways in which the FWVDP can be used to determine a visually optimal quantization strategy for foveation-based wavelet coefficients and to produce a quantitative local measure of image quality.

Sensory, Microbiological and Chemical Assessment of Cod (Gadus morhua) Fillets during Chilled Storage as Influenced by Bleeding Methods

The effects of seawater and slurry ice bleeding methods on the sensory, microbiological and chemical quality changes of cod fillets during chilled storage were examined in this study. The results from sensory evaluation showed that slurry ice bleeding method prolonged the shelf life of cod fillets up to 13-14 days compared to 10-11 days for fish bled in seawater. Slurry ice bleeding method also led to a slower microbial growth and biochemical developments, resulting lower total plate count (TPC), H2S-producing bacteria count, total volatile basic nitrogen (TVB-N), trimethylamine (TMA), free fatty acid (FFA) content and higher phospholipid content (PL) compared to those of samples bled in seawater. The results of principle component analysis revealed that TPC, H2S-producing bacteria, TVB-N, TMA and FFA were in significant correlation. They were also in negative correlation with sensory evaluation (Torry score), PL and water holding capacity (WHC).

A Robust Wavelet-Based Watermarking Algorithm Using Edge Detection

In this paper, a robust watermarking algorithm using the wavelet transform and edge detection is presented. The efficiency of an image watermarking technique depends on the preservation of visually significant information. This is attained by embedding the watermark transparently with the maximum possible strength. The watermark embedding process is carried over the subband coefficients that lie on edges, where distortions are less noticeable, with a subband level dependent strength. Also, the watermark is embedded to selected coefficients around edges, using a different scale factor for watermark strength, that are captured by a morphological dilation operation. The experimental evaluation of the proposed method shows very good results in terms of robustness and transparency to various attacks such as median filtering, Gaussian noise, JPEG compression and geometrical transformations.

Debye Layer Confinement of Nucleons in Nuclei by Laser Ablated Plasma

Following the laser ablation studies leading to a theory of nuclei confinement by a Debye layer mechanism, we present here numerical evaluations for the known stable nuclei where the Coulomb repulsion is included as a rather minor component especially for lager nuclei. In this research paper the required physical conditions for the formation and stability of nuclei particularly endothermic nuclei with mass number greater than to which is an open astrophysical question have been investigated. Using the Debye layer mechanism, nuclear surface energy, Fermi energy and coulomb repulsion energy it is possible to find conditions under which the process of nucleation is permitted in early universe. Our numerical calculations indicate that about 200 second after the big bang at temperature of about 100 KeV and subrelativistic region with nucleon density nearly equal to normal nuclear density namely, 10cm all endothermic and exothermic nuclei have been formed.

Improvement of New Government R&D Program Plans through Preliminary Feasibility Studies

As a part of an evaluation system for R&D programs, the Korean Government has applied the preliminary feasibility study to new government R&D program plans. Basically, the fundamental purpose of the preliminary feasibility study is to decide that the government will either do or do not invest in a new R&D Program. Additionally, the preliminary feasibility study can contribute to the improvement of R&D program plans. For example, 2 cases of new R&D program plans applied to the study are explained in this paper and there are expectations that these R&D programs would yield better performance than without the study. It is thought that the important point of the preliminary feasibility study is not only the effective decision making process of R&D program but also the opportunity to improve R&D program plan actually.

Multi-Hazard Risk Assessment and Management in Tourism Industry- A Case Study from the Island of Taiwan

Global environmental changes lead to increased frequency and scale of natural disaster, Taiwan is under the influence of global warming and extreme weather. Therefore, the vulnerability was increased and variability and complexity of disasters is relatively enhanced. The purpose of this study is to consider the source and magnitude of hazard characteristics on the tourism industry. Using modern risk management concepts, integration of related domestic and international basic research, this goes beyond the Taiwan typhoon disaster risk assessment model and evaluation of loss. This loss evaluation index system considers the impact of extreme weather, in particular heavy rain on the tourism industry in Taiwan. Consider the extreme climate of the compound impact of disaster for the tourism industry; we try to make multi-hazard risk assessment model, strategies and suggestions. Related risk analysis results are expected to provide government department, the tourism industry asset owners, insurance companies and banking include tourist disaster risk necessary information to help its tourism industry for effective natural disaster risk management.

Toxicity Study of Two Different Synthesized Silver Nanoparticles on Bacteria Vibrio Fischeri

A comparative evaluation of acute toxicity of synthesized nano silvers using two different procedures (biological and chemical reduction methods) and silver ions on bacteria Vibrio fischeri was investigated. The bacterial light inhibition test as a toxicological endpoint was used by applying of a homemade luminometer. To compare the toxicity effects as a quantitative parameter, a nominal effective concentrations (EC) of chemicals and a susceptibility constant (Z-value) of bacteria, after 5 min and 30 min exposure times, were calculated. After 5 and 30 min contact times, the EC50 values of two silver nanoparticles and the EC20 values were about similar. It demonstrates that toxicity of silvers was independent of their procedure. The EC values of nanoparticles were larger than those of the silver ions. The susceptibilities(Z- Values) of V.fischeri (L/mg) to the silver ions were greater than those of the nano silvers. According to the EC and Z values, the toxicity of silvers decreased in the following order: Silver ions >> silver nanoparticles from chemical reduction method ~ silver nanoparticles from biological method.

Performance Analysis of Flooding Attack Prevention Algorithm in MANETs

The lack of any centralized infrastructure in mobile ad hoc networks (MANET) is one of the greatest security concerns in the deployment of wireless networks. Thus communication in MANET functions properly only if the participating nodes cooperate in routing without any malicious intention. However, some of the nodes may be malicious in their behavior, by indulging in flooding attacks on their neighbors. Some others may act malicious by launching active security attacks like denial of service. This paper addresses few related works done on trust evaluation and establishment in ad hoc networks. Related works on flooding attack prevention are reviewed. A new trust approach based on the extent of friendship between the nodes is proposed which makes the nodes to co-operate and prevent flooding attacks in an ad hoc environment. The performance of the trust algorithm is tested in an ad hoc network implementing the Ad hoc On-demand Distance Vector (AODV) protocol.

Multiple Job Shop-Scheduling using Hybrid Heuristic Algorithm

In this paper, multi-processors job shop scheduling problems are solved by a heuristic algorithm based on the hybrid of priority dispatching rules according to an ant colony optimization algorithm. The objective function is to minimize the makespan, i.e. total completion time, in which a simultanous presence of various kinds of ferons is allowed. By using the suitable hybrid of priority dispatching rules, the process of finding the best solution will be improved. Ant colony optimization algorithm, not only promote the ability of this proposed algorithm, but also decreases the total working time because of decreasing in setup times and modifying the working production line. Thus, the similar work has the same production lines. Other advantage of this algorithm is that the similar machines (not the same) can be considered. So, these machines are able to process a job with different processing and setup times. According to this capability and from this algorithm evaluation point of view, a number of test problems are solved and the associated results are analyzed. The results show a significant decrease in throughput time. It also shows that, this algorithm is able to recognize the bottleneck machine and to schedule jobs in an efficient way.

Risk Level Evaluation for Power System Facilities in Smart Grid

Reliability Centered Maintenance(RCM) is one of most widely used methods in the modern power system to schedule a maintenance cycle and determine the priority of inspection. In order to apply the RCM method to the Smart Grid, a precedence study for the new structure of rearranged system should be performed due to introduction of additional installation such as renewable and sustainable energy resources, energy storage devices and advanced metering infrastructure. This paper proposes a new method to evaluate the priority of maintenance and inspection of the power system facilities in the Smart Grid using the Risk Priority Number. In order to calculate that risk index, it is required that the reliability block diagram should be analyzed for the Smart Grid system. Finally, the feasible technical method is discussed to estimate the risk potential as part of the RCM procedure.

Pleurotus sajor-caju (PSC) Improves Nutrient Contents and Maintains Sensory Properties of Carbohydrate-based Products

The grey oyster mushroom, Pleurotus sajor-caju (PSC), is a common edible mushroom and is now grown commercially around the world for food. This fungus has been broadly used as food or food ingredients in various food products for a long time. To enhance the nutritional quality and sensory attributes of bakery-based products, PSC powder is used in the present study to partially replace wheat flour in baked product formulations. The nutrient content and sensory properties of rice-porridge and unleavened bread (paratha) incorporated with various levels of PSC powder were studied. These food items were formulated with either 0%, 2%, 4% or 6% of PSC powder. Results show PSC powder recorded β-glucan at 3.57g/100g. In sensory evaluation, consumers gave higher score to both rice-porridge and paratha bread containing 2-4% PSC compared to those that are not added with PSC powder. The paratha containing 4% PSC powder can be formulated with the intention in improving overall acceptability of paratha bread. Meanwhile, for rice-porridge, consumers prefer the formulated product added with 4% PSC powder. In conclusion, the addition of PSC powder to partially wheat flour can be recommended for the purpose of enhancing nutritional composition and maintaining the acceptability of carbohydrate-based products.

Simulation and Experimentation on the Contact Width of New Metal Gasket for Asbestos Substitution

The contact width is important design parameter for optimizing the design of new metal gasket for asbestos substitution gasket. The contact width is found have relationship with the helium leak quantity. In the increasing of axial load value, the helium leak quantity is decreasing and the contact width is increasing. This study provides validity method using simulation analysis and the result is compared to experimental using pressure sensitive paper. The results denote similar trend data between simulation and experimental result. Final evaluation is determined by helium leak quantity to check leakage performance of gasket design. Considering the phenomena of position change on the convex contact, it can be developed the optimization of gasket design by increasing contact width.

Toward An Agreement on Semantic Web Architecture

There are many problems associated with the World Wide Web: getting lost in the hyperspace; the web content is still accessible only to humans and difficulties of web administration. The solution to these problems is the Semantic Web which is considered to be the extension for the current web presents information in both human readable and machine processable form. The aim of this study is to reach new generic foundation architecture for the Semantic Web because there is no clear architecture for it, there are four versions, but still up to now there is no agreement for one of these versions nor is there a clear picture for the relation between different layers and technologies inside this architecture. This can be done depending on the idea of previous versions as well as Gerber-s evaluation method as a step toward an agreement for one Semantic Web architecture.