Optimal Control of a Linear Distributed Parameter System via Shifted Legendre Polynomials

The optimal control problem of a linear distributed parameter system is studied via shifted Legendre polynomials (SLPs) in this paper. The partial differential equation, representing the linear distributed parameter system, is decomposed into an n - set of ordinary differential equations, the optimal control problem is transformed into a two-point boundary value problem, and the twopoint boundary value problem is reduced to an initial value problem by using SLPs. A recursive algorithm for evaluating optimal control input and output trajectory is developed. The proposed algorithm is computationally simple. An illustrative example is given to show the simplicity of the proposed approach.

Thermal and Electrical Properties of Carbon Nanotubes Purified by Acid Digestion

Carbon nanotubes (CNTs) possess unique structural, mechanical, thermal and electronic properties, and have been proposed to be used for applications in many fields. However, to reach the full potential of the CNTs, many problems still need to be solved, including the development of an easy and effective purification procedure, since synthesized CNTs contain impurities, such as amorphous carbon, carbon nanoparticles and metal particles. Different purification methods yield different CNT characteristics and may be suitable for the production of different types of CNTs. In this study, the effect of different purification chemicals on carbon nanotube quality was investigated. CNTs were firstly synthesized by chemical vapor deposition (CVD) of acetylene (C2H2) on a magnesium oxide (MgO) powder impregnated with an iron nitrate (Fe(NO3)3·9H2O) solution. The synthesis parameters were selected as: the synthesis temperature of 800°C, the iron content in the precursor of 5% and the synthesis time of 30 min. The liquid phase oxidation method was applied for the purification of the synthesized CNT materials. Three different acid chemicals (HNO3, H2SO4, and HCl) were used in the removal of the metal catalysts from the synthesized CNT material to investigate the possible effects of each acid solution to the purification step. Purification experiments were carried out at two different temperatures (75 and 120 °C), two different acid concentrations (3 and 6 M) and for three different time intervals (6, 8 and 15 h). A 30% H2O2 : 3M HCl (1:1 v%) solution was also used in the purification step to remove both the metal catalysts and the amorphous carbon. The purifications using this solution were performed at the temperature of 75°C for 8 hours. Purification efficiencies at different conditions were evaluated by thermogravimetric analysis. Thermal and electrical properties of CNTs were also determined. It was found that the obtained electrical conductivity values for the carbon nanotubes were typical for organic semiconductor materials and thermal stabilities were changed depending on the purification chemicals.

Ethylene Epoxidation in a Low-Temperature Parallel Plate Dielectric Barrier Discharge System: Effects of Ethylene Feed Position and O2/C2H4 Feed Molar Ratio

The effects of ethylene (C2H4) feed position and O2/C2H4 feed molar ratio on ethylene epoxidation in a parallel dielectric barrier discharge (DBD) were studied. The results showed that the ethylene feed position fraction of 0.5 and the feed molar ratio of O2/C2H4 of 0.2:1 gave the highest EO selectivity of 34.3% and the highest EO yield of 5.28% with low power consumptions of 2.11×10-16 Ws/molecule of ethylene converted and 6.34×10-16 Ws/molecule of EO produced when the DBD system was operated under the best conditions: an applied voltage of 19 kV, an input frequency of 500 Hz and a total feed flow rate of 50 cm3/min. The separate ethylene feed system provided much higher epoxidation activity as compared to the mixed feed system which gave EO selectivity of 15.5%, EO yield of 2.1% and the power consumption of EO produced of 7.7×10-16 Ws/molecule.

Efficiency of Floristic and Molecular Markers to Determine Diversity in Iranian Populations of T. boeoticum

In order to study floristic and molecular classification of common wild wheat (Triticum boeoticum Boiss.), an analysis was conducted on populations of the Triticum boeoticum collected from different regions of Iran. Considering all floristic compositions of habitats, six floristic groups (syntaxa) within the populations were identified. A high level of variation of T. boeoticum also detected using SSR markers. Our results showed that molecular method confirmed the grouping of floristic method. In other word, the results from our study indicate that floristic classification are still useful, efficient, and economic tools for characterizing the amount and distribution of genetic variation in natural populations of T. boeoticum. Nevertheless, molecular markers appear as useful and complementary techniques for identification and for evaluation of genetic diversity in studied populations.

Source of Oseltamivir Resistance Due to R152K Mutation of Influenza B Virus Neuraminidase: Molecular Modeling

Every 2-3 years the influenza B virus serves epidemics. Neuraminidase (NA) is an important target for influenza drug design. Although, oseltamivir, an oral neuraminidase drug, has been shown good inhibitory efficiency against wild-type of influenza B virus, the lower susceptibility to the R152K mutation has been reported. Better understanding of oseltamivir efficiency and resistance toward the influenza B NA wild-type and R152K mutant, respectively, could be useful for rational drug design. Here, two complex systems of wild-type and R152K NAs with oseltamivir bound were studied using molecular dynamics (MD) simulations. Based on 5-ns MD simulation, the loss of notable hydrogen bond and decrease in per-residue decomposition energy from the mutated residue K152 contributed to drug compared to those of R152 in wildtype were found to be a primary source of high-level of oseltamivir resistance due to the R152K mutation.

XML based Safe and Scalable Multi-Agent Development Framework

In this paper we describe our efforts to design and implement an agent development framework that has the potential to scale to the size of any underlying network suitable for various ECommerce activities. The main novelty in our framework is it-s capability to allow the development of sophisticated, secured agents which are simple enough to be practical. We have adopted FIPA agent platform reference Model as backbone for implementation along with XML for agent Communication and Java Cryptographic Extension and architecture to realize the security of communication information between agents. The advantage of our architecture is its support of agents development in different languages and Communicating with each other using a more open standard i.e. XML

A Multi-Signature Scheme based on Coding Theory

In this paper we propose two first non-generic constructions of multisignature scheme based on coding theory. The first system make use of the CFS signature scheme and is secure in random oracle while the second scheme is based on the KKS construction and is a few times. The security of our construction relies on a difficult problems in coding theory: The Syndrome Decoding problem which has been proved NP-complete [4].

Parallel Branch and Bound Model Using Logarithmic Sampling (PBLS) for Symmetric Traveling Salesman Problem

Very Large and/or computationally complex optimization problems sometimes require parallel or highperformance computing for achieving a reasonable time for computation. One of the most popular and most complicate problems of this family is “Traveling Salesman Problem". In this paper we have introduced a Branch & Bound based algorithm for the solution of such complicated problems. The main focus of the algorithm is to solve the “symmetric traveling salesman problem". We reviewed some of already available algorithms and felt that there is need of new algorithm which should give optimal solution or near to the optimal solution. On the basis of the use of logarithmic sampling, it was found that the proposed algorithm produced a relatively optimal solution for the problem and results excellent performance as compared with the traditional algorithms of this series.

Key Exchange Protocol over Insecure Channel

Key management represents a major and the most sensitive part of cryptographic systems. It includes key generation, key distribution, key storage, and key deletion. It is also considered the hardest part of cryptography. Designing secure cryptographic algorithms is hard, and keeping the keys secret is much harder. Cryptanalysts usually attack both symmetric and public key cryptosystems through their key management. We introduce a protocol to exchange cipher keys over insecure communication channel. This protocol is based on public key cryptosystem, especially elliptic curve cryptosystem. Meanwhile, it tests the cipher keys and selects only the good keys and rejects the weak one.

Effect of Substituent on Titanocene/MMAO Catalyst for Ethylene/1-Hexene Copolymerization

Copolymerization of ethylene with 1-hexene was carried out using two ansa-fluorenyl titanium derivative complexes. The substituent effect on the catalytic activity, monomer reactivity ratio and polymer property was investigated. It was found that the presence of t-Bu groups on fluorenyl ring exhibited remarkable catalytic activity and produced polymer with high molecular weight. However, these catalysts produce polymer with narrow molecular weight distribution, indicating the characteristic of single-site metallocene catalyst. Based on 13C NMR, we can observe that monomer reactivity ratio was affected by catalyst structure. The rH values of complex 2 were lower than that of complex 1 which might be result from the higher steric hindrance leading to a reduction of 1- hexene insertion step.

Computational Intelligence Hybrid Learning Approach to Time Series Forecasting

Time series forecasting is an important and widely popular topic in the research of system modeling. This paper describes how to use the hybrid PSO-RLSE neuro-fuzzy learning approach to the problem of time series forecasting. The PSO algorithm is used to update the premise parameters of the proposed prediction system, and the RLSE is used to update the consequence parameters. Thanks to the hybrid learning (HL) approach for the neuro-fuzzy system, the prediction performance is excellent and the speed of learning convergence is much faster than other compared approaches. In the experiments, we use the well-known Mackey-Glass chaos time series. According to the experimental results, the prediction performance and accuracy in time series forecasting by the proposed approach is much better than other compared approaches, as shown in Table IV. Excellent prediction performance by the proposed approach has been observed.

IMLFQ Scheduling Algorithm with Combinational Fault Tolerant Method

Scheduling algorithms are used in operating systems to optimize the usage of processors. One of the most efficient algorithms for scheduling is Multi-Layer Feedback Queue (MLFQ) algorithm which uses several queues with different quanta. The most important weakness of this method is the inability to define the optimized the number of the queues and quantum of each queue. This weakness has been improved in IMLFQ scheduling algorithm. Number of the queues and quantum of each queue affect the response time directly. In this paper, we review the IMLFQ algorithm for solving these problems and minimizing the response time. In this algorithm Recurrent Neural Network has been utilized to find both the number of queues and the optimized quantum of each queue. Also in order to prevent any probable faults in processes' response time computation, a new fault tolerant approach has been presented. In this approach we use combinational software redundancy to prevent the any probable faults. The experimental results show that using the IMLFQ algorithm results in better response time in comparison with other scheduling algorithms also by using fault tolerant mechanism we improve IMLFQ performance.

Effect of Plasma Therapy on Epidermal Regeneration

The purpose of our study was to compare spontaneous re-epithelisation characteristics versus assisted re-epithelisation. In order to assess re-epithelisation of the injured skin, we have imagined and designed a burn wound model on Wistar rat skin. Our aim was to create standardised, easy reproducible and quantifiable skin lesions involving entire epidermis and superficial dermis. We then have applied the above mentioned therapeutic strategies to compare regeneration of epidermis and dermis, local and systemic parameter changes in different conditions. We have enhanced the reepithelisation process under a moist atmosphere of a polyurethane wound dress modified with helium non-thermal plasma, and with the aid of direct cold-plasma treatment respectively. We have followed systemic parameters change: hematologic and biochemical parameters, and local features: oxidative stress markers and histology of skin in the above mentioned conditions. Re-epithelisation is just a part of the skin regeneration process, which recruits cellular components, with the aid of epidermal and dermal interaction via signal molecules.

Application of Sensory Thermography as Measuring Method to Study Median Nerve Temperatures

This paper presents an experimental case using sensory thermography to describe temperatures behavior on median nerve once an activity of repetitive motion was done. Thermography is a noninvasive technique without biological hazard and not harm at all times and has been applied in many experiments to seek for temperature patterns that help to understand diseases like cancer and cumulative trauma disorders (CTD’s). An infrared sensory thermography technology was developed to execute this study. Three women in good shape were selected for the repetitive motion tests for 4 days, two right-handed women and 1 left handed woman, two sensory thermographers were put on both median nerve wrists to get measures. The evaluation time was of 3 hours 30 minutes in a controlled temperature, 20 minutes of stabilization time at the beginning and end of the operation. Temperatures distributions are statistically evaluated and showed similar temperature patterns behavior.

Characterization of the O.ul-mS952 Intron:A Potential Molecular Marker to Distinguish Between Ophiostoma Ulmi and Ophiostoma Novo-Ulmi Subsp. Americana

The full length mitochondrial small subunit ribosomal (mt-rns) gene has been characterized for Ophiostoma novo-ulmi subspecies americana. The gene was also characterized for Ophiostoma ulmi and a group II intron was noted in the mt-rns gene of O. ulmi. The insertion in the mt-rns gene is at position S952 and it is a group IIB1 intron that encodes a double motif LAGLIDADG homing endonuclease from an open reading frame located within a loop of domain III. Secondary structure models for the mt-rns RNA of O. novo-ulmi subsp. americana and O. ulmi were generated to place the intron within the context of the ribosomal RNA. The in vivo splicing of the O.ul-mS952 group II intron was confirmed with reverse transcription-PCR. A survey of 182 strains of Dutch Elm Diseases causing agents showed that the mS952 intron was absent in what is considered to be the more aggressive species O. novo-ulmi but present in strains of the less aggressive O. ulmi. This observation suggests that the O.ul-mS952 intron can be used as a PCR-based molecular marker to discriminate between O. ulmi and O. novo-ulmi subsp. americana.

Fuzzy based Security Threshold Determining for the Statistical En-Route Filtering in Sensor Networks

In many sensor network applications, sensor nodes are deployed in open environments, and hence are vulnerable to physical attacks, potentially compromising the node's cryptographic keys. False sensing report can be injected through compromised nodes, which can lead to not only false alarms but also the depletion of limited energy resource in battery powered networks. Ye et al. proposed a statistical en-route filtering scheme (SEF) to detect such false reports during the forwarding process. In this scheme, the choice of a security threshold value is important since it trades off detection power and overhead. In this paper, we propose a fuzzy logic for determining a security threshold value in the SEF based sensor networks. The fuzzy logic determines a security threshold by considering the number of partitions in a global key pool, the number of compromised partitions, and the energy level of nodes. The fuzzy based threshold value can conserve energy, while it provides sufficient detection power.

Nuclear Medical Image Treatment System Based On FPGA in Real Time

We present in this paper an acquisition and treatment system designed for semi-analog Gamma-camera. It consists of a nuclear medical Image Acquisition, Treatment and Display chain(IATD) ensuring the acquisition, the treatment of the signals(resulting from the Gamma-camera detection head) and the scintigraphic image construction in real time. This chain is composed by an analog treatment board and a digital treatment board. We describe the designed systems and the digital treatment algorithms in which we have improved the performance and the flexibility. The digital treatment algorithms are implemented in a specific reprogrammable circuit FPGA (Field Programmable Gate Array).interface for semi-analog cameras of Sopha Medical Vision(SMVi) by taking as example SOPHY DS7. The developed system consists of an Image Acquisition, Treatment and Display (IATD) ensuring the acquisition and the treatment of the signals resulting from the DH. The developed chain is formed by a treatment analog board and a digital treatment board designed around a DSP [2]. In this paper we have presented the architecture of a new version of our chain IATD in which the integration of the treatment algorithms is executed on an FPGA (Field Programmable Gate Array)

A Martingale Residual Diagnostic for Logistic Regression Model

Martingale model diagnostic for assessing the fit of logistic regression model to recurrent events data are studied. One way of assessing the fit is by plotting the empirical standard deviation of the standardized martingale residual processes. Here we used another diagnostic plot based on martingale residual covariance. We investigated the plot performance under several types of model misspecification. Clearly the method has correctly picked up the wrong model. Also we present a test statistic that supplement the inspection of the two diagnostic. The test statistic power agrees with what we have seen in the plots of the estimated martingale covariance.

Necessity of using an Optimum Business Model in High-Tech Firms, Nanotechnology Case Study

In the way of growing and developing firms especially high-tech firms, on many occasions manager of firm is mainly involved in solving problems of his business and decision making about executive activities of the firm, while besides executive measures, attention to planning of firm's success and growth way and application of long experience and sagacity in designing business model are vital and necessary success in a business is achieved as a result of different factors, one of the most important of them is designing and performing an optimal business model at the beginning of the firm's work. This model is determining the limit of profitability achieved by innovation and gained value added. Therefore, business model is the process of connecting innovation environment and technology with economic environment and business and is important for succeeding modern businesses considering their traits.

Evaluation of Horizontal Seismic Hazard of Naghan, Iran

This paper presents probabilistic horizontal seismic hazard assessment of Naghan, Iran. It displays the probabilistic estimate of Peak Ground Horizontal Acceleration (PGHA) for the return period of 475, 950 and 2475 years. The output of the probabilistic seismic hazard analysis is based on peak ground acceleration (PGA), which is the most common criterion in designing of buildings. A catalogue of seismic events that includes both historical and instrumental events was developed and covers the period from 840 to 2009. The seismic sources that affect the hazard in Naghan were identified within the radius of 200 km and the recurrence relationships of these sources were generated by Kijko and Sellevoll. Finally Peak Ground Horizontal Acceleration (PGHA) has been prepared to indicate the earthquake hazard of Naghan for different hazard levels by using SEISRISK III software.