A Robust TVD-WENO Scheme for Conservation Laws

The ultimate goal of this article is to develop a robust and accurate numerical method for solving hyperbolic conservation laws in one and two dimensions. A hybrid numerical method, coupling a cheap fourth order total variation diminishing (TVD) scheme [1] for smooth region and a Robust seventh-order weighted non-oscillatory (WENO) scheme [2] near discontinuities, is considered. High order multi-resolution analysis is used to detect the high gradients regions of the numerical solution in order to capture the shocks with the WENO scheme, while the smooth regions are computed with fourth order total variation diminishing (TVD). For time integration, we use the third order TVD Runge-Kutta scheme. The accuracy of the resulting hybrid high order scheme is comparable with these of WENO, but with significant decrease of the CPU cost. Numerical demonstrates that the proposed scheme is comparable to the high order WENO scheme and superior to the fourth order TVD scheme. Our scheme has the added advantage of simplicity and computational efficiency. Numerical tests are presented which show the robustness and effectiveness of the proposed scheme.

Temperature Control of Industrial Water Cooler using Hot-gas Bypass

In this study, we experiment on precise control outlet temperature of water from the water cooler with hot-gas bypass method based on PI control logic for machine tool. Recently, technical trend for machine tools is focused on enhancement of speed and accuracy. High speedy processing causes thermal and structural deformation of objects from the machine tools. Water cooler has to be applied to machine tools to reduce the thermal negative influence with accurate temperature controlling system. The goal of this study is to minimize temperature error in steady state. In addition, control period of an electronic expansion valve were considered to increment of lifetime of the machine tools and quality of product with a water cooler.

A Dynamic Programming Model for Maintenance of Electric Distribution System

The paper presents dynamic programming based model as a planning tool for the maintenance of electric power systems. Every distribution component has an exponential age depending reliability function to model the fault risk. In the moment of time when the fault costs exceed the investment costs of the new component the reinvestment of the component should be made. However, in some cases the overhauling of the old component may be more economical than the reinvestment. The comparison between overhauling and reinvestment is made by optimisation process. The goal of the optimisation process is to find the cost minimising maintenance program for electric power distribution system.

Headspace Solid-phase Microextraction of Volatile and Furanic Compounds in Coated Fish Sticks: Effect of the Extraction Temperature

This work evaluated the effect of temperature on headspace solid-phase microextraction of volatile and furanic compounds in coated fish sticks. The major goal was the analysis of the samples as consumed, to reproduce volatile compounds people feel when consuming those products. Extraction at 37 ºC (the human body temperature) throughout the HS-SPME analysis of volatile and furanic compounds in coated fish was compared with higher extraction temperatures, which are frequently used for this kind of determinations. The profile of volatile compounds found in deepfried (F) and non-fried (NF) coated fish at 37 and 50 ºC was different from that obtained at 80 ºC. Concerning furan and its derivatives, an extra formation of these compounds was observed at higher extraction temperatures. The analysis of volatile and furanic compounds in fish coated sticks simulating the cooking and eating conditions can be reliably carried out setting the headspace absorption temperature at 37 ºC.

Post-Cracking Behaviour of High Strength Fiber Concrete Prediction and Validation

Fracture process in mechanically loaded steel fiber reinforced high-strength (SFRHSC) concrete is characterized by fibers bridging the crack providing resistance to its opening. Structural SFRHSC fracture model was created; material fracture process was modeled, based on single fiber pull-out laws, which were determined experimentally (for straight fibers, fibers with end hooks (Dramix), and corrugated fibers (Tabix)) as well as obtained numerically ( using FEM simulations). For this purpose experimental program was realized and pull-out force versus pull-out fiber length was obtained (for fibers embedded into concrete at different depth and under different angle). Model predictions were validated by 15x15x60cm prisms 4 point bending tests. Fracture surfaces analysis was realized for broken prisms with the goal to improve elaborated model assumptions. Optimal SFRHSC structures were recognized.

Nonlinear Model Predictive Swing-Up and Stabilizing Sliding Mode Controllers

In this paper, a nonlinear model predictive swing-up and stabilizing sliding controller is proposed for an inverted pendulum-cart system. In the swing up phase, the nonlinear model predictive control is formulated as a nonlinear programming problem with energy based objective function. By solving this problem at each sampling instant, a sequence of control inputs that optimize the nonlinear objective function subject to various constraints over a finite horizon are obtained. Then, this control drives the pendulum to a predefined neighborhood of the upper equilibrium point, at where sliding mode based model predictive control is used to stabilize the systems with the specified constraints. It is shown by the simulations that, due to the way of formulating the problem, short horizon lengths are sufficient for attaining the swing up goal.

Towards Model-Driven Communications

In modern distributed software systems, the issue of communication among composing parts represents a critical point, but the idea of extending conventional programming languages with general purpose communication constructs seems difficult to realize. As a consequence, there is a (growing) gap between the abstraction level required by distributed applications and the concepts provided by platforms that enable communication. This work intends to discuss how the Model Driven Software Development approach can be considered as a mature technology to generate in automatic way the schematic part of applications related to communication, by providing at the same time high level specialized languages useful in all the phases of software production. To achieve the goal, a stack of languages (meta-meta¬models) has been introduced in order to describe – at different levels of abstraction – the collaborative behavior of generic entities in terms of communication actions related to a taxonomy of messages. Finally, the generation of platforms for communication is viewed as a form of specification of language semantics, that provides executable models of applications together with model-checking supports and effective runtime environments.

Cryptography Over Elliptic Curve Of The Ring Fq[e], e4 = 0

Groups where the discrete logarithm problem (DLP) is believed to be intractable have proved to be inestimable building blocks for cryptographic applications. They are at the heart of numerous protocols such as key agreements, public-key cryptosystems, digital signatures, identification schemes, publicly verifiable secret sharings, hash functions and bit commitments. The search for new groups with intractable DLP is therefore of great importance.The goal of this article is to study elliptic curves over the ring Fq[], with Fq a finite field of order q and with the relation n = 0, n ≥ 3. The motivation for this work came from the observation that several practical discrete logarithm-based cryptosystems, such as ElGamal, the Elliptic Curve Cryptosystems . In a first time, we describe these curves defined over a ring. Then, we study the algorithmic properties by proposing effective implementations for representing the elements and the group law. In anther article we study their cryptographic properties, an attack of the elliptic discrete logarithm problem, a new cryptosystem over these curves.

Hiding Data in Images Using PCP

In recent years, everything is trending toward digitalization and with the rapid development of the Internet technologies, digital media needs to be transmitted conveniently over the network. Attacks, misuse or unauthorized access of information is of great concern today which makes the protection of documents through digital media a priority problem. This urges us to devise new data hiding techniques to protect and secure the data of vital significance. In this respect, steganography often comes to the fore as a tool for hiding information. Steganography is a process that involves hiding a message in an appropriate carrier like image or audio. It is of Greek origin and means "covered or hidden writing". The goal of steganography is covert communication. Here the carrier can be sent to a receiver without any one except the authenticated receiver only knows existence of the information. Considerable amount of work has been carried out by different researchers on steganography. In this work the authors propose a novel Steganographic method for hiding information within the spatial domain of the gray scale image. The proposed approach works by selecting the embedding pixels using some mathematical function and then finds the 8 neighborhood of the each selected pixel and map each bit of the secret message in each of the neighbor pixel coordinate position in a specified manner. Before embedding a checking has been done to find out whether the selected pixel or its neighbor lies at the boundary of the image or not. This solution is independent of the nature of the data to be hidden and produces a stego image with minimum degradation.

The Role of Knowledge Management in Enterprise 2.0

The term Enterprise 2.0 (E2.0) describes a collection of organizational and IT practices that help organizations establish flexible work models, visible knowledge-sharing practices, and higher levels of community participation. E2.0 parallels and builds on another term commonly being used in the industry – Web 2.0. E2.0 represents also new packaging for strategic collaboration and Knowledge Management (KM). Organizations rely on collaboration and KM initiatives to attain innovation, growth, productivity, and performance goals.

Computer Modeling of Drug Distribution after Intravitreal Administration

Intravitreal injection (IVI) is the most common treatment for eye posterior segment diseases such as endopthalmitis, retinitis, age-related macular degeneration, diabetic retinopathy, uveitis, and retinal detachment. Most of the drugs used to treat vitreoretinal diseases, have a narrow concentration range in which they are effective, and may be toxic at higher concentrations. Therefore, it is critical to know the drug distribution within the eye following intravitreal injection. Having knowledge of drug distribution, ophthalmologists can decide on drug injection frequency while minimizing damage to tissues. The goal of this study was to develop a computer model to predict intraocular concentrations and pharmacokinetics of intravitreally injected drugs. A finite volume model was created to predict distribution of two drugs with different physiochemical properties in the rabbit eye. The model parameters were obtained from literature review. To validate this numeric model, the in vivo data of spatial concentration profile from the lens to the retina were compared with the numeric data. The difference was less than 5% between the numerical and experimental data. This validation provides strong support for the numerical methodology and associated assumptions of the current study.

Evaluating per-user Fairness of Goal-Oriented Parallel Computer Job Scheduling Policies

Fair share objective has been included into the goaloriented parallel computer job scheduling policy recently. However, the previous work only presented the overall scheduling performance. Thus, the per-user performance of the policy is still lacking. In this work, the details of per-user fair share performance under the Tradeoff-fs(Tx:avgX) policy will be further evaluated. A basic fair share priority backfill policy namely RelShare(1d) is also studied. The performance of all policies is collected using an event-driven simulator with three real job traces as input. The experimental results show that the high demand users are usually benefited under most policies because their jobs are large or they have a lot of jobs. In the large job case, one job executed may result in over-share during that period. In the other case, the jobs may be backfilled for performances. However, the users with a mixture of jobs may suffer because if the smaller jobs are executing the priority of the remaining jobs from the same user will be lower. Further analysis does not show any significant impact of users with a lot of jobs or users with a large runtime approximation error.

Hybrid Modeling and Optimal Control of a Two-Tank System as a Switched System

In the past decade, because of wide applications of hybrid systems, many researchers have considered modeling and control of these systems. Since switching systems constitute an important class of hybrid systems, in this paper a method for optimal control of linear switching systems is described. The method is also applied on the two-tank system which is a much appropriate system to analyze different modeling and control techniques of hybrid systems. Simulation results show that, in this method, the goals of control and also problem constraints can be satisfied by an appropriate selection of cost function.

Applying Clustering of Hierarchical K-means-like Algorithm on Arabic Language

In this study a clustering technique has been implemented which is K-Means like with hierarchical initial set (HKM). The goal of this study is to prove that clustering document sets do enhancement precision on information retrieval systems, since it was proved by Bellot & El-Beze on French language. A comparison is made between the traditional information retrieval system and the clustered one. Also the effect of increasing number of clusters on precision is studied. The indexing technique is Term Frequency * Inverse Document Frequency (TF * IDF). It has been found that the effect of Hierarchical K-Means Like clustering (HKM) with 3 clusters over 242 Arabic abstract documents from the Saudi Arabian National Computer Conference has significant results compared with traditional information retrieval system without clustering. Additionally it has been found that it is not necessary to increase the number of clusters to improve precision more.