Abstract: At present, dictionary attack has been the basic tool for
recovering key passwords. In order to avoid dictionary attack, users
purposely choose another character strings as passwords. According to
statistics, about 14% of users choose keys on a keyboard (Kkey, for
short) as passwords. This paper develops a framework system to attack
the password chosen from Kkeys and analyzes its efficiency. Within
this system, we build up keyboard rules using the adjacent and parallel
relationship among Kkeys and then use these Kkey rules to generate
password databases by depth-first search method. According to the
experiment results, we find the key space of databases derived from
these Kkey rules that could be far smaller than the password databases
generated within brute-force attack, thus effectively narrowing down
the scope of attack research. Taking one general Kkey rule, the
combinations in all printable characters (94 types) with Kkey adjacent
and parallel relationship, as an example, the derived key space is about
240 smaller than those in brute-force attack. In addition, we
demonstrate the method's practicality and value by successfully
cracking the access password to UNIX and PC using the password
databases created
Abstract: this paper presents an auto-regressive network called the Auto-Regressive Multi-Context Recurrent Neural Network (ARMCRN), which forecasts the daily peak load for two large power plant systems. The auto-regressive network is a combination of both recurrent and non-recurrent networks. Weather component variables are the key elements in forecasting because any change in these variables affects the demand of energy load. So the AR-MCRN is used to learn the relationship between past, previous, and future exogenous and endogenous variables. Experimental results show that using the change in weather components and the change that occurred in past load as inputs to the AR-MCRN, rather than the basic weather parameters and past load itself as inputs to the same network, produce higher accuracy of predicted load. Experimental results also show that using exogenous and endogenous variables as inputs is better than using only the exogenous variables as inputs to the network.
Abstract: This paper presents an algorithm of particle swarm
optimization with reduction for global optimization problems. Particle
swarm optimization is an algorithm which refers to the collective
motion such as birds or fishes, and a multi-point search algorithm
which finds a best solution using multiple particles. Particle
swarm optimization is so flexible that it can adapt to a number
of optimization problems. When an objective function has a lot of
local minimums complicatedly, the particle may fall into a local
minimum. For avoiding the local minimum, a number of particles are
initially prepared and their positions are updated by particle swarm
optimization. Particles sequentially reduce to reach a predetermined
number of them grounded in evaluation value and particle swarm
optimization continues until the termination condition is met. In order
to show the effectiveness of the proposed algorithm, we examine the
minimum by using test functions compared to existing algorithms.
Furthermore the influence of best value on the initial number of
particles for our algorithm is discussed.
Abstract: Ever since industrial revolution began, our ecosystem
has changed. And indeed, the negatives outweigh the positives.
Industrial waste usually released into all kinds of body of water, such
as river or sea. Tempeh waste is one example of waste that carries
many hazardous and unwanted substances that will affect the
surrounding environment. Tempeh is a popular fermented food in
Asia which is rich in nutrients and active substances. Tempeh liquid
waste- in particular- can cause an air pollution, and if penetrates
through the soil, it will contaminates ground-water, making it
unavailable for the water to be consumed. Moreover, bacteria will
thrive within the polluted water, which often responsible for causing
many kinds of diseases. The treatment used for this chemical waste is
biological treatment such as constructed wetland and activated
sludge. These kinds of treatment are able to reduce both physical and
chemical parameters altogether such as temperature, TSS, pH, BOD,
COD, NH3-N, NO3-N, and PO4-P. These treatments are implemented
before the waste is released into the water. The result is a
comparation between constructed wetland and activated sludge,
along with determining which method is better suited to reduce the
physical and chemical subtances of the waste.
Abstract: This paper addresses the fundamental requirements for
starting an online business. It covers the process of ideation,
conceptualization, formulation, and implementation of new venture
ideas on the Web. Using Facebook as an illustrative example, we learn
how to turn an idea into a successful electronic business and to execute
a business plan with IT skills, management expertise, a good
entrepreneurial attitude, and an understanding of Internet culture. The
personality traits and characteristics of a successful e-commerce
entrepreneur are discussed with reference to Facebook-s founder,
Mark Zuckerberg. Facebook is a social and e-commerce success. It
provides a trusted environment of which participants can conduct
business with social experience. People are able to discuss products
before, during the after the sale within the Facebook environment. The
paper also highlights the challenges and opportunities for e-commerce
entrepreneurial startups to go public and of entering the China market.
Abstract: This article demonstrated development of
controlled release system of an NSAID drug, Diclofenac
sodium employing different ratios of Ethyl cellulose.
Diclofenac sodium and ethyl cellulose in different proportions
were processed by microencapsulation based on phase
separation technique to formulate microcapsules. The
prepared microcapsules were then compressed into tablets to
obtain controlled release oral formulations. In-vitro evaluation
was performed by dissolution test of each preparation was
conducted in 900 ml of phosphate buffer solution of pH 7.2
maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined
time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12,
16, 20 and 24 hrs). The drug concentration in the collected
samples was determined by UV spectrophotometer at 276 nm.
The physical characteristics of diclofenac sodium
microcapsules were according to accepted range. These were
off-white, free flowing and spherical in shape. The release
profile of diclofenac sodium from microcapsules was found to
be directly proportional to the proportion of ethylcellulose and
coat thickness. The in-vitro release pattern showed that with
ratio of 1:1 and 1:2 (drug: polymer), the percentage release of
drug at first hour was 16.91 and 11.52 %, respectively as
compared to 1:3 which is only 6.87 % with in this time. The
release mechanism followed higuchi model for its release
pattern. Tablet Formulation (F2) of present study was found
comparable in release profile the marketed brand Phlogin-SR,
microcapsules showed an extended release beyond 24 h.
Further, a good correlation was found between drug release
and proportion of ethylcellulose in the microcapsules.
Microencapsulation based on coacervation found as good
technique to control release of diclofenac sodium for making
the controlled release formulations.
Abstract: Program slicing is the task of finding all statements in
a program that directly or indirectly influence the value of a variable
occurrence. The set of statements that can affect the value of a
variable at some point in a program is called a program backward
slice. In several software engineering applications, such as program
debugging and measuring program cohesion and parallelism, several
slices are computed at different program points. The existing
algorithms for computing program slices are introduced to compute a
slice at a program point. In these algorithms, the program, or the
model that represents the program, is traversed completely or
partially once. To compute more than one slice, the same algorithm
is applied for every point of interest in the program. Thus, the same
program, or program representation, is traversed several times.
In this paper, an algorithm is introduced to compute all forward
static slices of a computer program by traversing the program
representation graph once. Therefore, the introduced algorithm is
useful for software engineering applications that require computing
program slices at different points of a program. The program
representation graph used in this paper is called Program Dependence
Graph (PDG).
Abstract: This paper considers the robust exponential stability issues for a class of uncertain switched neutral system which delays switched according to the switching rule. The system under consideration includes both stable and unstable subsystems. The uncertainties considered in this paper are norm bounded, and possibly time varying. Based on multiple Lyapunov functional approach and dwell-time technique, the time-dependent switching rule is designed depend on the so-called average dwell time of stable subsystems as well as the ratio of the total activation time of stable subsystems and unstable subsystems. It is shown that by suitably controlling the switching between the stable and unstable modes, the robust stabilization of the switched uncertain neutral systems can be achieved. Two simulation examples are given to demonstrate the effectiveness of the proposed method.
Abstract: Structured catalysts formed from the growth of
zeolites on substrates is an area of increasing interest due to the
increased efficiency of the catalytic process, and the ability to
provide superior heat transfer and thermal conductivity for both
exothermic and endothermic processes.
However, the generation of structured catalysts represents a
significant challenge when balancing the relationship variables
between materials properties and catalytic performance, with the
Na2O, H2O and Al2O3 gel composition paying a significant role in
this dynamic, thereby affecting the both the type and range of
application.
The structured catalyst films generated as part of this
investigation have been characterised using a range of techniques,
including X-ray diffraction (XRD), Electron microscopy (SEM),
Energy Dispersive X-ray analysis (EDX) and Thermogravimetric
Analysis (TGA), with the transition from oxide-on-alloy wires to
hydrothermally synthesised uniformly zeolite coated surfaces being
demonstrated using both SEM and XRD. The robustness of the
coatings has been ascertained by subjecting these to thermal cycling
(ambient to 550oC), with the results indicating that the synthesis time
and gel compositions have a crucial effect on the quality of zeolite
growth on the FeCrAlloy wires.
Finally, the activity of the structured catalyst was verified by a
series of comparison experiments with standard zeolite Y catalysts in
powdered pelleted forms.
Abstract: This paper deals with modeling and parameter
identification of nonlinear systems described by Hammerstein model
having Piecewise nonlinear characteristics such as Dead-zone
nonlinearity characteristic. The simultaneous use of both an easy
decomposition technique and the triangular basis functions leads to a
particular form of Hammerstein model. The approximation by using
Triangular basis functions for the description of the static nonlinear
block conducts to a linear regressor model, so that least squares
techniques can be used for the parameter estimation. Singular Values
Decomposition (SVD) technique has been applied to separate the
coupled parameters. The proposed approach has been efficiently
tested on academic examples of simulation.
Abstract: The Romanian government has been making
significant attempts to make its services and information available on
the Internet. According to the UN e-government survey conducted in
2008, Romania comes under mid range countries by utilization of egovernment
(percent of utilization 41%). Romania-s national portal
www.e-guvernare.ro aims at progressively making all services and
information accessible through the portal. However, the success of
these efforts depends, to a great extent, on how well the targeted
users for such services, citizens in general, make use of them. For
this reason, the purpose of the presented study was to identify what
factors could affect the citizens' adoption of e-government services.
The study is an extension of the Technology Acceptance Model. The
proposed model was validated using data collected from 481 citizens.
The results provided substantial support for all proposed hypotheses
and showed the significance of the extended constructs.
Abstract: This paper presents the comparative study of coded
data methods for finding the benefit of concealing the natural data
which is the mercantile secret. Influential parameters of the number
of replicates (rep), treatment effects (τ) and standard deviation (σ)
against the efficiency of each transformation method are investigated.
The experimental data are generated via computer simulations under
the specified condition of the process with the completely
randomized design (CRD). Three ways of data transformation consist
of Box-Cox, arcsine and logit methods. The difference values of F
statistic between coded data and natural data (Fc-Fn) and hypothesis
testing results were determined. The experimental results indicate
that the Box-Cox results are significantly different from natural data
in cases of smaller levels of replicates and seem to be improper when
the parameter of minus lambda has been assigned. On the other hand,
arcsine and logit transformations are more robust and obviously,
provide more precise numerical results. In addition, the alternate
ways to select the lambda in the power transformation are also
offered to achieve much more appropriate outcomes.
Abstract: The Internet and the ever growing applications enable
communities to share and collaborate through common platforms.
However, this growing pattern is not witnessed yet even for elearning.
This paper is based on a doctoral research which aimed at
researching the ways students interact in an online campus and the
supports that they look for and require. Content analysis, based on the
Panchoo/Jaillet methodology, was done on four synchronous
meetings between a tutor and his ten students. The UNIV-Rct ecampus,
analogical to a physical campus, was found to be user
friendly and the students enrolled in a master-s course faced no
difficulties in using it. In addition to the environmental aspects, the
pedagogical implementation of the course has driven the students to
interact and collaborate significantly and this has contributed to
overcome the problems faced by the distance learners. This
completely online model was found to be fruitful in helping distant
learners fight their loneliness and brave their difficulties in a socioconstructivism
approach.
Abstract: This paper aims to present a framework for the
organizational knowledge management, which seeks to deploy a
standardized structure for the integrated management of knowledge is
a common language based on domains, processes and global
indicators inspired by the COBIT framework 5 (ISACA, 2012),
which supports the integration of three technologies, enterprise
information architecture (EIA), the business process modeling (BPM)
and service-oriented architecture (SOA). The Gomak Framework is a
management platform that seeks to integrate the information
technology infrastructure, the structure of applications, information
infrastructure, and business logic and business model to support a
sound strategy of organizational knowledge management, low
process-based approach and concurrent engineering. Concurrent
engineering (CE) is a systematic approach to integrated product
development that respond to customer expectations, involving all
perspectives in parallel, from the beginning of the product life cycle.
(European Space Agency, 2000).
Abstract: Automated production lines with so called 'hard structures' are widely used in manufacturing. Designers segmented these lines into sections by placing a buffer between the series of machine tools to increase productivity. In real production condition the capacity of a buffer system is limited and real production line can compensate only some part of the productivity losses of an automated line. The productivity of such production lines cannot be readily determined. This paper presents mathematical approach to solving the structure of section-based automated production lines by criterion of maximum productivity.
Abstract: This paper presents a new technique of compensation
of the effect of variation parameters in the direct field oriented
control of induction motor. The proposed method uses an adaptive
tuning of the value of synchronous speed to obtain the robustness for
the field oriented control. We show that this adaptive tuning allows
having robustness for direct field oriented control to changes in rotor
resistance, load torque and rotational speed. The effectiveness of the
proposed control scheme is verified by numerical simulations. The
numerical validation results of the proposed scheme have presented
good performances compared to the usual direct-field oriented
control.
Abstract: This paper describes the development of a fully
automated measurement software for antenna radiation pattern
measurements in a Compact Antenna Test Range (CATR). The
CATR has a frequency range from 2-40 GHz and the measurement
hardware includes a Network Analyzer for transmitting and
Receiving the microwave signal and a Positioner controller to control
the motion of the Styrofoam column. The measurement process
includes Calibration of CATR with a Standard Gain Horn (SGH)
antenna followed by Gain versus angle measurement of the Antenna
under test (AUT). The software is designed to control a variety of
microwave transmitter / receiver and two axis Positioner controllers
through the standard General Purpose interface bus (GPIB) interface.
Addition of new Network Analyzers is supported through a slight
modification of hardware control module. Time-domain gating is
implemented to remove the unwanted signals and get the isolated
response of AUT. The gated response of the AUT is compared with
the calibration data in the frequency domain to obtain the desired
results. The data acquisition and processing is implemented in
Agilent VEE and Matlab. A variety of experimental measurements
with SGH antennas were performed to validate the accuracy of
software. A comparison of results with existing commercial
softwares is presented and the measured results are found to be
within .2 dBm.
Abstract: This paper presents a new Hybrid Fuzzy (HF) PID type controller based on Genetic Algorithms (GA-s) for solution of the Automatic generation Control (AGC) problem in a deregulated electricity environment. In order for a fuzzy rule based control system to perform well, the fuzzy sets must be carefully designed. A major problem plaguing the effective use of this method is the difficulty of accurately constructing the membership functions, because it is a computationally expensive combinatorial optimization problem. On the other hand, GAs is a technique that emulates biological evolutionary theories to solve complex optimization problems by using directed random searches to derive a set of optimal solutions. For this reason, the membership functions are tuned automatically using a modified GA-s based on the hill climbing method. The motivation for using the modified GA-s is to reduce fuzzy system effort and take large parametric uncertainties into account. The global optimum value is guaranteed using the proposed method and the speed of the algorithm-s convergence is extremely improved, too. This newly developed control strategy combines the advantage of GA-s and fuzzy system control techniques and leads to a flexible controller with simple stricture that is easy to implement. The proposed GA based HF (GAHF) controller is tested on a threearea deregulated power system under different operating conditions and contract variations. The results of the proposed GAHF controller are compared with those of Multi Stage Fuzzy (MSF) controller, robust mixed H2/H∞ and classical PID controllers through some performance indices to illustrate its robust performance for a wide range of system parameters and load changes.
Abstract: The drastic increase in the usage of SMS technology
has led service providers to seek for a solution that enable users of
mobile devices to access services through SMSs. This has resulted in
the proposal of solutions towards SMS-based service invocation in
service oriented environments. However, the dynamic nature of
service-oriented environments coupled with sudden load peaks
generated by service request, poses performance challenges to
infrastructures for supporting SMS-based service invocation. To
address this problem we adopt load balancing techniques. A load
balancing model with adaptive load balancing and load monitoring
mechanisms as its key constructs is proposed. The load balancing
model then led to realization of Least Loaded Load Balancing
Framework (LLLBF). Evaluation of LLLBF benchmarked with round
robin (RR) scheme on the queuing approach showed LLLBF
outperformed RR in terms of response time and throughput.
However, LLLBF achieved better result in the cost of high
processing power.
Abstract: Flow field around hypersonic vehicles is very
complex and difficult to simulate. The boundary layers are squeezed
between shock layer and body surface. Resolution of boundary layer,
shock wave and turbulent regions where the flow field has high
values is difficult of capture. Detached eddy simulation (DES) is a
modification of a RANS model in which the model switches to a
subgrid scale formulation in regions fine enough for LES
calculations. Regions near solid body boundaries and where the
turbulent length scale is less than the maximum grid dimension are
assigned the RANS mode of solution. As the turbulent length scale
exceeds the grid dimension, the regions are solved using the LES
mode. Therefore the grid resolution is not as demanding as pure LES,
thereby considerably cutting down the cost of the computation. In
this research study hypersonic flow is simulated at Mach 8 and
different angle of attacks to resolve the proper boundary layers and
discontinuities. The flow is also simulated in the long wake regions.
Mesh is little different than RANS simulations and it is made dense
near the boundary layers and in the wake regions to resolve it
properly. Hypersonic blunt cone cylinder body with frustrum at angle
5o and 10 o are simulated and there aerodynamics study is performed
to calculate aerodynamics characteristics of different geometries. The
results and then compared with experimental as well as with some
turbulence model (SA Model). The results achieved with DES
simulation have very good resolution as well as have excellent
agreement with experimental and available data. Unsteady
simulations are performed for DES calculations by using duel time
stepping method or implicit time stepping. The simulations are
performed at Mach number 8 and angle of attack from 0o to 10o for
all these cases. The results and resolutions for DES model found
much better than SA turbulence model.