Abstract: At present, dictionary attack has been the basic tool for
recovering key passwords. In order to avoid dictionary attack, users
purposely choose another character strings as passwords. According to
statistics, about 14% of users choose keys on a keyboard (Kkey, for
short) as passwords. This paper develops a framework system to attack
the password chosen from Kkeys and analyzes its efficiency. Within
this system, we build up keyboard rules using the adjacent and parallel
relationship among Kkeys and then use these Kkey rules to generate
password databases by depth-first search method. According to the
experiment results, we find the key space of databases derived from
these Kkey rules that could be far smaller than the password databases
generated within brute-force attack, thus effectively narrowing down
the scope of attack research. Taking one general Kkey rule, the
combinations in all printable characters (94 types) with Kkey adjacent
and parallel relationship, as an example, the derived key space is about
240 smaller than those in brute-force attack. In addition, we
demonstrate the method's practicality and value by successfully
cracking the access password to UNIX and PC using the password
databases created
Abstract: Chinese Idioms are a type of traditional Chinese idiomatic
expressions with specific meanings and stereotypes structure
which are widely used in classical Chinese and are still common in
vernacular written and spoken Chinese today. Currently, Chinese
Idioms are retrieved in glossary with key character or key word in
morphology or pronunciation index that can not meet the need of
searching semantically. OCIRS is proposed to search the desired
idiom in the case of users only knowing its meaning without any key
character or key word. The user-s request in a sentence or phrase will
be grammatically analyzed in advance by word segmentation, key
word extraction and semantic similarity computation, thus can be
mapped to the idiom domain ontology which is constructed to provide
ample semantic relations and to facilitate description logics-based
reasoning for idiom retrieval. The experimental evaluation shows that
OCIRS realizes the function of searching idioms via semantics, obtaining
preliminary achievement as requested by the users.
Abstract: Severe acute respiratory syndrome (SARS) is a respiratory disease in humans which is caused by the SARS coronavirus. The treatment of coronavirus-associated SARS has been evolving and so far there is no consensus on an optimal regimen. The mainstream therapeutic interventions for SARS involve broad-spectrum antibiotics and supportive care, as well as antiviral agents and immunomodulatory therapy. The Protein- Ligand interaction plays a significant role in structural based drug designing. In the present work we have taken the receptor Angiotensin converting enzyme 2 and identified the drugs that are commonly used against SARS. They are Lopinavir, Ritonavir, Ribavirin, and Oseltamivir. The receptor Angiotensin converting enzyme 2 (ACE-2) was docked with above said drugs and the energy value obtained are as follows, Lopinavir (-292.3), Ritonavir (-325.6), Oseltamivir (- 229.1), Ribavirin (-208.8). Depending on the least energy value we have chosen the best two drugs out of the four conventional drugs. We tried to improve the binding efficiency and steric compatibility of the two drugs namely Ritonavir and Lopinavir. Several modifications were made to the probable functional groups (phenylic, ketonic groups in case of Ritonavir and carboxylic groups in case of Lopinavir respectively) which were interacting with the receptor molecule. Analogs were prepared by Marvin Sketch software and were docked using HEX docking software. Lopinavir analog 8 and Ritonavir analog 11 were detected with significant energy values and are probable lead molecule. It infers that some of the modified drugs are better than the original drugs. Further work can be carried out to improve the steric compatibility of the drug based upon the work done above for a more energy efficient binding of the drugs to the receptor.
Abstract: Reducing energy consumption of embedded systems requires careful memory management. It has been shown that Scratch- Pad Memories (SPMs) are low size, low cost, efficient (i.e. energy saving) data structures directly managed at the software level. In this paper, the focus is on heuristic methods for SPMs management. A method is efficient if the number of accesses to SPM is as large as possible and if all available space (i.e. bits) is used. A Tabu Search (TS) approach for memory management is proposed which is, to the best of our knowledge, a new original alternative to the best known existing heuristic (BEH). In fact, experimentations performed on benchmarks show that the Tabu Search method is as efficient as BEH (in terms of energy consumption) but BEH requires a sorting which can be computationally expensive for a large amount of data. TS is easy to implement and since no sorting is necessary, unlike BEH, the corresponding sorting time is saved. In addition to that, in a dynamic perspective where the maximum capacity of the SPM is not known in advance, the TS heuristic will perform better than BEH.
Abstract: Textile structures are engineered and fabricated to
meet worldwide structural applications. Nevertheless, research
varying textile structure on natural fibre as composite reinforcement
was found to be very limited. Most of the research is focusing on
short fibre and random discontinuous orientation of the reinforcement
structure. Realizing that natural fibre (NF) composite had been
widely developed to be used as synthetic fibre composite
replacement, this research attempted to examine the influence of
woven and cross-ply laminated structure towards its mechanical
performances. Laminated natural fibre composites were developed
using hand lay-up and vacuum bagging technique. Impact and
flexural strength were investigated as a function of fibre type (coir
and kenaf) and reinforcement structure (imbalanced plain woven,
0°/90° cross-ply and +45°/-45° cross-ply). Multi-level full factorial
design of experiment (DOE) and analysis of variance (ANOVA) was
employed to impart data as to how fibre type and reinforcement
structure parameters affect the mechanical properties of the
composites. This systematic experimentation has led to determination
of significant factors that predominant influences the impact and
flexural properties of the textile composites. It was proven that both
fibre type and reinforcement structure demonstrated significant
difference results. Overall results indicated that coir composite and
woven structure exhibited better impact and flexural strength. Yet,
cross-ply composite structure demonstrated better fracture resistance.
Abstract: This paper presents an algorithm of particle swarm
optimization with reduction for global optimization problems. Particle
swarm optimization is an algorithm which refers to the collective
motion such as birds or fishes, and a multi-point search algorithm
which finds a best solution using multiple particles. Particle
swarm optimization is so flexible that it can adapt to a number
of optimization problems. When an objective function has a lot of
local minimums complicatedly, the particle may fall into a local
minimum. For avoiding the local minimum, a number of particles are
initially prepared and their positions are updated by particle swarm
optimization. Particles sequentially reduce to reach a predetermined
number of them grounded in evaluation value and particle swarm
optimization continues until the termination condition is met. In order
to show the effectiveness of the proposed algorithm, we examine the
minimum by using test functions compared to existing algorithms.
Furthermore the influence of best value on the initial number of
particles for our algorithm is discussed.
Abstract: Ever since industrial revolution began, our ecosystem
has changed. And indeed, the negatives outweigh the positives.
Industrial waste usually released into all kinds of body of water, such
as river or sea. Tempeh waste is one example of waste that carries
many hazardous and unwanted substances that will affect the
surrounding environment. Tempeh is a popular fermented food in
Asia which is rich in nutrients and active substances. Tempeh liquid
waste- in particular- can cause an air pollution, and if penetrates
through the soil, it will contaminates ground-water, making it
unavailable for the water to be consumed. Moreover, bacteria will
thrive within the polluted water, which often responsible for causing
many kinds of diseases. The treatment used for this chemical waste is
biological treatment such as constructed wetland and activated
sludge. These kinds of treatment are able to reduce both physical and
chemical parameters altogether such as temperature, TSS, pH, BOD,
COD, NH3-N, NO3-N, and PO4-P. These treatments are implemented
before the waste is released into the water. The result is a
comparation between constructed wetland and activated sludge,
along with determining which method is better suited to reduce the
physical and chemical subtances of the waste.
Abstract: In this study, an inland metropolitan area, Gwangju, in Korea was selected to assess the amplification potential of earthquake motion and provide the information for regional seismic countermeasure. A geographic information system-based expert system was implemented for reliably predicting the spatial geotechnical layers in the entire region of interesting by building a geo-knowledge database. Particularly, the database consists of the existing boring data gathered from the prior geotechnical projects and the surface geo-knowledge data acquired from the site visit. For practical application of the geo-knowledge database to estimate the earthquake hazard potential related to site amplification effects at the study area, seismic zoning maps on geotechnical parameters, such as the bedrock depth and the site period, were created within GIS framework. In addition, seismic zonation of site classification was also performed to determine the site amplification coefficients for seismic design at any site in the study area. KeywordsEarthquake hazard, geo-knowledge, geographic information system, seismic zonation, site period.
Abstract: This paper explores the sense of place in the Vredefort Dome World Heritage site, South Africa, as an essential input for the formulation of spatial planning proposals for the area. Intangible aspects such as personal and symbolic meanings of sites are currently not integrated in spatial planning in South Africa. This may have a detrimental effect on local inhabitants who have a long history with the site and built up a strong place identity. Involving local inhabitants at an early stage of the planning process and incorporating their attitudes and opinions in future intervention in the area, may also contribute to the acceptance of the legitimacy of future policy. An interdisciplinary and mixed-method research approach was followed in this study in order to identify possible ways to anchor spatial planning proposals in the identity of the place. In essence, the qualitative study revealed that inhabitants reflect a deep and personal relationship with and within the area, which contributes significantly to their sense of emotional security and selfidentity. Results include a strong conservation-orientated attitude with regard to the natural rural character of the site, especially in the inner core.
Abstract: In this study, we developed an algorithm for detecting
seam cracks in a steel plate. Seam cracks are generated in the edge
region of a steel plate. We used the Gabor filter and an adaptive double
threshold method to detect them. To reduce the number of pseudo
defects, features based on the shape of seam cracks were used. To
evaluate the performance of the proposed algorithm, we tested 989
images with seam cracks and 9470 defect-free images. Experimental
results show that the proposed algorithm is suitable for detecting seam
cracks. However, it should be improved to increase the true positive
rate.
Abstract: Aim of this paper is to explore the prospect of a new approach of mobile phone banking in Libya. This study evaluates customer knowledge on commercial mobile banking in Libya. To examine the relationship between age, occupation and intention for using mobile banking for commercial purpose, a survey was conducted to gather information from one hundred Libyan bank clients. The results indicate that Libyan customers have accepted the new technology and they are ready to use it. There is no significant joint relationship between age and occupation found in intention to use mobile banking in Libya. On the other hand, the customers’ knowledge about mobile banking has a greater relationship with the intention. This study has implications for demographic researches and consumer behaviour disciplines. It also has profitable implications for banks and managers in Libya, as it will assist in better understanding of the Libyan consumers and their activities, when they develop their market strategies and new service.
Abstract: Deoxyribonucleic Acid or DNA computing has
emerged as an interdisciplinary field that draws together chemistry,
molecular biology, computer science and mathematics. Thus, in this
paper, the possibility of DNA-based computing to solve an absolute
1-center problem by molecular manipulations is presented. This is
truly the first attempt to solve such a problem by DNA-based
computing approach. Since, part of the procedures involve with
shortest path computation, research works on DNA computing for
shortest path Traveling Salesman Problem, in short, TSP are reviewed.
These approaches are studied and only the appropriate one is adapted
in designing the computation procedures. This DNA-based
computation is designed in such a way that every path is encoded by
oligonucleotides and the path-s length is directly proportional to the
length of oligonucleotides. Using these properties, gel electrophoresis
is performed in order to separate the respective DNA molecules
according to their length. One expectation arise from this paper is that
it is possible to verify the instance absolute 1-center problem using
DNA computing by laboratory experiments.
Abstract: The Internet and the ever growing applications enable
communities to share and collaborate through common platforms.
However, this growing pattern is not witnessed yet even for elearning.
This paper is based on a doctoral research which aimed at
researching the ways students interact in an online campus and the
supports that they look for and require. Content analysis, based on the
Panchoo/Jaillet methodology, was done on four synchronous
meetings between a tutor and his ten students. The UNIV-Rct ecampus,
analogical to a physical campus, was found to be user
friendly and the students enrolled in a master-s course faced no
difficulties in using it. In addition to the environmental aspects, the
pedagogical implementation of the course has driven the students to
interact and collaborate significantly and this has contributed to
overcome the problems faced by the distance learners. This
completely online model was found to be fruitful in helping distant
learners fight their loneliness and brave their difficulties in a socioconstructivism
approach.
Abstract: In this paper, we propose use of convolutional codes
for file dispersal. The proposed method is comparable in complexity
to the information Dispersal Algorithm proposed by M.Rabin and for
particular choices of (non-binary) convolutional codes, is almost as
efficient as that algorithm in terms of controlling expansion in the
total storage. Further, our proposed dispersal method allows string
search.
Abstract: This paper presents a new Hybrid Fuzzy (HF) PID type controller based on Genetic Algorithms (GA-s) for solution of the Automatic generation Control (AGC) problem in a deregulated electricity environment. In order for a fuzzy rule based control system to perform well, the fuzzy sets must be carefully designed. A major problem plaguing the effective use of this method is the difficulty of accurately constructing the membership functions, because it is a computationally expensive combinatorial optimization problem. On the other hand, GAs is a technique that emulates biological evolutionary theories to solve complex optimization problems by using directed random searches to derive a set of optimal solutions. For this reason, the membership functions are tuned automatically using a modified GA-s based on the hill climbing method. The motivation for using the modified GA-s is to reduce fuzzy system effort and take large parametric uncertainties into account. The global optimum value is guaranteed using the proposed method and the speed of the algorithm-s convergence is extremely improved, too. This newly developed control strategy combines the advantage of GA-s and fuzzy system control techniques and leads to a flexible controller with simple stricture that is easy to implement. The proposed GA based HF (GAHF) controller is tested on a threearea deregulated power system under different operating conditions and contract variations. The results of the proposed GAHF controller are compared with those of Multi Stage Fuzzy (MSF) controller, robust mixed H2/H∞ and classical PID controllers through some performance indices to illustrate its robust performance for a wide range of system parameters and load changes.
Abstract: This study aims to investigate empirically the valuerelevance
of accounting information to domestic investors in Tehran
stock exchange from 1999 to 2006. During the present research
impacts of two factors, including positive vs. negative earnings and
the firm size are considered as well. The authors used earnings per
share and annual change of earnings per share as the income
statement indices, and book value of equity per share as the balance
sheet index. Return and Price models through regression analysis are
deployed in order to test the research hypothesis. Results depicted
that accounting information is value-relevance to domestic investors
in Tehran Stock Exchange according to both studied models.
However, income statement information has more value-relevance
than the balance sheet information. Furthermore, positive vs. negative
earnings and firm size seems to have significant impact on valuerelevance
of accounting information.
Abstract: Flow field around hypersonic vehicles is very
complex and difficult to simulate. The boundary layers are squeezed
between shock layer and body surface. Resolution of boundary layer,
shock wave and turbulent regions where the flow field has high
values is difficult of capture. Detached eddy simulation (DES) is a
modification of a RANS model in which the model switches to a
subgrid scale formulation in regions fine enough for LES
calculations. Regions near solid body boundaries and where the
turbulent length scale is less than the maximum grid dimension are
assigned the RANS mode of solution. As the turbulent length scale
exceeds the grid dimension, the regions are solved using the LES
mode. Therefore the grid resolution is not as demanding as pure LES,
thereby considerably cutting down the cost of the computation. In
this research study hypersonic flow is simulated at Mach 8 and
different angle of attacks to resolve the proper boundary layers and
discontinuities. The flow is also simulated in the long wake regions.
Mesh is little different than RANS simulations and it is made dense
near the boundary layers and in the wake regions to resolve it
properly. Hypersonic blunt cone cylinder body with frustrum at angle
5o and 10 o are simulated and there aerodynamics study is performed
to calculate aerodynamics characteristics of different geometries. The
results and then compared with experimental as well as with some
turbulence model (SA Model). The results achieved with DES
simulation have very good resolution as well as have excellent
agreement with experimental and available data. Unsteady
simulations are performed for DES calculations by using duel time
stepping method or implicit time stepping. The simulations are
performed at Mach number 8 and angle of attack from 0o to 10o for
all these cases. The results and resolutions for DES model found
much better than SA turbulence model.
Abstract: Recently, business environment and customer needs
have become rapidly changing, hence it is very difficult to fulfill
sophisticated customer needs by product or service innovation only. In
practice, to cope with this problem, various manufacturing companies
have developed services to combine with their products. Along with
this, many academic studies on PSS (Product Service System) which is
the integrated system of products and services have been conducted
from the viewpoint of manufacturers. On the other hand, service
providers are also attempting to develop service-supporting products
to increase their service competitiveness and provide differentiated
value. However, there is a lack of research based on the service-centric
point of view. Accordingly, this paper proposes a concept generation
method for service-supporting product development from the
service-centric point of view. This method is designed to be executed
in five consecutive steps: situation analysis, problem definition,
problem resolution, solution evaluation, and concept generation. In
the proposed approach, some tools of TRIZ (Theory of Solving
Inventive Problem) such as ISQ (Innovative Situation Questionnaire)
and 40 inventive principles are employed in order to define problems
of the current services and solve them by generating
service-supporting product concepts. This research contributes to the
development of service-supporting products and service-centric PSSs.
Abstract: This research was aimed to develop and determine the
quality of online learning activities kit as well as to examine the
learning achievement of students and their satisfaction towards the kit
through authentic assessment. The tools in this research contained
online learning activities kit on plant in Thai literature in compliance
with the School Botanical Garden of Plant Genetic Conservation
Project under the Royal Initiative of Her Royal Highness Princess
Maha Chakri Sirindhorn, the assessment form, the learning
achievement test, the satisfaction form and the authentic assessment
form. The population consisted of 40 students in the second range of
primary years (Prathomsuksa 4 to 6) at Ban Khao Rak School,
Suratthani Province, Thailand. The research results showed that the
content quality of the developed online learning activities kit as
assessed by the experts was 4.70 on average or at very high level.
The pre-test and post-test comparison was made to examine the
learning achievement and it revealed that the post-test score was
higher than the pre-test score with statistical significance at the .01
level. The satisfaction of the sampling group towards the online
learning activities kit was 4.74 or at the highest level. The authentic
assessment showed an average of 1.69 or at good level. Therefore,
the online learning activities kit on plant in Thai literature in
compliance with the School Botanical Garden of Plant Genetic
Conservation Project under the Royal Initiative of Her Royal
Highness Princess Maha Chakri Sirindhorn could be used in real
classroom situations.
Abstract: This paper proposes a system to extract images from web pages and then detect the skin color regions of these images. As part of the proposed system, using BandObject control, we built a Tool bar named 'Filter Tool Bar (FTB)' by modifying the Pavel Zolnikov implementation. The Yahoo! Team provides us with the Yahoo! SDK API, which also supports image search and is really useful. In the proposed system, we introduced three new methods for extracting images from the web pages (after loading the web page by using the proposed FTB, before loading the web page physically from the localhost, and before loading the web page from any server). These methods overcome the drawback of the regular expressions method for extracting images suggested by Ilan Assayag. The second part of the proposed system is concerned with the detection of the skin color regions of the extracted images. So, we studied two famous skin color detection techniques. The first technique is based on the RGB color space and the second technique is based on YUV and YIQ color spaces. We modified the second technique to overcome the failure of detecting complex image's background by using the saturation parameter to obtain an accurate skin detection results. The performance evaluation of the efficiency of the proposed system in extracting images before and after loading the web page from localhost or any server in terms of the number of extracted images is presented. Finally, the results of comparing the two skin detection techniques in terms of the number of pixels detected are presented.