Abstract: Key management represents a major and the most
sensitive part of cryptographic systems. It includes key generation,
key distribution, key storage, and key deletion. It is also considered
the hardest part of cryptography. Designing secure cryptographic
algorithms is hard, and keeping the keys secret is much harder.
Cryptanalysts usually attack both symmetric and public key
cryptosystems through their key management. We introduce a
protocol to exchange cipher keys over insecure communication
channel. This protocol is based on public key cryptosystem,
especially elliptic curve cryptosystem. Meanwhile, it tests the cipher
keys and selects only the good keys and rejects the weak one.
Abstract: Within the new world order, the term “crisis" is nowadays familiar to companies. Organizations are experiencing conditions which are surprising, uncertain, often adverse and usually unstable. The companies, who grasp the importance of transformation within the information age, have felt the need to develop modern methods to achieve the ability to thrive despite severe shocks. Through strategically managing human resource and developing appropriate elements of human resource system, companies can be assured for resolving the crisis. In this paper the role of HR system on resolving crisis has been evaluated. To help accomplish this, an insight on previous strategic HRM literature and an introduction to the elements and relationship within HR systems has been presented. It also reviews different attitude around resilience in literature. It continues by reviewing three elements central to developing an organization-s capacity for crisis resolving and it will demonstrate how designing proper elements of HR system can lead the organizations to possess the ability for passing through crisis. Finally it will evaluate an Iranian Insurance organization in case of one of the three central elements (specific cognitive ability) and observe how successful they were on developing an effective HR system to be ready for facing crisis.
Abstract: There are many issues that affect modeling and designing real-time databases. One of those issues is maintaining consistency between the actual state of the real-time object of the external environment and its images as reflected by all its replicas distributed over multiple nodes. The need to improve the scalability is another important issue. In this paper, we present a general framework to design a replicated real-time database for small to medium scale systems and maintain all timing constrains. In order to extend the idea for modeling a large scale database, we present a general outline that consider improving the scalability by using an existing static segmentation algorithm applied on the whole database, with the intent to lower the degree of replication, enables segments to have individual degrees of replication with the purpose of avoiding excessive resource usage, which all together contribute in solving the scalability problem for DRTDBS.
Abstract: In the way of growing and developing firms especially
high-tech firms, on many occasions manager of firm is mainly involved in solving problems of his business and decision making about executive activities of the firm, while besides executive
measures, attention to planning of firm's success and growth way and
application of long experience and sagacity in designing business model are vital and necessary success in a business is achieved as a
result of different factors, one of the most important of them is designing and performing an optimal business model at the beginning
of the firm's work. This model is determining the limit of profitability
achieved by innovation and gained value added. Therefore, business
model is the process of connecting innovation environment and
technology with economic environment and business and is important
for succeeding modern businesses considering their traits.
Abstract: This paper presents probabilistic horizontal seismic
hazard assessment of Naghan, Iran. It displays the probabilistic
estimate of Peak Ground Horizontal Acceleration (PGHA) for the
return period of 475, 950 and 2475 years. The output of the
probabilistic seismic hazard analysis is based on peak ground
acceleration (PGA), which is the most common criterion in designing
of buildings. A catalogue of seismic events that includes both
historical and instrumental events was developed and covers the
period from 840 to 2009. The seismic sources that affect the hazard
in Naghan were identified within the radius of 200 km and the
recurrence relationships of these sources were generated by Kijko
and Sellevoll. Finally Peak Ground Horizontal Acceleration (PGHA)
has been prepared to indicate the earthquake hazard of Naghan for
different hazard levels by using SEISRISK III software.
Abstract: The medical studies often require different methods
for parameters selection, as a second step of processing, after the
database-s designing and filling with information. One common
task is the selection of fields that act as risk factors using wellknown
methods, in order to find the most relevant risk factors and
to establish a possible hierarchy between them. Different methods
are available in this purpose, one of the most known being the
binary logistic regression. We will present the mathematical
principles of this method and a practical example of using it in the
analysis of the influence of 10 different psychiatric diagnostics
over 4 different types of offences (in a database made from 289
psychiatric patients involved in different types of offences).
Finally, we will make some observations about the relation
between the risk factors hierarchy established through binary
logistic regression and the individual risks, as well as the results of
Chi-squared test. We will show that the hierarchy built using the
binary logistic regression doesn-t agree with the direct order of risk
factors, even if it was naturally to assume this hypothesis as being
always true.
Abstract: When designing information systems that deal with
large amount of domain knowledge, system designers need to consider
ambiguities of labeling termsin domain vocabulary for navigating
users in the information space. The goal of this study is to develop a
methodology for system designers to label navigation items, taking
account of ambiguities stems from synonyms or polysemes of labeling
terms. In this paper, we propose a method for concept labeling based
on mappings between domain ontology andthesaurus, and report
results of an empirical evaluation.
Abstract: Aspect Oriented Programming promises many
advantages at programming level by incorporating the cross cutting
concerns into separate units, called aspects. Join Points are
distinguishing features of Aspect Oriented Programming as they
define the points where core requirements and crosscutting concerns
are (inter)connected. Currently, there is a problem of multiple
aspects- composition at the same join point, which introduces the
issues like ordering and controlling of these superimposed aspects.
Dynamic strategies are required to handle these issues as early as
possible. State chart is an effective modeling tool to capture dynamic
behavior at high level design. This paper provides methodology to
formulate the strategies for multiple aspect composition at high level,
which helps to better implement these strategies at coding level. It
also highlights the need of designing shared join point at high level,
by providing the solutions of these issues using state chart diagrams
in UML 2.0. High level design representation of shared join points
also helps to implement the designed strategy in systematic way.
Abstract: In the context of sensor networks, where every few
dB saving counts, the novel node cooperation schemes are reviewed
where MIMO techniques play a leading role. These methods could be
treated as joint approach for designing physical layer of their
communication scenarios. Then we analyzed the BER performance
of transmission diversity schemes under a general fading channel
model and proposed a power allocation strategy to the transmitting
sensor nodes. This approach is then compared to an equal-power
assignment method and its performance enhancement is verified by
the simulation. Another key point of the contribution lies in the
combination of optimal power allocation and sensor nodes-
cooperation in a transmission diversity regime (MISO). Numerical
results are given through figures to demonstrate the optimality and
efficiency of proposed combined approach.
Abstract: Nowadays, people are going more and more mobile, both in terms of devices and associated applications. Moreover, services that these devices are offering are getting wider and much more complex. Even though actual handheld devices have considerable computing power, their contexts of utilization are different. These contexts are affected by the availability of connection, high latency of wireless networks, battery life, size of the screen, on-screen or hard keyboard, etc. Consequently, development of mobile applications and their associated mobile Web services, if any, should follow a concise methodology so they will provide a high Quality of Service. The aim of this paper is to highlight and discuss main issues to consider when developing mobile applications and mobile Web services and then propose a framework that leads developers through different steps and modules toward development of efficient and secure mobile applications. First, different challenges in developing such applications are elicited and deeply discussed. Second, a development framework is presented with different modules addressing each of these challenges. Third, the paper presents an example of a mobile application, Eivom Cinema Guide, which benefits from following our development framework.
Abstract: In this paper we analyze the core issues affecting
software architecture in enterprise projects where a large number of
people at different backgrounds are involved and complex business,
management and technical problems exist. We first give general
features of typical enterprise projects and then present foundations of
software architectures. The detailed analysis of core issues affecting
software architecture in software development phases is given. We
focus on three main areas in each development phase: people,
process, and management related issues, structural (product) issues,
and technology related issues. After we point out core issues and
problems in these main areas, we give recommendations for
designing good architecture. We observed these core issues and the
importance of following the best software development practices and
also developed some novel practices in many big enterprise
commercial and military projects in about 10 years of experience.
Abstract: The world's population continues to grow at a quarter of a million people per day, increasing the consumption of energy. This has made the world to face the problem of energy crisis now days. In response to the energy crisis, the principles of renewable energy gained popularity. There are much advancement made in developing the wind and solar energy farms across the world. These energy farms are not enough to meet the energy requirement of world. This has attracted investors to procure new sources of energy to be substituted. Among these sources, extraction of energy from the waves is considered as best option. The world oceans contain enough energy to meet the requirement of world. Significant advancements in design and technology are being made to make waves as a continuous source of energy. One major hurdle in launching wave energy devices in a developing country like Pakistan is the initial cost. A simple, reliable and cost effective wave energy converter (WEC) is required to meet the nation-s energy need. This paper will present a novel design proposed by team SAS for harnessing wave energy. This paper has three major sections. The first section will give a brief and concise view of ocean wave creation, propagation and the energy carried by them. The second section will explain the designing of SAS-2. A gear chain mechanism is used for transferring the energy from the buoy to a rotary generator. The third section will explain the manufacturing of scaled down model for SAS-2 .Many modifications are made in the trouble shooting stage. The design of SAS-2 is simple and very less maintenance is required. SAS-2 is producing electricity at Clifton. The initial cost of SAS-2 is very low. This has proved SAS- 2 as one of the cost effective and reliable source of harnessing wave energy for developing countries.
Abstract: Today, the Internet based communication has widen
the opportunity of event monitoring system in the medical field.
There is always a need of analyzing and designing secure and reliable
mobile communication between the hospital and biomedical
engineers mobile units. This study has been carried out to find
possible solution using SIP-based event notification for alerting the
technical staff about the Biomedical Device (BMD) status and
Patients treatment session. The Session Initiation Protocol (SIP) can
be used to create a medical event notification system. SIP can work
on a variety of devices. Its adoption as the protocol of choice for third
generation wireless networks allows for a robust and scalable
environment. One of the advantages of SIP is that it supports personal
mobility through the separation of user addressing and device
addressing. The solution for Telemed alert notification system is
based on SIP - Specific Event Notification. The aim of this project is
to extend mobility service to the hospital technicians who are using
Telemedicine system.
Abstract: One of the criteria in production scheduling is Make
Span, minimizing this criteria causes more efficiently use of the
resources specially machinery and manpower. By assigning some
budget to some of the operations the operation time of these activities
reduces and affects the total completion time of all the operations
(Make Span). In this paper this issue is practiced in parallel flow
shops. At first we convert parallel flow shop to a network model and
by using a linear programming approach it is identified in order to
minimize make span (the completion time of the network) which
activities (operations) are better to absorb the predetermined and
limited budget. Minimizing the total completion time of all the
activities in the network is equivalent to minimizing make span in
production scheduling.
Abstract: The objective of the present study was to evaluate the
potential of hollow microneedles for enhancing the transdermal
delivery of Bovine Serum Albumin (MW~66,000 Da)-Fluorescein
Isothiocyanate (BSA-FITC) conjugate, a hydrophilic large molecular
compound. Moreover, the effect of different formulations was
evaluated. The series of binary mixtures composed of propylene
glycol (PG) and pH 7.4 phosphate buffer solution (PBS) was
prepared and used as a medium for BSA-FITC. The results showed
that there was no permeation of BSA-FITC solution across the
neonatal porcine skin without using hollow microneedles, whereas
the cumulative amount of BSA-FITC released at 8 h through the
neonatal porcine skin was about 60-70% when using hollow
microneedles. Furthermore, the results demonstrated that the higher
volume of PG in binary mixtures injected, the lower cumulative
amount of BSA-FITC released and release rate of BSA-FITC from
skin. These release profiles of BSA-FITC in binary mixtures were
expressed by Fick-s law of diffusion. These results suggest the
utilization of hollow microneedle to enhance transdermal delivery of
protein and provide useful information for designing an effective
hollow microneedle system.
Abstract: Finite impulse response (FIR) filters have the advantage of linear phase, guaranteed stability, fewer finite precision errors, and efficient implementation. In contrast, they have a major disadvantage of high order need (more coefficients) than IIR counterpart with comparable performance. The high order demand imposes more hardware requirements, arithmetic operations, area usage, and power consumption when designing and fabricating the filter. Therefore, minimizing or reducing these parameters, is a major goal or target in digital filter design task. This paper presents an algorithm proposed for modifying values and the number of non-zero coefficients used to represent the FIR digital pulse shaping filter response. With this algorithm, the FIR filter frequency and phase response can be represented with a minimum number of non-zero coefficients. Therefore, reducing the arithmetic complexity needed to get the filter output. Consequently, the system characteristic i.e. power consumption, area usage, and processing time are also reduced. The proposed algorithm is more powerful when integrated with multiplierless algorithms such as distributed arithmetic (DA) in designing high order digital FIR filters. Here the DA usage eliminates the need for multipliers when implementing the multiply and accumulate unit (MAC) and the proposed algorithm will reduce the number of adders and addition operations needed through the minimization of the non-zero values coefficients to get the filter output.
Abstract: Prime Factorization based on Quantum approach in
two phases has been performed. The first phase has been achieved at
Quantum computer and the second phase has been achieved at the
classic computer (Post Processing). At the second phase the goal is to
estimate the period r of equation xrN ≡ 1 and to find the prime factors
of the composite integer N in classic computer. In this paper we
present a method based on Randomized Approach for estimation the
period r with a satisfactory probability and the composite integer N
will be factorized therefore with the Randomized Approach even the
gesture of the period is not exactly the real period at least we can find
one of the prime factors of composite N. Finally we present some
important points for designing an Emulator for Quantum Computer
Simulation.
Abstract: The present work is concerned with the effect of turning process parameters (cutting speed, feed rate, and depth of cut) and distance from the center of work piece as input variables on the chip micro-hardness as response or output. Three experiments were conducted; they were used to investigate the chip micro-hardness behavior at diameter of work piece for 30[mm], 40[mm], and 50[mm]. Response surface methodology (R.S.M) is used to determine and present the cause and effect of the relationship between true mean response and input control variables influencing the response as a two or three dimensional hyper surface. R.S.M has been used for designing a three factor with five level central composite rotatable factors design in order to construct statistical models capable of accurate prediction of responses. The results obtained showed that the application of R.S.M can predict the effect of machining parameters on chip micro-hardness. The five level factorial designs can be employed easily for developing statistical models to predict chip micro-hardness by controllable machining parameters. Results obtained showed that the combined effect of cutting speed at it?s lower level, feed rate and depth of cut at their higher values, and larger work piece diameter can result increasing chi micro-hardness.
Abstract: Chemical detection is still a continuous challenge when
it comes to designing single-walled carbon nanotube (SWCNT)
sensors with high selectivity, especially in complex chemical
environments. A perfect example of such an environment would be in
thermally oxidized soybean oil. At elevated temperatures, oil oxidizes
through a series of chemical reactions which results in the formation of
monoacylglycerols, diacylglycerols, oxidized triacylglycerols, dimers,
trimers, polymers, free fatty acids, ketones, aldehydes, alcohols,
esters, and other minor products. In order to detect the rancidity of
oxidized soybean oil, carbon nanotube chemiresistor sensors have
been coated with polyethylenimine (PEI) to enhance the sensitivity
and selectivity. PEI functionalized SWCNTs are known to have a high
selectivity towards strong electron withdrawing molecules. The
sensors were very responsive to different oil oxidation levels and
furthermore, displayed a rapid recovery in ambient air without the
need of heating or UV exposure.
Abstract: Nowadays, the earth is countered with serious problem
of air pollution. This problem has been started from the industrial
revolution and has been faster in recent years, so that leads the earth
to ecological and environmental disaster. One of its results is the
global warming problem and its related increase in global
temperature. The most important factors in air pollution especially in
urban environments are Automobiles and residential buildings that are
the biggest consumers of the fossil energies, so that if the residential
buildings as a big part of the consumers of such energies reduce their
consumption rate, the air pollution will be decreased. Since
Metropolises are the main centers of air pollution in the world,
assessment and analysis of efficient strategies in decreasing air
pollution in such cities, can lead to the desirable and suitable results
and can solve the problem at least in critical level. Tabriz city is one
of the most important metropolises in North west of Iran that about
two million people are living there. for its situation in cold dry
climate, has a high rate of fossil energies consumption that make air
pollution in its urban environment. These two factors, being both
metropolis and in cold dry climate, make this article try to analyze the
strategies of climatic design in old districts of the city and use them in
new districts of the future. These strategies can be used in this city
and other similar cities and pave the way to reduce energy
consumption and related air pollution to save whole world.