Abstract: Reverse Engineering is a very important process in
Software Engineering. It can be performed backwards from system
development life cycle (SDLC) in order to get back the source data
or representations of a system through analysis of its structure,
function and operation. We use reverse engineering to introduce an
automatic tool to generate system requirements from its program
source codes. The tool is able to accept the Cµ programming source
codes, scan the source codes line by line and parse the codes to
parser. Then, the engine of the tool will be able to generate system
requirements for that specific program to facilitate reuse and
enhancement of the program. The purpose of producing the tool is to
help recovering the system requirements of any system when the
system requirements document (SRD) does not exist due to
undocumented support of the system.
Abstract: Membrane distillation (MD) is a rising technology for
seawater or brine desalination process. In this work, an air gap
membrane distillation (AGMD) performance was investigated for
aqueous NaCl solution along with natural ground water and seawater.
In order to enhance the performance of the AGMD process in
desalination, that is, to get more flux, it is necessary to study the
effect of operating parameters on the yield of distillate water. The
influence of operational parameters such as feed flow rate, feed
temperature, feed salt concentration, coolant temperature and air gap
thickness on the membrane distillation (MD) permeation flux have
been investigated for low and high salt solution. the natural
application of ground water and seawater over 90 h continuous
operation, scale deposits observed on the membrane surface and
reduction in flux represents 23% for ground water and 60% for
seawater, in 90 h. This reduction was eliminated (less than 14 %) by
acidification of feed water. Hence, promote the research attention in
apply of AGMD for the ground water as well as seawater
desalination over today-s conventional RO operation.
Abstract: Heavy rainfall greatly affects the aerodynamic performance of the aircraft. There are many accidents of aircraft caused by aerodynamic efficiency degradation by heavy rain. In this Paper we have studied the heavy rain effects on the aerodynamic efficiency of NACA 64-210 & NACA 0012 airfoils. For our analysis, CFD method and preprocessing grid generator are used as our main analytical tools, and the simulation of rain is accomplished via two phase flow approach-s Discrete Phase Model (DPM). Raindrops are assumed to be non-interacting, non-deforming, non-evaporating and non-spinning spheres. Both airfoil sections exhibited significant reduction in lift and increase in drag for a given lift condition in simulated rain. The most significant difference between these two airfoils was the sensitivity of the NACA 64-210 to liquid water content (LWC), while NACA 0012 performance losses in the rain environment is not a function of LWC . It is expected that the quantitative information gained in this paper will be useful to the operational airline industry and greater effort such as small scale and full scale flight tests should put in this direction to further improve aviation safety.
Abstract: The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.
Abstract: In today-s information age, numbers of organizations
are still arguing on capitalizing the values of Information Technology
(IT) and Knowledge Management (KM) to which individuals can
benefit from and effective communication among the individuals can
be established. IT exists in enabling positive improvement for
communication among knowledge workers (k-workers) with a
number of social network technology domains at workplace. The
acceptance of digital discourse in sharing of knowledge and
facilitating the knowledge and information flows at most of the
organizations indeed impose the culture of knowledge sharing in
Digital Social Networks (DSN). Therefore, this study examines
whether the k-workers with IT background would confer an effect on
the three knowledge characteristics -- conceptual, contextual, and
operational. Derived from these three knowledge characteristics, five
potential factors will be examined on the effects of knowledge
exchange via e-mail domain as the chosen query. It is expected, that
the results could provide such a parameter in exploring how DSN
contributes in supporting the k-workers- virtues, performance and
qualities as well as revealing the mutual point between IT and KM.
Abstract: A 3.5-bit stage of the CMOS pipelined ADC is proposed. In this report, the main part of 3.5-bit stage ADC is introduced. How the MDAC, comparator and encoder worked and designed are shown in details. Besides, an OTA which is used in fully differential pipelined ADC was described. Using gain-boost architecture with differential amplifier, this OTA achieve high-gain and high-speed. This design was using CMOS 0.18um process and simulation in Cadence. The result of the simulation shows that the OTA has a gain up to 80dB, the unity gain bandwidth of about 1.138GHz with 2pF load.
Abstract: ZnO nanocrystals with mean diameter size 14 nm
have been prepared by precipitation method, and examined as
photocatalyst for the UV-induced degradation of insecticide diazinon
as deputy of organic pollutant in aqueous solution. The effects of
various parameters, such as illumination time, the amount of
photocatalyst, initial pH values and initial concentration of
insecticide on the photocatalytic degradation diazinon were
investigated to find desired conditions. In this case, the desired
parameters were also tested for the treatment of real water containing
the insecticide. Photodegradation efficiency of diazinon was
compared between commercial and prepared ZnO nanocrystals. The
results indicated that UV/ZnO process applying prepared
nanocrystalline ZnO offered electrical energy efficiency and
quantum yield better than commercial ZnO. The present study, on the
base of Langmuir-Hinshelwood mechanism, illustrated a pseudo
first-order kinetic model with rate constant of surface reaction equal
to 0.209 mg l-1 min-1 and adsorption equilibrium constant of 0.124 l
mg-1.
Abstract: Telemedicine is brought to life by contemporary changes of our world and summarizes the entire range of services that are at the crossroad of traditional healthcare and information technology. It is believed that eHealth can help in solving critical issues of rising costs, care for ageing and housebound population, staff shortage. It is a feasible tool to provide routine as well as specialized health service as it has the potential to improve both the access to and the standard of care. eHealth is no more an optional choice. It has already made quite a way but it still remains a fantastic challenge for the future requiring cooperation and coordination at all possible levels. The strategic objectives of this paper are: 1. To start with an attempt to clarify the mass of terms used nowadays; 2. To answer the question “Who needs eHealth"; 3. To focus on the necessity of bridging telemedicine and medical (health) informatics as well as on the dual relationship between them; as well as 4. To underline the need of networking in understanding, developing and implementing eHealth.
Abstract: It was determined that woody biomass and livestock excreta can be utilized as hydrogen resources and hydrogen produced from such sources can be used to fill fuel cell vehicles (FCVs) at hydrogen stations. It was shown that the biomass transport costs for hydrogen production may be reduced the costs for co-generation. In the Tokyo Metropolitan Area, there are only a few sites capable of producing hydrogen from woody biomass in amounts greater than 200 m3/h-the scale required for a hydrogen station to be operationally practical. However, in the case of livestock excreta, it was shown that 15% of the municipalities in this area are capable of securing sufficient biomass to be operationally practical for hydrogen production. The differences in feasibility of practical operation depend on the type of biomass.
Abstract: The upgrading of low quality crude natural gas (NG) is attracting interest due to high demand of pipeline-grade gas in recent years. Membrane processes are commercially proven technology for the removal of impurities like carbon dioxide from NG. In this work, cross flow mathematical model has been suggested to be incorporated with ASPEN HYSYS as a user defined unit operation in order to design the membrane system for CO2/CH4 separation. The effect of operating conditions (such as feed composition and pressure) and membrane selectivity on the design parameters (methane recovery and total membrane area required for the separation) has been studied for different design configurations. These configurations include single stage (with and without recycle) and double stage membrane systems (with and without permeate or retentate recycle). It is shown that methane recovery can be improved by recycling permeate or retentate stream as well as by using double stage membrane systems. The ASPEN HYSYS user defined unit operation proposed in the study has potential to be applied for complex membrane system design and optimization.
Abstract: The electromagnetic spectrum is a natural resource
and hence well-organized usage of the limited natural resources is the
necessities for better communication. The present static frequency
allocation schemes cannot accommodate demands of the rapidly
increasing number of higher data rate services. Therefore, dynamic
usage of the spectrum must be distinguished from the static usage to
increase the availability of frequency spectrum. Cognitive radio is not
a single piece of apparatus but it is a technology that can incorporate
components spread across a network. It offers great promise for
improving system efficiency, spectrum utilization, more effective
applications, reduction in interference and reduced complexity of
usage for users. Cognitive radio is aware of its environmental,
internal state, and location, and autonomously adjusts its operations
to achieve designed objectives. It first senses its spectral environment
over a wide frequency band, and then adapts the parameters to
maximize spectrum efficiency with high performance. This paper
only focuses on the analysis of Bit-Error-Rate in cognitive radio by
using Particle Swarm Optimization Algorithm. It is theoretically as
well as practically analyzed and interpreted in the sense of
advantages and drawbacks and how BER affects the efficiency and
performance of the communication system.
Abstract: Vehicular Ad-Hoc Networks (VANET) can provide
communications between vehicles or infrastructures. It provides the
convenience of driving and the secure driving to reduce accidents. In
VANET, the security is more important because it is closely related to
accidents. Additionally, VANET raises a privacy issue because it can
track the location of vehicles and users- identity when a security
mechanism is provided. In this paper, we analyze the problem of an
existing solution for security requirements required in VANET, and
resolve the problem of the existing method when a key management
mechanism is provided for the security operation in VANET.
Therefore, we show suitability of the Long Term Evolution (LTE) in
VANET for the solution of this problem.
Abstract: Sustainability and sustainable development have been
the main theme of many international conferences, such the UN Rio
de Janeiro 1992 Earth Summit This was followed by the appearance
of the global conferences at the late of the nineties and the early of
2000 to confirm the importance of the sustainable development .it
was focused on the importance of the economic development as it is
considered an effective tool in the operations of the sustainable
development. Industry plays a critical role in technological
innovations and research and development activities, which are
crucial for the economic and social development of any country.
Transportation and mobility are an important part or urban
economics and the quality of life. To analyze urban transportation
and its environmental impacts, a comprehensive approach is needed.
So this research aims to apply new approach for the development of
the urban communities that insure the continuity and facing the
deterioration. This approach aims to integrate sustainable transport
solutions with economic development and community development.
For that purpose we will concentrate on one of the most sustainable
cities in the world (Curitiba in Brazil) which provides the world with
a model in how to integrate sustainable transport considerations into
business development, road infrastructure development, and local
community development.
Abstract: Currently, one of the main directions is developing of
development based on the clustering of economic operations of
Kazakhstan, providing for the organization and concentration of
production capacity in one region or the most optimal system. In the
modern economic literature clustering is regarded as one of the most
effective tools to ensure competitive businesses, and improve their
business itself.
Abstract: Equipment miniaturisation offers several opportunities such as an increased surface-to-volume ratio and higher heat transfer coefficients. However, moving towards small-diameter channels demands extra attention to fouling, reliability and stable operation of the system. The present investigation explores possibilities to enhance the stability of the once-through micro evaporator by reducing its flow boiling induced pressure fluctuations. Experimental comparison shows that the measured reduction factor approaches a theoretically derived value. Pressure fluctuations are reduced by a factor of ten in the solid conical channel and a factor of 15 in the porous conical channel. This presumably leads to less backflow and therefore to a better flow control.
Abstract: In practice, wireless networks has the property that
the signal strength attenuates with respect to the distance from the
base station, it could be better if the nodes at two hop away are
considered for better quality of service. In this paper, we propose a
procedure to identify delay preserving substructures for a given
wireless ad-hoc network using a new graph operation G 2 – E (G) =
G* (Edge difference of square graph of a given graph and the
original graph). This operation helps to analyze some induced
substructures, which preserve delay in communication among them.
This operation G* on a given graph will induce a graph, in which 1-
hop neighbors of any node are at 2-hop distance in the original
network. In this paper, we also identify some delay preserving
substructures in G*, which are (i) set of all nodes, which are mutually
at 2-hop distance in G that will form a clique in G*, (ii) set of nodes
which forms an odd cycle C2k+1 in G, will form an odd cycle in G*
and the set of nodes which form a even cycle C2k in G that will form
two disjoint companion cycles ( of same parity odd/even) of length k
in G*, (iii) every path of length 2k+1 or 2k in G will induce two
disjoint paths of length k in G*, and (iv) set of nodes in G*, which
induces a maximal connected sub graph with radius 1 (which
identifies a substructure with radius equal 2 and diameter at most 4 in
G). The above delay preserving sub structures will behave as good
clusters in the original network.
Abstract: This paper presents the modeling of a MEMS based accelerometer in order to detect the presence of a wheel flat in the railway vehicle. A haversine wheel flat is assigned to one wheel of a 5 DOF pitch plane vehicle model, which is coupled to a 3 layer track model. Based on the simulated acceleration response obtained from the vehicle-track model, an accelerometer is designed that meets all the requirements to detect the presence of a wheel flat. The proposed accelerometer can survive in a dynamic shocking environment with acceleration up to ±150g. The parameters of the accelerometer are calculated in order to achieve the required specifications using lumped element approximation and the results are used for initial design layout. A finite element analysis code (COMSOL) is used to perform simulations of the accelerometer under various operating conditions and to determine the optimum configuration. The simulated results are found within about 2% of the calculated values, which indicates the validity of lumped element approach. The stability of the accelerometer is also determined in the desired range of operation including the condition under shock.
Abstract: Integrated Total Quality Management (TQM) with
Lean Manufacturing (LM) is a system comprises of TQM with LM
principles and is associated with financial and nonfinancial
performance measurement indicators. The ultimate goal of this
system is to focus on achieving total customer satisfaction by
removing eight wastes available in any process in an organization.
A survey questionnaire was developed and distributed to 30 highly
active automotive vendors in Malaysia and analyzed by PASW
Statistics 18. It was found out that these vendors have been
practicing and measuring the effectiveness TQM and LM
implementation. More involvement of all Malaysian automotive
vendors will represent the exact status of current Malaysian
automotive industry in implementing TQM and LM and can
determine whether the industry is ready for integrated TQM and
LM system. This is the first study that combined 4 awards
practices, ISO/TS16949, Toyota Production System and
SAEJ4000.
Abstract: The manufacturing transmission line tower parts has
being generated hazardous waste which is required proper disposal
of waste for protection of land pollution. Manufacturing Process in
the manufacturing of steel angle, plates, pipes, channels are passes
through conventional, semi automatic and CNC machines for
cutting, marking, punching, drilling, notching, bending operations.
All fabricated material Coated with thin layer of Zinc in Galvanizing
plant where molten zinc is used for coating. Prior to Galvanizing,
chemical like 33% concentrated HCl Acid, ammonium chloride and
d-oil being used for pretreatment of iron. The bath of water with
sodium dichromate is used for cooling and protection of the
galvanized steel. For the heating purpose the furnace oil burners are
used. These above process the Zinc dross, Zinc ash, ETP sludge and
waste pickled acid generated as hazardous waste. The RPG has
made captive secured land fill site, since 1997 since then it was
using for disposal of hazardous waste after completion of SLF
(Secured land fill) site. The RPG has raised height from ground
level then now it is being used for disposal of waste as he designed
the SLF after in creasing height of from GL it is functional without
leach ate or adverse impacts in the environment.
Abstract: The purpose of this paper is to propose a framework for constructing correct parallel processing programs based on Equivalent Transformation Framework (ETF). ETF regards computation as In the framework, a problem-s domain knowledge and a query are described in definite clauses, and computation is regarded as transformation of the definite clauses. Its meaning is defined by a model of the set of definite clauses, and the transformation rules generated must preserve meaning. We have proposed a parallel processing method based on “specialization", a part of operation in the transformations, which resembles substitution in logic programming. The method requires “Memo-tree", a history of specialization to maintain correctness. In this paper we proposes the new method for the specialization-base parallel processing without Memo-tree.