Abstract: In mechanical and environmental engineering, mixed
convection is a frequently encountered thermal fluid phenomenon
which exists in atmospheric environment, urban canopy flows, ocean
currents, gas turbines, heat exchangers, and computer chip cooling
systems etc... . This paper deals with a numerical investigation of
mixed convection in a vertical heated channel. This flow results from
the mixing of the up-going fluid along walls of the channel with the
one issued from a flat nozzle located in its entry section. The fluiddynamic
and heat-transfer characteristics of vented vertical channels
are investigated for constant heat-flux boundary conditions, a
Rayleigh number equal to 2.57 1010, for two jet Reynolds number
Re=3 103 and 2104 and the aspect ratio in the 8-20 range. The system
of governing equations is solved with a finite volumes method and an
implicit scheme. The obtained results show that the turbulence and
the jet-wall interaction activate the heat transfer, as does the drive of
ambient air by the jet. For low Reynolds number Re=3 103, the
increase of the aspect Ratio enhances the heat transfer of about 3%,
however; for Re=2 104, the heat transfer enhancement is of about
12%. The numerical velocity, pressure and temperature fields are
post-processed to compute the quantities of engineering interest such
as the induced mass flow rate, and average Nusselt number, in terms
of Rayleigh, Reynolds numbers and dimensionless geometric
parameters are presented.
Abstract: In this paper, we present the information life cycle, and analyze the importance of managing the corporate application portfolio across this life cycle. The approach presented here does not correspond just to the extension of the traditional information system development life cycle. This approach is based in the generic life cycle employed in other contexts like manufacturing or marketing. In this paper it is proposed a model of an information system life cycle, supported in the assumption that a system has a limited life. But, this limited life may be extended. This model is also applied in several cases; being reported here two examples of the framework application in a construction enterprise, and in a manufacturing enterprise.
Abstract: Data clustering is an important data exploration technique
with many applications in data mining. We present an enhanced
version of the well known single link clustering algorithm. We will
refer to this algorithm as DCBOR. The proposed algorithm alleviates
the chain effect by removing the outliers from the given dataset.
So this algorithm provides outlier detection and data clustering
simultaneously. This algorithm does not need to update the distance
matrix, since the algorithm depends on merging the most k-nearest
objects in one step and the cluster continues grow as long as possible
under specified condition. So the algorithm consists of two phases;
at the first phase, it removes the outliers from the input dataset. At
the second phase, it performs the clustering process. This algorithm
discovers clusters of different shapes, sizes, densities and requires
only one input parameter; this parameter represents a threshold for
outlier points. The value of the input parameter is ranging from 0 to
1. The algorithm supports the user in determining an appropriate
value for it. We have tested this algorithm on different datasets
contain outlier and connecting clusters by chain of density points,
and the algorithm discovers the correct clusters. The results of
our experiments demonstrate the effectiveness and the efficiency of
DCBOR.
Abstract: A new numerical scheme based on the H1-Galerkin mixed finite element method for a class of second-order pseudohyperbolic equations is constructed. The proposed procedures can be split into three independent differential sub-schemes and does not need to solve a coupled system of equations. Optimal error estimates are derived for both semidiscrete and fully discrete schemes for problems in one space dimension. And the proposed method dose not requires the LBB consistency condition. Finally, some numerical results are provided to illustrate the efficacy of our method.
Abstract: In this paper, a novel copyright protection scheme for digital images based on Visual Cryptography and Statistics is proposed. In our scheme, the theories and properties of sampling distribution of means and visual cryptography are employed to achieve the requirements of robustness and security. Our method does not need to alter the original image and can identify the ownership without resorting to the original image. Besides, our method allows multiple watermarks to be registered for a single host image without causing any damage to other hidden watermarks. Moreover, it is also possible for our scheme to cast a larger watermark into a smaller host image. Finally, experimental results will show the robustness of our scheme against several common attacks.
Abstract: CloudSim is a useful tool to simulate the cloud
environment. It shows the service availability, the power consumption,
and the network traffic of services on the cloud environment.
Moreover, it supports to calculate a network communication delay
through a network topology data easily. CloudSim allows inputting a
file of topology data, but it does not provide any generating process.
Thus, it needs the file of topology data generated from some other
tools. The BRITE is typical network topology generator. Also, it
supports various type of topology generating algorithms. If CloudSim
can include the BRITE, network simulation for clouds is easier than
existing version. This paper shows the potential of connection between
BRITE and CloudSim. Also, it proposes the direction to link between
them.
Abstract: This paper presents an overview of the design and
implementation of an online rule-based Expert Systems for Islamic
medication. T his Online Islamic Medication Expert System (OIMES)
focuses on physical illnesses only. Knowledge base of this Expert
System contains exhaustively the types of illness together with their
related cures or treatments/therapies, obtained exclusively from the
Quran and Hadith. Extensive research and study are conducted to
ensure that the Expert System is able to provide the most suitable
treatment with reference to the relevant verses cited in Quran or
Hadith. These verses come together with their related 'actions'
(bodily actions/gestures or some acts) to be performed by the patient
to treat a particular illness/sickness. These verses and the instructions
for the 'actions' are to be displayed unambiguously on the computer
screen. The online platform provides the advantage for patient getting
treatment practically anytime and anywhere as long as the computer
and Internet facility exist. Patient does not need to make appointment
to see an expert for a therapy.
Abstract: Molecular dynamics simulation of annular flow
boiling in a nanochannel with 70000 particles is numerically
investigated. In this research, an annular flow model is developed to
predict the superheated flow boiling heat transfer characteristics in a
nanochannel. To characterize the forced annular boiling flow in a
nanochannel, an external driving force F ext ranging from 1to12PN
(PN= Pico Newton) is applied along the flow direction to inlet fluid
particles during the simulation. Based on an annular flow model
analysis, it is found that saturation condition and superheat degree
have great influences on the liquid-vapor interface. Also, the results
show that due to the relatively strong influence of surface tension in
small channel, the interface between the liquid film and vapor core is
fairly smooth, and the mean velocity along the stream-wise direction
does not change anymore.
Abstract: In Public Wireless LANs(PWLANs), user anonymity
is an essential issue. Recently, Juang et al. proposed an anonymous
authentication and key exchange protocol using smart cards in
PWLANs. They claimed that their proposed scheme provided identity
privacy, mutual authentication, and half-forward secrecy. In this paper,
we point out that Juang et al.'s protocol is vulnerable to the
stolen-verifier attack and does not satisfy user anonymity.
Abstract: Basic ingredients of concrete are cement, fine aggregate, coarse aggregate and water. To produce a concrete of certain specific properties, optimum proportion of these ingredients are mixed. The important factors which govern the mix design are grade of concrete, type of cement and size, shape and grading of aggregates. Concrete mix design method is based on experimentally evolved empirical relationship between the factors in the choice of mix design. Basic draw backs of this method are that it does not produce desired strength, calculations are cumbersome and a number of tables are to be referred for arriving at trial mix proportion moreover, the variation in attainment of desired strength is uncertain below the target strength and may even fail. To solve this problem, a lot of cubes of standard grades were prepared and attained 28 days strength determined for different combination of cement, fine aggregate, coarse aggregate and water. An artificial neural network (ANN) was prepared using these data. The input of ANN were grade of concrete, type of cement, size, shape and grading of aggregates and output were proportions of various ingredients. With the help of these inputs and outputs, ANN was trained using feed forward back proportion model. Finally trained ANN was validated, it was seen that it gave the result with/ error of maximum 4 to 5%. Hence, specific type of concrete can be prepared from given material properties and proportions of these materials can be quickly evaluated using the proposed ANN.
Abstract: In this work, we present for the first time in our
perception an efficient digital watermarking scheme for mpeg audio
layer 3 files that operates directly in the compressed data domain,
while manipulating the time and subband/channel domain. In
addition, it does not need the original signal to detect the watermark.
Our scheme was implemented taking special care for the efficient
usage of the two limited resources of computer systems: time and
space. It offers to the industrial user the capability of watermark
embedding and detection in time immediately comparable to the real
music time of the original audio file that depends on the mpeg
compression, while the end user/audience does not face any artifacts
or delays hearing the watermarked audio file. Furthermore, it
overcomes the disadvantage of algorithms operating in the PCMData
domain to be vulnerable to compression/recompression attacks,
as it places the watermark in the scale factors domain and not in the
digitized sound audio data. The strength of our scheme, that allows it
to be used with success in both authentication and copyright
protection, relies on the fact that it gives to the users the enhanced
capability their ownership of the audio file not to be accomplished
simply by detecting the bit pattern that comprises the watermark
itself, but by showing that the legal owner knows a hard to compute
property of the watermark.
Abstract: Much has been written about the difficulties students
have with producing traditional dissertations. This includes both
native English speakers (L1) and students with English as a second
language (L2). The main emphasis of these papers has been on the
structure of the dissertation, but in all cases, even when electronic
versions are discussed, the dissertation is still in what most would
regard as a traditional written form.
Master of Science Degrees in computing disciplines require
students to gain technical proficiency and apply their knowledge to a
range of scenarios. The basis of this paper is that if a dissertation is a
means of showing that such a student has met the criteria for a pass,
which should be based on the learning outcomes of the dissertation
module, does meeting those outcomes require a student to
demonstrate their skills in a solely text based form, particularly in a
highly technical research project? Could it be possible for a student
to produce a series of related artifacts which form a cohesive package
that meets the learning out comes of the dissertation?
Abstract: Existing image-based virtual reality applications
allow users to view image-based 3D virtual environment in a more
interactive manner. User could “walkthrough"; looks left, right, up
and down and even zoom into objects in these virtual worlds of
images. However what the user sees during a “zoom in" is just a
close-up view of the same image which was taken from a distant.
Thus, this does not give the user an accurate view of the object from
the actual distance. In this paper, a simple technique for zooming in
an object in a virtual scene is presented. The technique is based on
the 'hotspot' concept in existing application. Instead of navigation
between two different locations, the hotspots are used to focus into
an object in the scene. For each object, several hotspots are created.
A different picture is taken for each hotspot. Each consecutive
hotspot created will take the user closer to the object. This will
provide the user with a correct of view of the object based on his
proximity to the object. Implementation issues and the relevance of
this technique in potential application areas are highlighted.
Abstract: There exists an injective, information-preserving function
that maps a semantic network (i.e a directed labeled network)
to a directed network (i.e. a directed unlabeled network). The edge
label in the semantic network is represented as a topological feature
of the directed network. Also, there exists an injective function that
maps a directed network to an undirected network (i.e. an undirected
unlabeled network). The edge directionality in the directed network
is represented as a topological feature of the undirected network.
Through function composition, there exists an injective function that
maps a semantic network to an undirected network. Thus, aside from
space constraints, the semantic network construct does not have any
modeling functionality that is not possible with either a directed
or undirected network representation. Two proofs of this idea will
be presented. The first is a proof of the aforementioned function
composition concept. The second is a simpler proof involving an
undirected binary encoding of a semantic network.
Abstract: Rutting is one of the major load-related distresses in airport flexible pavements. Rutting in paving materials develop gradually with an increasing number of load applications, usually appearing as longitudinal depressions in the wheel paths and it may be accompanied by small upheavals to the sides. Significant research has been conducted to determine the factors which affect rutting and how they can be controlled. Using the experimental design concepts, a series of tests can be conducted while varying levels of different parameters, which could be the cause for rutting in airport flexible pavements. If proper experimental design is done, the results obtained from these tests can give a better insight into the causes of rutting and the presence of interactions and synergisms among the system variables which have influence on rutting. Although traditionally, laboratory experiments are conducted in a controlled fashion to understand the statistical interaction of variables in such situations, this study is an attempt to identify the critical system variables influencing airport flexible pavement rut depth from a statistical DoE perspective using real field data from a full-scale test facility. The test results do strongly indicate that the response (rut depth) has too much noise in it and it would not allow determination of a good model. From a statistical DoE perspective, two major changes proposed for this experiment are: (1) actual replication of the tests is definitely required, (2) nuisance variables need to be identified and blocked properly. Further investigation is necessary to determine possible sources of noise in the experiment.
Abstract: This research uses computational linguistics, an area of study that employs a computer to process natural language, and aims at discerning the patterns that exist in declarative sentences used in technical texts. The approach is mathematical, and the focus is on instructional texts found on web pages. The technique developed by the author and named the MAYA Semantic Technique is used here and organized into four stages. In the first stage, the parts of speech in each sentence are identified. In the second stage, the subject of the sentence is determined. In the third stage, MAYA performs a frequency analysis on the remaining words to determine the verb and its object. In the fourth stage, MAYA does statistical analysis to determine the content of the web page. The advantage of the MAYA Semantic Technique lies in its use of mathematical principles to represent grammatical operations which assist processing and accuracy if performed on unambiguous text. The MAYA Semantic Technique is part of a proposed architecture for an entire web-based intelligent tutoring system. On a sample set of sentences, partial semantics derived using the MAYA Semantic Technique were approximately 80% accurate. The system currently processes technical text in one domain, namely Cµ programming. In this domain all the keywords and programming concepts are known and understood.
Abstract: Like other external sorting algorithms, the presented
algorithm is a two step algorithm including internal and external
steps. The first part of the algorithm is like the other similar
algorithms but second part of that is including a new easy
implementing method which has reduced the vast number of inputoutput
operations saliently. As decreasing processor operating time
does not have any effect on main algorithm speed, any improvement
in it should be done through decreasing the number of input-output
operations. This paper propose an easy algorithm for choose the
correct record location of the final list. This decreases the time
complexity and makes the algorithm faster.
Abstract: Portuguese diet has been gradually diverging from the basic principles of healthy eating, leading to an unbalanced dietary pattern which, associated with increasing sedentary lifestyle, has a negative impact on public health. The main objective of this work was to characterize the dietary habits of university students in Viseu, Portugal. The study consisted of a sample of 80 university students, aged between 18 and 28 years. Anthropometric data (weight (kg) and height (m)) were collected and Body Mass Index (BMI) was calculated. The dietary habits were assessed through a three-day food record and the software Medpoint was used to convert food into energy and nutrients. The results showed that students present a normal body mass index. Female university students made a higher number of daily meals than male students, and these last skipped breakfast more frequently. The values of average daily intake of energy, macronutrients and calcium were higher in males. The food pattern was characterized by a predominant consumption of meat, cereal, fats and sugar. Dietary intake of dairy products, fruits, vegetables and legumes does not meet the recommendations, revealing inadequate food habits such as hypoglycemic, hyperprotein and hyperlipidemic diet. Our findings suggest that preventive interventions should be focus in promoting healthy eating habits and physical activity in adulthood.
Abstract: Measurements of radioactivity in the environment is of great importance to monitor and control the levels of radiation to which man is exposed directly or indirectly. It is necessary to show that regardless of working or being close to nuclear power plants, people are daily in contact with some amount of radiation from the actual environment and food that are ingested, contradicting the view of most of them. The aim of this study was to analyze the rate of natural and artificial radiation from radionuclides present in cement, soil and fertilizers used in Sergipe State – Brazil. The radionuclide activitiesmeasured all samples arebelow the Brazilian limit of the exclusion and exemption criteria from the requirement of radiation protection.It was detected Be-7 in organic fertilizers that means a short interval between the brewing processes for use in agriculture. It was also detected an unexpected Cs-137 in some samples; however its activities does not represent risk for the population. Th-231 was also found in samples of soil and cement in the state of Sergipe that is an unprecedented result.
Abstract: This paper describes a novel optimized JTAG interface circuit between a JTAG controller and target IC. Being able to access JTAG using only one or two pins, this circuit does not change the original boundary scanning test frequency of target IC. Compared with the traditional JTAG interface which based on IEEE std. 1149.1, this reduced pin technology is more applicability in pin limited devices, and it is easier to control the scale of target IC for the designer.