Abstract: Bones are dynamic and responsive organs, they
regulate their strength and mass according to the loads which they are subjected. Because, the Wnt/β-catenin pathway has profound
effects on the regulation of bone mass, we hypothesized that mechanical loading of bone cells stimulates Wnt/β-catenin signaling, which results in the generation of new bone mass.
Mechanical loading triggers the secretion of the Wnt molecule, which after binding to transmembrane proteins, causes GSK-3β (Glycogen synthase kinase 3 beta) to cease the phosphorylation of β-catenin. β-catenin accumulation in the cytoplasm, followed by its
transport into the nucleus, binding to transcription factors (TCF/LEF)
that initiate transcription of genes related to bone formation. To test this hypothesis, we used TOPGAL (Tcf Optimal Promoter
β-galactosidase) mice in an experiment in which cyclic loads were
applied to the forearm. TOPGAL mice are reporters for cells effected
by the Wnt/β-catenin signaling pathway. TOPGAL mice are genetically engineered mice in which transcriptional activation of β-
catenin, results in the production of an enzyme, β-galactosidase. The
presence of this enzyme allows us to localize transcriptional
activation of β-catenin to individual cells, thereby, allowing us to quantify the effects that mechanical loading has on the Wnt/β-catenin pathway and new bone formation. The ulnae of loaded TOPGAL
mice were excised and transverse slices along different parts of the
ulnar shaft were assayed for the presence of β-galactosidase.
Our results indicate that loading increases β-catenin transcriptional
activity in regions where this pathway is already primed (i.e. where basal activity is already higher) in a load magnitude dependent
manner. Further experiments are needed to determine the temporal and spatial activation of this signaling in relation to bone formation.
Abstract: In this paper, we investigate the strategic stochastic air traffic flow management problem which seeks to balance airspace capacity and demand under weather disruptions. The goal is to reduce the need for myopic tactical decisions that do not account for probabilistic knowledge about the NAS near-future states. We present and discuss a scenario-based modeling approach based on a time-space stochastic process to depict weather disruption occurrences in the NAS. A solution framework is also proposed along with a distributed implementation aimed at overcoming scalability problems. Issues related to this implementation are also discussed.
Abstract: Crime is a major societal problem for most of the
world's nations. Consequently, the police need to develop new
methods to improve their efficiency in dealing with these ever increasing crime rates. Two of the common difficulties that the police
face in crime control are crime investigation and the provision of crime information to the general public to help them protect themselves. Crime control in police operations involves the use of
spatial data, crime data and the related crime data from different organizations (depending on the nature of the analysis to be made).
These types of data are collected from several heterogeneous sources
in different formats and from different platforms, resulting in a lack of standardization. Moreover, there is no standard framework for
crime data collection, integration and dissemination through mobile
devices. An investigation into the current situation in crime control was carried out to identify the needs to resolve these issues. This
paper proposes and investigates the use of service oriented
architecture (SOA) and the mobile spatial information service in crime control. SOA plays an important role in crime control as an
appropriate way to support data exchange and model sharing from
heterogeneous sources. Crime control also needs to facilitate mobile
spatial information services in order to exchange, receive, share and release information based on location to mobile users anytime and
anywhere.
Abstract: Tool Tracker is a client-server based application. It is essentially a catalogue of various network monitoring and management tools that are available online. There is a database maintained on the server side that contains the information about various tools. Several clients can access this information simultaneously and utilize this information. The various categories of tools considered are packet sniffers, port mappers, port scanners, encryption tools, and vulnerability scanners etc for the development of this application. This application provides a front end through which the user can invoke any tool from a central repository for the purpose of packet sniffing, port scanning, network analysis etc. Apart from the tool, its description and the help files associated with it would also be stored in the central repository. This facility will enable the user to view the documentation pertaining to the tool without having to download and install the tool. The application would update the central repository with the latest versions of the tools. The application would inform the user about the availability of a newer version of the tool currently being used and give the choice of installing the newer version to the user. Thus ToolTracker provides any network administrator that much needed abstraction and ease-ofuse with respect to the tools that he can use to efficiently monitor a network.
Abstract: Air pollution is a major environmental health
problem, affecting developed and developing countries around the
world. Increasing amounts of potentially harmful gases and
particulate matter are being emitted into the atmosphere on a global
scale, resulting in damage to human health and the environment.
Petroleum-related air pollutants can have a wide variety of adverse
environmental impacts. In the crude oil production sectors, there is a
strong need for a thorough knowledge of gaseous emissions resulting
from the flaring of associated gas of known composition on daily
basis through combustion activities under several operating
conditions. This can help in the control of gaseous emission from
flares and thus in the protection of their immediate and distant
surrounding against environmental degradation.
The impacts of methane and non-methane hydrocarbons emissions
from flaring activities at oil production facilities at Kuwait Oilfields
have been assessed through a screening study using records of flaring
operations taken at the gas and oil production sites, and by analyzing
available meteorological and air quality data measured at stations
located near anthropogenic sources. In the present study the
Industrial Source Complex (ISCST3) Dispersion Model is used to
calculate the ground level concentrations of methane and nonmethane
hydrocarbons emitted due to flaring in all over Kuwait
Oilfields.
The simulation of real hourly air quality in and around oil
production facilities in the State of Kuwait for the year 2006,
inserting the respective source emission data into the ISCST3
software indicates that the levels of non-methane hydrocarbons from
the flaring activities exceed the allowable ambient air standard set by
Kuwait EPA. So, there is a strong need to address this acute problem
to minimize the impact of methane and non-methane hydrocarbons
released from flaring activities over the urban area of Kuwait.
Abstract: The data exchanged on the Web are of different nature
from those treated by the classical database management systems;
these data are called semi-structured data since they do not have a
regular and static structure like data found in a relational database;
their schema is dynamic and may contain missing data or types.
Therefore, the needs for developing further techniques and
algorithms to exploit and integrate such data, and extract relevant
information for the user have been raised. In this paper we present
the system OSIX (Osiris based System for Integration of XML
Sources). This system has a Data Warehouse model designed for the
integration of semi-structured data and more precisely for the
integration of XML documents. The architecture of OSIX relies on
the Osiris system, a DL-based model designed for the representation
and management of databases and knowledge bases. Osiris is a viewbased
data model whose indexing system supports semantic query
optimization. We show that the problem of query processing on a
XML source is optimized by the indexing approach proposed by
Osiris.
Abstract: With the rapid development in the field of life
sciences and the flooding of genomic information, the need for faster
and scalable searching methods has become urgent. One of the
approaches that were investigated is indexing. The indexing methods
have been categorized into three categories which are the lengthbased
index algorithms, transformation-based algorithms and mixed
techniques-based algorithms. In this research, we focused on the
transformation based methods. We embedded the N-gram method
into the transformation-based method to build an inverted index
table. We then applied the parallel methods to speed up the index
building time and to reduce the overall retrieval time when querying
the genomic database. Our experiments show that the use of N-Gram
transformation algorithm is an economical solution; it saves time and
space too. The result shows that the size of the index is smaller than
the size of the dataset when the size of N-Gram is 5 and 6. The
parallel N-Gram transformation algorithm-s results indicate that the
uses of parallel programming with large dataset are promising which
can be improved further.
Abstract: The present paper considers the steady free convection
boundary layer flow of a viscoelastic fluid on solid sphere with
Newtonian heating. The boundary layer equations are an order higher
than those for the Newtonian (viscous) fluid and the adherence
boundary conditions are insufficient to determine the solution of
these equations completely. Thus, the augmentation an extra
boundary condition is needed to perform the numerical
computational. The governing boundary layer equations are first
transformed into non-dimensional form by using special
dimensionless group and then solved by using an implicit finite
difference scheme. The results are displayed graphically to illustrate
the influence of viscoelastic K and Prandtl Number Pr parameters on
skin friction, heat transfer, velocity profiles and temperature profiles.
Present results are compared with the published papers and are found
to concur very well.
Abstract: Minimally invasive surgery (MIS) is now being widely used as a preferred choice for various types of operations. The need to detect various tactile properties, justifies the key role of tactile sensing that is currently missing in MIS. In this regard, Laparoscopy is one of the methods of minimally invasive surgery that can be used in kidney stone removal surgeries. At this moment, determination of the exact location of stone during laparoscopy is one of the limitations of this method that no scientific solution has been found for so far. Artificial tactile sensing is a new method for obtaining the characteristics of a hard object embedded in a soft tissue. Artificial palpation is an important application of artificial tactile sensing that can be used in different types of surgeries. In this study, a new method for determining the exact location of stone during laparoscopy is presented. In the present study, the effects of stone existence on the surface of kidney were investigated using conceptual 3D model of kidney containing a simulated stone. Having imitated palpation and modeled it conceptually, indications of stone existence that appear on the surface of kidney were determined. A number of different cases were created and solved by the software and using stress distribution contours and stress graphs, it is illustrated that the created stress patterns on the surface of kidney show not only the existence of stone inside, but also its exact location. So three-dimensional analysis leads to a novel method of predicting the exact location of stone and can be directly applied to the incorporation of tactile sensing in artificial palpation, helping surgeons in non-invasive procedures.
Abstract: In this study, we propose a network architecture for
providing secure access to information resources of enterprise
network from remote locations in a wireless fashion. Our proposed
architecture offers a very promising solution for organizations which
are in need of a secure, flexible and cost-effective remote access
methodology. Security of the proposed architecture is based on
Virtual Private Network technology and a special role based access
control mechanism with location and time constraints. The flexibility
mainly comes from the use of Internet as the communication medium
and cost-effectiveness is due to the possibility of in-house
implementation of the proposed architecture.
Abstract: With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation is exploited for either an individual element or a set of consecutive elements in a Web document and results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called SMC, enabling the development of mobility applications and services according to a channel model based on the principles of Services Oriented Architecture (SOA). It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation that prescribes a scheme for representing semantic markup files and a way of associating Web documents with these external annotations. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering. Semantic Web content adaptation is a way of adding value to Web contents and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).
Abstract: This paper presents modeling and optimization of two NP-hard problems in flexible manufacturing system (FMS), part type selection problem and loading problem. Due to the complexity and extent of the problems, the paper was split into two parts. The first part of the papers has discussed the modeling of the problems and showed how the real coded genetic algorithms (RCGA) can be applied to solve the problems. This second part discusses the effectiveness of the RCGA which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.
Abstract: In this paper, fluid flow patterns of steady incompressible flow inside shear driven cavity are studied. The numerical simulations are conducted by using lattice Boltzmann method (LBM) for different Reynolds numbers. In order to simulate the flow, derivation of macroscopic hydrodynamics equations from the continuous Boltzmann equation need to be performed. Then, the numerical results of shear-driven flow inside square and triangular cavity are compared with results found in literature review. Present study found that flow patterns are affected by the geometry of the cavity and the Reynolds numbers used.
Abstract: At the present, auto part industries have become higher challenge in strategy market. As this consequence, manufacturers need to have better response to customers in terms of quality, cost, and delivery time. Moreover, they need to have a good management in factory to comply with international standard maximum capacity and lower cost. This would lead companies to have to order standard part from aboard and become the major cost of inventory. The development of auto part research by recycling materials experiment is to compare the auto parts from recycle materials to international auto parts (CKD). Factors studied in this research were the recycle material ratios of PU-foam, felt, and fabric. Results of recycling materials were considered in terms of qualities and properties on the parameters such as weight, sound absorption, water absorption, tensile strength, elongation, and heat resistance with the CKD. The results were showed that recycling materials would be used to replace for the CKD.
Abstract: The study was conducted to investigate the profile of
hepatitis in Kingdom of Saudi Arabia, and to determine which age
group hepatitis viruses most commonly infect. The epidemiology of
viral hepatitis in Saudi Arabia has undergone major changes,
concurrent with major socioeconomic developments over the last two
to three decades. This disease represents a major public health
problem in Saudi Arabia resulting in the need for considerable
healthcare resources. A retrospective cross sectional analysis of the
reported cases of viral hepatitis was conducted based on the reports
of The Ministry of Health in Saudi Arabia about Hepatitis A, B and C
infections in all regions from the period of January 2006 to December
2010. The study demonstrated that incidence of viral Hepatitis is
decreasing, except for Hepatitis B that showed minimal increase. Of
hepatitis A, B, and C, Hepatitis B virus (HBV) was the most
predominant type, accounting for (53%) of the cases, followed by
Hepatitis C virus (HCV) (30%) and HAV (17%). HAV infection
predominates in children (5–14 years) with 60% of viral hepatitis
cases, HBV in young adults (15–44 years) with 69% of viral hepatitis
cases, and HCV in older adults (>45 years) with 59% of viral
hepatitis cases. Despite significant changes in the prevalence of viral
hepatitis A, B and C, it remains a major public health problem in
Saudi Arabia; however, it showed a significant decline in the last two
decades that could be attributed to the vaccination programs and the
improved health facilities. Further researches are needed to identify
the risk factors making a specific age group or a specific region in
Saudi Arabia targeted for a specific type of hepatitis viruses.
Abstract: Target tracking and localization are important applications
in wireless sensor networks. In these applications, sensor nodes
collectively monitor and track the movement of a target. They have
limited energy supplied by batteries, so energy efficiency is essential
for sensor networks. Most existing target tracking protocols need to
wake up sensors periodically to perform tracking. Some unnecessary
energy waste is thus introduced. In this paper, an energy efficient
protocol for target localization is proposed. In order to preserve
energy, the protocol fixes the number of sensors for target tracking,
but it retains the quality of target localization in an acceptable
level. By selecting a set of sensors for target localization, the other
sensors can sleep rather than periodically wake up to track the target.
Simulation results show that the proposed protocol saves a significant
amount of energy and also prolongs the network lifetime.
Abstract: The performance and the plasma created by a pulsed
magnetoplasmadynamic thruster for small satellite application is
studied to understand better the ablation and plasma propagation
processes occurring during the short-time discharge. The results can
be applied to improve the quality of the thruster in terms of efficiency,
and to tune the propulsion system to the needs required by the satellite
mission. Therefore, plasma measurements with a high-speed camera
and induction probes, and performance measurements of mass bit
and impulse bit were conducted. Values for current sheet propagation
speed, mean exhaust velocity and thrust efficiency were derived from
these experimental data. A maximum in current sheet propagation
was found by the high-speed camera measurements for a medium
energy input and confirmed by the induction probes. A quasilinear
tendency between the mass bit and the energy input, the current
action integral respectively, was found, as well as a linear tendency
between the created impulse and the discharge energy. The highest
mean exhaust velocity and thrust efficiency was found for the highest
energy input.
Abstract: In the past years, the world has witnessed significant work in the field of Manufacturing. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Closely following all this, due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: moving toward a more intelligent manufacturing, the present paper emerges with the main aim of contributing to the analysis and a few customization issues of a new iCIM 3000 system at the IPSAM. In this process, special emphasis in made on the material flow problem. For this, besides offering a description and analysis of the system and its main parts, also some tips on how to define other possible alternative material flow scenarios and a partial analysis of the combinatorial nature of the problem are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a few figures and expressions which would help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.
Abstract: Lately, significant work in the area of Intelligent
Manufacturing has become public and mainly applied within the
frame of industrial purposes. Special efforts have been made in the
implementation of new technologies, management and control
systems, among many others which have all evolved the field. Aware
of all this and due to the scope of new projects and the need of
turning the existing flexible ideas into more autonomous and
intelligent ones, i.e.: Intelligent Manufacturing, the present paper
emerges with the main aim of contributing to the design and analysis
of the material flow in either systems, cells or work stations under
this new “intelligent" denomination. For this, besides offering a
conceptual basis in some of the key points to be taken into account
and some general principles to consider in the design and analysis of
the material flow, also some tips on how to define other possible
alternative material flow scenarios and a classification of the states a
system, cell or workstation are offered as well. All this is done with
the intentions of relating it with the use of simulation tools, for which
these have been briefly addressed with a special focus on the Witness
simulation package. For a better comprehension, the previous
elements are supported by a detailed layout, other figures and a few
expressions which could help obtaining necessary data. Such data and
others will be used in the future, when simulating the scenarios in the
search of the best material flow configurations.
Abstract: The approach of subset selection in polynomial
regression model building assumes that the chosen fixed full set of
predefined basis functions contains a subset that is sufficient to
describe the target relation sufficiently well. However, in most cases
the necessary set of basis functions is not known and needs to be
guessed – a potentially non-trivial (and long) trial and error process.
In our research we consider a potentially more efficient approach –
Adaptive Basis Function Construction (ABFC). It lets the model
building method itself construct the basis functions necessary for
creating a model of arbitrary complexity with adequate predictive
performance. However, there are two issues that to some extent
plague the methods of both the subset selection and the ABFC,
especially when working with relatively small data samples: the
selection bias and the selection instability. We try to correct these
issues by model post-evaluation using Cross-Validation and model
ensembling. To evaluate the proposed method, we empirically
compare it to ABFC methods without ensembling, to a widely used
method of subset selection, as well as to some other well-known
regression modeling methods, using publicly available data sets.