Abstract: Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.
Abstract: In recent years, a nondestructive elemental analysis method based on muonic X-ray measurements has been developed and applied for various samples. Muonic X-rays are emitted after the formation of a muonic atom, which occurs when a negatively charged muon is captured in a muon atomic orbit around the nucleus. Because muonic X-rays have a higher energy than electronic X-rays due to the muon mass, they can be measured without being absorbed by a material. Thus, estimating the two-dimensional (2D) elemental distribution of a sample became possible using an X-ray imaging detector. In this work, we report a non-destructive imaging experiment using muonic X-rays at Japan Proton Accelerator Research Complex. The irradiated target consisted of a polypropylene material, and a double-sided silicon strip detector, which was developed as an imaging detector for astronomical obervation, was employed. A peak corresponding to muonic X-rays from the carbon atoms in the target was clearly observed in the energy spectrum at an energy of 14 keV, and 2D visualizations were successfully reconstructed to reveal the projection image from the target. This result demonstrates the potential of the nondestructive elemental imaging method that is based on muonic X-ray measurement. To obtain a higher position resolution for imaging a smaller target, a new detector system will be developed to improve the statistical analysis in further research.
Abstract: This cross-sectional study aims to explore the differences among adults with somatic symptom disorder (SSD) versus adults without SSD, in terms of attachment and emotion regulation strategies. A total sample of 80 participants (40 people with SSD and 40 healthy controls), aged 20-57 years old (M = 31.69, SD = 10.55), were recruited from institutions and online groups. They completed the Romanian version of the Experiences in Close Relationships Scale – Short Form (ECR-S), Regulation of Emotion Systems Survey (RESS), Patient Health Questionnaire-15 (PHQ-15) and Somatic Symptom Disorder – B Criteria Scale (SSD-12). The results indicate significant differences between the two groups in terms of attachment and emotion regulation strategies. Adults with SSD have a higher level of attachment anxiety and avoidance compared to the nonclinical group. Moreover, people with SSD are more prone to use rumination and suppression and less prone to use reevaluation compared to healthy people. Implications for SSD prevention and treatment are discussed.
Abstract: A flaw or drift from expected operational performance in one component (NAND, PMIC, controller, DRAM, etc.) may affect the reliability of the entire Solid State Drive (SSD) system. Therefore, it is important to ensure the required quality of each individual component through qualification testing specified using standards or user requirements. Qualification testing is time-consuming and comes at a substantial cost for product manufacturers. A highly technical team, from all the eminent stakeholders is embarking on reliability prediction from beginning of new product development, identify critical to reliability parameters, perform full-blown characterization to embed margin into product reliability and establish control to ensure the product reliability is sustainable in the mass production. The paper will discuss a comprehensive development framework, comprehending SSD end to end from design to assembly, in-line inspection, in-line testing and will be able to predict and to validate the product reliability at the early stage of new product development. During the design stage, the SSD will go through intense reliability margin investigation with focus on assembly process attributes, process equipment control, in-process metrology and also comprehending forward looking product roadmap. Once these pillars are completed, the next step is to perform process characterization and build up reliability prediction modeling. Next, for the design validation process, the reliability prediction specifically solder joint simulator will be established. The SSD will be stratified into Non-Operating and Operating tests with focus on solder joint reliability and connectivity/component latent failures by prevention through design intervention and containment through Temperature Cycle Test (TCT). Some of the SSDs will be subjected to the physical solder joint analysis called Dye and Pry (DP) and Cross Section analysis. The result will be feedbacked to the simulation team for any corrective actions required to further improve the design. Once the SSD is validated and is proven working, it will be subjected to implementation of the monitor phase whereby Design for Assembly (DFA) rules will be updated. At this stage, the design change, process and equipment parameters are in control. Predictable product reliability at early product development will enable on-time sample qualification delivery to customer and will optimize product development validation, effective development resource and will avoid forced late investment to bandage the end-of-life product failures. Understanding the critical to reliability parameters earlier will allow focus on increasing the product margin that will increase customer confidence to product reliability.
Abstract: The conservation of marine biodiversity keeps ecosystems in balance and ensures the sustainable use of resources. In this context, technological resources have been used for monitoring marine species to allow biologists to obtain data in real-time. There are different mobile applications developed for data collection for monitoring purposes, but these systems are designed to be utilized only on third-generation (3G) phones or smartphones with Internet access and in rural parts of the developing countries, Internet services and smartphones are scarce. Thus, the objective of this work is to develop a system to monitor marine turtles using Unstructured Supplementary Service Data (USSD), which users can access through basic mobile phones. The system aims to improve the data collection mechanism and enhance the effectiveness of current systems in monitoring sea turtles using any type of mobile device without Internet access. The system will be able to report information related to the biological activities of marine turtles. Also, it will be used as a platform to assist marine conservation entities to receive reports of illegal sales of sea turtles. The system can also be utilized as an educational tool for communities, providing knowledge and allowing the inclusion of communities in the process of monitoring marine turtles. Therefore, this work may contribute with information to decision-making and implementation of contingency plans for marine conservation programs.
Abstract: The purification of brackish seawater becomes a necessity and not a choice against demographic and industrial growth especially in third world countries. Two models can be used in this work: simple solar still and simple solar still coupled with a heat pump. In this research, the productivity of water by Simple Solar Distiller (SSD) and Simple Solar Distiller Hybrid Heat Pump (SSDHP) was determined by the orientation, the use of heat pump, the simple or double glass cover. The productivity can exceed 1.2 L/m²h for the SSDHP and 0.5 L/m²h for SSD model. The result of the global efficiency is determined for two models SSD and SSDHP give respectively 30%, 50%. The internal efficiency attained 35% for SSD and 60% of the SSDHP models. Convective heat coefficient can be determined by attained 2.5 W/m²°C and 0.5 W/m²°C respectively for SSDHP and SSD models.
Abstract: The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03
Abstract: This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.
Abstract: In present scenario, cardiovascular problems are growing challenge for researchers and physiologists. As heart disease have no geographic, gender or socioeconomic specific reasons; detecting cardiac irregularities at early stage followed by quick and correct treatment is very important. Electrocardiogram is the finest tool for continuous monitoring of heart activity. Heart rate variability (HRV) is used to measure naturally occurring oscillations between consecutive cardiac cycles. Analysis of this variability is carried out using time domain, frequency domain and non-linear parameters. This paper presents HRV analysis of the online dataset for normal sinus rhythm (taken as healthy subject) and sudden cardiac death (SCD subject) using all three methods computing values for parameters like standard deviation of node to node intervals (SDNN), square root of mean of the sequences of difference between adjacent RR intervals (RMSSD), mean of R to R intervals (mean RR) in time domain, very low-frequency (VLF), low-frequency (LF), high frequency (HF) and ratio of low to high frequency (LF/HF ratio) in frequency domain and Poincare plot for non linear analysis. To differentiate HRV of healthy subject from subject died with SCD, k –nearest neighbor (k-NN) classifier has been used because of its high accuracy. Results show highly reduced values for all stated parameters for SCD subjects as compared to healthy ones. As the dataset used for SCD patients is recording of their ECG signal one hour prior to their death, it is therefore, verified with an accuracy of 95% that proposed algorithm can identify mortality risk of a patient one hour before its death. The identification of a patient’s mortality risk at such an early stage may prevent him/her meeting sudden death if in-time and right treatment is given by the doctor.
Abstract: Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.
Abstract: The beginning of 21st century has witnessed new
advancements in the design and use of new materials for biosensing
applications, from nano to macro, protein to tissue. Traditional
analytical methods lack a complete toolset to describe the
complexities introduced by living systems, pathological relations,
discrete hierarchical materials, cross-phase interactions, and
structure-property dependencies. Materiomics – via systematic
molecular dynamics (MD) simulation – can provide structureprocess-
property relations by using a materials science approach
linking mechanisms across scales and enables oriented biosensor
design. With this approach, DNA biosensors can be utilized to detect
disease biomarkers present in individuals’ breath such as acetone for
diabetes. Our wireless sensor array based on single-stranded DNA
(ssDNA)-decorated single-walled carbon nanotubes (SWNT) has
successfully detected trace amount of various chemicals in vapor
differentiated by pattern recognition. Here, we present how MD
simulation can revolutionize the way of design and screening of DNA
aptamers for targeting biomarkers related to oral diseases and oral
health monitoring. It demonstrates great potential to be utilized to
build a library of DNDA sequences for reliable detection of several
biomarkers of one specific disease, and as well provides a new
methodology of creating, designing, and applying of biosensors.
Abstract: In the Solid-State-Drive (SSD) performance, whether
the data has been well parallelized is an important factor. SSD
parallelization is affected by allocation scheme and it is directly
connected to SSD performance. There are dynamic allocation and
static allocation in representative allocation schemes. Dynamic
allocation is more adaptive in exploiting write operation parallelism,
while static allocation is better in read operation parallelism.
Therefore, it is hard to select the appropriate allocation scheme when
the workload is mixed read and write operations. We simulated
conditions on a few mixed data patterns and analyzed the results to
help the right choice for better performance. As the results, if data
arrival interval is long enough prior operations to be finished and
continuous read intensive data environment static allocation is more
suitable. Dynamic allocation performs the best on write performance
and random data patterns.
Abstract: Although it is fully impossible to ensure that a software system is quite secure, developing an acceptable secure software system in a convenient platform is not unreachable. In this paper, we attempt to analyze software development life cycle (SDLC) models from the hardware systems and circuits point of view. To date, the SDLC models pay merely attention to the software security from the software perspectives. In this paper, we present new features for SDLC stages to emphasize the role of systems and circuits in developing secure software system through the software development stages, the point that has not been considered previously in the SDLC models.
Abstract: Noninvasive diagnostics of diseases via breath
analysis has attracted considerable scientific and clinical interest for
many years and become more and more promising with the rapid
advancements in nanotechnology and biotechnology. The volatile
organic compounds (VOCs) in exhaled breath, which are mainly
blood borne, particularly provide highly valuable information about
individuals’ physiological and pathophysiological conditions.
Additionally, breath analysis is noninvasive, real-time, painless, and
agreeable to patients. We have developed a wireless sensor array
based on single-stranded DNA (ssDNA)-functionalized single-walled
carbon nanotubes (SWNT) for the detection of a number of
physiological indicators in breath. Seven DNA sequences were used
to functionalize SWNT sensors to detect trace amount of methanol,
benzene, dimethyl sulfide, hydrogen sulfide, acetone, and ethanol,
which are indicators of heavy smoking, excessive drinking, and
diseases such as lung cancer, breast cancer, and diabetes. Our test
results indicated that DNA functionalized SWNT sensors exhibit
great selectivity, sensitivity, and repeatability; and different
molecules can be distinguished through pattern recognition enabled
by this sensor array. Furthermore, the experimental sensing results
are consistent with the Molecular Dynamics simulated ssDNAmolecular
target interaction rankings. Thus, the DNA-SWNT sensor
array has great potential to be applied in chemical or biomolecular
detection for the noninvasive diagnostics of diseases and personal
health monitoring.
Abstract: This paper focuses on I/O optimizations of N-hybrid
(New-Form of hybrid), which provides a hybrid file system space
constructed on SSD and HDD. Although the promising potentials of
SSD, such as the absence of mechanical moving overhead and high
random I/O throughput, have drawn a lot of attentions from IT
enterprises, its high ratio of cost/capacity makes it less desirable to
build a large-scale data storage subsystem composed of only SSDs. In
this paper, we present N-hybrid that attempts to integrate the strengths
of SSD and HDD, to offer a single, large hybrid file system space.
Several experiments were conducted to verify the performance of
N-hybrid.
Abstract: Model transformation, as a pivotal aspect of Modeldriven
engineering, attracts more and more attentions both from
researchers and practitioners. Many domains (enterprise engineering,
software engineering, knowledge engineering, etc.) use model
transformation principles and practices to serve to their domain
specific problems; furthermore, model transformation could also be
used to fulfill the gap between different domains: by sharing and
exchanging knowledge. Since model transformation has been widely
used, there comes new requirement on it: effectively and efficiently
define the transformation process and reduce manual effort that
involved in. This paper presents an automatic model transformation
methodology based on semantic and syntactic comparisons, and
focuses particularly on granularity issue that existed in transformation
process. Comparing to the traditional model transformation
methodologies, this methodology serves to a general purpose: crossdomain
methodology. Semantic and syntactic checking
measurements are combined into a refined transformation process,
which solves the granularity issue. Moreover, semantic and syntactic
comparisons are supported by software tool; manual effort is replaced
in this way.
Abstract: The paper focuses on the problem of the point
correspondence matching in stereo images. The proposed matching
algorithm is based on the combination of simpler methods such as
normalized sum of squared differences (NSSD) and a more complex
phase correlation based approach, by considering the noise and other
factors, as well. The speed of NSSD and the preciseness of the
phase correlation together yield an efficient approach to find the best
candidate point with sub-pixel accuracy in stereo image pairs. The
task of the NSSD in this case is to approach the candidate pixel
roughly. Afterwards the location of the candidate is refined by an
enhanced phase correlation based method which in contrast to the
NSSD has to run only once for each selected pixel.
Abstract: This study examined the properties of fresh and hardened concretes as influenced by the moisture state of the coarse recycled concrete aggregates (RCA) after surface treatment. Surface treatment was performed by immersing the coarse RCA in a calcium metasilicate (CM) solution. The treated coarse RCA was maintained in three controlled moisture states, namely, air-dried, oven-dried, and saturated surface-dried (SSD), prior to its use in a concrete mix. The physical properties of coarse RCA were evaluated after surface treatment during the first phase of the experiment to determine the density and the water absorption characteristics of the RCA. The second phase involved the evaluation of the slump, slump loss, density, and compressive strength of the concretes that were prepared with different proportions of natural and treated coarse RCA. Controlling the moisture state of the coarse RCA after surface treatment was found to significantly influence the properties of the fresh and hardened concretes.
Abstract: In H.264/AVC video encoding, rate-distortion
optimization for mode selection plays a significant role to achieve
outstanding performance in compression efficiency and video quality.
However, this mode selection process also makes the encoding
process extremely complex, especially in the computation of the ratedistortion
cost function, which includes the computations of the sum
of squared difference (SSD) between the original and reconstructed
image blocks and context-based entropy coding of the block. In this
paper, a transform-domain rate-distortion optimization accelerator
based on fast SSD (FSSD) and VLC-based rate estimation algorithm
is proposed. This algorithm could significantly simplify the hardware
architecture for the rate-distortion cost computation with only
ignorable performance degradation. An efficient hardware structure
for implementing the proposed transform-domain rate-distortion
optimization accelerator is also proposed. Simulation results
demonstrated that the proposed algorithm reduces about 47% of total
encoding time with negligible degradation of coding performance.
The proposed method can be easily applied to many mobile video
application areas such as a digital camera and a DMB (Digital
Multimedia Broadcasting) phone.
Abstract: This paper aims to describe how student satisfaction is
measured for work-based learners as these are non-traditional
learners, conducting academic learning in the workplace, typically
their curricula have a high degree of negotiation, and whose
motivations are directly related to their employers- needs, as well as
their own career ambitions. We argue that while increasing WBL
participation, and use of SSD are both accepted as being of strategic
importance to the HE agenda, the use of WBL SSD is rarely
examined, and lessons can be learned from the comparison of SSD
from a range of WBL programmes, and increased visibility of this
type of data will provide insight into ways to improve and develop
this type of delivery. The key themes that emerged from the analysis
of the interview data were: learners profiles and needs, employers
drivers, academic staff drivers, organizational approach, tools for
collecting data and visibility of findings. The paper concludes with
observations on best practice in the collection, analysis and use of
WBL SSD, thus offering recommendations for both academic
managers and practitioners.