5thInternational Conference on Advanced Computing (ADCO 2018)

June 23~24, 2018, Copenhagen, Denmark

Accepted Papers

  • HOLMES: Holistic Mice-Elephants Stochastic Scheduling in Data Center Networks
    Tingqiu Yuan,Tao Huang,Cong Xu, Jian Li
    Huawei Technologies,United States of America
    Dependability of Scheduling between latency-sensitive small data flows (a.k.a. mice) and throughput-oriented large ones (a.k.a. elephants) has become ever challenging with the proliferation of cloud based applications. In light of this mounting problem, this work proposes a novel flow control scheme, HOLMES (HOListic Mice-Elephants Stochastic), which offers a holistic view of global congestion awareness as well as a stochastic scheduler of mixed mice-elephants data flows in Data Center Networks (DCNs). Firstly, we theoretically prove the necessity for partitioning DCN paths into sub-networks using a stochastic model. Secondly, the HOLMES architecture is proposed, which adaptively partitions the available DCN paths into low-latency and high-throughput sub-networks via a global congestion-aware scheduling mechanism. Based on the stochastic power-of-twochoices policy, the HOLMES scheduling mechanism acquires only a subset of the global congestion information, while achieves close to optimal load balance on each end-to-end DCN path. We also formally prove the stability of HOLMES flow scheduling algorithm. Thirdly, extensive simulation validates the effectiveness and dependability of HOLMES with select DCN topologies. The proposal has been in test in an industry production environment

    Software Engineering, Mobile Computing, Cloud Computing, Computer Security, Mobile Cloud Computing., Security as a Service.

    Hakkı Soy1,Shaoying Liu 1
    Department of Electrical and Electronics Engineering, Necmettin Erbakan University, Konya, Turkey
    This study investigates the performances of several scheduling algorithms (Round Robin, maximum SNR and opportunistic Round Robin) in the Medium Access Control (MAC) layer of wireless network with cellular architecture. The spectral efficiency and fairness are two important metrics to evaluate the performance of wireless systems. The Nakagami-m distribution is a generalized way to model small scale fading and it can model various wireless fading channel conditions by changing the value of m parameter. The channel is assumed to be slow-fading and opportunistic beamforming scheme is used to enlarge the rate and dynamic range of channel fluctuations through the employment of multiple antennas at the base station. The MATLAB based Monte Carlo simulations are implemented to show the effects of the antenna number and m parameter of the Nakagami distribution. The obtained results show that the pure opportunistic max-SNR algorithm has a spectral efficiency advantage, even though it has a bad fairness characteristic in high number of users.

    Spectral Efficiency, Fairness, Scheduling, Opportunistic Beamforming, Wireless Cellular Network

    Lev Lafayette,1 Mitch Turnbull 2 Mark Wilcox2and Eric A. Treml3
    1Department of Infrastructure, University of Melbourne, Melbourne, Australia
    2Nyriad, Cambridge, New Zealand
    34School of BioSciences, University of Melbourne, Melbourne, Australia
    Identifying probable dispersal routes and for marine populations is a data and processing intensive task of which traditional high performance computing systems are suitable, even for single-threaded applications. Whilst processing dependencies between the datasets exist, a large level of independence between sets allows for use of job arrays to significantly improve processing time. Identification of bottle-necks within the code base suitable for GPU optimisation however had led to additional performance improvements which can be coupled with the existing benefits from job arrays. This small example offers an example of how to optimise single-threaded applications suitable for GPU architectures for significant performance improvements. Further development is suggested with the expansion of the GPU capability of the University of Melbourne's "Spartan" HPC system.

    High Performance Computing, General-Purpose Computing on Graphics Processing Units, Marine Biodiversity

    Saleh Alrashed1 and Eltayeb Abuelyaman2
    1Imam Abdulraham Bin Faisal University, Saudi Arabia
    2Nyriad, Cambridge, New Zealand
    3Imam Abdulrahan Bin Faisal University, Saudi Arabia
    A scheme for reducing the overhead of revoking members in secure multicasting environments is proposed. The scheme requires transmission of only the identifications of participants to be revoked. For environments where revocation is infrequent and the number to be revoked is small, serial revocation is recommended. However, if quite a few participants are to be revoked, then parallel revocation will be more practical when storing many encryption keys is not an issue. The revocation of any participant is blindly carried by all participants collaboratively, including the target. For added security, a periodical refreshing of the ciphering keys is recommended to thwart hacking.

    revocation, multicasting, security, overhead

    Mehmet Emin Güllüoğlu 1, Mehmet Reşit Tolun 2
    1Department of Computer Engineering, Baskent University, Ankara, Turkey TAI, Turkish Aerospace Industries, Ankara, Turkey
    2 Department of Computer Engineering, Baskent University, Ankara, Turkey
    Today, embedded real-time applications play an important role in modern life. Satellites are also robust embedded real-time applications. A satellite project can cost over three-hundred million dollars. As many satellite manufacturers validate their satellites before launching, satellite simulators play the most valuable role in validation infrastructures. Specifically, satellite flight software validation has become more important. In this paper, we focused on the round robin (RR), rate monotonic (RM), and event driven (ED) real-time scheduling task methods with respect to their CPU usage performance for satellite simulator infrastructures. The tasks are evaluated and tested by real-time executive for multiprocessor systems (RTEMS). Those scheduling tasks are used in polling mode in the simulation setup. In this study, we compared three task scheduler methods for attitude orbit control system tasks and MIL-STD 1553 bus data distribution controller tasks in a spacecraft simulator environment. The results were close and the values were not segregated, thus, we chose RR and ED, because RR was easy to implement and ED allowed for full control of the tasks.

    Real-time embedded systems, Real-time operating system, Rate monotonic task, Round robin task, Event driven task handling, and Satellite simulations 1

    Optimization Framework for Flavour and Nutrition Balanced Recipe: A Data Driven Approach
    Isura Nirmal1, Amith Caldera2, Roshan Dela Bandara3
    1University of Colombo School of Computing, Sri Lanka
    2 ChildFund Sri Lanka, Sri Lanka
    Food has been playing a major part of human civilization as not only being a physiological need, but being a major factor for defining the culture and society. The choice of the food is mainly depended on both flavour and nutrient but the biasness towards to the flavour factor has lead the human to effect badly on their healthier lifestyle.

    Recipe recommendation literature typically considers either flavour or nutrient factor. Various flavour traits are also preventing the promotion of healthy foods while maintaining the palatability. Our data driven flavour and nutrient optimization framework consists of classification model which achieved 79.546 % prediction accuracy when detecting the cuisine, generalized flavour recommendation approach and personalized nutrient recommendation approach when deciding the optimization task.

    recipe recommendation, data mining, personalization, recommender systems, similarity measurement

    Issar Arab 1, Violetta Cavalli Sforza 2 ,Safae Bourhnane 2
    1Technical University of Munich (TUM), Garching/Munich, Germany, TUM Department of Informatics
    2 Al Akhawayn University in Ifrane, PO. BOX. 104, Ifrane 53000, Morocco School of Science of Enginering
    Natural Language Processing (NLP) is getting more and more popular. It represents the ability of a computer to understand and process the natural language spoken by humans. NLP has been used in many fields to solve different issues one of which is documents indexing. We based our research project on a real problem that persists over the years at Al Akhawayn University in Ifrane (AUI). This problem resides in the difficulty of retrieving a desired document from the pool of theses, capstones, and master projects submitted as a graduation requirement by AUI’s alumni. Thus, we decided to use NLP in order to index the documents and hence, facilitate the access to the pool of reports. We also made use of the famous lexical English database named WordNet. In this paper, we are describing the steps of using NLP along with WordNet and other technologies to index the reports. Through this, we are aiming at creating a simple, efficient, and easy-to-use web-application powered by NLP techniques and technologies.

    Indexing, Natural Language Processing, Parse Trees, Part-of-Speech Tagging, WordNet, n-grams, k-skip-grams

    Youssef Taher
    Center of guidance and Education Planning, Rabat, Morocco
    Today, the videos published by YouTube platform can be a very powerful learning tool, as they add a dynamic element to the eLearning tools, improve knowledge transfer, demonstrate complex procedures, and help explain difficult topics. For a given scientific field, the scientific knowledge published by the YouTube platform evolves in various dimensions (subject dimension, time dimension, space dimension…). On the other hand, the dynamic of this scientific knowledge represents unfilled gaps in a knowledge sought by a final user (non-existent information, incomplete information, unanswered queries…). In this context, the present investigation proposes a new model of YouTube recommendation system. This investigation’s objective is to transform this platform into an intelligent guide that can bring co mplete answers to the scientific knowledge queries and promote the development of a new form of collective intelligence.

    YouTube platform, Recommendation system, Scientific knowledge gaps, Ontology, Ontology web language.

    Hsiu-Sen Chiang 1, Yi-Wen Lin 1, Mu-Yen Chen 1, and An-Pin Chen 2
    1National Taichung University of Science and Technology,Taiwan ROC
    2National Chiao Tung University,Taiwan ROC
    People in modern society have become vulnerable to stress. Some would come down with depression or choose to commit suicide when they can no longer deal with frustrations or difficulties. How stress is handled and relieved is critical because inappropriate solutions will result in troubles and problems for others. A majority of people have grown fond of live broadcasts these years. The topic of whether or not live broadcast relieves stress is not yet addressed in previous studies. Because of personal characteristics, each person feels differently about live broadcast viewing. Therefore, with the use of stress survey as a form of stress test, this study aims to examine the efficacy of personal stress relief by viewing different types of live broadcasts. This study examines whether different types of live broadcasts help to relieve the viewers’ stress. Twenty participants were randomly divided into four groups, five people in each group. Experiment lasted thirty minutes. The participants in the control group did not do anything, while those in the other three experiment groups watched different types of live broadcasts (game, music, and unboxing). Stress surveys were filled before and after the experiment, so as to distinguish between before- and after-test. “Paired sample t-test” was conducted to identify whether viewing live broadcasts helps to relieve stress. Furthermore, “analysis of variance” was performed in different groups to identify the efficacy of stress relief with regard to different types of live broadcasts

    Live Broadcast, Stress Relief methods, Mental Stress

    Chien-Cheng Lin1, Min-Chih Hung 1, Hsiu-Sen Chiang 2, Mu-Yen Chen 2 and An-Pin Chen 1
    1Department of Information Management and Finance, National Chiao Tung University, Hsinchu, Taiwan
    2Department of Information Management, National Taichung University of Science & Technology, Taichung, Taiwan
    The empirical research on the financial trading market in the past few years is quite important and complex. In addition to discover the characteristics of the market trend movement, that buying and selling timing in the market has a great impact on profitability performance. Therefore, through the extension study of previously published literature, there should be existing relatively low risk buying and selling points in the trading market. Using market profile indicators and financial engineering physical quantities to find trade signals, and using reverse-operation trading strategies to verify whether it is a relatively low-risk buying and selling point. The results of this study show that by statistically significant differences in profitability performance, it proves that there exist relatively low risk buying and selling points in the financial trading market. There are three contributions to this study: 1).This study refutes both the Efficient-market Theory and the Random Walk Theory and there is the existence of relatively low-risk buying and selling points in the market. 2). Verify the financial physics of the trading market 3). Verify the applicability of the new indicator definition for the market profile.

    Market Profile Theory, Financial Physics, Neural Networks, Trading Analysis

    Abdulrahman Alreshidi
    College of Computer Science and Engineering (CCSE), University of Hai’l (UoH), Ha’il, Saudi Arabia
    Mobile computing is fast replacing the traditional computing paradigms by offering its users to exploit portable computations and context-aware communications. Despite the benefits of mobile computing such as portability and context-sensitivity, there are some critical challenges such as resource poverty of mobile devices and security of mobile user’s data that must be addressed. Implementing the security mechanisms to execute on mobile devices can be challenging as mobile devices lack the required processor, memory and battery resources to support continuous and long-term execution of computation intensive tasks. Cloud computing model can provide virtually unlimited hardware, software, and service resources to compensate for the resource poverty of mobile devices. In recent years, there is a lot of research and development of solutions and frameworks that preserve the security and privacy of mobile devices and their data. However, there has been little effort to secure mobile devices while also supporting an efficient utilization of the limited resources available on mobile devices. In this paper, we propose Security as a Service for mobile devices (SeaaS for mobile) that integrates mobile computing and cloud computing technologies to secure the critical resources of mobile devices. The proposed solution aims to support (i) security for the data critical resources of mobile devices, and (ii) security as a service by cloud servers for an efficient utilization of the mobile device resources. We demonstrate the security as a service based on a practical scenario for the security of mobile devices. The evaluation results show that the proposed solution is (a) accurate to detect the potential security threats and is (ii) computationally efficient for mobile devices. The proposed solution as part of ongoing research provides the foundations to develop a framework to address SeaaS for mobile. The proposed solution aims to advance the research state-of-the-art on mobile cloud computing, while it specifically focuses exploiting cloud-based services to secure mobile devices.

    Software Engineering, Mobile Computing, Cloud Computing, Computer Security, Mobile Cloud Computing., Security as a Service.

    TBFV-M: Testing-Based Formal Veri cation for SysML Activity Diagrams
    Yufei Yin1,Shaoying Liu 1
    Hosei University Tokyo, Japan.
    SysML activity diagrams are often used as models for software systems and its correct-ness is likely to signi cantly a ect the reliability of the implementation. However, how to e ectively verify the correctness of SysML diagrams still remains a challenge. In this paper, we propose a testing-based formal veri cation (TBFV) approach to the veri cation of SysML diagrams, called TBFV-M, by creatively applying the existing TBFV approach for code veri cation. We describe the principle of TBFV-M and present a case study to demonstrate its feasibility and usability. Finally, we conclude the paper and point out future research directions.

    SysML activity diagrams, TBFV, test path generation, formal verification of SysML diagram;

    A Multi-Factor Authentication Method For Security of Online Examinations
    Abrar Ullah,1 Hannan Xiao 2and Trevor Barker2
    1School of Computing and Management, Cardiff Metropolitan University
    2Department of Computer Science, University of Hertfordshire
    Security of online examinations is the key to success of remote online learning. However, it faces many conventional and non-conventional security threats. Impersonation and abetting are rising non-conventional security threats, when a student invites a third party to impersonate or abet in a remote exam. This work proposed dynamic profile questions authentication to identify that the person taking an online test is the same who completed the course work. This is combined with remote proctoring to prevent students from taking help from a third party during exam. This research simulated impersonation and abetting attacks in remote online course and laboratory based control simulation to analyse the impact of dynamic profile questions and proctoring. The study also evaluated effectiveness of the proposed method. The findings indicate that dynamic profile questions are highly effective. The security analysis shows that impersonation attack was not successful.

    Security, Usability, Authentication, Online Examination  

    Automated Visualization of Input / Output for Processes in SOFL Formal Specifications
    Yu Chen1 and Shaoying Liu2

    1Graduate School of Computer and Information Science, Hosei University, Tokyo, Japan
    2Faculty of Computer and Information Science, Hosei University, Tokyo, Japan
    While formal specification is regarded as an effective means to capture accurate requirements and design, validation of the specifications remains a challenge. Specification animation has been proposed to tackle the challenge, but lacking an effective representation of the input/output data in the animation can considerably limit the understanding of the animation by clients. In this paper, we put forward a tool supported technique for visualization of the input/output data of processes in SOFL formal specifications. After discussing the motives of our work, we describe how data of each kind of data type available in the SOFL language can be visualized to facilitate the representation and understanding of input/output data. We also present a supporting tool for the technique and a case study to demonstrate the usability and effectiveness of our proposed technique. Finally, we conclude the paper and point out the future research directions.

    Visualization, SOFL, Formal specification, Data type, Formal methods

    A content-based communication solution to avoid discontinuity in VANET

    Mohamed Anis MASTOURI 1 Salem Hasnaoui1
    1Communication systems – Sys’Com Laboratory National school of engineers of Tunis - ENIT Tunis, Tunisia

    The goal of Vehicular Ad Hoc Network (VANET) is to contribute to safer and more efficient roads in the future by providing timely information to drivers and interested authorities.In the last few years, VANETs have been quite a hot research area and attract so much attention of both academia and industry, due to their particular characteristics, such as high dynamic topology and intermittent connectivity. In this context VANET have great challenges in terms of data exchange and routing in order to avoid the discontinuity.In this article, a proposal of a content-based communication approach is presented as a solution to bypass non predictable mobility and loss of connectivity.


    VANET, publish-subscribe, MANET, opportunistic, routing.

    A Web-based Solution for Power Quality Data Management
    Claudiu Popirlan1, Gabriel Stoian1, Leonardo Geo Manescu2, Denisa Rusinaru2, Marian Ciontu2, Gabriel Cosmin Buzatu2, Miron Alba3, Adrian Cojoaca3
    1Computer Science Department & INCESA Research Hub for Applied Sciences University of Craiova, Romania
    2Faculty of Electrical Engineering & INCESA Research Hub for Applied Sciences University of Craiova, Romania
    3Oltenia Distribution SA, Craiova, Romania
    In this paper we present a web-based solution designed to convert (integrating an appropriate plugin) heterogeneous information provided by any type of power acquisition equipment into standard formats as PQDIF (Power Quality Data Interchange Format - IEEE® Std 1159.3-2003 standard). The power grid operators are interested for this kind of applications capable to convert huge volume of heterogeneous information into standard formats in order to be easily processed.

    Web-Based Solution, Java Enterprise Application, Power Quality, Data Conversion  

    Using Supervised And Unsupervised Machine Learning Algorithms For Business Process Modeling
    Muhammad Ayaz1
    1Department of Computer Science, Umm Al-Qura University, Makkah Al Mukarramah,
    Kingdom of Saudi Arabia
    Business process modeling is a challenging task for every organization because of the data complexity used in the organization. Nowadays, every business industry wants to have a digital business process which is generated automatically from any business transaction from the recorded data. This data is recorded in any database event log or inside any data warehouse. To generate the digital business process automatically in any business environment from such type of data is a challenging task, because it requires deep knowledge of the data and process. Moreover, it takes a long time, sometime weeks and even months. To overcome these issues and to model a quick business process, I proposed a model which used the machine learning techniques to generate the business process automatically from the recorded data. In my approach, I used supervised and unsupervised machine learning techniques to model business process from aggregated and sequence data. Moreover, a limited number of notations were used that made it easy to understand and analyze.

    Business Process Management, Knowledge discovery, Data Mining, Supervised, Unsupervised

    Li Yan1, Wang Dan2, Han Bo2
    1Academy of opto-electronics,Chinese Academy of Sciences, Beijing, China
    2Science and Technology on Underwater Acoustic Laboratory, Harbin Engineering University, Harbin, China

    Using the structure of nested arrays, a broadband direction of arrival (DOA) estimation algorithm is presented. The algorithm combines different sensors data to construct a covariance matrix, which is equivalent to that of the uniform linear array whose aperture is same as the nested arrays, and then the DOAs of broadband sources can be estimated by the incoherent MUSIC algorithm. Compared to the traditional coherent subspace algorithms, the proposed algorithm does not need to pre-estimate the azimuths of sources. When the number of sources is greater than the number of sensors, it can still perform well. Under the condition of same array aperture, the number of the sensors is reduced considerably by using nested arrays, and the good localizing performance and spatial resolution can be obtained. Finally, computer simulations show the performance of the proposed algorithm


    Array processing, DOA estimation, broadband signal, nested arrays

    A Semantic Collaborative Clustering Approach Based on Confusing Matrix
    Damien E. ZOMAHOUN
    University of Burgundy Dijon, France
    In this paper we discuss about a new images retrieval technique based on clustering. We argue that images don’t have an intrinsic meaning, but they can receive different interpretation. These images can complicate documents retrieval. However, users need a quick and direct access to documents. To answer this requirement, we propose a retrieval approach which use a collaborative clustering technique based on confusing matrix.

    Borative Clustering, Retrieval, Confusing matrix, Mapping Function

    An Approach of Semantic Retrieval of Information Based on Local Granularity and Similarity Measures
    Damien E. ZOMAHOUN
    University of Burgundy Dijon, France
    The success of the information retrieval system hinges on being able to allow an efficient access to information in a way that is cost-effective and consistent with adopted retrieval technics based on similarity measures. This work focuses on an intelligent approach to semantic retrieval of information. We propose an architecture, in which the textual information extracted from the query is mapped to semantically meaningful keywords of documents. Our key contribution is the use of local granularity and similarity measures to retrieve information. Both at metadata and visual levels is used to speed up the retrieval of subsequent documents within the same domain as well as to improve future query and retrieval of information.

    Information, Semantic, Retrieval, Similarity Measure, Local Granularity

    Pan Hongxia1, Pan Mingzhi2, and Xu Xin1
    1School of Mechanical Engineering, North University of China, Taiyuan, China
    2 School of Mechanical Engineering, Jinzhong University, Jinzhong, China
    According to endpoint effect problem in the EMD decomposition process, a new method of inhibiting endpoint effect is put forward in this paper. The method of data extension and window function predicted based on AR model was used to suppress the problem effectively. Proven by simulation of the simulation signal obtained the ideal effect, then, apply this method to the gearbox test data is also achieved good effect in the process, for the analysis of the subsequent data processing to improve the calculation accuracy. Finally, it lays a good foundation for fault diagnosis of gearbox under various working conditions.

    AR Model, Data Extension, Window Function, Empirical Mode Decomposition(EMD), Non-stationary Signal

    Yougjae Kim
    Korea Polytechnic University, Prof. ph.D. Republic of Korea
    Recently, the entry of the domestic financial institutions is brisk into the Southeast Asian markets and the progress is also being made for export of the financial services and systems by the domestic financial institutions. The demand for biometric authentication service is on a trend of significant increase owing to the advances in new technology such as the Internet of Things (IoT), artificial intelligence and others as well as the expanded use of fin-tech and electronic financing arrangements.The global market size of biometric authentication service is expected to grow to USD 15.1 billion in 2025 from USD 2.4 billion in 2016 (a growth by 6 times, Tractica, “Biometrics Market Forecast”, Feb. 6, 2017). By way of the enactment and propagation of the standard for the authentication method to verify the customers in non-face-to-face transactions by utilizing the biometric data registered by the customers and split to and stored at multiple institutions for distributed management, the contributions can be made to the proactive response to the leakage of biometric data, prevention of the infringement of customer privacy and the activation of the biometric authentication service and its related industries.The purpose of paper is to propose to standardize the international standard such as ISO and IEC for the distributed management technology of biometric data.

    Biometric, Registration, Authentication , Certification, Pin-Tech, Financial transactions