×
keyboard_arrow_up
Top Software Engineering Research articles of 2020

FROM QUALITY ASSURANCE TO QUALITY ENGINEERING FOR DIGITAL TRANSFORMATION

    Kiran Kumaar CNK, Capgemini India Private Limited, India

    ABSTRACT

    Defects are one of the seven prominent wastes in lean process that arises out of the failure of a product or functionality from meeting customer expectations. These defects, in turn, can cause rework and redeployment of that product or functionality again, which costs valuable time, effort, and money. As per the survey, most of the clients invest much time, energy, and money in fixing production defects. This paper provides information about ways to move into quality engineering from quality assurance mode for digital transformation by diagnostic, Predictive & Prescriptive approaches, it also outlines the overall increase in quality observations, given QA shift left and continuous delivery through Agile with the integration of analytics and toolbox.

    KEYWORDS

    Diagnostic, Predictive & Prescriptive approaches, continuous delivery through Agile.


    ..

    Full Paper
    https://aircconline.com/csit/csit1002.pdf


    Volume Link :
    http://airccse.org/csit/V10N02.html



DESIGN OF SOFTWARE TRUSTED TOOL BASED ON SEMANTIC ANALYSIS

    Guofengli, Beijing University of Technology, Beijing, China

    ABSTRACT

    IAt present, the research on software trustworthiness mainly focuses on two parts: behavioral trustworthiness and trusted computing. The research status of trusted computing is in the stage of active immune of trusted 3.0. Behavioral trustworthiness mainly focuses on the detection and monitoring of software behavior trajectory. Abnormal behaviors are found through scene and hierarchical monitoring program call sequence, Restrict sensitive and dangerous software behavior.

    At present, the research of behavior trust mainly uses XML language to configure behavior statement, which constrains sensitive and dangerous software behaviors. These researches are mainly applied to software trust testing methods. The research of XML behavior statement file mainly uses the method of obtaining sensitive behavior set and defining behavior path to manually configure. It mainly focuses on the formulation of behavior statements and the generation of behavior statement test cases. There are few researches on behavior semantics trustworthiness. Behavior statements are all based on behavior set configuration XML format declaration files. There are complicated and time-consuming problems in manual configuration, including incomplete behavior sets. This paper uses the trusted tool of semantic analysis technology to solve the problem of behavior set integrity And can generate credible statement file efficiently.

    The main idea of this paper is to use semantic analysis technology to model requirements, including dynamic semantic analysis and static semantic analysis. This paper uses UML model to automatically generate XML language code, behavioral semantic analysis and modeling, and formal modeling of non functional requirements, so as to ensure the credibility of the developed software trusted tools and the automatically generated XML files. It is mainly based on the formal construction of non functional requirements Model research, semantic analysis of the state diagram and function layer in the research process, generation of XML language trusted behavior declaration file by activity diagram established by model driven method, and finally generation of functional semantic set and functional semantic tree set by semantic analysis to ensure the integrity of the software. Behavior set generates behavior declaration file in XML format by the design of trusted tools Trusted computing is used to verify the credibility of trusted tools.

    KEYWORDS

    Behavior declaration, behavior semantic analysis, trusted tool design, functional semantic set.


    For More Details :
    https://aircconline.com/csit/csit1002.pdf


    Volume Link :
    http://airccse.org/csit/V10N02.html


DOCPRO: A FRAMEWORK FOR BUILDING DOCUMENT PROCESSING SYSTEMS

    Ming-Jen Huang, Chun-Fang Huang, Chiching Wei Foxit Software Inc., Albrae Street, Fremont, USA

    ABSTRACT

    With the recent advance of the deep neural network, we observe new applications of natural language processing (NLP) and computer vision (CV) technologies. Especaully, when applying them to document processing, NLP and CV tasks are usually treated individually in research work and open source libraries. However, designing a real-world document processing system needs to weave NLP and CV tasks and their generated information together. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. This paper introduces a framework to fulfil this need. The framework includes a representation model definition for holding the generated information and specifications defining the coordination between the NLP and CV tasks.

    KEYWORDS

    Document Processing, Framework, Formal definition, Machine Learning.


    For More Details :
    https://aircconline.com/csit/csit1009.pdf


    Volume Link :
    http://airccse.org/csit/V10N09.html


MODERATION EFFECT OF SOFTWARE ENGINEERS’ EMOTIONAL INTELLIGENCE (EQ) BETWEEN THEIR WORK ETHICS AND THEIR WORK PERFORMANCE

    Shafia Khatun and Norsaremah Salleh International Islamic University Malaysia (IIUM), Kuala Lumpur, Malaysia

    ABSTRACT

    In today’s world, software is being used in every sector, be it education, healthcare, security, transportation, finance and so on. As software engineers are affecting society greatly, if they do not behave ethically, it could cause widespread damage, such as the Facebook-CambridgeAnalytica scandal in 2018. Therefore, investigating the ethics of software engineers and the relationships it has with other interpersonal variables such as work performance is important for understanding what could be done to improve the situation. Software engineers work in rapidly-changing business environments which lead to a lot of stress. Their emotions are important for dealing with this, and can impact their ethical decision-making. In this quantitative study, the researcher aims to investigate whether Emotional Intelligence (EQ) moderates the relationship between work ethics of software engineers and their work performance using hierarchical multiple regression analysis in SPSS. The findings have found that EQ does significantly moderate the relationship between work ethics and work performance. These findings provide valuable information for improving the ethical behavior of software engineers.

    KEYWORDS

    Software engineers, emotional intelligence, work ethics, work performance, quantitative study .


    For More Details :
    https://aircconline.com/csit/csit1014.pdf


    Volume Link :
    http://airccse.org/csit/V10N14.html


UNIQUE SOFTWARE ENGINEERING TECHNIQUES: PANACEA FOR THREAT COMPLEXITIES IN SECURE MULTIPARTY COMPUTATION (MPC) WITH BIG DATA

    Uchechukwu Emejeamara1 , Udochukwu Nwoduh2 and Andrew Madu2 1IEEE Computer Society, Connecticut Section, USA 2Federal Polytechnic Nekede, Nigeria.

    ABSTRACT

    Most large corporations with big data have adopted more privacy measures in handling their sensitive/private data and as a result, employing the use of analytic tools to run across multiple sources has become ineffective. Joint computation across multiple parties is allowed through the use of secure multi-party computations (MPC). The practicality of MPC is impaired when dealing with large datasets as more of its algorithms are poorly scaled with data sizes. Despite its limitations, MPC continues to attract increasing attention from industry players who have viewed it as a better approach to exploiting big data. Secure MPC is however, faced with complexities that most times overwhelm its handlers, so the need for special software engineering techniques for resolving these threat complexities. This research presents cryptographic data security measures, garbed circuits protocol, optimizing circuits, and protocol execution techniques as some of the special techniques for resolving threat complexities associated with MPC’s. Honest majority, asymmetric trust, covert security, and trading off leakage are some of the experimental outcomes of implementing these special techniques. This paper also reveals that an essential approach in developing suitable mitigation strategies is having knowledge of the adversary type.

    KEYWORDS

    Cryptographic Data Security, Garbed Circuits, Optimizing Circuits, Protocol Execution, Honest Majority, Asymmetric Trust, Covert Security, Trading Off Leakage.


    For More Details :
    https://aircconline.com/csit/csit1014.pdf


    Volume Link :
    http://airccse.org/csit/V10N14.html


NETWORK DEFENSE IN AN END-TO-END PARADIGM

    William R. Simpson and Kevin E. Foltz The Institute for Defense Analyses (IDA), Alexandria, Virginia, USA

    ABSTRACT

    Network defense implies a comprehensive set of software tools to preclude malicious entities from conducting nefarious activities. For most enterprises at this time, that defense builds upon a clear concept of the fortress approach. Many of the requirements are based on inspection and reporting prior to delivery of the communication to the intended target. These inspections require decryption of packets when encrypted. This decryption implies that the defensive suite has access to the private keys of the servers that are the target of communication. This is in contrast to an end-to-end paradigm where known good entities can communicate directly with each other. In an end-to-end paradigm, maintaining confidentiality through unbroken end-toend encryption, the private key resides only with the holder-of-key in the communication and on a distributed computation of inspection and reporting. This paper examines a formulation that is pertinent to the Enterprise Level Security (ELS) framework. .

    KEYWORDS

    Appliance, end-to-end security model, ELS, network defenses, web server handlers.


    For More Details :
    https://aircconline.com/csit/csit1014.pdf


    Volume Link :
    http://airccse.org/csit/V10N14.html


QUALITY MODEL BASED ON PLAYABILITY FOR THE UNDERSTANDABILITY AND USABILITY COMPONENTS IN SERIOUS VIDEO GAMES

    Iván Humberto Fuentes Chab, Damián Uriel Rosado Castellanos, Olivia Graciela Fragoso Diaz and Ivette Stephany Pacheco Farfán Instituto Tecnológico Superior de Escárcega (ITSE), Escárcega, México

    ABSTRACT

    A serious video game is an easy and practical way to get the player to learn about a complex subject, such as performing integrals, applying first aid, or even getting children to learn to read and write in their native language or another language. Therefore, to develop a serious video game, you must have a guide containing the basic or necessary elements of its software components to be considered. This research presents a quality model to evaluate the playability, taking the attributes of usability and understandability at the level of software components. This model can serve as parameters to measure the quality of the software product of the serious video games before and during its development, providing a margin with the primordial elements that a serious video game must have so that the players reach the desired objective of learning while playing. The experimental results show that 88.045% is obtained concerning for to the quality model proposed for the serious video game used in the test case, margin that can vary according to the needs of the implemented video game. .

    KEYWORDS

    Quality Model, Serious Video Games, Playability Metrics.


    For More Details :
    : https://aircconline.com/csit/papers/vol10/csit101912.pdf


    Volume Link :
    http://airccse.org/csit/V10N19.html






Journals by Area

menu
Reach Us

emailsecretary@airccse.org


emailjsecretary@airccj.org

close