An Assistive System based on Kinect Sensor to Help Students with Disabilities in Educational Institutions

DOI : 10.17577/IJERTV8IS040486

Download Full-Text PDF Cite this Publication

Text Only Version

An Assistive System based on Kinect Sensor to Help Students with Disabilities in Educational Institutions

Hanan. E. Abdelkader Dept. of Computer Science Mansoura University, Egypt

W. K. El Said

Dept. of Computer Science Mansoura University, Egypt

Abstract In modern times, the scientific and technological revolution creates new reality in all sectors of society. As the educational sector is the most influential sector of society, the new technologies are employed to develop its activities and operations. One of the new development trends in the education sector is the transition to E-Learning. Unfortunately, physical disabilities affect the ability of a number of students to use E- Learning resources. Therefore, the current study presents an assistive system based on Kinect motion sensor to assist students with disabilities in hand for improving their educational and academic performance in E-Learning systems. The efficiency of the proposed system has been evaluated by a number of experts and end-users. The final evaluation outcomes indicate that, the Acceptance Factor of the proposed system is significantly high.

Keywords Educational Institutions; E-Learning; Students With Disabilities; Students With Hand Impairments; Remote Control; Assistive Technology; Kinect Sensor

  1. INTRODUCTION

    In digital age, the rapid technological changes require reconsideration of modern technology application methods in all sectors of society, including the educational one. One of the new forms of applying modern technology in the education sector is to help students with disabilities to learn better and to achieve their educational goals as their normal peers [1].

    In fact, the right of students with disabilities to public and free education was in 1997 through landmark legislation known as the Individuals with Disabilities Education Act (IDEA) [2].

    This law was reauthorized in 2004 to obligate schools to provide assistive technology for students with disabilities to assure equal access to the schools programs and services [3].

    Assistive technology is any technology that provides creative tools to improve the functional capability of students with difficulties, making them learn normally [4, 5].

    Assistive technology differs from traditional technology like instructional technology as the latter does not take into account the students needs unlike the first [6].

    Assistive technology uses a variety of advanced equipment to improve the educational environment for students with disabilities. Examples of such equipment include: wheelchairs, prosthetics, seeing, listening, communication devices … etc [7].

    In recent times, assistive technology has become the focus of attention in the education sector because it offers many desired advantages, including the following [6]:

    • It develops individual strengths;

    • It motivates students to learn effectively;

    • It uses compensatory tools;

    • It increases learning rates;

    • It helps students to complete difficult academic tasks;

    • It raises the sense of self-efficacy;

    • It faces literacy and numeracy challenges, where it employs many effective tools for this purpose such as word processing, spell-checking etc.

      The aim of this study is to design an assistive system based on Microsoft Kinect sensor for helping students with hand impairments in Mansoura University in Egypt to improve their performance and to maximize their educational and academic gains.

      The paper organization is shown in Fig.1 and simplified as follows: Section (I) is dedicated to present the introduction of the study. Section (II) is dedicated to introduce an overview about the Kinect sensor technology. Section (III) is dedicated to define the research problem. Section (IV) is dedicated to explain the details of the proposed system including components, guidelines, steps, development scenario and evaluation results. Finally, the concluding remarks and directions for future works are presented in the last section

      Fig. 1. Organization of proposed work

  2. BACKGROUND: MICROSOFT KINECT SENSOR

    For several years now, imaging and motion tracking technology have attracted researchers attention particularly in physics laboratory to improve student understanding of particle motives [8]. In the literature, there are many technologies that support motion detection. One of the most

    important motion detectors is Kinect for Microsofts XBox 360. Originally the Kinect sensor was launched in November 2010 by Microsoft corporation for the entertainment market, especially for videogames [9, 10]. Then, its low price and great potential helped the engineers to employed it in other advanced areas[11-13].

    The Microsoft Kinect sensor contains a depth sensor, a color camera and a four-microphone array that provide full- body 3D motion capture, facial recognition and voice recognition capabilities [14]. For educational applications, the Kinect technology cannot be implemented alone in the classroom. So, integration with other devices is required[15].

    In fact, it proved that the application of Microsoft Kinect sensor in education sector achieves many positive advantages, such as[16]:

    • It facilitates and enhances teaching and learning level;

    • It enhances classroom interactions;

    • It increases classroom participation;

    • It improves teachers ability to present digital materials;

    • It provides opportunities for interaction and discussion;

    • It creates enjoyable and interesting interactions types to promote both motivation and learning levels via its multimedia and multi-sensory capacities.

  3. PROBLEM DESCRIPTION

    Nowadays, the conventional learning systems are no longer suitable for new trends of society. Therefore, it is necessary to find other alternatives to keep up with the new world. Today, one of the most effective tools to address the drawbacks of traditional education systems is "E-Learning". This type of learning aims to enable students to access contents of digital materials remotely from anywhere using modern technologies. In the literature, E-Learning systems are two modes: offline and online. Practically speaking, both types mainly depend on computer technology. Therefore, students must be familiar with computer principles and applications in order to be able to use common E-Learning tools such as CDs, DVDs, Internet … etc.

    In educational sector there are many students with physical disabilities whether for inherited factors or acquired changes. These disabilities make this category of students faces a variety of challenges that might hinder their ability to use E- Learning tools including computers devices and applications software. So that, it's more necessary to exploit modern technologies to help this category of students in E-Learning systems.

    Unfortunately, a list of human difficulties includes a large number of disorders such as speech, language, hearing … etc. Therefore, it is very difficult to find one realistic solution to address all these difficulties together.

    As the students with hand impairments are unable to participate in E-Learning systems effectively because of their inability to use mouse or keyboard, the current study provides an electronic system for this category of students. The proposed system employs Microsoft Kinect sensor to remote control the computer and its contents via voice commands or movements of student head.

  4. PROBOSED SSTEM

    To help students with hand impairments to participate in E-Learning systems, a novel assistive system is proposed. For ease of presentation, the details of the proposed system will be divided into separate sections as follows:

    1. System Components

      The proposed system consists of three integrated components. Component-1; is a laptop computer used to receive the system inputs and to initialize the work environment. Component-2; is an external hardware device Microsoft Xbox 360 Kinect Sensor used to detect the student voice commands and to identify the student head movements. Component-3; is an application program that enable students to control the operating system or multimedia files either through speech information or head motion.

    2. System Guidelines

      There are many guidelines which should be taken into account when using the proposed assistive system. List of guidelines includes:

      • The distance between student with hand impairment and Kinect sensor should be between 80 to 150 cm;

      • The voice commands should be in American English (US);

      • The head movement in right direction should be at least 3 cm;

      • The head movement in left direction should be at least 3 cm;

      • The head movement in forward direction should be at least 7 cm;

      • The head movement in backward direction should be at least 7 cm.

    3. System Steps

      The steps of proposed assistive system can be divided into three parts.Part-1; includes the main steps to use the proposed system.Part-2; includes the latent steps to handle the students voice commands. Part-3; includes the latent steps to handle the students head movements.

      1. Main steps of proposed system

        The basic steps for using our proposals are shown in Fig.2 and illustrated as follows :

        Step1: The Kinect sensor is placed above the laptop computer.

        Step2: The Kinect sensor is connected to the laptop computer via a USB cable.

        Step3: The student with hand impairment is seated in front of the laptop computer and the Kinect sensor.

        Step4: The proposed application software is running.

        Step5: The student with hand impairment gives a voice command to the proposed system for determining the work mode from one of the following options:

        • Default Mode: This mode is dedicated to control the installed operating system often Microsoft windows

          either through voice commands or head movements.

        • Media Mode: This mode is only dedicated to control the multimedia files that in playback mode either through voice commands or head movements.

          Fig. 2. Main steps of proposed system

      2. Steps of handling voice commands

        The voice commands are processed as shown in Fig.3 and illustrated as follows:

        Step1: The list of actions is created based on specific words. Step2: The voice command is given by the student with hand impairment.

        Step3: The obtained voice command is detected by the Kinect sensor .

        Step4: The detected voice command is matched with the words in the predefined list. If it is supported, the appropriate action is executed whether on operating system or multimedia files according to the working mode.

        Fig. 3. Steps of processing voice commands

      3. Steps of handling head movements

      The head movements are processed as shown in Fig.4 and illustrated as follows:

      Step1: The list of actions is created based on specific head directions.

      Step2: The head movement is given by the student with hand impairment.

      Step3: The obtained head movement is detected by the Kinect sensor .

      Step4: The detected head movement is matched with the head directions in the predefined list. If it is supported, the appropriate action is executed whether on operating system or multimedia files according to the working mode.

      Fig. 4. Steps of processing head movements

    4. System Development

      To produce a real time computer system, a software application is developed using C# language on Microsoft.NET platform. The proposed application includes a number of graphical screens. A sample screen of graphical interface of proposed assistive application is shown in Fig.5.

      Fig. 5. Sample screen of proposed system

      3

      High

      V.High

      N

      V.High

      High

      Average

      V.High

      (96.45%)

      V.High

      (98.71%)

      Overall Average

      V.High

      (97.58%)

      3

      High

      V.High

      N

      V.High

      High

      Average

      V.High

      (96.45%)

      V.High

      (98.71%)

      Overall Average

      V.High

      (97.58%)

    5. System Testing & Evaluation

    To gauge performance of the proposed system, it has been exposed to a series of comprehensive tests. These tests were performed by two group of testers. Group-;1 includes a number of experts with varied backgrounds in computer science. Group-;2 includes a number of typical end users. The measurement factor used in these experiments is the Acceptance Factor (A-factor) which is divided into five levels according to percentage range specified below in Table 1.

    TABLE I. PERCENTAGE RANGE OF ACCEPTANCE FACTOR

    Percentage [%]

    Acceptance Factor

    50-64

    V.Low

    65-74

    Low

    75-84

    Moderate

    85-94

    High

    95-100

    V.High

    A sample of these tests on video files is shown in Fig.6, where the video frames are moving to forward and backward based on the student head movement in both left direction and right direction.

    A1 B1

    The above results indicate that, the proposed system has been accepted with varying degrees, where the acceptance of experts is greater than the acceptance of end users. This gap between experts acceptance and end users acceptance can be justified that, the experts have dedicated testing tools that enable them to provide a complete vision of the efficiency of the proposed system unlike the end users who do not have these tools. Thus, they often focus on some aspects of the proposed system and ignore other aspects during the evaluation process.

  5. CONCLUSION

Recently, new technologies play an important role in supporting students rights in various sectors of society, particularly in education. This paper utilizes Kinect sensor technology to assist students with disabilities in in E-Learning systems. The proposed assistive system only directed to students with hand impairments to enable them to use computer easily through voice instructions or head movements. The performance of the proposed system has been evaluated by a testing team who included a number of experts and end users. The overall evaluation results indicated that, the proposed system achieved a very high Acceptance Factor.

A2 B2

A3 B3

Fig. 6. Sample screenshots of proposed system tests on video files

All outcomes of the proposed system evaluation process have been recorded, organized and summarized in Table 2.

Evaluator

Number

Acceptance Factor

Expert

End User

1

High

VHigh

2

V.High

V.High

Evaluator

Number

Acceptance Factor

Expert

End User

1

High

V.High

2

V.High

V.High

TABLE II. OUTCOMES OF PROPOSED SYSTEM EVALUATION

REFERENCES

  1. F.Silman, H.Yaratan and T.Karanfiller, Use of assistive technology for teaching-learning and administrative processes for the visually impaired people, Eurasia Journal of Mathematics, Science & Technology Education, vol.13, no.8, pp.4805-4813, 2017.

  2. S.Vaughn and S.Linan-Thompson, What is special about special education for students with learning disabilities?, The Journal of Special Education, vol.37, no.3, pp.140-147, 2003.

  3. M.D.Chester, Access to learning: Assistive technology and accessible instructional materials, Massachusetts Department of Elementary and Secondary Education: Malden, MA, USA, 2012.

  4. P.Boucher, Assistive technologies for people with disabilities, In- depth Analysis Science and Technology Options Assessment, Scientific Foresight Unit (STOA), European Parliamentary Research Service (EPRS), European Parliament, 2018. Available at:

    http://www.europarl.europa.eu/stoa/en/document/EPRS_IDA(2018)603 218

  5. M.J.Scherer, Assistive technology, CRC Press, Rehabilitation Science in Practice Series, 2016, Available at:

    https://www.researchgate.net/publication/302877224_Assistive_Techn ology

  6. Computer based assistive technology, Available at: http://eworkshop.on.ca/edu/pdf/Mod28_assistive_technology.pdf

  7. N.J.Stumbo, J.K.Martin and B.N.Hedrick, Assistive technology:

    Impact on education, employment, and independence of individuals with physical disabilities, Journal of Vocational Rehabilitation, vol.30, no.2, pp.99-110, 2009.

  8. J.Balleste and C.Pheatt, Using the XBOX Kinect sensor for positional data acquisition, American journal of Physics, vol.81, no.1, pp.71-77, 2013.

  9. I.P.T.Weerasinghe, J.Y.Ruwanpura, J.E.Boyd and A.F.Habib, Application of Microsoft Kinect sensor for tracking construction workers, Construction Research Congress, pp.858-867, 2012.

  10. S. Zennaro, Evaluation of Microsoft Kinect 360 and Microsoft Kinect One for robotics and computer vision applications, University of Padua, 2014.

  11. B.Suelze and et al, Waving at the heart: Implementation of a Kinect- based real-time interactive control system for viewing cineangiogram loops during cardiac catheterization procedures, In Computing in Cardiology 2013, IEEE, pp.229-232, 2013.

  12. G.Baron, P.Czekalski, M.Golenia, and K.Tokarz, Gesture and voice driven mobile tribot robot using Kinect sensor, In 2013 International Symposium on Electrodynamic and Mechatronic Systems (SELM). IEEE, pp.33-34, 2013.

  13. A. Sinha, K.Chakravarty and B.Bhowmick,Person identification using skeleton information from Kinect, In Proc. Intl. Conf. on Advances in Computer-Human Interactions, pp.101-108, 2013.

  14. M.S.Muneshwara, M.S.Swetha and G.N.Anil, Providing the Natural User Interface (NUI) through Kinect sensor in Cloud Computing environment, International Journal for Innovative Research in Science & Technology, vol.1, no.7, pp.161-167, 2014.

  15. H.M.J.Hsu, The potential of Kinect as interactive educational technology, 2nd international conference on education and management technology, Singapore, vol.13, pp.334-338, 2011.

  16. H.M.J.Hsu, The potential of Kinect in education, International Journal of Information and Education Technology, vol.1, no.5, pp.365- 370, 2011.

Leave a Reply