A Practical Computational Approach to Rate Skill Levels of School Children Using Log File Analysis

DOI : 10.17577/IJERTV2IS90011

Download Full-Text PDF Cite this Publication

Text Only Version

A Practical Computational Approach to Rate Skill Levels of School Children Using Log File Analysis

Norizan Mohamad1, Azlin Dahlan2, Mohd Talmizie Amron3, Zairi Ismael Rizman4 1,2,3Faculty of Computer and Mathematical Sciences

4Faculty of Electrical Engineering

1,3,4,5Universiti Teknologi MARA (UiTM) Terengganu Malaysia

2Universiti Teknologi MARA (UiTM) Melaka Malaysia

Abstract

Children in schools should possess sufficient Information and Communication Technology (ICT) literacy skills so that they can confidently use computers as a learning tool. Lack of the ICT literacy skills would hinder children from getting maximum benefits despite massive spending by the government in the educational sector. This paper has two objectives. First, it reviews the literature on the common practices of assessing students ICT literacy level. Second, it attempts to highlight the potential of the applicability of log file analysis as a tool to assess ICT literacy skill. In this pilot study involving nine school children, the researchers investigated whether the children who own electronic books(e-books) have adequate ICT literacy skill. The log file analysis allows the classification of their expertise into beginner, intermediate or expert based on the response time of completing the given tasks. This paper reported a promising approach in using the log file analysis. It is hoped that more research will be undertaken that exploit the benefits of log file analysis as a tool to assess ICT literacy skills among the Malaysian citizen.

  1. Introduction

    The adoption of Information and Communication Technology (ICT) accelerates information and knowledge development and consumption; and thus acts as the key driver for the future growth of the country [13]. On the basis of general concept of ICT literacy, in this paper, the authors opt for the definition by [27] who define ICT literacy as the ability to use computers at an adequate level for creation, communication and collaboration in a literate society. Skill, on the other hand, is defined by the Oxford dictionary as To know how to do something; capability

    of accomplishing something with precision and certainty; an ability to perform a function acquired or learnt with practice. Often, in the literature, the terms skill and expertise are used interchangeably [7].

    In this respect, Malaysian Government, through its Vision 2020 plan, has taken great efforts towards nurturing an ICT literate society. Malaysian Government believes that the next wave of economic growth will come from the knowledge-based economy, with digital technologies as a key driver of progress. In this context, considerable and accelerated efforts are expected to be continuously made by the government particularly in the field of education. The main aim is to develop the potentials and skills of school children towards becoming ICT competent citizens in the digital economy [10]. In recent years, the focus has shifted from providing the ICT infrastructure to finding whether teachers and students are using ICT effectively; and assessing the ICT literacy skill of students [17]. In addition, ICT literacy skill is among the skills embedded in the Malaysian curriculum in order to meet the changing needs of the Malaysian society [11]. At the state level, Terengganu has been spending generously in the e-book program.

    Just like most new initiatives, the e-book program encountered its own challenges in technical and human aspects. One of the challenges is in the technical definition of the e-book itself. There are many hybrid definitions of e-book found in the literature. The different and imprecise terms used for e-books have led to some confusion in both the education and information technology communities. As a result, a consensus on the term is needed so as to describe the e- book used in the Malaysian Electronic Book Program in order not to be mistaken for other version of e- book used elsewhere [23]. The present authors believe that the different definitions are shaped by the distinction that exists between the content and delivery system in the context of e-books. Initially, e-book

    refers to the digital media equivalent to a conventional printed book. Then, the definition has been extended by researchers to fit their own expectations. In [22] the researcher gives a more comprehensive definition for an e-book that spans across: text in digital form, or a book converted into digital form, or digital reading material, or a book on computer file format, or an electronic file displayed on a desktop, note-book computer or portable device, or formatted for display on dedicated e-book readers. On the other hand, [24] provide a more precise terminology:

    • The digital content which users read (i.e. paperless version of a book, article, magazine etc) is called publication.

    • The physical device used to read a publication is called a reading device (e.g. dedicated readers, personal computers or personal digital assistants).

    • The combination of software and hardware that processes content and presents it to users is called a reading system.

      Clearly, the previous definitions in [22] combines the aspects of software and hardware, which fit into the reading system term. Figure 1 and Figure 2 show the visual of e-book as currently adopted by Terengganu school children in Year 6. In this context, the e-book is used by the school children just like they use notebooks, laptops or the personal computers. In [5], the researchers give a more clear definition by referring the e-book in Terengganu as a mini laptop that is installed with academic applications including digital text-books, dictionary, digital Quran and daily prayer. The present authors agree that in the context of Terengganu e-book, this definition fits the e-book description better. Hence, e-book in this paper will refer to the latter.

      Figure 1. The e-book distributed to Year 6 school children

      Figure 2. A view of an application in the e-book

      Another challenge is in the human aspect of the school children. As this e-book project was only introduced recently, many inquiries had risen among the public on the effectiveness of e-book on school children [1]. In the literature, e-books have been proven to benefit students physically, academically and psychologically [18, 23]. Students have shown familiarity with their e-book and the basic applications available in the e-book. They knew how to open, shut down, save as and they were also good at playing computer games [10].

      However, research effort in the progress and results of the e-book project has been scarce and limited. There is still a need to further investigate the usage of e-book among primary school children particularly in Malaysia [23]. For example, when asked on their preference between e-books and textbook, only 43% school children preferred the e-book. It shows that the students are still in the early stages of getting used to using the e-books for learning. Partly, this was attributed to their ICT literacy. Thus, it is strongly felt that an investigation should be conducted on the implications of e-books to primary schools [18].

      In addition, it is also a challenge to obtain students feedback, review and respond to ensure greater student empowerment and acquisition of specific independent learning skills [2, 16]. The assessment of student knowledge related to specific ICT competencies is important because it can provide important information about the strengths and weaknesses of these students literacy skills [26]. At the same time, students perfomance acts as an indicator to the government that the government is doing something in the right direction [17]. The next section discusses the

      commonly employed methods for assessing ICT literacy skill.

  2. Related Works

    Assessing and classifying user skill levels is important for several reasons. In the area of human- computer interaction, the process of getting to know the users is never ending because there is so much to know and because the users keep changing. Therefore, users with poor reading skills, limited education and low motivation require special attention. A generic separation into novice or first time, knowledgeable intermittent and expert frequent users might lead to different design goals [25]. Moreover, each user has different levels of skill across different applications and even in different portions of the same application. Additionally, users skill levels change dynamically as users gain more experience in a user interface. In order to adapt user interfaces to the different needs of user groups with different levels of skills, automatic methods of skill detection are required [7].

      1. ICT Literacy Assessment Methods

        Traditionally ICT skills are assessed by human examiners marking the output on paper of set exercises. Commonly used evaluation methods such as think aloud protocols, questionnaires and observations are not able to accurately portray the competency acquired by the children [19]. Another mechanism which has been used extensively in the literature to assess ICT knowledge and skills among students is self- assessment. However, its accuracy in providing information on knowledge and competence is much more problematic and unreliable [3, 7, 15]. While some students may underestimate their level of literacy and some may be overestimate concerning their level of ICT literacy, they are not able to accurately report their ICT literacy. Often, they can lead to biased data when the questions are related to patterns of usage such as when asked how many times they completed a certain task. Therefore, data collected directly and automatically using the computer may be a preferred method compared to asking users to provide answers from their memory. In addition, performing other tasks like timing user task completion with a stopwatch, writing notes describing user interactions with software systems, coding notes from observations are laborious, expensive, time consuming and often potentially result into errors [12]. As in [14] states The best way to find out if someone can dance is not to ask them questions about it, but to ask them to dance. Clearly, any designed assessment should be appropriate for its purpose [4].

      2. Log File Analysis

    Hence, the availability of special-purpose software that observes and records what users do simplifies the process and reduce error. The use of this automated software for data collection has many advantages. The software collects data and stores the details of user action in a log file. A log file is a plain text file that indicates what activities were performed, when it happened and other details such as the specific user identification to assist with data interpretation [12]. Log files analysis have been used to analyze usage pattern and infer expertise [9]. For example, log files are used to determine the pattern in the Internet usage behavior of female users against their academic and social activities [20]. To see the potential benefit of log file analysis, the present authors have developed an automated tool called RateSkill to evaluate the current ICT literacy skill of school children who own the e- books [19]. This work is motivated by [7]. However, contrary to their work who pre-identified their participants as novice and skilled from the beginning, the present authors made classification of the participants skill at the end.

  3. Material and Methods

      1. Participants

        The participants in this study were chosen from several primary schools within vicinity. All of the participants (6 girls and 3 boys) were e-book users.

      2. Data Collection

        Using two publicly available programs i.e. CamStudio and PCAgent, the authors collected data via observation and generated log file. CamStudio was used to record all screen and audio activities on the e- book. During testing, the authors prepared the CamStudio application and web camera for the purpose of capturing and observing the participants task event. The result is stored in an AVI video file.

        Another software, PCAgent monitors and records the activities of all users on a computer and store the result into a text file. The PCAgent logs each event with its related data for example the type of event, the location of the mouse cursor, the timestamp at which the event happened and other details. The events such as mouse moves, menu selection and button presses are all recorded in the log file. The timestamp allows the system to calculate the time taken for each user to complete a task, thus offering the benefit to classify users skill in the shortest possible time [7]. A sample of text file generated automatically from PCAgent is shown in Figure 3.

        Figure 3. A sample of text file

      3. Tasks

        In this study, the authors have identified a suitable set of tasks to test the students ICT skills that must be performed by all the participants. All participants were given a document containing similar instructions on how to perform the task. For details, readers are referred to [19].

      4. Log File Data Analysis

    Data analysis is based on the log file generated during the task activity capture. The time required to accomplish a task is the most widely used indicator to measure the skill level. Since log files tend to be voluminous, the RateSkill application is equipped with the capability of omitting irrelevant segment of the log file and pointing to the appropriate starting line. The authors employ string comparison algorithm [8] to parse and extract relevant data from the log file. For every participant, the authors obtained the indicator based on the total amount of time (in seconds) spent to complete each task.

  4. Results and Discussion

    This section highlights some snapshots from the RateSkill system. To use this system, the user should first select the log file. Once the file has been selected, it will be displayed in the FileViewer window as shown in Figure 4.

    Figure 4. Displaying the log file

    Next, in the analysis phase, the RateSkill system reports the time spent for each task as shown in Figure

  5. At the end, it classifies the user into the corresponding proficiency level; whether the user is assumed to be a beginner, an intermediate or an expert. In this study, out of nine participants, only one participant was categorized as expert, six as intermediate and two were beginners.

Figure 5. Result of the analysis

The response time is useful when answering questions such as How long do users take when trying

to open a Paint application? Capturing the time- related data can be automatically done at the level of the operating environment. Although hand-held stopwatches can perform this job well, software that measures and records elapsed times between starting events and task completion is usually more reliable and easier to work with [12].

Qualitatively, the authors also performed observation from the AVI file captured earlier for every participant. The result of the observation can assist the authors to understand participants attitude, state of mind and frustration experience. The captured data were then transcribed into textual descriptions for ease of reference. A transcription for one of the participants is shown in Table 1.

Nor Zakirah did task 1 very well which is in 41 seconds. She quickly found where shortcut of Al-Quran Digital is. Nor Zakirah did not manage to do task 2 because she never uss that application, Dekstop Gadget Gallery. She cound not find where the application is. She just stared at the desktop but failed to find the icon. Even, she did not try to search by selecting the all programs from the menu. For task 3, Nor Zakirah did it in 47 seconds. From our observation, she did one incorrect step in this task, she clicked the icon search after type 25 to go the page 25. She should press enter to go the page 25. Nor Zakirah correctly did task 4 which is Ikhwan Fardhu Ain, in 52 seconds. For task 5 which is Fasohah Jawi, she did it in 27 seconds. In task 6, Nor Zakirah did it in 70 seconds. She searched at google by typing only g in input URL and click Google URL. For MyKamus 1Terengganu application in task 7, Nor Zakirah finished it in 52 seconds. Nor Zakirah did task eight in 61 seconds. She had difficulty in finding which one shape is circle, and she finally decided to take the oval shape and color it successfully two times. For task 9 which is Windows Media Player, Nor Zakirah had problem to find where that application is. She did the right decision to find it in search. But, she could not find it because she typed windows as windos. Then, she found the application at all programs. She completed this task successfully in 79 seconds. For task 10, the notepad, Nor Zakirah did it in 114 seconds. Similar to task 9, she found the application in search. In this task, she noticed that she had typed wrongly where instead of typing the notepad, she typed notepad as notes and then again, she typed noteped. She also had difficulty to save that text file.

From our observation, the researchers think that this candidate never open and use windows media player and notepad before. Overall, Nor Zakirah took 573 seconds to finish the whole task which is equivalent to 9 minutes and 33 seconds.

Nor Zakirah did task 1 very well which is in 41 seconds. She quickly found where shortcut of Al-Quran Digital is. Nor Zakirah did not manage to do task 2 because she never uses that application, Dekstop Gadget Gallery. She cound not find where the application is. She just stared at the desktop but failed to find the icon. Even, she did not try to search by selecting the all programs from the menu. For task 3, Nor Zakirah did it in 47 seconds. From our observation, she did one incorrect step in this task, she clicked the icon search after type 25 to go the page 25. She should press enter to go the page 25. Nor Zakirah correctly did task 4 which is Ikhwan Fardhu Ain, in 52 seconds. For task 5 which is Fasohah Jawi, she did it in 27 seconds. In task 6, Nor Zakirah did it in 70 seconds. She searched at google by typing only g in input URL and click Google URL. For MyKamus 1Terengganu application in task 7, Nor Zakirah finished it in 52 seconds. Nor Zakirah did task eight in 61 seconds. She had difficulty in finding which one shape is circle, and she finally decided to take the oval shape and color it successfully two times. For task 9 which is Windows Media Player, Nor Zakirah had problem to find where that application is. She did the right decision to find it in search. But, she could not find it because she typed windows as windos. Then, she found the application at all programs. She completed this task successfully in 79 seconds. For task 10, the notepad, Nor Zakirah did it in 114 seconds. Similar to task 9, she found the application in search. In this task, she noticed that she had typed wrongly where instead of typing the notepad, she typed notepad as notes and then again, she typed noteped. She also had difficulty to save that text file.

From our observation, the researchers think that this candidate never open and use windows media player and notepad before. Overall, Nor Zakirah took 573 seconds to finish the whole task which is equivalent to 9 minutes and 33 seconds.

Table 1. Sample transcription result of a participant

  1. Summary and Concluding Remarks

    ICT literacy is the prerequisite for a developed society and the society must attain a certain level of

    ICT literacy in order to face the challenges of the Information Age [5]. Being able to identify the students ICT literacy skill at the primary school level would potentially reduce the ICT literacy skills gap among the students. Incompetent school children would hinder them from functioning efficiently and successfully in an information society as envisioned in Malaysian Vision 2020.

    In this paper, the authors have shown the benefit of using log file analysis as an automated ICT literacy assessment for school children. The result of the analysis enables the classification of school children into three categories; whether beginner, intermediate or expert. Practically, it can shed light on identifying the students strengths and weaknesses. As a result, necessary technical provision can be arranged to help the incompetent children.

    However, effective use of the log file analysis tools requires addressing several important challenges. As with any method for data collection, automated methods work best if their use is carefully considered in the context of the specific situation and scope that are being studied [12]. Currently, in this study, the tasks must be carried out in an ordered fashion in order to ease the computational analysis of the log file. The order is important because the keywords used in the implementation correspond directly to the tasks. In future, the authors plan to automate the analysis in random order and employ a bigger population. Also, the authors wish to carry out a more thorough systematic analysis using a relevant combination of qualitative and quantitative in order to investigate the full effect of the tool on certain pedagogical issues.

  2. References

  1. W.M.A.W. Ahmad, N.A. Halim, N.A. Aleng, N. Mohamed, W.A.A.W.M. Amin, and N.A. Amiruddin, Quantitative Analysis on the Level of Acceptance, Usage and Problems of E-Books Among School Teachers in Terengganu, The International Journal of Social Sciences, , 7(1), pp. 89-101, 2013.

  2. Bahagian Teknologi Pendidikan Kementerian Pelajaran Malaysia, Smart School Programme: Rural Smart School Model and SchoolNet, Malaysia National Workshop on Rural ICT Development, 2010, Available: http://www.itu.int/ITU-D/asp/CMS/Events/2010/ITU- ADB/Malaysia/S3-Mdm_Hawa_Hj_Said.pdf.

  3. J.A. Ballantine, P.M. Larres, and P. Oyelere, Computer Usage and the Validity of Self-Assessed Computer Competence Among First-Year Business Students, Elsevier Computers & Education, 49(4), pp. 976-990, 2007.

  4. R.D. Dowsing and S. Long, Trust and the Automated Assessment of IT Skills, ACM Conference on Computer Personnel Research, pp. 90-95, 2002.

  5. N.N. Edzan, Information Literacy Development in Malaysia: A Review, Libri: International Journal of Libraries and Information Services, 58(4), pp. 265-280, 2008.

  6. S. Fahmy, N. Haslinda, W. Roslina, and Z. Fariha, Evaluating the Quality of Software in E-Book Using the ISO 9126 Model, International Journal of Control and Automation, 5(2), pp. 115-122, 2012.

  7. A. Ghazarian and S.M. Noorhosseini, Automatic Detection of Users' Skill Levels Using High-Frequency User Interface Events, Springer User Modeling and User-Adapted Interaction, 20(2), pp. 109-146, 2010.

  8. Gusfield, D., Algorithms on Strings, Trees and Sequences: Computer Science and Computational Biology, Cambridge University Press, New York, USA, 1997.

  9. M. Guzdial, P. Santos, A. Badre, S. Hudson, and M. Gray, Analyzing and Visualizing Log Files: A Computational Science of Usability, Technical Report of Human-Computer Interaction Consortium Workshop, pp. 1-10, 1994.

  10. C.L. Hoon, S.H. Loke, L.S. Eng, M.F. Dzainuddin, and

    Z. Noor, Bridging the Digital Divide: Success and Challenges in the e-Book Project in Terengganu, Malaysia, Joint International Conference Australian Association for Research in Education and Asia Pacific ducational Research Association, 2012.

  11. S.A. Ismail, D. Dorner, and G. Oliver, Issues Related to Information Literacy Education in Malaysian Schools, IACSIT International Conference on Sociality and Economics Development, 10, pp. 204-208, 2011.

  12. Lazar, J., J.H. Feng, and H. Hochheiser, Research Methods In Human-Computer Interaction, Wiley, 2010.

  13. A. Luqman, G.K.S. Nair, T. Vadeveloo, and R. Theethappan, Examining the Information Technology Experience of Government Staff in using E-Government Services, European Journal of Scientific Research, 69(2), pp. 243-249, 2012.

  14. Mager, R.F., Making Instructions Work, Kogan Page London, UK, 1990.

  15. K. Merritt, K.D. Smith, and J.C.D. Renzo, An Investigation of Self-Reported Computer Literacy: Is It Reliable?, IACIS Issues in Information System, VI(1), pp. 289-295, 2005.

  16. Multimedia Development Corporation Sdn. Bhd., The Smart School Roadmap 2005-2020: An Educational Odyssey, 2005, Available: http://www.msc.com.my/ smartschool/downloads/roadmap.pdf.

  17. K. Ng, Making Malaysia's Schools Smarter, futureGOV, 2010, Available: http://www.futuregov.asia/ articles/2010/apr/30/making-malaysia-schools-smarter/.

  18. A.M. Noor, A.M. Embong, and M.R.T.L. Abdullah, E- Books in Malaysian Primary Schools: The Terengganu Chapter, World Academy of Science, Engineering and Technology, 66, pp. 298-301, 2012.

  19. N. Mohamad, A. Dahlan, M.T. Amron, Z.I. Rizman, and

    N.H.R. Husin, Automated ICT Literacy Skill Assessment Using RateSkill System, International Journal of Science and Research, 2(8), pp. 190-195, 2013.

  20. R.J. Oskouei, Behavior Mining of Female Students by Analyzing Log Files, IEEE 5th International Conference on Digital Information Management, pp. 5-10, 2010.

  21. T. Priban, J. Hodinar, and V. Vrbik, A New Approach to Measuring of Computer Literacy at the UVB, Journal on Efficiency and Responsibility in Education and Science, 4(2), pp. 97-104, 2011.

  22. S.S. Rao, Electronic Books: A Review and Evaluation,

    Emerald Library Hi Tech, 21(1), pp. 85-93, 2003.

  23. W. Roslina, S. Fahmy, A. Yaacob, N. Haslinda, and Z. Fariha, Research Directions for E-Book: A Malaysian Perspective, International Journal of Information Technology & Computer Science, 5(2), pp. 8-13, 2012.

  24. N. Shiratuddin, M. Landoni, F. Gibb, and S. Hassan, E- Book Technology and Its Potential Applications in Distance Education, Journal of Digital Information, 3(4), pp. , 2003.

  25. Shneiderman B., and C. Plaisant, Designing the User Interface: Strategies for Effective Human-Computer Interaction, 5th Edition, Addison-Wesley Computing, Boston, MA, 2010.

  26. G.F. Shuster and M. Pearl, Computer Competency: A 7-Year Study to Identify Gaps in Student Computer Skills, Canadian Center of Science and Education International Education Studies, 4(4), pp. 137-148, 2011.

  27. J.-B. Son, T. Robb, and I. Charismiadji, Computer Literacy and Competency: A Survey of Indonesian Teachers of English as a Foreign Language, Computer-Assisted Language Learning Electronic Journal, 12(1), pp. 26-42, 2011.

Leave a Reply