Who can and who is good and dependable at completing assignments free of plagiarism in own words?

Pakistan Journal of Science (Vol. 6 8 No. 2June , 201 6) 164 A TECHNIQUE TO INCREASE THE USABILITY OF E -LEARNING WEBSITES Q. Ain , M. Aslam *, S. Muhammad **, S. Awan **, M. T. Pervez **, N. Naveed ** and A. Basit **S. Qadri *** Department of Computer Science, School Education Department, Govt. of Punjab *Department of Computer Science and Engineering, University of Engineering & Technology, Lahore, Pakistan. **Department of Computer Science, Virtual University of Pakistan, Lahore, Pakistan. ***Department of Computer Science, The Islamia University of Bahawalpur Corresponding Author: [email protected] ABSTRACT: Distance learning is an ancient idea to impart education to the learners who were unable to join the regular educational institutes. Revolution in information technology has changed this mode of learning to e -learning. E -learning is the mode of education in which the learners in remote areas were educated through the use of technology. Many institutes have developed dedicated websites and learning management systems such as open courseware, to int roduce e - learning method. These learning management systems are very efficient, have vast learning material and communication channels without assistive support for physically disabled users. Lack of assistive technology support in these websites stimulat ed negative user experiences. In this research work an e -learning usability evaluation model based on user’s mental model, had been developed. Objective of this research work was to reduce the gap between user’s mental model and designer’s perception. The major problem while improving usability was a communication gap between user and designer. Keywords: Usability, Accessibility, User Experience, Cognitive Burden, Learning Adaptive, Usability Evaluation Model. (Received 19 -02-2015 Accepted 02-06-201 6). INTRODUCTION Just making things easy to visualize may not be a parameter to address the usability. Usability is a well - defined unit of product adoption level of user with contentment, competence and worth. User experience, cognition and learning ability are important factors for website surfing (ISO -9241 -11, 1998). Usability of websites is me asured by its user adaptive level, screen reading level, satisfaction and learnability. The above - mentioned factors affect accessibility of a website design, user adaptation and display of contents with ease of use and navigation. Massachusetts institute o f technology (MIT) launched its open courseware in 2002 for the first time in the history which gave the concept of e -learning. A lot of irrelevant images or textures on the websites lead to cognitive burden which disturbs the users. A combination of intri nsic burden, germane burden and extraneous burden is known as cognitive burden (Hasan, 2012). The purpose of developing e -learning usability evaluation model is to reduce the gap between mental model of the user and designer’s perception. User’s interact ion with system is mental model of the user. One of the major drawback is the gap between the mental model and designer’s perception, which leads to less usability (Norman, 2004). The concept for enhancing quality of teaching and learningvia technology u sing virtual learning environment(s) (VLEs) in medical education is given by Asarbakhsh and Sandar (2013). They reported that 85% of the problems can be identified by questionnaire/survey and testing techniques which help to improve e -learning system. Thin k aloud technique, applied to let the learner to observe the system and to give feedback. They chose attributes like navigation; learning ability, visual design and consistency, which have been used to improve e - learning systems/websites. It is important to achieve sufficient level of competence in an effective and efficient manner. User’s experience is a major challenge, which leads to cognitive burden (Jayakumar and Mukhopadhyay, 2013). In a study, Buchner et al. (2012) demonstrated usefulness of robots by experimental methodology while considering user experience as important issue. User Experience increases gradually and changes periodically.

It is not static at all, and introduced their concept by introducing two kinds of robots i -e one working cell a nd one robotic arm. Robotic arm is feasible and is free to interact. UX measured by the following scales i -e perceived usability, general user experience, emotion and stress. The accessibility issues regarding disability that affect the users in educatio nal context. Physically disabled persons using websites have a great significant influence by the use of assistive technology. Developers of the websites must have knowledge about the distinguished features of assistive technology to work appropriately in Pakistan Journal of Science (Vol. 6 8 No. 2June , 201 6) 165 order to develop websites Freire et al. (2013). The standards for evaluating usability of online learning management systems with a combination of ease in navigation structure were introduced by Hasan (2012).

They used two online automated tools, html too lbox and web page analyzer along with a questionnaire directed towards users of Jordan University website. Each category deals with one usability aspect. A weighted measurement evaluation for user accessibility while using e -learning systems has been dev eloped. They proposed solution using semi -automatic tool. They tested two online learning management systems, as case studies (VULMS and efront) by conducting online survey to measure their usability.

Satisfaction of user depends upon the level of usabilit y, which originated from e -learning system Aslam et al. (2011). A model for adaptive learn -ability, usability and user effectiveness of educational content management systems had been developed by Joo et al. (2011). Their research work intended to develop usability evaluation models and a survey tool for measuring the learn ability, usability and efficiency for academic library websites. A model that affects the efficiency and performance of young users while performing search operations on the web has b een presented by Dinet and Kitajima (2011). They identified six mental model styles in their method: technical view, connection view, functional view, process view, technical functional view and functional -connection view. Browsing sites and defining the s earch processes involves the following three main steps: state the problem, choose the best solution and evaluate the answer. A task analysis model for groupware to perform a task more precisely was developed by Pinelle and Gutwin (2008). Major component s of task model were scenarios, tasks, individual, collaborative subtasks and actions. A launch of new technology in market comes with lots of risks as time, failure and effort (Norman, 2004). New feature in the product enables user to adopt that product. Designer’s objective is to meet the user satisfaction level. User Experience (UX) leads to usability.

Usability of a website depends upon the satisfaction level of user ( Hassenzahl and Tractinsky, 2011) . It is observed in many scenarios that there is no communication between user and designer. The section 2 focuses on developed model. In section 3, results and discussions for the developed model has been documented. In section 4, conclusion with future research section presented. References cited in section 5. MATERIA LS AND METHODS The proposed web based learning usability evaluation model (WBLUEM) is shown in Fig -1. The objective behind the research was to reduce the gap between user’s mental model and designer’s perception.

The discussed factor s were the designer’s perception, the conceptual model based on user’s perception, the user’s mental model and the learning system evaluation parameters. The conceptual model expressed basic functionality, vital standard of systems, websites including int ernal and external issues. It successfully implemented the fundamental objectives of the developed system. Mental model explicated the feelings of users about the working of these websites in the real world. It was a psychological representation of imagina ry, hypothetical and reality. What user believed about the system was mental model. The user’s mental model was based on belief, not facts. The user interface should communicate the basic nature of system well. This was a prime goal of designers. Underst anding the concept of mental models can help you to make sense of usability problems in design. When users have erroneous mental models they make mistakes on websites. Designers develop websites according to their mental level, vast knowledge and experienc e. They have no idea about the categories of different users. A designer cannot understand the accessibility and usability issues of naive users. This bad impression leads to negative user experience towards website. Cognitive science helps to explore ment al models. Designers perception is basically guide of user experience that how much user is comfortable. Every user has its own thinking, perception and action, which results in different usability needs. This variation in the usability requirements leads to a gap between the mental model of user and the designer’s perception . Due to lack of communication between designers and users, the designers to develop websites without understanding the usability needs of the users.

Users who inte ract with the system have different mental model and abilities ( Al-Khalifa et al., 2014) . WBLUEM model is developed to reduce this communication gap.

This may only be possible by conducting surveys, interviews and getting feedback. Based on user perception, the designer changes interface, perform some iterations and validations. The learning sy stem evaluations parameters given in table 1 are used to conduct these kind of surveys. The five parameters usability, accessibility, cognitive burden, learnability and user experience are used for evaluation of a website. Lack of communication between use r and designer causes the issues related to usability and accessibility. This leads to cognitive burden, which has influence on behavior towards user experience. All these factors directly or indirectly affect the learning adaptive -ness of the users. This model has practically implemented use of an online Pakistan Journal of Science (Vol. 6 8 No. 2June , 201 6) 166 questionnaire. User experience is a major challenge, which leads to cognitive burden. Fig 1: Web Based Learning Usability Evaluation Model RESULTS AND DISCUSSION The proposed approach used in current research work was based on an approach of a simple model to define the relationship between system, user and designer (Norman, 2004). The model developed in this research was an enhanced approach of same model. Also this model was based on earlier developed cognition model (Thielsch et al., 2013). Researchers added the conceptual model, which was a main element of WBLUEM. The gap between users’ mental model and designers’ perception may be reduced by conducting surveys, interviews and feedbac k from users time to time. The proposed model is shown in figure 1. When different kind s of users interact with a website their perception about that website isdeveloped. Conceptual model was used by different kind of users. When they have some accessibili ty or usability issues they will suggest in the form interviews.

In this way, d esigners know what user want in the website. The interaction tools used to fill the gap between designer and user are survey, feedback and interview. In a study, Joo et al. (2011) developed model for adaptive learn -ability, usability and user effectiveness of educational content management systems. Designer again updates interface, performs iterations and validations.

Then user’s mental model and designer perception gets equa l. Hasan (2012) introduced new standards for evaluating usability of online learning management systems with a combination of ease in navigation and structure. Table 1 . Usability Evaluation Parameters Based Questionnaire VULMS and Coursera were tested using virtual testing tool. The researchers teste d the test bed VULMS and Coursera using Virtual -testing tool. The testing tool was used to measure the usability and accessibility of the educational as well as other website. Any of the website can be checked by giving its URL or by uploading an HTML file . The complete description of errors is shown in table 2. Usability Ev aluation Parameters Parameters Sub Parameters Usability Browser Compatibility,Collaboration Support,Security, Ease of Use, Ease of Navigation, Completeness, Reliability and Efficiency, Multilanguage Support, Universality, Load Time, Accuracy Accessibility Hardware And Software Support, Controllability, Readability, Navigability, Display Space, Helpfulness Cognitive Burden Satisfaction, Display of Content, Readability, Use of Aesthetics, Design Consistency User Experience Satisfaction, Use r Friendly interface, Error Handling and Support, Attractiveness Learning Adaptive Remember ability, Pedagogy, Learner Facilitation, Curriculum Management Pakistan Journal of Science (Vol. 6 8 No. 2June , 201 6) 167 The results obtained from proposed model were compared with the previous exiting approaches to validate this research. The comprehensive comparison is shown in table 3 and 4. Table 2 . Results obtained Using Proposed Model/Tool . Description Coursera VULMS XML Issues 8 errors 18 errors XHTML issue 195 errors 7 errors Accessibility 216 accessibility issues 5 pages with accessibility problems Standards 1 page that violates W3C standards 5 pages that violates W3C standards Usability 1 page with usability issues 5 pages with usability issues Table 3 .Comparison of Testing Tools with Proposed Tool . Tool Name W3C Validator Virtual Testing tool Capacity Single webpage (main page) Complete website Criteria HTML code, HTML 4.01 without repairing Usability, Accessibility, HTML 5.0 with Repaired code Base W3C compliance WCAG 1.0, 2.0, 3.0 , W3C compliance In this section the researchers had discussed the validity of this research by comparing it with the results of already existing studies and validated the research.

Aslam et al. (2011) used the questionnaire based technique for the comparison of two differ ent leaning management systems of different institutes. Shrivastava et al. (2012) used the questionnaire as a tool. Mentes (2012) conducted a survey based on hypothesis. Asarbakhsh and Sandar (2013) used questionnaire based on learner observation and scenar io. Jayakumar and Mukhopadhyay (2013) used a survey based on user feedback. Now in this proposed approach for enhancing the usability , an online survey questionnaire based on think -aloud and review of task analysis is used. No one of all above researchers used think -aloud and review of task analysis in combined fashion. So this technique has enabled us to get a proper user feedback about the usability of learning websites and hence concluded towards a more effective solution. As far as the support for t ool is concerned, again the existing techniques were either manual or semi - automatic. Only Aslam et al. (2011) introduces a semi - automatic support of tool in their research. All others, in our considerations, Shrivastava et al. (2012), Mentes (2012), Asarba khsh and Sandar (2013) and Jayakumar and Mukhopadhyay (2013 ) used the manual ways and had no support of automatic tools. The tool proposed in this research is fully automatic. Hence, this proposed solution is far better than manual and semi -automatic tool sused by the previous researchers. The criteria used by Aslam et al. (2011) to evaluate the usability of learning manage mental systems was based on feedback and interactivity, learning material, assessment, visibility, learner facilitation and support, error handling and prevention, collaboration support. Shrivastava et al. (2012) used the functionality, reliability and efficiency to observe the usability of different learning management systems. Mentes, 2012 made the attractiveness, controllability, help fulness, efficiency and learn ability as criteria to formulize the usability of learning management systems. Asarbakhsh and Sandar (2013) based their research for usability of learning management systems on the external quality attributes. Jayakumar and Mu khopadhyay (2013) used accuracy, feasibility, utility and propriety for evaluation.

All above parameters and criteria used cover some of the aspects of evaluation. A complete set of all parameters was not used by previous resear chers . In this proposed model, the researcher s had used usability, accessibility, cognitive burden, user experience, and learning adaptive (all with their sub -parameters). Therefore, it is concluded that this research covers all parameters which we re required for the evaluation o f learning management systems. Cognitive burden, user experience, and learning adaptive were not taken under consideration by any of already existing studies. So this proposed model has comprehensively taken into account the vast spectrum of criteria to ev aluate the usability. Aslam et al. (2011) used EFront LMS, VULMS as a test bed in their research. Shrivastava et al. (2012) used Stanford University website and Georgia Institute website in their research. Mentes (2012) used the Namık Kemal University (NK U) website to perform his research. Asarbakhsh and Sandar (2013) used comparative study to conduct their research. Jayakumar and Mukhopadhyay (2013) showed their results only on theoretical basis. In this research, VULMS and Coursera learning management sy stems we re used as test beds to evaluate the usability. These we re more efficient and widely used than other used by previous researcher s. The others were only simple websites or some of them had used either VULMS or Courser a. None of them used both of these LMS as test bed. Both VULMS and Coursera are Pakistan Journal of Science (Vol. 6 8 No. 2June , 201 6) 168 based on ver y complex and comprehensive LMS , so more options to be explored for the usability. Aslam et al. (2011) adopted online data collection module to conduct their resea rch work. Shrivastava et al. (2012) used Logical Scoring Preferences to pr opose their solution. Mentes (2012) implemented Website Analysis and Measurement Inventory methodology in their research. Asarbakhsh and Sandar (2013) adopted think aloud to conduct t he research. Jayakumar and Mukhopadhyay (2013) used Website Quality Assessment Model. In this research, online data collection survey and task analysis, Think aloud and Virtual Testing Tool are used to propose and implement this model . The researcher s had developed usability evaluation model for e -learning websites, by elaborating some important factors in a comprehensiveas compared to previous approaches. The researcher s had defined some tasks in an online activity where the researcher ha d collected u sers feedback based on think -aloud technique. The researcher shad also developed criteria of usability evaluation based on some parameters as usability, accessibility, cognitive burden, and learning adaptive all with their sub -parameters. In the pr oposed conceptual model, the researcher s ha d elaborated some important factors to measure the usability and accessibility . This model performed better than the precious models as shown in table 3 . The deficiencies in the previous models had been addressed in this model . The results obtained by the implementation this proposed model is shown in table 2. The researcher had selected VULMS and global learning management system of Coursera as case study . The improvement in the accessibility of website directly and indi rectly, increases the usability of websites. Conclusion and future work: Websites play important role in e -learning as instructors and virtual classrooms are located at remote distance where everyone learns on equal level. Their testing and evaluation is an important aspect. User surveys are always best for evaluation of websites/learning management systems. The focus during this research was to increase usability of learning websites by the development of usability evaluation model. Designer’s perception must match to user’s mental model, which wa s only possible by taking user feedback and suggestions. The results shown in table 3, VULMS got 8 points for task analysis review and Coursera got 6 points for the same activity. Use of assistive technology support was suggested in findings and outcomes section, which is helpful for physically disabled users. Web based learning is a str ong tool in developing countries. For a healthy interaction of the user to the websites, continuous feedback is necessary from users for learning websites. To improve the usability of the websites/LMS designers m ay be deploy a site intercept survey. REFERENCES Al-Khalifa, H. S., and Garcia, R. A. (2014). Website Design Based on Cultures: An Investigation of Saudis, Filipinos and Indians Government Websites’ Attributes. In Design, User Experience and Usability.User Experience Design for Diverse Interaction Platforms and Environments. Springer International Publishing. Pp. 15 -27 Asarbakhsh.M. and J. Sandar .(2013). E -learning: The Essential Usability Perspective. The Clinical Teacher.Wiley Blackwell Publishing Ltd.US.

10(1): 47 -50 Aslam.M., M. I. A ksam, U. Saqib and A. M. Martinez - Enriquez .(2011). A Weighted Usability Measure for E -learning Systems.The Journal of American Science. USA. 7(2): 680 -686 Buchner. R., D. Wurhofer, A. Weiss and M. Tscheligi . (2012). User Experience of Industrial Robots over Time.In Proc. of 7 th Annual ACM/IEEE International Conf. on Human -Robot Interaction (HRI’12).American Association of Computing Machinery (ACM). USA. pp. 115 -116 Dinet.J. and M. Kitajima (2011). Draw me the Web: Impact of Mental Model of the Web on Information Search Performance of Young Users. In Proc.Of http://library.vu.edu.pk/cgi - bin/nph - proxy.cgi/000100A/http/ihm2011.unice.fr/index.

php 23 rd French Speaking Conf. on Human Computer Interaction. NY. USA. 3(1): 25 -38 Freire. A., R. Bettio, E. Frade, F. Ferrari and J. Libardi (2013). Accessibility of Web and Multimedia Content: Techniques and Examples from the Educational Context. American Association of Computing Machinery (ACM).Web Media Publishers. Salvador. Brazil. 13(11): 7 -8 Gerstadt, Cherie L., Y. Hong and A. Diamond (19 94). The relationship between cognition and action:

performance of children 312 –7 years old on a stroop -like day -night test. 53(2): 129 -153 Hasan.L (2012).Evaluating the Usability of Nine Jordanian University Websites.In Proc. of IEEE Int. Conf. on Commun ications and Information Technology. Tunisia. 9(2): 91 -96 Hassenzahl.M. and N. Tractinsky (2011). User Experience: A Research Agenda. Behaviour and Information Technology.Taylor and Francis.

Darmstadt. Germany. 25(6): 91 -97 ISO.9241 -11 (1998). Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs).

Guidance on Usability.Int. Standard Organization.Vol. 2. pp. 1 -22. Pakistan Journal of Science (Vol. 6 8 No. 2June , 201 6) 169 Jayakumar.R. and B. Mukhopadhyay (2013). Website Quality Assessment Model (WQAM) for Developing Efficient E -Learning Fram ework - A Novel Approach. Int. Journal of Engineering and Technology (IJET). India. 5(5): 3770 -3780 Joo.S., S. Lin and K. Lu (2011). A Usability Evaluation Model for Academic Library Websites:

Efficiency. Effectiveness and Learn - ability.Journal of Library and Information Studies. Taiwan. 9(2): 11 -26 Mccracken, D. D. and R. J. Wolfe. 2004. User -Centered Website Development: Human Computer Interaction Approach. Upper Saddle: Pearson Education. Prentice Hall, Inc. (NJ, USA). 2(1):

25 -48. ISBN: 978 -0130411617 . Mentes, A. 2012.Assessing the Usability of University Websites: An Empirical Study on Namik Kemal University.The Turkish Online Journal of Educational Technology. (Tekirdag, Turkey).

11(3): 61 -68. ISSN: 2146 -7242. Milica. S, C. Pilgrim, G. Lindgaard (2014 ). In Proc. of the 26 th Australian Computer - Human Interaction Conference on Designing Futures: The Future of Design. American Association of Computing Machinery (ACM). NY, USA. 316 -323 Norman. D. A (A) (2004). Emotional design: Why we Love (or Hate) Eve ryday Things. Basic Book Publishers. USA. 1(1): 20 -35 Norman. D. A (B) (2004).Introduction to this Special Section on Beauty. Goodness. and Usability.

Human Computer Interaction.Taylors and Francisco.Lawrence Erlbaum Associates.Inc. USA. 19(4): 311 –18 Pa poutsi, M., Labuschagne, I., Tabrizi, S. J. and Stout, J. C (2014). The cognitive burden in Huntington's disease: Pathology, phenotype and mechanisms of compensation. Movement Disorders . 29 (5): 673 -683 Pinelle.D. and C. Gutwin (2008).Group Task Analysis for Groupware Usability Evaluations.American Association of Computing Machinery (ACM).

USA. 10(4): 281 -311 Seckler, Mirjam, Klaus Opwis and Alexandre N. Tuch (2015).Linking objective design factors with subjective aesthetics: An experimental study on how structure and color of websites affect the facets of users’ visual aesthetic perception. Computers in Human Behavior. 49: 375 -389 Shrivastava, R., R. Pandey and M. Kumar. 2012. Ranking of Academic Websites on the basis of External Quality Measurement. Journ al of Emerging Trends in Computing and Information Sciences.

(Bhopal, India). 3(4): 547 - 553. ISSN: 2079 - 8407. Thielsch. M. T., I. Blotenberg and R. Jaron (2013). User Evaluation of Websites: From First Impression to Recommendation. Interact with Computers Journal. Oxford University Press. UK. 1(1):

123 -205 Wyer Jr, Robert S. and Thomas K. Srull (2014). Memory and cognition in its social context .Psychology Press. Copyright ofPakistan JournalofScience isthe property ofAsianet- Pakistan anditscontent may notbecopied oremailed tomultiple sitesorposted toalistserv without thecopyright holder's expresswrittenpermission. However,usersmayprint, download, oremail articles for individual use.