• Ei tuloksia

RQ3. What are the methodologies in Learning Analytics to extract knowledge

extract knowledge from data available in smart learning environment?

As discussed from research question in section 4.3, we can see there are learning analytics techniques to infer knowledge from programming learning data in smart learning environments: Notifications, Visualisation, and Schedule.

The notification provides an analysis of what must be presented. It would inspire by encouraging them to not give up and improve further, apprising the learners about their procedures and growth by offering them data types. It can act as a source of inspiration, push, or boost for further improvement and keep working hard, and one could also provide suggestions or proposals. Visualisation, as the name suggests, is about ways of presenting the information to the learner. IDE based learning environment has three different techniques for extracting knowledge and submitting it. The demonstration could be helpful in effective pictorial analytics in a learning environment. A significant consideration is that they are pretty efficient in encouraging learning only if the learner engagement is high, e.g., there should be some sought of collaboration in the demonstrations in the demonstration [35].

Human-system interactions & person opinions can too enhance the efficacy of demonstrations. First, the report is quite different from demonstration delivery as it provides text messages, and secondly, it provides a reminder in the form of a popup message in an IDE. The schedule can be used to determine when the information must be presented to the learner. One option would be persevering, which would be continuously available. The state-based technique can help when the user is known to change over into or remain within an educationally fruitless state. On-Demand would be helpful only after the user request. For e.g., an IDE could deliver a “Get Hint” button when clicked, would create a recommendation or analysis to the learner. This type of schedule varies to some extent from the Persevering schedule option. Although it is always available, it is not noticeable to the learner unless it appeals to it.

6 Conclusion and Recommendations

In this work, we explored IDE-based Learning Analytics' exciting prospect for successfully collecting different levels of data and effectively delivering it to the programming students. Correspondingly suitable IDE-based Learning Analytics, which could be a game-changer for future research.

Our review states that we must also improve IDE infrastructure by collaborating with different developers and developing a standardized data format and a unified IDE architecture by creating a standard API for its plugins. Our review also revealed that only a small amount of data is currently collected and investigated in IDE-based Learning Analytics. Therefore, it can be recommended for future research to focus more on the growth of IDE facilities wherein a broader spectrum of data is gathered and evaluated.

7 Future work

Smart Learning Analytics can be defined as “Smart” by analyzing its malleability and how effectively it promotes Self Learning. Smart Learning helps in ´contextual aware learning and can also help in personalized feedback. Learners can track their progress and self-design their learning plans suitable for their learning goals and programming capabilities. Smart Learning Environment and Smart Learning analytics are entirely interdependent. Smart Learning Environment ensures flexible learning anywhere and anytime. And Smart Learning Analytics helps in keeping the smartness intact by improving customization, self-directive. With more advanced research Smart Learning Analytics would be more receptive and supportive for programming goals and capabilities and, on the way, help them to excel in their programming careers.

References

[1] Z.-T. Zhu, M.-H. Yu, and P. Riezebos, ‘A research framework of smart education’, Smart Learn. Environ., vol. 3, no. 1, p. 4, Mar. 2016, doi: 10.1186/s40561-016-0026-2.

[2] F. J. Agbo, S. S. Oyelere, J. Suhonen, and M. Tukiainen, ‘Identifying potential design features of a smart learning environment for programming education in Nigeria’, Int J Learn Technol, 2019, doi: 10.1504/ijlt.2019.106551.

[3] P. Leitner, M. Ebner, and M. Ebner, ‘Learning Analytics Challenges to Overcome in Higher Education Institutions’, in Utilizing Learning Analytics to Support Study Success, D. Ifenthaler, D.-K. Mah, and J. Y.-K. Yau, Eds. Cham: Springer International Publishing, 2019, pp. 91–104. doi: 10.1007/978-3-319-64792-0_6.

[4] T. H. Laine, M. Vinni, C. I. Sedano, and M. Joy, ‘On designing a pervasive mobile learning platform’, ALT-J, vol. 18, no. 1, pp. 3–17, Mar. 2010, doi:

10.1080/09687761003657606.

[5] Z. Papamitsiou and A. A. Economides, ‘Learning Analytics for Smart Learning Environments: A Meta-Analysis of Empirical Research Results from 2009 to 2015’, in Learning, Design, and Technology: An International Compendium of Theory, Research, Practice, and Policy, M. J. Spector, B. B. Lockee, and M. D. Childress, Eds. Cham: Springer International Publishing, 2016, pp. 1–23. doi: 10.1007/978-3-319-17727-4_15-1.

[6] A. Gomes and A. J. Mendes, ‘An environment to improve programming education’, in Proceedings of the 2007 international conference on Computer systems and technologies, New York, NY, USA, Jun. 2007, pp. 1–6. doi:

10.1145/1330598.1330691.

[7] J. A. Luke, ‘Continuously Collecting Software Development Event Data As Students Program’, undefined, 2015, Accessed: Aug. 08, 2021. [Online]. Available:

https://www.semanticscholar.org/paper/Continuously-Collecting-Software-Development-Event-Luke/fd852e24bce2e68a848d5134e51069fb74c40a8a

[8] J. Spacco, D. Hovemeyer, W. Pugh, F. Emad, J. K. Hollingsworth, and N. Padua-Perez, ‘Experiences with marmoset: designing and using an advanced submission and

testing system for programming courses’, in Proceedings of the 11th annual SIGCSE conference on Innovation and technology in computer science education, New York, NY, USA, Jun. 2006, pp. 13–17. doi: 10.1145/1140124.1140131.

[9] D. Burgos, C. Tattersall, and R. Koper, ‘How to represent adaptation in e-learning with IMS learning design’, Interact. Learn. Environ., vol. 15, no. 2, pp. 161–170, Aug. 2007, doi: 10.1080/10494820701343736.

[10] E. Kurilovas, I. Zilinskiene, and V. Dagiene, ‘Recommending suitable learning scenarios according to learners’ preferences: An improved swarm based approach’, Comput. Hum. Behav., vol. 30, pp. 550–557, Jan. 2014, doi:

10.1016/j.chb.2013.06.036.

[11] A. Klašnja-Milićević, M. Ivanović, B. Vesin, and Z. Budimac, ‘Enhancing e-learning systems with personalized recommendation based on collaborative tagging techniques’, Appl. Intell., vol. 48, no. 6, pp. 1519–1535, Jun. 2018, doi:

10.1007/s10489-017-1051-8.

[12] B. Vesin, K. Mangaroska, and M. Giannakos, ‘Learning in smart environments: user-centered design and analytics of an adaptive learning system’, Smart Learn. Environ., vol. 5, no. 1, p. 24, Oct. 2018, doi: 10.1186/s40561-018-0071-0.

[13] A. Klašnja‐Milićević, M. Ivanović, and Z. Budimac, ‘Data science in education: Big data and learning analytics’, Comput. Appl. Eng. Educ., vol. 25, no. 6, pp. 1066–1078, 2017, doi: https://doi.org/10.1002/cae.21844.

[14] K. Mangaroska and M. Giannakos, ‘Learning Analytics for Learning Design: A Systematic Literature Review of Analytics-Driven Design to Enhance Learning’, IEEE Trans. Learn. Technol., vol. 12, no. 4, pp. 516–534, Oct. 2019, doi:

10.1109/TLT.2018.2868673.

[15] H. Trætteberg, A. Mavroudi, M. Giannakos, and J. Krogstie, ‘Adaptable Learning and Learning Analytics: A Case Study in a Programming Course’, in Adaptive and Adaptable Learning, Cham, 2016, pp. 665–668. doi: 10.1007/978-3-319-45153-4_87.

[16] J. L. Santos, S. Govaerts, K. Verbert, and E. Duval, ‘Goal-oriented visualizations of activity tracking: a case study with engineering students’, in Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, New York, NY, USA, Apr. 2012, pp. 143–152. doi: 10.1145/2330601.2330639.

[17] J. Liebowitz and M. Frank, Knowledge Management and E-Learning. CRC Press, 2016.

[18] V. Tikhomirov, N. Dneprovskaya, and E. Yankovskaya, ‘Three Dimensions of Smart Education’, in Smart Education and Smart e-Learning, Cham, 2015, pp. 47–56. doi:

10.1007/978-3-319-19875-0_5.

[19] K. M. Bailey and D. Nunan, ‘Voices from the Language Classroom: Qualitative Research in Second Language Education’, 1996. doi: 10.2307/3587662.

[20] R. Ronchi, H.-D. Park, and O. Blanke, ‘Bodily self-consciousness and its disorders’, Handb. Clin. Neurol., vol. 151, pp. 313–330, 2018, doi: 10.1016/B978-0-444-63622-5.00015-2.

[21] S. Charleer, J. Klerkx, E. Duval, T. De Laet, and K. Verbert, ‘Creating Effective Learning Analytics Dashboards: Lessons Learnt’, in Adaptive and Adaptable Learning, Cham, 2016, pp. 42–56. doi: 10.1007/978-3-319-45153-4_4.

[22] L. M. Baker, R. L. Dalla, and C. Williamson, ‘Exiting prostitution: an integrated

[24] R. Pilkington, ‘Developing undergraduate research and inquiry’, Innov. Educ. Teach.

Int., vol. 47, no. 2, pp. 247–248, May 2010, doi: 10.1080/14703291003718976.

[25] ‘Proceedings of the 1st International Conference on Learning Analytics and

Knowledge | ACM Other conferences’.

https://dl.acm.org/doi/proceedings/10.1145/2090116 (accessed Mar. 15, 2021).

[26] M. Kölling, ‘Using BlueJ to Introduce Programming’, in Reflections on the Teaching of Programming: Methods and Implementations, J. Bennedsen, M. E. Caspersen, and M. Kölling, Eds. Berlin, Heidelberg: Springer, 2008, pp. 98–115. doi: 10.1007/978-3-540-77934-6_9.

[27] P. M. Johnson, ‘Requirement and Design Trade-offs in Hackystat: An In-Process Software Engineering Measurement and Analysis System’, in First International Symposium on Empirical Software Engineering and Measurement (ESEM 2007), Sep. 2007, pp. 81–90. doi: 10.1109/ESEM.2007.36.

[28] M. Cimino, G. Lettieri, G. Stea, and L. Lazzarino, ‘Using Web-CAT to improve the teaching of programming to large university classes’, undefined, 2013, Accessed: Jun.

01, 2021. [Online]. Available:

https://www.semanticscholar.org/paper/Using-Web- CAT-to-improve-the-teaching-of-to-large-Cimino-Lettieri/c83bfc773f8e3216a76f50fb3499150ac47169a8

[29] J. Spacco, D. Hovemeyer, and W. Pugh, ‘An Eclipse-based course project snapshot and submission system’, in Proceedings of the 2004 OOPSLA workshop on eclipse technology eXchange, New York, NY, USA, Oct. 2004, pp. 52–56. doi:

10.1145/1066129.1066140.

[30] S. Amann, S. Proksch, S. Nadi, and M. Mezini, ‘A Study of Visual Studio Usage in Practice’, in 2016 IEEE 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER), Mar. 2016, vol. 1, pp. 124–134. doi:

10.1109/SANER.2016.39.

[31] A. Vihavainen, T. Vikberg, M. Luukkainen, and M. Pärtel, ‘Scaffolding students’

learning using test my code’, in Proceedings of the 18th ACM conference on Innovation and technology in computer science education, New York, NY, USA, Jul.

2013, pp. 117–122. doi: 10.1145/2462476.2462501.

[32] H. Böck, ‘IntelliJ IDEA and the NetBeans Platform’, in The Definitive Guide to NetBeansTM Platform 7, H. Böck, Ed. Berkeley, CA: Apress, 2011, pp. 431–437. doi:

10.1007/978-1-4302-4102-7_40.

[33] N. C. C. Brown, M. Kölling, D. McCall, and I. Utting, ‘Blackbox: a large scale repository of novice programmers’ activity’, in Proceedings of the 45th ACM technical symposium on Computer science education, New York, NY, USA, Mar.

2014, pp. 223–228. doi: 10.1145/2538862.2538924.

[34] N. C. C. Brown, A. Altadmri, S. Sentance, and M. Kölling, ‘Blackbox, Five Years On: An Evaluation of a Large-scale Programming Data Collection Project’, in Proceedings of the 2018 ACM Conference on International Computing Education Research, New York, NY, USA, Aug. 2018, pp. 196–204. doi:

10.1145/3230977.3230991.

[35] M. Scheffel, H. Drachsler, S. Stoyanov, and M. Specht, ‘Quality Indicators for Learning Analytics’, J Educ Technol Soc, 2014.

[36] S.S, Oyelere, J. Suhonen, and T.H. Laine. 'Integrating Parson’s programming puzzles into a game-based mobile learning application', In proceedings of the 17th Koli Calling International Conference on Computing Education Research, ACM, 2017, pp. 158–162.

[37] F. J. Agbo, S. S. Oyelere, J. Suhonen, and S. Adewumi, ‘A systematic review of computational thinking approach for programming education in higher education

institutions’, In Proceedings of the 19th Koli Calling International Conference on Computing Education Research, 2019, pp. 1-10.

[38] F. J. Agbo, I. T. Sanusi, S. S., Oyelere, and J. Suhonen. ‘Application of Virtual Reality in Computer Science Education: A Systemic Review Based on Bibliometric and Content Analysis Methods’. Education Sciences, vol.11, no. 3, 2021.

[39] F. J. Agbo, S. S. Oyelere, ‘Smart mobile learning environment for programming education in Nigeria: adaptivity and context-aware features. In Intelligent Computing-Proceedings of the Computing Conference pp. 1061-1077, Springer, Cham, 2019.

[40] A. A. Yunusa, I. T. Sanusi, O. A., Dada, S. S. Oyelere, &, F. J. Agbo (2020, October).

Disruptions of Academic Activities in Nigeria: University Lecturers’ Perceptions and Responses to the COVID-19. In 2020 XV Conferencia Latinoamericana de Tecnologias de Aprendizaje (LACLO) (pp. 1-6). IEEE.

[41] F. J. Agbo, S. S. Oyelere, J. Suhonen, and M. Tukiainen, ’Scientific production and thematic breakthroughs in smart learning environments: a bibliometric analysis’, Smart Learning Environments, vol. 8, no. 1, pp. 1-25, 2021.

[42] F. J. Agbo, S. S. Oyelere, J. Suhonen, and T. H. Laine, ‘Co-design of mini games for learning computational thinking in an online environment’, Education and Information Technologies, https://link.springer.com/article/10.1007/s10639-021-10515-1, 2021.

[43] S. S. Oyelere, F. J. Agbo, I. T. Sanusi, A. A. Yunusa, and K. Sunday, ‘Impact of Puzzle-Based Learning Technique for Programming Education in Nigeria Context’, In 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT) (Vol. 2161, pp. 239-241). IEEE, 2019.

[44] F. J. Agbo, S. S. Oyelere, and N. Bouali, ‘A UML approach for designing a VR-based smart learning environment for programming education’, In 2020 IEEE Frontiers in Education Conference (FIE) (pp. 1-5). IEEE, 2020.

[45] F. J. Agbo, S. S. Oyelere, J. Suhonen, and M. Tukiainen, ‘Smart learning environment for computing education: readiness for implementation in Nigeria’, In EdMedia+

Innovate Learning (pp. 1382-1391). Association for the Advancement of Computing in Education (AACE), 2019.

[46] S. S. Oyelere, J. Suhonen, G. M. Wajiga, and E Sutinen, ‘Design, development, and evaluation of a mobile learning application for computing education’, Education and Information Technologies, vol. 23, no. 1, pp. 467-495, 2018.

[47] A. K. Yadav, and S. S. Oyelere, ‘Contextualized mobile game-based learning application for computing education’, Education and Information Technologies, vol.

26, no. 3, pp. 2539-2562, 2021.

[48] S. S., Oyelere, and J. Suhonen, ‘Design and implementation of MobileEdu m-learning application for computing education in Nigeria: A design research approach’, In 2016 International Conference on Learning and Teaching in Computing and Engineering (LaTICE) (pp. 27-31). IEEE, 2016.

[49] S. S. Oyelere, N. Bouali, R. Kaliisa, G. Obaido, A. A Yunusa, and E. R. Jimoh,

’Exploring the trends of educational virtual reality games: a systematic review of empirical studies’ Smart Learning Environments, vol. 7, no. 1, pp. 1-22, 2020.