A3: Eye-Tracking Integration

Hello everyone,

My A3 project is focused on a hypothetical program named inSight that utilizes software/hardware advancements in mobile devices to integrate eye tracking software into mobile devices. inSight would then provide learners and educators with a variety of data to help influence educational practice, user experience design, and quantify intangible data.

Please explore my website on this topic by using this LINK

Thank you!


( Average Rating: 4.5 )

11 responses to “A3: Eye-Tracking Integration”

  1. danya sprott

    Hi Braden, I found your forecasting project incredibly interesting. I think this is such a creative way to support students and the examples of how it can change the way we work with our students really intrigued me. Especially when you went through the examples of what a teacher can do with the information of what students spent more time looking at.


    ( 0 upvotes and 0 downvotes )
  2. elizabeth

    “Our mission is that every student should have the right kind of support at the earliest time as possible.” I wish that all governments would heed that sentiment of Optolexia CEO Fredrick Wetterhall. As you mentioned, Braden, numerous individuals would gain from having a mobile eye tracking device. Would an inSight plug-in be available to add the functionality to train eye movement to facilitate scanning, skimming, and reading? I’m in total agreement that inSight could inform UX design. Specifically, as I’ve spent a lot of time struggling with the findability of 523 content, I would also wonder if it was just me or if the other mobile users were facing similar issues.


    ( 1 upvotes and 0 downvotes )
  3. sebastien renald

    Excellent research Braden and your explanation of eye tracking integration is worthy of the best teachers, congratulations! This is the forecast project with which I learned the most. This is new to me and at the same time, I would love to experiment with this technology in the near future. In your section “Forecasting the Future”, the example of inSight particularly interested me, because I teach French as a first language and teaching reading is a huge challenge. Unlike writing, reading is mostly a silent activity that happens in the minds of students. Language teachers often lack the tools and resources to innovate in teaching reading skills. A student with a reading disability (like your example of student B) is usually hard to spot and the diagnosis often comes too late. The technology is already very present in reading assistance such as Text-to-Speech, Audiobooks and Digital TTS Books, Optical Character Recognition or Annotation and Dictation Tools, but this technology is generally used once a diagnosis of reading difficulty has been posed. Eye-Tracking seems a bit intrusive to me and there are still many limits as you explain well on your site, but I believe that this technology can offer an improvement of digital education tools and spot learning problems more quickly for faster and more effective intervention. Thanks again for this excellent OER!


    ( 1 upvotes and 0 downvotes )
    1. Braden Litt

      Thank you for your feedback, Sebastien! As you mentioned, my motivation was to make internal thought processes more visible to those who are supporting the learner. Reading assessments are often so formal and time-consuming that those who struggle with reading are often overlooked, as you mentioned, but would benefit significantly from early intervention. If I were to add a feature, I would want inSight to also be applicable for those reading physical books so you would set the mobile device up nearby to scan as you read since many younger children learn to read with paper copies.


      ( 0 upvotes and 0 downvotes )
  4. Eduardo Rebagliati

    Hi Braden. What an interesting forecast! I think it’s very original how you connected eye tracking with usability in learning contexts. I’ve read a couple of articles in which brain activity is monitored to retrieve data that instructors can use to tailor and improve learning, but it never crossed my mind how eye movement could also provide relevant data. That is fascinating and I think the examples you provided helped me understand more clearly how the data could be used. It would be very interesting if inSight could be integrated with other technologies that monitor other activities within the human body, such as brain activity and respiration. This way, correlations could be done to arrive at more robust conclusions of what might be the subjective experience of students. This could be useful because some data could be extraneous, for example, if the student turned away to another device to read a text message.


    ( 1 upvotes and 0 downvotes )
    1. Braden Litt

      Hi Eduardo, thanks for the feedback! I do think of inSight as a potential predecessor to something like monitoring brain activity or respiration since it would not involve any physical sensors and would be less invasive than something like a neural implant (as we saw during the transhumanity OER). It would be something that could maybe be integrated into a more holistic biometric monitoring program in the future. There would definitely need to be algorithms or understandings developed to help differentiate between what is meaningful data and what is irrelevant.


      ( 0 upvotes and 0 downvotes )
      1. Eduardo Rebagliati

        Hi Braden. Exactly, that’s what I was envisioning (a holistic biometric monitoring program). I think that including eye tracking could be very beneficial because we can get a lot of information about that. While probably our thoughts would be the more representative element of our subjective experience, I think that the eye information is also significant, as there is a strong connection between our visual experience and our inner state.


        ( 1 upvotes and 0 downvotes )
  5. Aaron Chan

    Hey Braden – I think Insight is an innovative and useful concept. This data can definitely help identify subtle learning styles/obstacles that cannot be observed otherwise. Going over your A3, I see a lot of potential for this mobile technology, but seem to have trouble getting over the “Rigid Function Condition” challenge you identified. Apart from reading relatively short passages, I would assume that students typically do not prefer to work (particularly answering long text questions) on a phone. Further, the nature of a mobile phone is “mobility”, which may potentially affect the software’s accuracy? It seems to make more sense to make this a desktop application, utilizing a laptop webcam, where all lesson content is contained in one screen, and the student and laptop are relatively stable? The other aspect that I’m struggling with is the interpretation of data. It seems like the software makes recommendations based on eye-tracking data – for example, if students are fixated on menu bar, then UI should be changed. But I feel like fixation/non-fixation can be due to numerous positive/neutral/negative rationale? Even if a teacher used Insight to just check if students are doing the readings, I would imagine the students will figure out how to “game” the system 😀


    ( 0 upvotes and 0 downvotes )
    1. Braden Litt

      Hi Aaron, in hindsight I probably could have worded “Rigid Function Conditions” more eloquently. With much of the existing eye-tracking methods, there either needs to be no head movement or changes in environmental conditions during the eye-tracking, which is something that is basically not possible outside of sterile laboratory conditions and immediately limits the potential for eye-tracking. I do forecast that technological advances would be able to overcome this barrier, meaning that stability is no longer an issue. When developing this proposal, I was envisioning learners being able to use this same software across many different mobile devices (particularly something like a tablet) and even the respective differences of data across different devices could be used to evaluate teaching decisions. There would definitely need to be some more research and work done to assess fixation versus inattention.


      ( 0 upvotes and 0 downvotes )
  6. JenniferPetrovics

    Braden, what a great idea for a program that could help the teacher identify needs of the students and individual supports that could be provided. Would you market this as an app that parents could download onto their device for purposes of learning? Where would the data be stored? Would there be videos to support parents in interpreting the data as well? Can parents monitor what you and the students see along with the supports you would be implementing so they can support their student at home? Would students without access to a personal device and app be given a dedicated device in school to look at this? I recognize that these are all questions for a future portion of the study. What you presented has a strong support in the educational context and the specifics about UX would be beneficial in supporting those students. Do you see accommodations being made for students with learning needs already identified?


    ( 1 upvotes and 0 downvotes )
    1. Braden Litt

      Hi Jennifer, I think that my hope would be that this type of software is integrated into mobile devices at production, could be activated at any time, and linked to any other relevant parties to provide information to teachers and potentially parents. I think that I would want to add an artificial intelligence feature that would interpret the data and provide meaningful insights in easily understandable terms or explain relevant data. Obviously, this software’s effectiveness would be predicated on each student having access to a mobile device, so institutional support would be essential to making this a reality. I think the passive nature of this software would mean that it could be integrated seamlessly for those with different learning needs, while I still think the data could be used to evaluate the effectiveness of different interventions that are being put into place.


      ( 0 upvotes and 0 downvotes )

Leave a Reply

You must be logged in to post a comment.