Skip to content

COMP0160: Perception and Interfaces


COMP0160: Perception and Interfaces course offers students a gateway to get familiar with various aspects of perception and interfaces. Greater details of the course and its broad description can be found in course website.

Computational light laboratory contributes to COMP0160: Perception and Interfaces by providing two lectures introducing the human visual system, its relation with graphics and displays, and sensing modalities in emerging devices (e.g., near-eye displays for virtual reality and augmented reality). Each of these lectures is two hours long. In addition, we support these lectures with laboratory assignments for the students, which are vital for completing the course.


The timetable provided below show parts of COMP0160 that are provided by computational light laboratory.

Date Instructor(s) Content
14 January 2022 - 27th March 2022 Kaan Akşit Practical
17th January 2022 - 23rd January 2022 Kaan Akşit Visual Perception in graphics and displays
28th February 2022 - 6th March 2022 Kaan Akşit Integrating Sensory Information in Computational Displays



12:00 noon to 1:00 pm, Fridays, 14th January 2022 - 27th March 2022

Chandler House G15

Description (Public)


First coursework

Lecture 2: Visual perception in perceptual graphics and computational displays

Winter 2022


Recording (Password protected)

Slides (Invitation required)

This lecture focuses on human visual perception and its applications in computer graphics and computational display domains.


Summary: The students will learn about human visual perception in this course. They will primarily learn about the eye and its structure. The information about the eye explained throughout the lecture will be linked to designing computational displays and perceptual graphics with real cases from the recent literature. Towards the end of this lecture, students will have enough information to build a simplified optical model of a human eye. They will be encouraged to build an eye model using this simplified optical simulation of the human eye.


  • Panero, Julius, and Martin Zelnik. Human dimension & interior space: a source book of design reference standards. Watson-Guptill, 1979.

  • Bekerman, Inessa, Paul Gottlieb, and Michael Vaiman. "Variations in eyeball diameters of the healthy adults." Journal of ophthalmology 2014 (2014).

  • Roberts, Bethany R., and Juliet L. Osborne. "Testing the efficacy of a thermal camera as a search tool for locating wild bumble bee nests." Journal of Apicultural Research 58.4 (2019): 494-500.

  • Park, George E., and Russell Smith Park. "Further evidence of change in position of the eyeball during fixation." Archives of Ophthalmology 23.6 (1940): 1216-1230.

  • Koulieris, George Alex, et al. "Near‐eye display and tracking technologies for virtual and augmented reality." Computer Graphics Forum. Vol. 38. No. 2. 2019.

  • Cakmakci, Ozan, and Jannick Rolland. "Head-worn displays: a review." Journal of display technology 2.3 (2006): 199-216.

  • De Groot, S. G., and J. W. Gebhard. "Pupil size as determined by adapting luminance." JOSA 42.7 (1952): 492-495.

  • Hunt, Robert William Gainer. "Light and dark adaptation and the perception of color." JOSA 42.3 (1952): 190-199.

  • Han, S. H., et al. "The Change of Pupil Cycle Time after Occlusion Therapy in Amblyopia." Journal of the Korean Ophthalmological Society 38.2 (1997): 290-295.

  • Fine, I., et al. "Optical properties of the sclera." Physics in Medicine & Biology 30.6 (1985): 565.

  • Zoulinakis, Georgios, et al. "Accommodation in human eye models: a comparison between the optical designs of Navarro, Arizona and Liou-Brennan." International journal of ophthalmology 10.1 (2017): 43.

  • Herndon, Leon W., Jennifer S. Weizer, and Sandra S. Stinnett. "Central corneal thickness as a risk factor for advanced glaucoma damage." Archives of ophthalmology 122.1 (2004): 17-21.

  • Glasser, Adrian, and Melanie CW Campbell. "Presbyopia and the optical changes in the human crystalline lens with age." Vision research 38.2 (1998): 209-229.

  • Bharadwaj, Shrikant R., and Clifton M. Schor. "Acceleration characteristics of human ocular accommodation." Vision Research 45.1 (2005): 17-28.

  • Campbell, F. W., and G. Westheimer. "Dynamics of accommodation responses of the human eye." The Journal of physiology 151.2 (1960): 285-295.

  • Heron, Gordon, W. N. Charman, and C. Schor. "Dynamics of the accommodation response to abrupt changes in target vergence as a function of age." Vision research 41.4 (2001): 507-519.

  • Phillips, Stephen, Douglas Shirachi, and Lawrence Stark. "Analysis of accommodative response times using histogram information." Optometry and Vision Science 49.5 (1972): 389-401.

  • Deering, Michael F. "A photon accurate model of the human eye." ACM Transactions on Graphics (TOG) 24.3 (2005): 649-658.

  • Ratnam, Kavitha, et al. "Relationship between foveal cone structure and clinical measures of visual function in patients with inherited retinal degenerations." Investigative ophthalmology & visual science 54.8 (2013): 5836-5847.

  • Kim, Jonghyun, et al. "Foveated AR: dynamically-foveated augmented reality display." ACM Transactions on Graphics (TOG) 38.4 (2019): 1-15.

Lecture 7: Integrating Sensory Information in Computational Displays

Winter 2022


Recording (Password protected)

Slides (Invitation required)

This lecture focuses on integrating various kinds of sensory information to the next generation displays.


Summary: In this course, students will learn about sensors and their integration into modern display systems such as Virtual and Augmented Reality near-eye displays and three-dimensional displays. In the first half, a review of various kinds of sensors that could capture vital signs from a user, such as heart rate and gaze orientation, will be provided. The second half will cover applications that use captured sensory information. These applications will be sampled from actual products on the market and research prototypes at the forefront of science.

  • Cennini, G., Arguel, J., Akşit, K., & van Leest, A. (2010). Heart rate monitoring via remote photoplethysmography with motion artifacts reduction. Optics express, 18(5), 4867-4875.

  • Li, Richard, Eric Whitmire, Michael Stengel, Ben Boudaoud, Jan Kautz, David Luebke, Shwetak Patel, and Kaan Akşit. "Optical gaze tracking with spatially-sparse single-pixel detectors." In 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 117-126. IEEE, 2020.

  • Angelopoulos, Anastasios N., Julien NP Martel, Amit P. Kohli, Jorg Conradt, and Gordon Wetzstein. "Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz." IEEE transactions on visualization and computer graphics 27, no. 5 (2021): 2577-2586.

  • Wei, Shih-En, Jason Saragih, Tomas Simon, Adam W. Harley, Stephen Lombardi, Michal Perdoch, Alexander Hypes, Dawei Wang, Hernan Badino, and Yaser Sheikh. "Vr facial animation via multiview image translation." ACM Transactions on Graphics (TOG) 38, no. 4 (2019): 1-16.

  • Yaldiz, Mustafa B., Andreas Meuleman, Hyeonjoong Jang, Hyunho Ha, and Min H. Kim. "DeepFormableTag: end-to-end generation and recognition of deformable fiducial markers." ACM Transactions on Graphics (TOG) 40, no. 4 (2021): 1-14.

  • Glauser, O., Wu, S., Panozzo, D., Hilliges, O., & Sorkine-Hornung, O. (2019). Interactive hand pose estimation using a stretch-sensing soft glove. ACM Transactions on Graphics (TOG), 38(4), 1-15.

  • Glauser, O., Panozzo, D., Hilliges, O., & Sorkine-Hornung, O. (2019). Deformation capture via soft and stretchable sensor arrays. ACM Transactions on Graphics (TOG), 38(2), 1-16.

  • HP Reverb G2 VR Headset

  • MediaPipe Iris: Real-time Iris Tracking and Depth Estimation

  • Brelyon: a window to a whole new world

  • Tobii's eye and head tracking for professional esports


Kaan Akşit



Contact Us


The prefered way of communication is through University College London's online lecture system, Moodle. Please do not reach us through email unless the thing you want to achieve or establish or ask is not possible through the online lecture system.