Bridging optical imaging and machine learning through unpaired image translation

Project Overview

This project served as my capstone work at the Tel Aviv University, where I combined the two core focuses of the lab: optical systems and computer vision. The goal was to explore whether image degradation introduced by a Fresnel lens could be corrected using unpaired image-to-image translation with a CycleGAN. Fresnel lenses offer clear advantages in size, weight, and cost, but introduce distortion, color artifacts, and blur that limit their use in imaging systems.

To address this, I built an end-to-end pipeline that captured real sensor data through a Fresnel lens and used a CycleGAN to translate those images toward the domain of a conventional lens. The project emphasized working with unpaired data, real sensor constraints, and practical optical imperfections rather than idealized datasets.

I led the project independently, owning both the optical and software sides of the system. On the hardware side, I worked with a DCx camera sensor and Fresnel lens setup, identifying how sensor parameters such as white balance, black balance, and gain affected image quality and downstream learning performance. One of the main challenges was interfacing a non-scientific sensor with the computer in a way that allowed real-time control and high-quality image capture, which required creative workarounds and iterative testing.

On the software side, I implemented a pre-trained CycleGAN models for both paired and unpaired image translation, adapting the architecture to work with Fresnel-lens data and adjusting the pipeline to support live video input rather than static images. I handled data collection, preprocessing, training, evaluation, and analysis of failure modes such as color shifts, facial distortion, and blur, using those results to guide parameter tuning and future improvements.

My Role

The biggest takeaway was that model performance was often limited less by the network itself and more by sensor accessibility, data quality, and the physical behavior of the lens. Ensuring reliable sensor-to-computer communication and collecting sufficient, well-controlled training data proved to be the most critical factors in improving results.

More broadly, the project strengthened my confidence working across disciplines and navigating ambiguity. It pushed me to think at the system level, diagnose failures across hardware and software boundaries, and iterate under real constraints. The experience solidified my interest in optical sensing and computational imaging, and shaped how I approach complex engineering problems going forward.

Afterthoughts

Previous
Previous

LISA Pathfinder

Next
Next

High Frequency Signal Generator