Tianao (Owen) Zeng

Khan Lab at USC | Los Angeles, CA | Jun. 2023 - Aug. 2025

Khan Lab at USC Research Experience

Built the software and embedded-systems layer for wearable cognitive-state research, spanning experiment interfaces, multimodal physiological sensing, real-time dashboards, and machine-learning analysis.

Research role

Individual Researcher under Dr. Khan, Ming Hsieh Department of Electrical and Computer Engineering

My work at Khan Lab sat between biomedical sensing and production-minded engineering. I helped turn lab protocols into repeatable software workflows: collecting synchronized physiological signals, validating custom hardware, building dashboards for live inspection and recording, and preparing structured datasets for stress and affect classification.

Research systems

What I built

Cognitive stress evaluation and prediction platform

I worked on a Python-based platform for running cognitive-state experiments and converting wearable biosensor streams into machine-learning-ready data. The system supported relaxation, cold pressor, Stroop, arithmetic, and video stimulus tasks while logging participant metadata, timestamps, and task durations into structured files for analysis.

  • Integrated EDA/GSR and PPG-derived signals, including conductance, heart rate, and heart-rate variability features.
  • Built experiment interfaces for repeatable cognitive tasks and event logging so data could be aligned with protocol stages.
  • Processed recordings in Jupyter with NumPy, Pandas, and SciPy for merging, trimming, feature extraction, and dataset cleanup.
  • Evaluated classical ML baselines including random forest, KNN, and SVM; poster results reported 95 percent accuracy for random forest and KNN, and 87 percent for SVM.

Multimodal physiological acquisition firmware and dashboards

I also contributed to the data acquisition stack behind multimodal wearable sensing. The platform combined microcontroller firmware, sensor communication, Python tooling, and a WebBLE dashboard so lab members could inspect, record, and validate physiological streams in both wired and wireless workflows.

  • Supported firmware development around Arduino Nicla Sense ME with ADS1299 EEG/ECG acquisition over SPI.
  • Worked with I2C-connected AD5593R EDA and MAX30102 PPG sensing paths for multimodal signal capture.
  • Built and validated dashboard workflows for serial visualization, recording, wireless DAC control, and EEG sensor checks.
  • Helped design a WebBLE dashboard for wireless monitoring and logging of PPG, EDA, and movement data.

Outcomes

Why it matters

  • Turned exploratory neuroengineering experiments into repeatable software workflows with structured logs and analyzable datasets.
  • Connected embedded sensing, live visualization, and offline ML analysis into one research pipeline.
  • Added a biomedical-signal thread to my AI systems background: noisy sensor data, constrained hardware, data quality, and model evaluation.

Stack

Tools and methods

PythonC++JupyterNumPyPandasSciPyscikit-learnRandom ForestKNNSVMEEGPPGEDA/GSRSPII2CWebBLE

Posters

Research artifacts

Khan Lab neuroergonomics and wearable sensing poster
Khan Lab 2025 curve poster