Zhe Jun Tang, Ph.D

Machine Learning Researcher specializing in probabilistic modeling, deep learning, and large-scale data analysis. Experienced in designing algorithms for complex, high-dimensional problems across computer vision, NLP, and time-series forecasting, bridging theory and practical impact.

LinkedIn Scholar Email CV

Education

University of Oxford

Oxford, U.K.

Masters in Eng. Science | Non-Graduating Study Abroad Year; Thesis Graded First

2018

Courses: Statistical Learning, Information Engineering, Control Systems, Machine Vision & Robotics

National University of Singapore

Singapore

Bachelors in Electrical Engineering | First Class Honors

2019

Texas Instruments Book Prize Award | Top student in Digital Signal Processing and System

United Engineers Book Prize Award | Top student in Industrial Control System

First Author Publication List

3iGS - Factorised Tensorial Illumination for 3D Gaussian Splatting

Paper Code

ABLE-NeRF - Attention Based Rendering with Learnable Embeddings for Neural Radiance Fields

Paper Code

MPT-Net - Mask Point Transformer Network for Large Scale Point Cloud Semantic Segmentation

Paper

Experience

NTU S-Lab for Advanced Intelligence

Singapore

Machine Learning Researcher | Neural Rendering, Real-Time 3D Graphics

2022 – 2024
  • Developed learned reflectance parameterisations to model non-linear radiance behavior in 3D Gaussian splatting.
  • Optimised structured tensor decompositions for efficient real-time illumination modeling.
  • Proposed transformer-based inverse rendering models embedding physics priors into attention.
  • Experimented with LLMs/VLMs for language-aligned scene editing objectives.

SenseTime Research

Singapore

Algorithm Researcher | Point Cloud Segmentation & 3D Scene Understanding

2019 – 2022
  • Built compute-efficient cross-attention transformer for large-scale 3D point segmentation.
  • Designed masked token attention reducing class-level decoding complexity.
  • Parallelized training across >100 GPUs cutting training time by 40%.

ST Electronics

Singapore

AI Research Intern | Signal Processing & Drone Detection via Deep Learning

2017 – 2018
  • Developed patented deep learning method for spectrum-based drone detection.
  • Deployed CUDA-based FFT algorithms to enhance air defense system performance.

University of Oxford, Oxford Photonics Group

Oxford, U.K.

Student Researcher | Optical Wireless Communication & Tracking

2017 – 2018
  • Built high-speed optical communication links for VR applications.
  • Designed low-cost optical tracking systems reducing costs by 90%.

Technical Proficiencies

Languages

Python (proficient); C, CUDA, Verilog, Assembly (basic)

Platforms

NVIDIA GPU clusters (HPC), Linux, Xilinx FPGA, HFSS, Arduino

ML Frameworks

PyTorch, TensorFlow, NumPy, pandas, SciPy, scikit-learn, Matplotlib, OpenCV

Honors, Awards, and Competition Ranking

NTU - SenseTime Talent Programme: Full scholarship for Ph.D Candidature and Monthly Stipend of ~US$4,000

Texas Instruments Book Prize: Top Student in NUS ECE for Digital Signal Processing and System

United Engineers - Faculty of Engineering Annual Book Prize: Top Student in NUS ECE for Industrial Control System

LBC Family Engineering Scholarship: Awarded to top 0.33% Engineering Students for dual matriculation in Oxford University

WorldQuant Brain / Quantitative Researcher: GOLD Certificate

SemanticKITTI LiDAR Segmentation Competition 2021: Worldwide 11th (Solo)

?
Background: stochastic gradient descent on a smooth multimodal potential f(x,y) = 0.15(x² + y²) + Σ aᵢ exp(-((x - cₓᵢ)² + (y - c_yᵢ)²) / (2σᵢ²)) , with isotropic Gaussian noise. Particles descend, fade when ∥∇f∥ is below a threshold, and respawn. Contours depict iso-levels of f.