Machine Learning for Radio Access Networks
I research machine learning for wireless networks, a field that sounds narrow until you're in it, then it touches everything: optimization theory, systems design, signal processing, real-time decision-making under uncertainty. At Purdue, I work on making Radio Access Networks smarter: better resource allocation, better resilience, better policies as systems scale. Before this, IIT Kanpur for signal processing and communication networks, and a stint in embedded systems writing bare-metal firmware for BLE, Zigbee, LoRa, and 5G. Outside the lab: ham radio operator (VU2LWH), two-time Google Science Fair Asia-Pacific finalist, trained Carnatic vocalist and mridangam player, and a guitarist of considerably humbler standing. Online I go by Aravind Gitamuralee - a surname I made from both my parents' names, a small tribute I've chosen to carry.
For a complete and up-to-date list, see my Google Scholar profile ↗
Modern O-RAN architectures expose a fundamental tension: the non-RT RIC operates on slow timescales with global visibility, while the near-RT RIC must react to fast channel dynamics with limited context. This project develops a cross-timescale, closed-loop reinforcement learning framework coordinating policy decisions across both layers. At its core is a novel problem formulation that jointly models channel-distribution-aware resource provisioning and cross-slice SLA risk, treating interference patterns and traffic demands as distributional signals rather than point estimates. The goal is slicing policies that are proactively robust to distribution shift, not reactively corrective.
Implemented a custom OpenAI Gym environment for RAN slicing and trained a PPO agent on real-world Nokia Nordic radio data. Benchmarked against a Proportional Fairness baseline, tracking per-slice throughput and SLA violation rates across training. Written in Python using Stable-Baselines3.
View on GitHub ↗Built a discrete-event simulation of a multi-node queuing network to study end-to-end delay, throughput, and packet loss under varying traffic loads. Implemented in Python with automated plotting to characterize network behavior under different load conditions.
View on GitHub ↗Analytical and simulation-based performance evaluation of communication network topologies. Implemented models for throughput, delay, and loss using Newton-Raphson solvers, flow conservation equations, and multi-topology analysis, all in MATLAB.
View on GitHub ↗A collection of communication engineering implementations spanning MIMO systems and wireless channel modeling. Covers spatial multiplexing, beamforming, BER analysis under fading channels, and channel capacity estimation, implemented in Python and MATLAB.
View on GitHub ↗My academic path spans signal processing, communications theory, and machine learning. Before joining Purdue, I completed a postgraduate degree in Signal Processing and Communication Networks at IIT Kanpur and worked as an embedded systems engineer, developing firmware for wireless protocols including BLE, Zigbee, LoRa, and 5G.
Download Full CVOpen to research collaborations, academic discussions, and industry opportunities in wireless AI, 5G/6G systems, and O-RAN. Feel free to reach out.
Purdue University · Dept. of ECE
West Lafayette, IN 47907