Zero-Cost Proxies: An Efficient Approach To Neural Architecture Search

Yash Akhauri

As the landscape of neural network architectures and operations continues to expand rapidly, designing efficient neural network architectures that yield high performance becomes increasingly challenging. Traditional methods for automatically searching neural network architectures often demand substantial computational resources, along with extensive hand-tuning of search hyperparameters and the design space. In this talk, we delve into Zero-Cost Proxies for accuracy, a promising and efficient approach to streamline the search for neural networks. We introduce EZNAS, a novel framework for devising such proxies, and explore its underlying principles and advantages. Moreover, we highlight the crucial role of designing effective zero-cost proxies in enhancing neural architecture search methods, ultimately leading to more efficient and performant network designs with minimal computational overhead.

Bio: Yash is a first-year PhD candidate in the Electrical and Computer Engineering program at Cornell University, working under the guidance of Professor Mohamed S. Abdelfattah. His research primarily revolves around the co-optimization of neural network architectures and hardware systems. Yash's objectives include developing automated techniques for designing and discovering efficient neural network architectures, investigating approaches to compress neural network topologies, and exploring the co-design of hardware accelerators in conjunction with neural network architectures.