Distributed Computing (DISC) Group
Prof. Vaidya and his students
research on topics in distributed computing, with an emphasis on design and theoretical analysis of distributed algorithms. Ongoing research addresses the following three areas:
Distributed shared memory systems: Distributed shared memory abstractions are useful to implement inter-process coordination in a distributed setting.
A consistency model specifies the behavior of the shared memory as observed by the processes, and different consistency models are often desired
in different contexts. In our work, we are exploring consistency models for emerging applications such social networking and distributed robotics.
Our interest is in identifying suitable algorithms for achieving the desired notions of consistency, and designing algorithms that implement useful primitives
on top of these consistency models.
- Robust distributed optimization and machine learning:
Multi-agent distributed optimization has many applications. In recent year, its application
in the context of machine learning has received significant attention. We are exploring three research directions in this
context: (i) making distributed optimization and learning robust to tampering of data and communication during training, (ii) privacy-preserving optimization and machine learning, and (iii) making machine learning robust to adversarial samples.
Distributed computation over wireless networks: We are exploring performance of distributed computations
over wireless networks, exploiting the (lossy) broadcast property of the wireless channel. Our past work in this area has included design
of algorithms for distributed optimization, distributed mutual exclusion, and leader election in wireless networks.