no code implementations • 4 Jun 2022 • Kushal Chakrabarti, Nikhil Chopra
We prove the convergence of the proposed AdamSSM algorithm.
no code implementations • 11 Jan 2022 • Prasad Vilas Chanekar, Nikhil Chopra
We also present a gradient-based co-design solution procedure which involves system coordinate transformation and whose output is provably stable solution for the original system.
no code implementations • 19 Aug 2021 • Kushal Chakrabarti, Nirupam Gupta, Nikhil Chopra
The system comprises multiple agents in this problem, each with a set of local data points and an associated local cost function.
no code implementations • 31 May 2021 • Kushal Chakrabarti, Nikhil Chopra
Two of the prominent accelerated gradient algorithms are AdaGrad and Adam.
no code implementations • 26 Jan 2021 • Kushal Chakrabarti, Nirupam Gupta, Nikhil Chopra
This paper considers the problem of multi-agent distributed linear regression in the presence of system noises.
no code implementations • 15 Nov 2020 • Kushal Chakrabarti, Nirupam Gupta, Nikhil Chopra
The recently proposed Iteratively Pre-conditioned Gradient-descent (IPG) method has been shown to converge faster than other existing distributed algorithms that solve this problem.
no code implementations • 6 Aug 2020 • Kushal Chakrabarti, Nirupam Gupta, Nikhil Chopra
In this problem, the system comprises multiple agents, each having a set of local data points, that are connected to a server.
no code implementations • 13 Mar 2020 • Kushal Chakrabarti, Nirupam Gupta, Nikhil Chopra
In this problem, there are multiple agents in the system, and each agent only knows its local cost function.