Scheme for suppressing atom expansion induced contrast loss in atom interferometers

15 May 2018  ·  Hu Qing-Qing, Luo Yu-Kun, Jia Ai-Ai, Wei Chun-Hua, Yan Shu-Hua, Yang Jun ·

The loss of contrast due to atom expansion induced non-perfect Raman pulse area in atom interferometers is investigated systematically. Based on the theoretical simulation, we find that the expansion of the atomic cloud results in a decrease of the {\pi} pulse fidelity and a change of the {\pi} pulse duration, which lead to a significant reduction in fringe contrast. We propose a mitigation strategy of increasing the intensities of the second and third Raman pulses. Simulation results show that the fringe contrast can be improved by 13.6% in a typical atom interferometer gravimeter using this intensity compensation strategy. We also evaluate the effects of this mitigation strategy in the case of a lower atomic cloud temperature and a larger Raman beam size under different Raman pulse time interval conditions. This mitigation strategy has potential applications in increasing the sensitivity of atom interferometer-based precision measuring, including precision measuring of the gravity, gravity gradient, rotation, and magnetic field gradient, as well as testing of the Einstein equivalence principle.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Quantum Physics Applied Physics