Neural network verification, particularly verification of robustness properties, has received much research attention. However, many existing verification methods overlook the influence of floating-point rounding errors in the deployed neural networks, resulting in unsound verification outcomes. In this paper, we propose a sound robustness verification approach aiming at overcoming this limitation. Our method utilizes real-number intervals to approximate floating-point arithmetic and abstracts floating-point neural networks into equivalent networks using real-number interval arithmetic semantics, thereby effectively taking into account for floating-point rounding errors. We subsequently employ exact MILP formulations to verify robustness over these abstracted networks. We introduce FMIPVerify, a dedicated verification tool tailored to ensure the soundness of floating-point neural network verification. Experimental results demonstrate that FMIPVerify significantly improves the robustness verification ability in floating-point ReLU neural networks compared to established complete methods like MIPVerify.