For statistical decision problems with a finite parameter space, it is well-known that the upper value (minimax value) agrees with the lower value (maximin value). One usually needs to relax the notion of prior to regain this equivalence for infinite parameter spaces. Various extensions of this classical result have been established, but they are subject to technical conditions such as compactness of the parameter space or continuity of the risk functions. Using nonstandard analysis, we extend this equivalence to arbitrary statistical decision problems. Informally, we show that, for every statistical decision problem, the standard upper value equals the lower value when the supremum is taken over the collection of all so-called internal priors. Applying our nonstandard minimax theorem, we are able to show two existing standard minimax theorems in the literature: a minimax theorem on compact parameter space with continuous risk functions and a finitely additive minimax theorem with bounded risk functions. Finally, we establish a new standard minimax theorem on totally bounded metric parameter space with Lipschitz continuous risk functions.