Bumblebee models are a class of field-theoretic models with a nonderivative potential that forces a vector field or a tensor field to acquire a nonzero expectation value, thereby provoking spontaneous Lorentz breaking. A crucial issue in this regard is the stability of the model. In this talk we first review this issue in the context of a vector-field bumblebee model. Then we focus on a recent work involving an antisymmetric two-tensor.