Unless you’re working on 32 bit hardware, or dealing with legacy systems, there’s really no need to be using 32 bit integers in database schemas or binary formats. There’s ample memory/storage/bandwidth for 64 bit integers nowadays. So save yourself the “overflow conversion” warnings.
This is where I think Java made a mistake of defaulting to 32 bit integers regardless of the architecture. I mean, I can see why: a language and VM made in the mid-90s targeting set-top boxes: settling on 32 integers made a lot of sense. But even back then, the talk of moving to 64 bit was in the air. Nintendo even made that part of the console marketing.