have DBD::SQLite read the DB directly from the ZIP file, using some kind of transparent intermediary layer
IO::Uncompress::Unzip does support a filehandle interface, but since SQLite is implemented in C, I doubt it will be able to make use of that. Since I assume that SQLite will be doing a lot of random access on the file, I doubt that any "transparent" layer would be particularly performant (unless it happens to decide to cache the entire file in memory). So I assume that it would instead be easier and probably better to just extract the database from the archive - whether to a file or to memory would be the next point.
extract the DB into memory (i.e. a Perl scalar), and then have DBD::SQLite read that
Unfortunately I don't see an API method available in DBD::SQLite to do that, although I may have missed something. See also Putting an SQLite DB file into memory, although that is about reading a file on disk into an in-memory database, not using existing memory - the SQLite API would need to support a way to pass a pointer to that memory to it. However, I also don't see any mention of that in the SQLite docs on that topic. Again, I may be missing something (it is Sunday after all ;-) ), but I think you'll have to extract the file to disk.
worry about disk space, clean-up, and all that
If those are your main concerns, I think you will find File::Temp's my $tmpdir = tempdir(CLEANUP=>1); very useful, I use it a lot and have hardly ever had problems with it not cleaning up after itself (Perl usually has to crash hard for that). Personally, I would extract the SQLite database to disk, and probably only extract the binary blobs on demand, or perhaps, if the tools you are using to process them can make use of it, transparently using IO::Uncompress::Unzip. The JSON file you could obviously extract into memory and decode via e.g. JSON::MaybeXS.