we are working on an integration with our mobile application where we are trying to create a live feed backend API that connects to an RTSP URL and returns some video data.
The process we thought about is the following:
mobile application -> /api/realtime/camera -> starts an FFMPEG process that connects to an RTSP url -> write process' input stream to API's response using webflux.
This is a code snippet:
----------- process init -----------------------------
ProcessBuilder ffmpegBuilder = new ProcessBuilder(
"ffmpeg",
"-rtsp_transport", "tcp",
"-i", rtspConnection,
"-c:v", "libx264",
"-preset", "ultrafast",
"-tune", "zerolatency",
"-b:v", "2000k",
"-an",
"-f", "mp4",
"-movflags", "frag_keyframe+empty_moov+default_base_moof",
"pipe:1");
------------- process read & push to spring reactive Sink ------------------
try (var inputStream = ffmpegProcess.getInputStream()) {
byte[] buffer = new byte[8192];
int bytesRead;
while ((bytesRead = inputStream.read(buffer)) != -1) {
DataBuffer dataBuffer = new DefaultDataBufferFactory().wrap(copyOf(buffer, bytesRead));
sink.emitNext(dataBuffer, Sinks.EmitFailureHandler.FAIL_FAST);
}
} catch (Exception e) {
sink.emitError(e, Sinks.EmitFailureHandler.FAIL_FAST);
} finally {
this.dispose();
}
----------------- API that will eventually transform previous sink as flux -----------
@GetMapping(value = "/{deviceId}/camera", produces = "video/mp4")
public Flux streamCamera(@PathVariable Long deviceId) {
// return previous sink as flux
}
The problem is that we realized that in a web context, in case that multiple users will open a connection to this API, this will eventually start a lot of background processes and it might cause OOM or other issues with our machine.
Questions:
is this the way to go for such feature? or should the mobile app connect directly to the RTSP url?
are there any ffmpeg-friendly libraries for java?