Skip to content

Commit

Permalink
feat(*): introducing a unique streamer not dependent of the endpoint
Browse files Browse the repository at this point in the history
  • Loading branch information
ThibaultBee committed Jun 11, 2024
1 parent 0cbd458 commit 659cb11
Show file tree
Hide file tree
Showing 262 changed files with 3,395 additions and 4,075 deletions.
129 changes: 100 additions & 29 deletions DEVELOPER_README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,39 +15,44 @@ A class that represents an audio or video encoders. Only Android MediaCodec API
`Endpoint`:
The last element of a live streaming pipeline. It is responsible for handling the frames after the
encoder.
It could be composed of a `Muxer` and a `Sink`.
The endpoint could be a remote server (RTMP, SRT,...) or a file (FLV, MPEG-TS,...).
The main endpoint is `CompositeEndpoint` that is composed of a `Muxer` and a `Sink`.

`Muxer`:
A class that packs audio and video frames to a container (FLV, MPEG-TS, MP4,...).
A process that packs audio and video frames to a container (FLV, MPEG-TS, MP4,...).
The `CompositeEndpoint` is composed of a `IMuxer`.

`Sink`:
A class that sends the container to a remote server (RTMP, SRT,...) or to a file.
A process that sends the container to a remote server (RTMP, SRT,...) or to a file.
The `CompositeEndpoint` is composed of a `ISink`.

`Streamer`:
A class that represent a audio and/or video live streaming pipeline. It manages sources, encoders,
muxers, endpoints,... and have lot of tools. They are the most important class for users.

![streamer.png](https://github.com/ThibaultBee/StreamPack/blob/master/docs/assets/streamer.png)
Unless explicitly stated, the `Endpoint` is inferred from the `MediaDescriptor` object thanks to
the `DynamicEndpoint`.

`Streamer element`:
Could be a `Source`, `Encoder`, `Muxer`, or `Endpoint`. They implement the `Streamable<T>`.
Could be a `Source`, `Encoder`, `Muxer`, or `Endpoint`. They implement the `Streamable<T>` and they
might have a public interface to access specific info.

`Info`:
A class that provides a set of methods to help to `streamer` configuration such as supported
resolutions,... It comes with an instantiated `Streamer` object:

```kotlin
val info = streamer.info
val info = streamer.getInfo(MediaDescriptor(`media uri`))
```

They might be different for each `Streamer` object. For example, a `FlvStreamer` object will not
have the same `Helper` object as a `TsStreamer` object because FLV does not support a wide range of
have the same `Info` object as a `TsStreamer` object because FLV does not support a wide range of
codecs, audio sample rate,...

`Settings`:
Each streamer elements have a public interface `Settings` that allows go have access to specific
information or configurations:

Each streamer elements have a public interface that allows go have access to specific
information or configuration.
For example, the `VideoEncoder` object has a `bitrate` property that allows to get and set the
current video bitrate.
Example:

```kotlin
Expand All @@ -57,25 +62,20 @@ val bitrate = streamer.videoEncoder!!.bitrate
streamer.videoEncoder!!.bitrate = 2000000
```

`Extensions`:
A library that adds new features from native libraries. It often comes with a `Streamer elements`
and specific pipelines.

## Streamers

The base streamer class is `BaseStreamer`. All other streamers inherit from it. Then 2 specifics
The streamer implementation is the `DefaultStreamer`. All other streamers inherit from it. Then 2
specifics
base streamers inherit from it:

- `BaseCameraStreamer`: A streamer that streams from a camera and microphone. It adds `startPreview`
- `DefaultCameraStreamer`: A streamer that streams from a camera and microphone. It
adds `startPreview`
, `stopPreview` methods to the `Streamer` object as well as a camera settings.
- `BaseScreenRecorderStreamer`: A streamer that streams from the phone screen and microphone. It
- `DefaultScreenRecorderStreamer`: A streamer that streams from the phone screen and microphone. It
adds specific methods for screen recorder as a API to set activity result.

Then these base streamers are specialized for a File or for a Live:

- `BaseFileStreamer`: A streamer that streams to a file. That means you will find
`BaseFileStreamer` for MPEG-TS and one for FLV.
- `BaseLiveStreamer`: A streamer that streams to a remote server (RTMP or SRT both in `extensions`).
To endpoint of a `Streamer` is inferred from the `MediaDescriptor` object passed to the `Streamer`
by `open` or `startStream` methods.

## Sources

Expand All @@ -87,6 +87,21 @@ There are 2 types of sources:
a `Surface`. Its purpose is to improve encoder performance. For example, it suits camera and
screen recorder. `Surface` sources implement `ISurfaceSource`.

@startuml
interface IVideoSource {

+ hasSurface: Boolean
+ encoderSurface: Surface?
+ getFrame(): ByteBuffer
}

interface IAudioSource {

+ getFrame(): ByteBuffer
}
@enduml
+

To create a new audio source, implements a `IAudioSource`. It inherits from `IFrameSource`.

To create a new video source, implements a `IVideSource`. It inherits from both `IFrameCapture`
Expand All @@ -106,13 +121,69 @@ If your video source is a `ByteBuffer` source, set:

## Encoders

Both `AudioMediaCodecEncoder` and `VideoMediaCodecEncoder` inherit from `MediaCodecEncoder`. They
are using Android `MediaCodec` API in asynchronous mode.

## Muxers
The only encoder is based on Android `MediaCodec` API. It implements the `IEncoder` interface.

They implement the `IMuxer` interface.
@startuml
interface IEncoder {
}
@enduml

## Endpoints

They implement the `IEndpoint` interface. Use specific `ILiveEndpoint` for live streaming.
They implement the `IEndpoint` interface.

@startuml
interface IEndpoint {

+ open()
+ close()
+ write()
}

class CompositeEndpoint {

+ muxer: IMuxer
+ sink: ISink
}
@enduml

### Muxers

They implement the `IMuxer` interface.

### Sinks

They implement the `ISink` interface.

### Streamers

The implement the `ICoroutineStreamer` interface.

@startuml
class DefaultStreamer {

+ videoSource: IVideoSource
+ audioSource: IAudioSource
+ endoint: IEndpoint

- videoEncoder: IEncoder
- audioEncoder: IEncoder
}
@enduml

@startuml
rectangle DefaultCameraStreamer {
[CameraSource] as VideoSource
[MicrophoneSource] as AudioSource
[Encoder] as VideoEncoder
[Encoder] as AudioEncoder
[Endpoint] as Endpoint
VideoSource -r-> VideoEncoder
AudioSource -r-> AudioEncoder
VideoEncoder -r-> Endpoint
AudioEncoder -r-> Endpoint
AudioSource -d[hidden]-> VideoSource
AudioEncoder -d[hidden]-> VideoEncoder
}
}
@enduml
89 changes: 52 additions & 37 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,8 @@ dependencies {
implementation 'io.github.thibaultbee:streampack:2.6.0'
// For UI (incl. PreviewView)
implementation 'io.github.thibaultbee:streampack-ui:2.6.0'
// For ScreenRecorder service
implementation 'io.github.thibaultbee:streampack-services:2.6.0'
// For RTMP
implementation 'io.github.thibaultbee:streampack-extension-rtmp:2.6.0'
// For SRT
Expand Down Expand Up @@ -50,7 +52,7 @@ android {
* Processing: Noise suppressor or echo cancellation
* Audio only mode
* Device audio capabilities
* File: TS or FLV or Fragmented MP4
* File: TS, FLV, MP4, WebM and Fragmented MP4
* Write to a single file or multiple chunk files
* Streaming: RTMP/RTMPS or SRT
* Support for enhanced RTMP
Expand Down Expand Up @@ -108,10 +110,9 @@ If you want to create a new application, you should use the
template [StreamPack boilerplate](https://github.com/ThibaultBee/StreamPack-boilerplate). In 5
minutes, you will be able to stream live video to your server.

1. Add [permissions](#permissions) to your `AndroidManifest.xml` and request them in your
Activity/Fragment.
1. Request the required permissions in your Activity/Fragment.

2. Create a `SurfaceView` to display camera preview in your layout
2. Creates a `SurfaceView` to display camera preview in your layout

As a camera preview, you can use a `SurfaceView`, a `TextureView` or any
`View` where that can provide a `Surface`.
Expand All @@ -129,13 +130,13 @@ To simplify integration, StreamPack provides an `PreviewView`.

`app:enableZoomOnPinch` is a boolean to enable zoom on pinch gesture.

3. Instantiate the streamer (main live streaming class)
3. Instantiates the streamer (main live streaming class)

```kotlin
val streamer = CameraSrtLiveStreamer(context = requireContext())
val streamer = DefaultCameraStreamer(context = requireContext())
```

4. Configure audio and video settings
4. Configures audio and video settings

```kotlin
val audioConfig = AudioConfig(
Expand All @@ -153,7 +154,7 @@ val videoConfig = VideoConfig(
streamer.configure(audioConfig, videoConfig)
```

5. Inflate the camera preview with the streamer
5. Inflates the camera preview with the streamer

```kotlin
/**
Expand All @@ -166,17 +167,25 @@ preview.streamer = streamer
streamer.startPreview(preview)
```

6. Start the live streaming
6. Starts the live streaming

```kotlin
streamer.startStream(ip, port)
val descriptor =
UriMediaDescriptor("rtmps://serverip:1935/s/streamKey") // For RTMP/RTMPS. Uri also supports SRT url, file, content path,...
/**
* Alternatively, you can use object syntax:
* - RtmpConnectionDescriptor("rtmps", "serverip", 1935, "s", "streamKey") // For RTMP/RTMPS
* - SrtConnectionDescriptor("serverip", 1234) // For SRT
*/

streamer.startStream()
```

7. Stop and release the streamer
7. Stops and releases the streamer

```kotlin
streamer.stopStream()
streamer.disconnect()
streamer.close() // Disconnect from server or close the file
streamer.stopPreview() // The StreamerSurfaceView will be automatically stop the preview
streamer.release()
```
Expand Down Expand Up @@ -242,7 +251,7 @@ You will also have to declare the `Service`,
```xml

<application>
<!-- YourScreenRecorderService extends ScreenRecorderRtmpLiveService or ScreenRecorderSrtLiveService -->
<!-- YourScreenRecorderService extends DefaultScreenRecorderService -->
<service android:name=".services.YourScreenRecorderService" android:exported="false"
android:foregroundServiceType="mediaProjection" />
</application>
Expand All @@ -267,30 +276,34 @@ It is easy: if your server has SRT support, use SRT otherwise use RTMP.

### Streamers

Let's start with some definitions! `Streamers` are classes that represent a live streaming pipeline:
capture, encode, mux and send. They comes in multiple flavours: with different audio and video
source, with different endpoints and functionalities... 3 types of base streamers are available:

- `CameraStreamers`: for streaming from camera
- `ScreenRecorderStreamers`: for streaming from screen
- `AudioOnlyStreamers`: for streaming audio only

You can find specific streamers for File or for Live. Currently, there are 2 main endpoints:
Let's start with some definitions! `Streamers` are classes that represent a streaming pipeline:
capture, encode, mux and send.
They comes in multiple flavours: with different audio and video source. 3 types of base streamers
are available:

- `FileStreamer`: for streaming to file
- `LiveStreamer`: for streaming to a RTMP or a SRT live streaming server
- `DefaultCameraStreamer`: for streaming from camera
- `DefaultScreenRecorderStreamer`: for streaming from screen
- `DefaultAudioOnlyStreamer`: for streaming audio only

For example, you can use `AudioOnlyFlvFileStreamer` to stream from microphone only to a FLV file.
Another example, you can use `CameraRtmpLiveStreamer` to stream from camera to a RTMP server.
Since 3.0.0, the endpoint of a `Streamer` is inferred from the `MediaDescriptor` object passed to
the `open` or `startStream` methods. It is possible to limit the possibility of the endpoint by
implementing your own `DynamicEndpoint.Factory` or passing a endpoint as the `Streamer` `endpoint`
parameter.

If a streamer is missing, of course, you can also create your own. You should definitely submit it
in a [pull request](https://github.com/ThibaultBee/StreamPack/pulls).
To create a `Streamer` for a new source, you have to create a new `Streamer` class that inherits
from `DefaultStreamer`.

### Get device capabilities

Have you ever wonder: "What are the supported resolution of my cameras?" or "What is the supported
sample rate of my audio codecs?"? `Info` classes are made for this. All `Streamer` comes with a
specific `Info` object (I am starting to have the feeling I repeat myself):
specific `Info` object:

```kotlin
val info = streamer.getInfo(MediaDescriptor("rtmps://serverip:1935/s/streamKey"))
```

For static endpoint or an opened dynamic endpoint, you can directly get the info:

```kotlin
val info = streamer.info
Expand All @@ -299,32 +312,34 @@ val info = streamer.info
### Get extended settings

If you are looking for more settings on streamer, like the exposure compensation of your camera, you
must have a look on `Settings` class. All together: "All `Streamer` comes with a specific `Settings`
object":
must have a look on `Settings` class. Each `Streamer` elements (such
as `VideoSource`, `AudioSource`,...)
comes with a public interface that allows to have access to specific information or configuration.

```kotlin
streamer.settings
(streamer.videoSource as IPublicCameraSource).settings
```

For example, if you want to change the exposure compensation of your camera, on a `CameraStreamers`
you can do it like this:

```kotlin
streamer.settings.camera.exposure.compensation = value
(streamer.videoSource as IPublicCameraSource).settings.exposure.compensation = value
```

Moreover you can check exposure range and step with:

```kotlin
streamer.settings.camera.exposure.availableCompensationRange
streamer.settings.camera.exposure.availableCompensationStep
(streamer.videoSource as IPublicCameraSource).settings.exposure.availableCompensationRange
(streamer.videoSource as IPublicCameraSource).settings.exposure.availableCompensationStep
```

### Screen recorder Service

To record the screen, you have to use one of the `ScreenRecorderStreamers` inside
To record the screen, you have to use the `DefaultScreenRecorderStreamer` inside
an [Android Service](https://developer.android.com/guide/components/services). To simplify this
integration, StreamPack provides several `ScreenRecorderService` classes. Extends one of these class
integration, StreamPack provides the `DefaultScreenRecorderService` classes. Extends one of these
class
and overrides `onNotification` to customise the notification.

### Android SDK version
Expand Down
3 changes: 3 additions & 0 deletions core/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,11 @@ dependencies {
testImplementation 'androidx.test:rules:1.5.0'
testImplementation 'junit:junit:4.13.2'
testImplementation 'io.mockk:mockk:1.12.2'
testImplementation 'org.jetbrains.kotlinx:kotlinx-coroutines-test:1.8.1'
testImplementation "org.robolectric:robolectric:4.12.2"

androidTestImplementation 'androidx.test:rules:1.5.0'
androidTestImplementation 'androidx.test.ext:junit:1.1.5'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.5.1'
androidTestImplementation 'org.jetbrains.kotlinx:kotlinx-coroutines-test:1.8.1'
}
Loading

0 comments on commit 659cb11

Please sign in to comment.