Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Develop #17

Merged
merged 4 commits into from
Nov 21, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -126,7 +126,12 @@ internal void Update()

internal void SetTrack(TrackType type, MediaStreamTrack mediaStreamTrack, out IStreamTrack streamTrack)
{
Debug.LogWarning($"[Participant] Local: {IsLocalParticipant} Session ID: {SessionId} set track of type {type}");

#if STREAM_DEBUG_ENABLED
Debug.LogWarning(
$"[Participant] Local: {IsLocalParticipant} Session ID: {SessionId} set track of type {type}");
#endif

switch (type)
{
case TrackType.Unspecified:
Expand Down Expand Up @@ -158,7 +163,7 @@ internal void SetTrackEnabled(TrackType type, bool enabled)
// Not an error, sometimes we receive tracks info from the API before webRTC triggers onTrack event
return;
}

streamTrack.SetEnabled(enabled);

//StreamTodo: we should trigger some event that track status changed
Expand Down Expand Up @@ -187,7 +192,7 @@ protected override string InternalUniqueId
private readonly List<string> _roles = new List<string>();

#endregion

private BaseStreamTrack GetStreamTrack(TrackType type)
{
switch (type)
Expand Down
102 changes: 82 additions & 20 deletions docusaurus/docs/Unity/02-tutorials/02-audio-room.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -195,10 +195,12 @@ Next, we'll create the **UI Manager** script to reference all UI elements and ha
2. Open this script in your code editor and paste the following content:
```csharp
using System;
using System.Linq;
using TMPro;
using System.Collections.Generic;
using System.Threading.Tasks;
using StreamVideo.Core;
using StreamVideo.Core.StatefulModels;
using StreamVideo.Libs.Auth;
using UnityEngine;
using UnityEngine.UI;

public class AudioRoomsUI : MonoBehaviour
{
Expand Down Expand Up @@ -355,7 +357,7 @@ private async void OnLeaveButtonClicked()
}
```

### Step 6 - Capture **Microphone** Input
### Step 6 - Capture Microphone Input

Now, let's add sending audio input captured from our microphone device.

Expand All @@ -376,7 +378,6 @@ private void SetActiveMicrophone(int optionIndex)
_microphoneAudioSource.clip
= Microphone.Start(_activeMicrophoneDevice, true, 3, AudioSettings.outputSampleRate);
_microphoneAudioSource.loop = true;
_microphoneAudioSource.volume = 0; // Set volume to 0 so we don't hear back our microphone
_microphoneAudioSource.Play();
}
```
Expand All @@ -399,7 +400,7 @@ if (_activeMicrophoneDevice != null)
}
```

Next, we use Unity's `Microphone.Start` method to capture the audio from the microphone device and stream it into the `_microphoneAudioSource` AudioSource component. We'll later set the `microphoneAudioSource` component as an audio input source.
We use Unity's `Microphone.Start` method to capture the audio from the microphone device and stream it into the `_microphoneAudioSource` AudioSource component. We'll later set the `microphoneAudioSource` component as an audio input source.

:::note

Expand Down Expand Up @@ -427,18 +428,68 @@ The code above:
2. Passes the `AudioSource` to the `_audioRoomsManager.SetInputAudioSource` method. We'll create this method in the next step.
3. Calls the `SetActiveMicrophone(0);` to capture audio from the first microphone in the dropdown.

Next, open the `AudioRoomsManager.cs` class and add the following method:
Next, add these lines to the end of the `Awake` method in the `AudioRoomsUI` class:
```csharp
// SetActiveMicrophone will be called when a microphone is picked in the dropdown menu
_microphoneDropdown.onValueChanged.AddListener(SetActiveMicrophone);

InitMicrophone();
```

Next, open the `AudioRoomsManager.cs` class in the code editor.

First, add this field anywhere in the class:
```csharp
private AudioSource _inputAudioSource;
```

Next, add this method:
```csharp
public void SetInputAudioSource(AudioSource audioSource)
{
_inputAudioSource = audioSource;

// If client is already created, update the audio input source
if (_client != null)
{
_client.SetAudioInputSource(_inputAudioSource);
}
}
```
And the following field to the fields section:

And finally, replace the `Start` method with the following code:
```csharp
private AudioSource _inputAudioSource;
private async void Start()
{
// Create Client instance
_client = StreamVideoClient.CreateDefaultClient();

var credentials = new AuthCredentials(_apiKey, _userId, _userToken);

try
{
// Connect user to Stream server
await _client.ConnectUserAsync(credentials);
Debug.Log($"User `{_userId}` is connected to Stream server");

// highlight-start
// Set audio input source
if (_inputAudioSource != null)
{
_client.SetAudioInputSource(_inputAudioSource);
}
// highlight-end
}
catch (Exception e)
{
// Log potential issues that occured during trying to connect
Debug.LogException(e);
}
}
```

What changed is that we've added the part highlighted above. It will set the **Audio Input Source** if case when `SetInputAudioSource` was called before the client was created.

### Step 7 - Create UI scene objects

Go to the scene Hierarchy window and create the `Canvas` game object. One way to do this is by selecting the `GameObject -> UI -> Canvas` from the top menu.
Expand Down Expand Up @@ -611,20 +662,20 @@ private void OnTrackAdded(IStreamVideoCallParticipant participant, IStreamTrack

Next, in the `OnDestroy` method (automatically called by Unity), we unsubscribe from the `TrackAdded` event:
```csharp
private void OnDestroy()
{
// It's a good practice to always unsubscribe from events
_participant.TrackAdded -= OnTrackAdded;
}
private void OnDestroy()
{
// It's a good practice to always unsubscribe from events
_participant.TrackAdded -= OnTrackAdded;
}
```

And lastly, we've defined two fields to keep the `AudioSource` and the participant references:
```csharp
// This AudioSource will play the audio received from the participant
private AudioSource _audioOutputAudioSource;
// This AudioSource will play the audio received from the participant
private AudioSource _audioOutputAudioSource;

// Keep reference so we can unsubscribe from events in OnDestroy
private IStreamVideoCallParticipant _participant;
// Keep reference so we can unsubscribe from events in OnDestroy
private IStreamVideoCallParticipant _participant;
```

---
Expand Down Expand Up @@ -668,8 +719,11 @@ private void OnParticipantLeft(string sessionId, string userid)

var audioCallParticipant = _callParticipantBySessionId[sessionId];

// Remember to destroy the game object and not the component only
// Destroy the game object represeting a participant
Destroy(audioCallParticipant.gameObject);

// Remove entry from the dictionary
_callParticipantBySessionId.Remove(sessionId);
}
```

Expand Down Expand Up @@ -712,14 +766,22 @@ public async Task LeaveCallAsync()
_activeCall.ParticipantLeft -= OnParticipantLeft;

await _activeCall.LeaveAsync();

// Destroy all call participants objects
foreach (var participant in _callParticipantBySessionId.Values)
{
Destroy(participant.gameObject);
}

_callParticipantBySessionId.Clear();
}
```

We've extended the previous implementation with unsubscribing from the `ParticipantJoined` and `ParticipantLeft` events.

### Step 9 - Test

We're now ready to test the app! To test it, we need two instances of the app. Each running instance of the app will use a microphone and a speaker. This may result in a permission conflict if multiple applications attempt to use the microphone and camera simultaneously. Therefore, a good way to test it is by using your PC and a smartphone. Depending on what type of device you have, you can follow the [Android](../04-platforms/02-android.mdx) or [IOS](../04-platforms/03-ios.mdx) sections to learn how to deploy the app on those devices.
We're now ready to test the app! To test it, we need two instances of the app. Each running instance of the app will use a microphone and a speaker. This may result in a permission conflict if multiple applications attempt to use the microphone and camera simultaneously. Therefore, a good way to test it is by using your PC and a smartphone. Depending on what type of device you have, you can follow the [Android](../04-platforms/02-android.mdx) or [IOS](../04-platforms/03-ios.mdx) sections to learn how to build the app for mobile platforms.

Once you launch the app on two separate devices, provide the same **call Id** on both devices to join the same audio call.

Expand Down
8 changes: 4 additions & 4 deletions docusaurus/docs/Unity/04-platforms/02-android.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ title: Android
description: Building for Android platform
---

This page describes additional steps required when building to **Android** platform.
This page describes additional steps required when building for the **Android** platform.

#### Prerequisites
- Install Android Module in Unity HUB (if needed)
- Switch the project to Android platform (File -> Build Settings -> Platform)
- Install Android Module in Unity HUB
- Change the target platform to **Android** in Unity Editor (File -> Build Settings -> Platform)

You can check [Unity Docs](https://learn.unity.com/tutorial/publishing-for-android) for detailed information on setting up the project for Android development. Please note that setting up the **Keystore Manager** is only required when you intend to publish the app; you can skip this step when testing the app on your local device only.

Expand All @@ -18,4 +18,4 @@ Go to "File -> Build Settings -> Player Settings"
- Set **Require** for the **Internet Access** option
- Set the **Android API Level** to **23 or higher** (Other Settings -> Identification -> Minimum API Level)

Stream's Video SDK uses Unity's webRTC library internally and therefore follows the same requirements and limitations. For more information please refer to Unity's [webRTC package documentation](https://docs.unity3d.com/Packages/[email protected]/manual/requirements.html).
Stream's **Video & Audio SDK** internally uses Unity's webRTC library, following the exact requirements and limitations. For more information, please refer to Unity's [webRTC package documentation](https://docs.unity3d.com/Packages/[email protected]/manual/requirements.html).
15 changes: 9 additions & 6 deletions docusaurus/docs/Unity/04-platforms/03-ios.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,16 @@ title: IOS
description: Building for IOS platform
---

This page describes additional steps required when building to **IOS** platform.
This page describes additional steps required when building for the **iOS** platform.

#### Prerequisites
- Install IOS Module in Unity HUB (if needed)
- Switch the project to IOS platform (File -> Build Settings -> Platform)
- Install IOS Module in Unity HUB
- Change the target platform to **iOS** in Unity Editor (File -> Build Settings -> Platform)

#### Requirement for IOS Build
- disable the **bitcode** option in **Xcode project** exported from Unity. On the Xcode **Build Settings** tab, in the **Build Options** group, set **Enable Bitcode** to **No**.
#### Requirements for IOS Build
1. Set the descriptions for **Camera** and **Microphone** usage. This setting appears after you change the target platform to **iOS**. (Build Settings -> Player Settings -> Other Settings)
2. Open the **Xcode project** generated by Unity when you build your project for **iOS** platform. In the Xcode Editor, go to **Build Settings** tab, in the **Build Options** group, set **Enable Bitcode** to **No**.

Stream's Video SDK uses Unity's webRTC library internally and therefore follows the same requirements and limitations. For more information please refer to Unity's [webRTC package documentation](https://docs.unity3d.com/Packages/[email protected]/manual/requirements.html).
![Imported SDK package](../assets/platforms/set_camera_and_microphone_usage_desc.png)

Stream's **Video & Audio SDK** internally uses Unity's webRTC library, following the exact requirements and limitations. For more information, please refer to Unity's [webRTC package documentation](https://docs.unity3d.com/Packages/[email protected]/manual/requirements.html).
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.