Skip to content

Migrating SDK Version 1 or Version 2 to Version 3

Kesava Krishnan Madavan edited this page May 29, 2024 · 12 revisions

This article outlines the migration process for developers who are moving their Webex Web SDK applications from version 1 or 2 to version 3. It provides information on the most important changes to the SDK to aid our existing developers in making their applications compatible with SDK version 3. Please note that the migration process does not differ while migrating from version 1 to 3 or 2 to 3.

Media Handling Changes

For version 3, Tracks are migrating to Streams.

2.x

Media handling within the meeting object is done using the getMediaStreams() method. You pass audio and video device inputs along with their respective media settings, and a Stream object is returned:

let currentMediaStreams = [];

const mediaSettings: {
   sendAudio: true,
   sendVideo: true,
   sendShare: false,
   receiveVideo:true,
   receiveAudio: true,
   receiveShare: true,
}

meeting.getMediaStreams(mediaSettings, audioVideoInputDevices)
  .then(([localStream, localShare]) => {

    /*
      * If you only update a particular stream, other streams return as undefined.
      * We default back to the previous stream in this case.
      */
    currentMediaStreams = [localStream, localShare];

    return currentMediaStreams;
  })
  .then(([localStream]) => {
    if (localStream && mediaSettings.sendVideo) {
      const meetingStreamsLocalVideo = new MediaStream(localStream.getVideoTracks());
      const meetingStreamsLocalAudio = new MediaStream(localStream.getAudioTracks());
    }

    return {localStream};

After acquiring the media streams from the respective devices, join the meeting and add the media.

meeting.join();

meeting.addMedia({
  localShare,
  localStream,
  mediaSettings: getMediaSettings()
});

To stop media on a track, use the Track object's stop() method:

audioTrack.stop();
videoTrack.stop();

3.x

In the Web SDK 3.x, we've introduced the Stream class. You can create a Microphone, Camera, or Display stream using the mediaHelpers object contained within the meetings object:

// Construct the Constraints objects for audio and video...

const videoConstraints = {
    deviceId?: ConstrainDOMString;
    width?: ConstrainULong;
    height?: ConstrainULong;
    aspectRatio?: ConstrainDouble;
    frameRate?: ConstrainDouble;
    facingMode?: ConstrainDOMString;
};

const audioConstraints = {
    deviceId?: string;
    autoGainControl?: boolean;
    echoCancellation?: boolean;
    noiseSuppression?: boolean;
};

const audioStream = await webex.meetings.mediaHelpers.createMicrophoneStream(audioConstraints);
meetingStreamsLocalAudio.srcObject = audioStream;

const videoStream = await webex.meetings.mediaHelpers.createCameraStream(videoConstraints);
meetingStreamsLocalVideo.srcObject = videoStream;

// Create the display stream to share your screen, window, or tab.

const [localShareVideoStream, localShareAudioStream] = await webex.meetings.mediaHelpers.createDisplayStreamWithAudio();

Once you've created the above streams, you can add them to the meeting as shown below.

// Join the meeting

meeting.join();

// Setup media options and add media
const addMediaOptions = {
  localStreams: {
    microphone: audioStream,
    camera: videoStream,
    screenShare: {
      video: localShareVideoStream,
      audio: localShareAudioStream,
    },
  },
  allowMediaInLobby: true,
};

meeting.addMedia(addMediaOptions);

Once media has been added to the meeting, in order to change the media, you must use the unpublishStreams() and publishStreams() methods in the meeting object.

To remove media from a meeting object, use the unPublishStreams() method:

meeting.unPublishStreams([
  audioStream,
  videoStream
]);

To republish new or existing streams into the meeting object, use the publishStreams() method:

meeting.publishStreams({microphone: audioStream, camera: videoStream}));

You can also use the publishStreams() and unpublishStreams() methods to start and stop sharing the screen during the meeting.

To stop the media on a stream, use the Media objects stop() method:

audioStream.stop();

videoStream.stop();

Media Event Payload Changes

2.x

In the version 2 SDK, the media:ready and media:stop events on the meeting object return a payload containing the following media types:

  • remoteVideo
  • remoteAudio
  • remoteShare
  • localShare

The second part of the payload includes MediaStream object.

3.x

In version 3 SDK, the localShare media type is no longer passed as a type in the media event payload.

Media Effect Upgrades and Changes

2.x

In version 2 of the SDK, the meeting object exposes the Background Noise Removal (BNR) functionality to developers. To enable or disable this functionality, after adding audio tracks to the meeting:

meeting.enableBNR();
meeting.disableBNR();

3.x

In version, 3 of the SDK, developers have access to additional media effects features:

  • Background Noise Removal
  • Custom Virtual Background
  • Blur

All the above effects are now exposed as their own classes via the SDK. These effects are applied to the Stream objects created at the start of the meeting flow.

Stream Mute Options

Some devices have a way to mute/unmute their microphone/camera via a physical button. Once muted in this way, there is no way to programmatically unmute the device; the user must physically press the button again to unmute. For example, some Chromebooks have a physical button on the keyboard to mute/unmute the microphone.

To differentiate between muting programmatically and muting via a physical button, the following changes have been made to the APIs related to LocalStream and RemoteStream:

Note: This breaking change is part of any SDK version from [email protected]

LocalTrack's setMuted method

2.x

This is the method that is to be used to programically mute a LocalTrack in the V2 SDK

LocalTrack.setMuted();

3.x

In the V3 SDK, this method is renamed as follows,

LocalStream.setUserMuted();

LocalTrack's mute-state-change event

2.x

This event is triggered when the mute state of a local track changes programatically.

LocalTrack.on('mute-state-change',() => {
  //change something in UI
});

3.x

The mute-state-change event is replaced and split into two:

user-mute-state-change This is the event that's fired when the mute state changes programatically

LocalStream.on('user-mute-state-change',(isUserMuted) => {
  //triggered when user clicks mute button on the app
});

**system-mute-state-change ** This is the event that's fired when the mute state changes through a hardware

LocalStream.on('system-mute-state-change',(isSystemMuted) => {
  //triggered when mutes their hardware
  //At this point, it is to note that the unmute button won't work if isSystemMuted is true
  //Can use this to disable unmute button and add UI note stating the user has to unmute their hardware
});

LocalTrack's muted getter

This remains unchanged except that in 3.x it will return true in either case (Programically Muted or Hardware Muted).

3.x

It also introduces two new getters,

  • LocalStream.userMuted - true when programically muted
  • LocalStream.systemMuted - true when muted via hardware

RemoteTrack's muted getter

2.x

This getter is used to identify if a remote track is muted or not

const isRemoteMuted = remoteTrack.muted;

3.x

The muted is replaced by the mediaState getter which can be used as follows,

const isRemoteMuted = remoteStream.mediaState === 'started' ? false : true;

Basically, the getter remoteStream.mediaState will return either started or stopped meaning the stream is not muted or muted respectively.

RemoteTrack's mute-state-change event

2.x

This event is subscribed to know if the remote track's mute state changes

RemoteTrack.on('mute-state-change',(isMuted) => {

});

3.x

The existing mute-state-change event is replaced with the media-state-change event and it can be used as follows,

RemoteStream.on('media-state-change', (payload) => {
  // payload can be either 'started' or 'stopped'
  // started indicates unmuted
  // stopped indicates muted
});
Clone this wiki locally