Web technology sharing | webrtc media stream recording

anyRTC 2021-09-15 09:55:15

Audio and video recording audio and video are divided into server recording and client recording , The following is mainly about the use of webrtc How to make client recording .
because WebRTC After recording the audio and video stream , And finally through Blob Object to save data as a multimedia file ,

  • Blob Object represents an immutable 、 Class file object of original data . Its data can be read in text or binary format , It can also be converted into ReadableStream For data manipulation .

  • Blob Not necessarily JavaScript Data in native format .File Interface based on Blob, Inherited blob And extend it to support files on the user's system .

  • From other non blob Object and data construct a Blob, Please use Blob() Constructors . To create a blob A subset of the data blob, Please use slice() Method . To get the file corresponding to the file on the user's file system Blob object .

  • Accept Blob Object's API Also listed in File In the document .

To understand the Blob After the features of , Let's get down to business .

Get element declaration variable

let mediaRecorder;
let recordedBlobs;
const recordedVideo = document.querySelector('video#recorded');
const recordButton = document.querySelector('button#record');
const playButton = document.querySelector('button#play');
const downloadButton = document.querySelector('button#download');

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.

Bind click event

  • URL.createObjectURL() Static methods create a DOMString, It contains a... That represents the object given in the parameter URL. This URL Life cycle and create it in the window of document binding . This new URL Object represents the specified File Object or Blob object .

  • URL.revokeObjectURL() Static methods are used to release a previously existing 、 By calling URL.createObjectURL() Created URL object . When you end up using a URL After object , You should call this method to let the browser know that you don't need to keep the reference to this file in memory . You can sourceopen Called at any time after being processed revokeObjectURL(). This is because createObjectURL() It just means to put a media element of src Property is associated with a MediaSource Object up . call revokeObjectURL() Bring the potential object back to its original place , Allow the platform to collect garbage at the right time .

recordButton.addEventListener('click', () => {
if (recordButton.textContent === 'Start Recording') {
startRecording();
} else {
stopRecording();
recordButton.textContent = 'Start Recording';
playButton.disabled = false;
downloadButton.disabled = false;
}
});
playButton.addEventListener('click', () => {
const superBuffer = new Blob(recordedBlobs);
recordedVideo.src = null;
recordedVideo.srcObject = null;
recordedVideo.src = window.URL.createObjectURL(superBuffer);
recordedVideo.controls = true;
recordedVideo.play();
});
downloadButton.addEventListener('click', () => {
const blob = new Blob(recordedBlobs, {
type: 'video/webm'
});
const url = window.URL.createObjectURL(blob);
const a = document.createElement('a');
a.style.display = 'none';
a.href = url;
a.download = 'test.webm';
document.body.appendChild(a);
a.click();
setTimeout(() => {
document.body.removeChild(a);
window.URL.revokeObjectURL(url);
}, 100);
});
document.querySelector('button#start').addEventListener('click', async () => {
const constraints = {
audio: {},
video: {
width: 1280,
height: 720
}
};
console.log('Using media constraints:', constraints);
await init(constraints);
});

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44.
  • 45.
  • 46.
  • 47.
  • 48.

Click Start initialization

  • MediaDevices.getUserMedia() The user will be prompted to give permission to use media input , Media input produces a MediaStream, It contains the track of the requested media type . This stream can contain a video track ( From hardware or virtual video sources , Like a camera 、 Video capture devices and screen sharing services, etc )、 An audio track ( Also from hardware or virtual audio sources , Like a microphone 、A/D Converters and so forth ), It could be another type of orbit .

  • It returns a Promise object , After success resolve Call back one MediaStream object . If the user refuses permission , Or the required media source is not available ,promise Meeting reject Call back one PermissionDeniedError perhaps NotFoundError .

async function init(constraints) {
try {
const stream = await navigator.mediaDevices.getUserMedia(constraints);
handleSuccess(stream);
} catch (e) {
console.error('navigator.getUserMedia error:', e);
}
}
function handleSuccess(stream) {
recordButton.disabled = false;
console.log('getUserMedia() got stream:', stream);
window.stream = stream;
const gumVideo = document.querySelector('video#gum');
gumVideo.srcObject = stream;
}

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.

Start recording method

function startRecording() {
recordedBlobs = [];
try {
mediaRecorder = new MediaRecorder(window.stream);
} catch (e) {
console.error('Exception while creating MediaRecorder:', e);
return;
}
recordButton.textContent = 'Stop Recording';
playButton.disabled = true;
downloadButton.disabled = true;
mediaRecorder.onstop = (event) => {
console.log('Recorder stopped: ', event);
console.log('Recorded Blobs: ', recordedBlobs);
};
mediaRecorder.ondataavailable = handleDataAvailable;
mediaRecorder.start();
}
function handleDataAvailable(event) {
if (event.data && event.data.size > 0) {
recordedBlobs.push(event.data);
}
}

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.

HTML

<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta id="theme-color" name="theme-color" content="#ffffff">
<link rel="stylesheet" href="./index.css">
</head>
<body>
<div id="container">
<video id="gum" playsinline autoplay muted></video>
<video id="recorded" playsinline loop></video>
<div>
<button id="start">Start camera</button>
<button id="record" disabled>Start Recording</button>
<button id="play" disabled>Play</button>
<button id="download" disabled>Download</button>
</div>
</div>
<script src="./main.js" async></script>
</body>
</html>

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.

CSS

button {
background-color: #d84a38;
border: none;
border-radius: 2px;
color: white;
font-family: 'Roboto', sans-serif;
font-size: 0.8em;
margin: 0 0 1em 0;
padding: 0.5em 0.7em 0.6em 0.7em;
}
button:active {
background-color: #cf402f;
}
button:hover {
background-color: #cf402f;
}
button[disabled] {
color: #ccc;
}
button[disabled]:hover {
background-color: #d84a38;
}
div#container {
margin: 0 auto 0 auto;
max-width: 60em;
padding: 1em 1.5em 1.3em 1.5em;
}
video {
background: #222;
margin: 0 0 20px 0;
--width: 100%;
width: var(--width);
height: calc(var(--width) * 0.75);
}

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.

webRTC It's very powerful , About webrtc And live broadcasting, there are still many technologies we need to study , The last one is DEMO Example code for , Interested partners can try it in person .

Please bring the original link to reprint ,thank
Similar articles

2021-09-15

2021-09-15