DEV Community

Himanshu Tripathi
Himanshu Tripathi

Posted on

Screencast with Angular to springboot

Spring-angular

In this article we will explore how to stream a screen feed from angular to spring boot server and save it as video file. I have recently used this method in one of my projects. This is part 1 of a multipart series. Since I am a novice programmer there is still a room for improvement.
Let us begin.

Setting up Angular Frontend for screen capture.

Let's begin by having a basic functionality of capturing the screen and then downloading the captured video. This will the simple angular project, let us name it Screen-Capture, run the below command to create the project into your desired folder.

ng new Screen-Capture
Enter fullscreen mode Exit fullscreen mode

Now open the folder in your trusted editor and open app.component.ts to add the code to capture camera stream.

const mediaDevices = navigator.mediaDevices as any;
declare var MediaRecorder: any;
Enter fullscreen mode Exit fullscreen mode

These lines import browsers mediaDevice api and make a mediarecorder global variable.
Let us add the startRecording function

1   async startRecording() {
2     var options;
3 
4     if (MediaRecorder.isTypeSupported('video/webm;codecs=vp9')) {
5       options = {
6         videoBitsPerSecond: 2500000,
7         mimeType: 'video/webm; codecs=vp9',
8       };
9     } else if (MediaRecorder.isTypeSupported('video/webm;codecs=vp8')) {
10       options = {
11         videoBitsPerSecond: 2500000,
12         mimeType: 'video/webm; codecs=vp8',
13       };
14     } else {
15       options = { videoBitsPerSecond: 2500000, mimeType: 'video/webm' };
16     }
17 
18     try {
19       this.stream = await mediaDevices.getDisplayMedia({
20         screen: {
21           width: { min: 1024, ideal: 1280, max: 1920 },
22           height: { min: 576, ideal: 720, max: 1080 },
23           frameRate: { min: 10, ideal: 20, max: 25 },
24         },
25       });
26     } catch (err) {
27       alert('No devices found for recording.');
28     }
29     this.recorder = new MediaRecorder(this.stream, options);
30     let metadata: any;
31 
32     this.frameCount = 0;
33 
34     this.recorder.ondataavailable = (e: { data: any }) => {
35       this.blobarr.push(e.data);
36       this.frameCount += 1;
37     };
38 
39     this.recorder.addEventListener('stop', () => {
40       this.stream.getTracks().forEach(function (track: any) {
41         track.stop();
42       });
43       this.isRecording = false;
44     });
45 
46     this.recorder.start(500);
47   }
Enter fullscreen mode Exit fullscreen mode

Now this function is most important as this does all the heavy lifting in our application. Inside the function, from line number 4-16 a query to browser through MediaRecorder about the best codecs is done which shall be used to encode the video. vp9 is given most preference as it’s has fast performance on newer hardware, least preference is given to webm. It is also a good practice to limit bitrate here the bits/sec is set to 2500000 or 2.5 Mbits/sec.

Lines 19-24
19       this.stream = await mediaDevices.getDisplayMedia({
20         screen: {
21           width: { min: 1024, ideal: 1280, max: 1920 },
22           height: { min: 576, ideal: 720, max: 1080 },
23           frameRate: { min: 10, ideal: 20, max: 25 },
24         }, 
Enter fullscreen mode Exit fullscreen mode

Lines 19-24 get the stream handle of the screen; framerate preferences are also set in these lines. These are surrounded with try and catch blocks to handle unexpected errors.

Lines 29-37
29     this.recorder = new MediaRecorder(this.stream, options);
30     let metadata: any;
31 
32     this.frameCount = 0;
33 
34     this.recorder.ondataavailable = (e: { data: any }) => {
35       this.blobarr.push(e.data);
36       this.frameCount += 1;
37     };
Enter fullscreen mode Exit fullscreen mode

At line 29, a recorder object is created with MediaRecorder Function with options and stream handle obtained from before. To this recorder object ondataavailable event is wired at line 34-37. This just takes a blob of data that is emitted by recorder object and pushes it to an array named blobarr. A framecount variable is maintained to count the number of emitted blobs.

Lines 39-46
39     this.recorder.addEventListener('stop', () => {
40       this.stream.getTracks().forEach(function (track: any) {
41         track.stop();
42       });
43       this.isRecording = false;
44     });
45 
46     this.recorder.start(500);
Enter fullscreen mode Exit fullscreen mode

At line 39, stop event is wired to stop the recording, this event if fired when user calls the stop function on the recorder object. In this case when the stop event is called it stops stream on all tracks and toggle a variable isRecording to false. This variable describes if program is currently recording the screen. At line 46, start function on recorder object is invoked while passing 500 into it. This 500 is the time in millisecond describing the interval after which the ondataavailable event is called. Note bigger number will indicate longer time interval.

Now add function recordStart to call startRecording function. This function will also reset the blobarr to size 0 and toggle isRecording to true state.

1  recordStart() {
2     this.isRecording = true;
3     this.blobarr.length = 0;
4     this.startRecording();
5   }
Enter fullscreen mode Exit fullscreen mode

Add recordStop function which will call stop function on recorder object. This function will fire the stop function on record which was described before.

1  recordStop() {
2     if (this.recorder) {
3       this.recorder.stop();
4       
5     }
6   }

Enter fullscreen mode Exit fullscreen mode

Now your app.component.ts will look like ..

1 import {
2   Component,
3   ElementRef,
4   OnDestroy,
5   OnInit,
6   ViewChild,
7 } from '@angular/core';
8 
9 const mediaDevices = navigator.mediaDevices as any;
10 declare var MediaRecorder: any;
11 
12 @Component({
13   selector: 'app-root',
14   templateUrl: './app.component.html',
15   styleUrls: ['./app.component.scss'],
16 })
17 export class AppComponent implements OnDestroy {
18   recorder: any;
19   stream: any;
20   frameCount: number = 0;
21   blobarr: any[] = [];
22   finalBlob: Blob | null = null;
23   isRecording: boolean = false;
24 
25   ngOnDestroy(): void {
26     this.blobarr.length = 0;
27     this.recordStop();
28   }
29 
30   async startRecording() {
31     var options;
32 
33     if (MediaRecorder.isTypeSupported('video/webm;codecs=vp9')) {
34       options = {
35         videoBitsPerSecond: 2500000,
36         mimeType: 'video/webm; codecs=vp9',
37       };
38     } else if (MediaRecorder.isTypeSupported('video/webm;codecs=vp8')) {
39       options = {
40         videoBitsPerSecond: 2500000,
41         mimeType: 'video/webm; codecs=vp8',
42       };
43     } else {
44       options = { videoBitsPerSecond: 2500000, mimeType: 'video/webm' };
45     }
46 
47     try {
48       this.stream = await mediaDevices.getDisplayMedia({
49         screen: {
50           width: { min: 1024, ideal: 1280, max: 1920 },
51           height: { min: 576, ideal: 720, max: 1080 },
52           frameRate: { min: 10, ideal: 20, max: 25 },
53         },
54       });
55     } catch (err) {
56       alert('No devices found for recording.');
57     }
58     this.recorder = new MediaRecorder(this.stream, options);
59     let metadata: any;
60 
61     this.frameCount = 0;
62 
63     this.recorder.ondataavailable = (e: { data: any }) => {
64       this.blobarr.push(e.data);
65       this.frameCount += 1;
66     };
67 
68     this.recorder.onstop = (e: any) => {
69       this.isRecording = false;
70     };
71     this.recorder.start(500);
72   }
73 
74   downloadBlob() {
75     let downloadLink = document.createElement('a');
76     downloadLink.href = window.URL.createObjectURL(
77       new Blob(this.blobarr, { type: this.blobarr[0].type })
78     );
79     downloadLink.setAttribute('download', 'download.webm');
80     document.body.appendChild(downloadLink);
81     downloadLink.click();
82 
83     setTimeout(() => {
84       window.URL.revokeObjectURL(downloadLink.href);
85       document.body.removeChild(downloadLink);
86     }, 0);
87   }
88 
89   recordStop() {
90     if (this.recorder) {
91       this.recorder.stop();
92       this.stream.getTracks().forEach(function (track: any) {
93         track.stop();
94       });
95     }
96   }
97 
98   recordStart() {
99     this.isRecording = true;
100     this.blobarr.length = 0;
101     this.startRecording();
102   }
103 }
104 
Enter fullscreen mode Exit fullscreen mode

Now go to app.component.html and add this below code for adding buttons to start recording and download the video .

1 <div *ngIf="!isRecording">
2   <button (click)="recordStart()">Start Recording</button>
3 </div>
4 <div *ngIf="isRecording">
5   <button (click)="recordStop()">Stop Recording</button>
6 </div>
7 
8 
9 <button (click)="downloadBlob()">Download</button>
10 
Enter fullscreen mode Exit fullscreen mode

Now serve the application with ng serve -o. You can start the recording, stop it and then download the recorded screen cast.
Here is the link of the project on github it is in branch part1
In the next part we shall create a spring boot backend which will receive the chunks of video. Stay tuned.
Thanks.

Top comments (0)