Liveblocks drawing on screen

An experiment with Liveblocks broadcasted events to draw on screen

See code
Lab / Open Source

An experiment to draw on screen

This experiment comes from my use of Slack Huddle when someone was sharing his screen and we were drawing on it with a pen tool.

I wanted to re-create it using a great collaborative tool to do it. So I chosen to use Liveblocks and implement it on my personal website as a Lab πŸ™‚.

Why drawing on screen?

Drawing on screen is to me a great feature. It brings value and accessibility to users especially when sharing a screen in an application. It makes easier to communicate and express yourself when your pointing something present in the shared screen.

From a collaborative standpoint, drawing on screen in real-time brings some great benefits such as:

  • Enhanced communication
  • Improved collaboration
  • Users engagement
  • Visual clarity
  • Real-time feedback
  • Interactive learning
  • Memory aid

Having this feature in your product when you let users sharing their screen to each others make the collaborative side more stronger.

Let's talk about Liveblocks

Liveblocks is for me a great product providing a complete toolkit to developers to integrate collaborative experiences seamlessly with pace.

Benefits of using Liveblocks as a developer are you can integrates with your frontend framework while Liveblocks is managing all the infrastructure and scaling for you. As a developer deeply believing in frontend cloud it's for me a good match.

Adding Liveblocks into your project is very straightforward. First you need to create an account on the web app and then you install the packages you need in your app. In my case on my personal website:

$ npm i @liveblocks/client @liveblocks/react

Once you've installed the dependencies and got your public key created on the Liveblocks dashboard then you can set up your custom types in your liveblocks.config.ts:

declare global {
  interface Liveblocks {
    Presence: MyPresence
    RoomEvent: MyRoomEvent
  }
}

And setting up your LiveblocksProvider in your app:

import { LiveblocksProvider } from '@liveblocks/react/suspense'

function MyApp({ children }: { children: React.ReactNode }) {
  return (
    <LiveblocksProvider publicApiKey='your-public-api-key'>
      {children}
    </LiveblocksProvider>
  )
}

And we're good to go πŸš€

Drawing paths on screen with an svg canvas

Drawing on screen when sharing screen should stay simple and light in my opinion. So I though that instead of using a canvas element it would be much better and efficient to use an svg element.

When users draw on screen with their mouses, the mouse event can be easily translated into an SVGPoint and then we can assemble theses points to create an SVGPath:

type SVGPoint = { x: number; y: number }
type SVGPath = {
  id: string
  points: SVGPoint[]
  strokeColor: string
  strokeWidth: number
  ended: boolean
}

So our svg canvas will render an array of SVGPath by computing the path shape thanks to the array of SVGPoint with applying on it a cubic bezier curve:

const d = points.reduce<string>((d, point, index, points) => {
  return index === 0
    ? `M ${point.x},${point.y}`
    : `${d} ${getCubicBezierCurve(point, index, points, curveSmoothing)}`
}, '')
πŸ’‘

Curve smoothing is here to help to define the coordinates of the control points for the cubic bezier curve

And our svg canvas can just expose in the end an onChange prop to let parent components to apply some logic when paths are changing in the canvas:

<DrawingCanvas
  ref={canvasRef}
  backgroundColor='transparent'
  width='100%'
  height='100%'
  strokeColor={STROKE_COLOR}
  strokeWidth={STROKE_WIDTH}
  onChange={()=> {}}
  pathDisappearingTimeoutMs={PATH_DISAPPEARING_TIMEOUT_MS}
/>

Please note I wrote the code of the drawing svg canvas without using any external package but I'm sure there are some better drawing canvas out there. You can check out the drawing editor on his lab page:

You can check the code on GitHub πŸ™‚

Broadcasting events for real-time collaboration

Thanks to the exposed onChange prop on the drawing we can simply broadcast events using the useBroadcastEvent hook exported from the Liveblocks config.

Let's start by adding a new room event in our liveblocks.config.ts file:

import type { SVGPath } from '@/components/lab/drawing-canvas/svg-canvas-path'

type RoomEvent = {
  type: 'ADD_SVG_PATHS'
  paths: SVGPath[]
}

declare global {
  interface Liveblocks {
    RoomEvent: RoomEvent
  }
}

After having wrapped the drawing on screen editor with RoomProvider component you can call the useBroadcastEvent React hook to broadcast events:

import { useBroadcastEvent } from '@liveblocks/react/suspense'
import {
  type DrawingCanvasRef,
  DrawingCanvas,
} from '@/components/lab/drawing-canvas/drawing-canvas'

function DrawingOnScreenEditor() {
  /** @type DrawingCanvasRef */
  const canvasRef = useRef(null)
  const broadcast = useBroadcastEvent()

  const handleCanvasChange = useEvent((paths: SVGPath[]): void => {
    broadcast({
      type: 'ADD_SVG_PATHS',
      paths,
    })
  })

  return (
    <DrawingCanvas
      ref={canvasRef}
      backgroundColor='transparent'
      width='100%'
      height='100%'
      strokeColor={STROKE_COLOR}
      strokeWidth={STROKE_WIDTH}
      onChange={handleCanvasChange}
      pathDisappearingTimeoutMs={PATH_DISAPPEARING_TIMEOUT_MS}
    />
  )
}

Listening on broadcasted events

Once the events are broadcasted we can listen on them using the useEventListener React hook an sync the paths:

import { useEventListener } from '@liveblocks/react/suspense'
import type {
  DrawingCanvasRef,
  DrawingCanvas,
} from '@/components/lab/drawing-canvas/drawing-canvas'

function DrawingOnScreenEditor() {
  /** @type DrawingCanvasRef */
  const canvasRef = useRef(null)

  const handleCanvasChange = useEvent((): void => {
    /* ... */
  })

  useEventListener(({ event }): void => {
    if (canvasRef.current && event.type === 'ADD_SVG_PATHS') {
      // NOTE: `canvasRef` is an imperative handle to allow
      // to do operations using a custom handle exposed as ref.
      // It simplify the need to sync paths in the drawing canvas when
      // listening on events
      canvasRef.current.sync(event.paths)
    }
  })

  return (
    <DrawingCanvas
      ref={canvasRef}
      backgroundColor='transparent'
      width='100%'
      height='100%'
      strokeColor={STROKE_COLOR}
      strokeWidth={STROKE_WIDTH}
      onChange={handleCanvasChange}
      pathDisappearingTimeoutMs={PATH_DISAPPEARING_TIMEOUT_MS}
    />
  )
}

And here we go πŸš€ we can now broadcast and listen on events to draw on screen πŸ™‚

Demo

Check out the Liveblocks drawing on screen experiment on his lab page.

Conclusion

Thanks to Liveblocks as a developer we don't have to care all about the stuff on broadcasting and listening events. It's simple with a few lines of code πŸš€.

Liveblocks allows to bring collaboration with a fast pace on your product and this is amazing πŸ˜ƒ. No more headaches about how to design the infrastructure nor create the code to handle websocket in your frontend app with a good level of abstraction. You can just focus on what's matter the most to bring a super user experience to your customers in your product. And this feels really good 😊


Additional notes

This experiment is using Liveblocks 2.0. Check out the upgrading guide if you're coming from the first version as there are a lot of breaking changes.

This experiment is very straightforward and focus on the main collaborative side logic. Still their is room for improvements like performances when syncing the paths in the drawing canvas.