View All Posts
read
Want to keep up to date with the latest posts and videos? Subscribe to the newsletter
HELP SUPPORT MY WORK: If you're feeling flush then please stop by Patreon Or you can make a one off donation via ko-fi

Explore the Monocle from Brilliant Labs, an AR device that clips onto your glasses, and discover its features, specs, and potential applications! Dive into image transformation and take a closer look at this intriguing gadget.

Related Content
Transcript

[0:00] I’m a sucker for a gadget, and when I saw this come up on Twitter, I just had to buy one.
[0:05] I’ve got myself a Monocle from Brilliant Labs.
[0:08] Let’s see what’s in the box.
[0:10] So we’ve got the charging case with the Monocle inside, and we’ve got the charging cable.
[0:15] It’s an AR device that clips onto your glasses.
[0:20] As usual, this video is sponsored by PCBWay, and I was hoping to get their logo to show
[0:25] up on the Monocle.
[0:26] The problem is, there’s clamorly no support for images, it only does vector graphics.
[0:31] But we don’t let these small problems get in the way on this channel, how hard can it
[0:35] be to take a PNG file and turn it into polygons.
[0:39] This was a pretty fun exercise, there are quite a few libraries that can help out with this
[0:43] and ChatGPT was a lifesaver.
[0:45] The first thing I’m doing is detection the two most common colours.
[0:49] Looking at our image, you would think this should be simple, we’ve got green and orange.
[0:54] But zooming in, we can see that there’s actually a lot more colours, particularly where the
[0:58] orange slash overlaps the green wire.
[1:00] You can also see that we’ve got various levels of opacity around the edges.
[1:04] We can see the range of colours if we do a 3D plot of the red, green and blue values.
[1:09] We’ve got a wide range of colours.
[1:11] However, we can feed our colours into a K-means clustering algorithm, and provided we tell
[1:15] it just to create two clusters, we can get the two most common colours.
[1:20] We can now map our original colours onto these clusters, and if we look at the image
[1:24] now, we’ve got rid of all the anti-aliasing that was there.
[1:27] I’ve also thresholded the alpha channel, so we’ve got rid of the opacity.
[1:31] We can now take each colour in turn, and feed it through a library that extracts the boundaries.
[1:36] I’m using a library called potrace, I’m not going to pretend to know how this works,
[1:40] I’ve linked to a paper that explains it in the description.
[1:43] This gives us a bunch of outlines for the boundaries of our objects, and also gives
[1:47] us outlines for the holes within our objects.
[1:49] It was a small issue though, it uses a lot of bezier curves for its lines, and our library
[1:54] only does straight lines.
[1:56] But all is not lost.
[1:57] The equation for a bezier curve is pretty simple, so we can approximate any bezier curve
[2:02] with a series of straight lines.
[2:05] So far so good, but we’ve still got a problem.
[2:07] We’ve got the outlines of our shapes, and we’ve got the outlines of the holes in our
[2:11] shapes, but our library can only draw simple polygons, it can’t draw polygons with holes
[2:15] in them.
[2:16] There’s a couple of options to solve this, after confirming with chatgpt, we decided
[2:20] to use a triangulation library.
[2:23] This library can take the boundary polygon, and the whole polygons, and produces a bunch
[2:27] of triangles.
[2:28] We can easily feed these into the display, as triangles are just very simple polygons.
[2:32] But we do have a lot of triangles, this may not be very efficient.
[2:37] After a bit of digging into the triangulation code, I found that it actually has a function
[2:41] that will extract the boundary of a polygon with holes.
[2:44] So we can just feed our polygons into that, and we end up with some nice polygons with
[2:48] holes in.
[2:49] Our work is done, we’re ready to display the results.
[2:53] Unfortunately so far, I’ve only managed to display a couple of the letters, we’re going
[2:57] to need to wait for some new firmware with some bug fixes.
[3:00] But if you want some PCBs manufactured, you know where to go.
[3:05] Stay tuned to the channel, at some point I may get a spinning cover from Elite working,
[3:09] which will be pretty amazing.
[3:12] Let’s take a closer look at the tech.
[3:14] The case is really just for charging up the monocle, it’s got a built in 450 mWh battery
[3:19] so you don’t need to have it plugged in all the time, and it connects to the monocle using
[3:23] these two pogo pins.
[3:25] As far as I can tell, it’s just a dumb device, the USB connection is just used for charging.
[3:31] The actual monocle itself connects over Bluetooth and runs MicroPython, there’s a web rebel
[3:36] you can use to program it.
[3:38] We can run a pretty simple blink sketch with the built in LEDs, and it seems to work pretty
[3:42] well.
[3:43] We’re just cycling between the red LED and the green LED in this sketch.
[3:47] There’s some pretty interesting bits of hardware, we’ve got a touch controller for
[3:51] the two touch pads on the top, and the MCU is a Nordic NRF 52832, with 512 KB of flash
[3:58] and 64 KB of RAM.
[4:00] Now compared to things like the ESP32 which we’re normally using, it’s a pretty puny
[4:05] processor.
[4:06] It only runs at 64MHz, and with 64 KB of RAM, we’re really limited in what we can do with
[4:12] it.
[4:13] To make up for this, the monocle comes with a built in FPGA.
[4:16] This is used to manage the camera, microphone and to drive the display.
[4:21] There’s quite a nice exploded diagram on the monocle website that shows you all the bits.
[4:26] There’s a 70 mAh battery, that’s this big orange blob that you can see here, and you
[4:30] can see the 5 megapixel camera via next to it, the microphone’s hidden somewhere behind.
[4:36] The screen is up here connected to the PCB.
[4:38] This shines down onto some optics, which let you see it when you’re wearing the monocle.
[4:43] It’s really hard to film the screen, but I’ve done my best.
[4:46] Here I’m running one of their simple samples that wires up the touch buttons to some text
[4:50] on the screen.
[4:51] It’s just wired up so the text changes when you touch the A or the B buttons.
[4:56] The firmware is a bit of a work in progress at the moment, so I’m waiting for a new release
[5:00] to play around with some more graphics.
[5:03] Is it actually any good?
[5:04] Well, it does work.
[5:06] The lenses in my glasses are pretty thick and bulge out quite a bit, which makes it quite
[5:10] hard to get the monocle to sit at a good angle.
[5:13] But with a bit of fiddling, it does work ok.
[5:15] I’ll probably 3D print something to make it a bit more useful.
[5:19] The MCU is pretty underpowered, but the ability to offload any heavy processing to another
[5:24] device over Bluetooth and the built-in FPGA do mitigate this somewhat.
[5:28] I haven’t done any FPGA programming since university, so this could get quite interesting.
[5:34] What in all is a fun bit of kit?
[5:35] I just need to think of something useful to do with it now.
[5:38] Any suggestions in the comments?


HELP SUPPORT MY WORK: If you're feeling flush then please stop by Patreon Or you can make a one off donation via ko-fi
Want to keep up to date with the latest posts and videos? Subscribe to the newsletter
Blog Logo

Chris Greening

> Image

atomic14

A collection of slightly mad projects, instructive/educational videos, and generally interesting stuff. Building projects around the Arduino and ESP32 platforms - we'll be exploring AI, Computer Vision, Audio, 3D Printing - it may get a bit eclectic...

View All Posts