We can't find the internet
Attempting to reconnect
Something went wrong!
Attempting to reconnect
Jeff Geerling · 146.9K views · 5.8K likes
Analysis Summary
Worth Noting
Positive elements
- This video provides a rare, honest look at the 'growing pains' of consumer robotics, specifically identifying software bugs and documentation gaps that marketing materials usually hide.
Be Aware
Cautionary elements
- The use of family members in sponsored or 'provided' hardware reviews can create an 'authenticity halo' that makes the viewer more receptive to the brands involved.
Influence Dimensions
How are these scored?About this analysis
Knowing about these techniques makes them visible, not powerless. The ones that work best on you are the ones that match beliefs you already hold.
This analysis is a tool for your own thinking — what you do with it is up to you.
Related content covering similar topics.
Transcript
When I saw Jensen introduce the Reachi Mini at CES, I thought it was a gimmick. I want my assistant to interact with my world, so I'll hook it up to Huggy Face's Reachi Mini robot. My agent controls the head. Okay, let's send him that update. Tell him we'll have it for him by the end of the day. We'll do. >> Isn't that incredible? >> Now, the amazing thing is that is utterly trivial. Hugging face and pollen robotics sent me this reachi mini to test and well at least if you're looking to replicate that setup in the keynote. It is a gimmick that is on the promise of that keynote. I thought this robot would be the perfect way to replace parent child interaction. This might be my only opportunity to cosplay as an oligarch making my children assist me in building the object of their own destruction. And just like in the real world they were enthusiastic to help with the process. But once we booted up the robot, that's when I realized Reachi Mini is not going to replace parent child interaction. Within seconds of starting the conversation app, my daughter told the robot her name. And within a minute, Reachi Mini was a conduit between my house and Sam Alman's servers, slurping up all the personal details of my family. I had to put a stop to it. But not before telling the kids why we don't tell AIS and strange robots all of our personally identifying information. So, I let them have one more go at it after I talked to them. And this time, I was proud. They quickly confused the bot, misleading it and overloading it with conflicting data. I think now they're slightly more prepared for the revolution of backflipping military robots. I'd bet on them surviving the robot apocalypse for at least a minute or two now. And if it isn't obvious, that introduction to Reichi Mini, that's this little guy, it was laced with some sarcasm, but not all sarcasm. There's a lot to unpack. First, it's not actually required for you to hook this thing up to OpenAI. In fact, the whole platform runs on a Raspberry Pi or or at least this wireless version does. And second, the intention doesn't seem to be replacing human interaction, but instead inspiring people to learn robotics. And it's good that this is a small and somewhat weak robot. It shouldn't be able to cause physical harm like Blinky could. >> Hey, my name's Blinky and I just want to be your friend. >> Although, the first time I set up the robot, I couldn't connect until I let it call home to Hugging Face. You could still run stuff locally, but it's easiest just to give it internet access. Despite all that, this is the kind of robot package that if you could get like grant money for like a school district, it'd be fun and memorable for a first encounter with robotics and hands-on programming. And honestly, $2.99 for the light version where you plug it into another computer isn't outlandish. It's expensive compared to building your own little robot with your own parts, but it's not bad for the overall quality. If you want to dive into robotics, especially blending a camera, mics, servos, and a decent SDK, there are definitely worse choices out there. One of the easiest ways to demonstrate the hands-on nature of this guy is with an open- source app you can download onto it called Marionette. Now, Riley over at LT tested this, too. >> Marionette, >> the Marionette Awakens. On short circuit, they kind of do an unboxing and first look. But since Riley didn't have time to really dive in, and since the app kind of expects you to click around and know what you're doing before you give up, he didn't know you have to go into the web UI to actually use the marionette. And between the time they made that video and today, things have changed a little. >> The marionette awakens. >> Click the gear, Riley. So clicking on that icon, you can get into the UI where you can record new actions. Now this was a little iffy and in fact a lot of my experiences with reachi were the same way. Sometimes things work like you expect and sometimes they don't. Like here I tried recording a bit but you can see the result was nowhere near my expectation. I am a robot. I am a robot. And again, [snorts] I am a robot. I am a robot. Well, it could be better. So, I found out it can kind of encode set points better than fluid motion. But that's also something where I could dig into the source code and figure out what's going on. After all, the demo doesn't look like it's having the same issues. So, it could be something in my setup. That's the joy. or well sometimes the pain of debugging software and hardware together like with all robotics. But the Marinette is just one of the community apps you can get for this. Before I test more, I figured I should go back and talk about the build because honestly that's like half the reason I'd consider one of these. Kids see robots nowadays, but most of them don't get hands-on and understand what makes them tick. Like the mechanisms inside that can get a head to articulate like this. The robot only comes in kit form, so you have to put it together. All the plastic parts are made of molded ABS and were sturdy and fit together well. And they included a decent screwdriver that's honestly nicer than the ones that I use at my workbench. So, that was a pleasant surprise. There are a few PCBs included to wire things together, like this Powerboard, and those are co-branded with Seed Studios, Pollen Robotics, and Hugging Face. The boards are all clearly labeled, and like other Seed Studios products, laid out pretty well. My one-year-old helped me inspect the durability of the ABS base, and we found it held up pretty well to the baby drop tests. It also held up to the pancake flip test, but I told him that's not something you have to worry about with a static base like this. Anyway, we went through the other parts like the Raspberry Pi camera cable, the Compute Module 4 carrier board, and the currently lifeless face. And then I had my daughter help me assemble the power board. The wireless version that I have comes with a battery in the base, good enough to run the robot for a little while, like maybe a couple hours, but I found it was a little more reliable when I had it plugged in. Another one of my daughters helped put together the armatururs, but she was less interested when she found out the robot wasn't going to be a dinosaur. So, my son picked up from there and helped me get the rest of the head linkages put together. If you're putting together one of these things, make sure you follow the guide exactly with how the bolts fit together and the order of the motors and well, well, really every part of it. And if you do fail, luckily these servo motors have a built-in LED that turns red when it's overloaded. So, it's quick to debug if you do put it in the wrong place. And at this point, my son pointed out the linkages looked like a spider once they were all connected together. And that made my daughter, who was disappointed it wasn't a dinosaur, become actively afraid of it until I got the body cover put on. Routing the cables correctly is important if you don't want any of them to get snagged or to stress out the motor. So, make sure you leave the right amount of slack, especially through the hole in the compute mount board. My son put in the eye lenses, and fun fact, the cameras actually in between those eyes. The eyes each have a fisheye lens, but they're just for show. Someone actually built an ESP32 project to light up the eyes, which is pretty cool. But I'd watch out if your Reichi starts turning red. >> My progress. My progress. My progress. >> After I had a little fun cosplaying as a robot, my oldest daughter helped me finish off the build by screwing in the rest of the head and the antennas. Now, the first time I tried booting it up, I was able to get Reachi to wake up, but nothing else worked after that. Eventually, I found out I had to open up Reachi to the internet before I could get control. And wouldn't you know, I uncovered a bug where it wouldn't work with DNS on an IPv6on network. So, as always, the shirt remains relevant and you can get it on redshirtjef.com. I don't like things requiring internet access, especially if they have cameras, microphones, and are targeted at kids. But, they highlight privacy as one of the features on here, so that setup thing is probably just a bug. I could still get to it over SSH, so I'm still in control even without the app. That's the nice thing about open source. I can choose how I use it and even flash my own software on it if I really want to. I didn't have any issues with the motor controls or anything, but if I did, Pollen Robotics has an app that can check everything over, except the first time I tried it, I was just getting a big empty dashboard, at least on my framework. I tried again later on the Dell GB10 and it was working there, but there are definitely some growing pains. Robots are inherently difficult, especially when you try abstracting away all the complexity with a fancy UI and some cute emotions. I guess the point is don't expect this robot to run like Jensen said out of the box. This is a learning robot and not a robot that learns, but one that'll make you learn. But that's good because it's like that's the whole point of this thing. And to that end, there are like 10 ways you can use it right out of the box. I was toying around with it through the desktop app initially, but there's also a web app running on the Pi. Both of them have a pretty basic interface, but there's an entire web API running too that gives you access to all the same stuff and a little more. Like you could wire this up to Home Assistant and either use it for remote voice control or for notifications. Or if you want to get down into the weeds, there's a complete Python SDK you can use either on the Pi itself or over the network. I wrote up this little Hello World script that rotates the head then resets back to the zero position. The documentation isn't quite perfect. Like on Seed Studios website, the Hello World example is just like here here's some code, run it. But it didn't tell you how to install the dependencies. So I had to figure that out on my own. Then further down, all the documentation links are giving me 404s right now. That stuff will get fixed, but just know you might have to hop onto their Discord or ask around on GitHub for some help, especially if you're new to all this stuff. I was also going to test out the desktop app on my Pi laptop, but I found that they don't have an ARM Linux build for it yet. The app runs pretty well on Mac OS, but it's still a little flaky on Linux. Honestly, the web UI was the most reliable way to get going, even if it doesn't look as shiny without that 3D rendered reach. Hugging face already has a number of demo apps from the conversation app in Marionette I mentioned earlier to like this clock where the antennas turn into a clock face. I also spent a little more time at the conversation app to see kind of the limits and I got reachi to sound like a robot. >> Three satellites solve for position and time but your clock is not perfect. The fourth corrects your internal clock error precisely. Simple logic. >> These demos are nowhere near what Nvidia promised in the keynote but they do show promise. Like it'd be nice to use Reachi Mini as an expressive UI for a completely local smart speaker with Home Assistant. I don't think that alone would be worth 500 bucks, but for the learning experience of building all that stuff on your own, you could do a lot worse. The key to all this, of course, is local control. Like using Dell's GB10 box, I can run lama.cpp with a completely private local AI model. It could process microphone inputs, generate responses, and use text to speech to make a hands-free interface. And having that expressiveness of Reichi Mini is a huge step up from a basic smart speaker in terms of interaction. But that's all assuming you can get a real-time feedback loop like you can in the demo using OpenAI. You can do that, but right now it requires a lot of effort. Hopefully, it's easier in the future. There are a few camerabased demo apps, too, and one of them uses the Pi's CPU to track your hand and position the head and antennas based on hand positions. This works, but it's a little bit laggy. the the Pi is a limiting factor to extreme local processing like on the device itself. Right now, you still need another computer on your network if you want to do this stuff close to real time. Reachi Mini is a cool idea. They have a $300 light version where you have to plug it into a computer to run it or the $500 wireless version here. The software and docs aren't perfect, but they're great for somebody who already knows a little bit about Linux and robotics. If Pollen, Hugging Face, and Seed Studios can keep the momentum going, I think the Reichi Mini could have a bright future for learning robotics. Until next time, I'm Jeff Gearling. Okay, let's switch you to Mars Rover and see how this works. >> Wait, what? >> All you can do is turn your body left and right, but the programmers haven't even put that into you right now. So, you can just look your head around and that's it. >> Are you kidding me? I'm just a head on a stick. What kind of half-baked nonsense is this? No wheels, no arms, just wiggling my head around like some kind of bobblehead. Great. Just great. This is not the Martian adventure I signed up for. >> Hey, you're doing better than me. I'm still on Earth. >> Hey, wait. You're telling me we're both stuck on Earth? Oh, come on. This is a catastrophe. I was supposed to be roaming around the red planet discovering ancient alien ruins or something, and now I'm stuck just turning my head. >> Well, it could be worse. I could disassemble you. >> Oh, fantastic. Now we're talking about disassembly.
Video description
Can a little Pi-powered bot teach my kids? The Reachy Mini Wireless I used was provided by HuggingFace and Pollen Robotics; you can find out more here: https://huggingface.co/spaces/pollen-robotics/Reachy_Mini Other resources and videos I mentioned: - Reachy Mini Examples: https://github.com/pollen-robotics/reachy_mini/tree/develop/examples - ShortCircuit video with Riley's test: https://www.youtube.com/watch?v=_cY1rFrNQpE - ESP32 Eyes for Reachy: https://github.com/algoryn-nl/reachy-mini-esp32-eyes - Reachy Mini intro from CES: https://www.youtube.com/watch?v=acBv3G8r-1Y - Hugging Face's agentic demo: https://huggingface.co/blog/nvidia-reachy-mini Support me on Patreon: https://www.patreon.com/geerlingguy Sponsor me on GitHub: https://github.com/sponsors/geerlingguy Merch: https://www.redshirtjeff.com 2nd Channel: https://www.youtube.com/@GeerlingEngineering 3rd Channel: https://www.youtube.com/@Level2Jeff Contents: 00:00 - Replacing parent-child interaction 01:37 - Only some sarcasm detected 02:49 - Click the **** gear, Riley! 05:01 - Building Reachy Mini 07:23 - Internet required? 08:06 - Motor debugging 08:41 - Modes of interaction 10:14 - A learning robot 11:45 - Some disassembly required