As AI rapidly becomes a core component of K–12 education, we hear one practical question from educators all over the world: How can I teach AI through hands-on, project-based learning if I don't have an advanced background in coding? While many teachers are eager to bring AI into their curriculum, it's completely normal to feel a lack of confidence when it comes to the technical implementation. At DFRobot, we are committed to bridging this exact gap—moving from basic AI awareness to confident classroom practice.
Recently, we had the incredible opportunity to put this mission into action by hosting a three-hour, hands-on teacher training workshop at the Kathmandu University School of Engineering. You can read the official press release about the event here, but we wanted to take you behind the scenes on our blog to show you exactly how we helped teachers from various disciplines turn abstract AI concepts into engaging classroom projects.

Our approach for this workshop was simple: combine technical learning with pedagogy through face-to-face, project-based building. Instead of getting bogged down in complex syntax, we focused on deploying AI capabilities—specifically speech and vision—directly onto edge devices.
To make this as accessible as possible, we equipped the participating teachers and department heads with our UNIHIKER K10 hardware, HUSKYLENS AI vision sensors, and our Mind+ graphical programming software. Together, we built two progressive projects.
Voice interaction is a fantastic, accessible entry point into AI. Using the UNIHIKER K10 and Mind+, our team guided educators through creating a simple voice-controlled system.
Teachers started by programming basic commands, like "turn on the light." This allowed them to experience the fundamental AI interaction loop firsthand: wake → recognize → execute. Once they mastered the basics, they extended the system to control on-screen movements using voice directions. The focus wasn't on complex coding, but on showing how simple voice commands can be translated into highly interactive classroom applications.
Next, we tackled visual data. Using the HUSKYLENS AI vision sensor, we introduced teachers to how machines perceive the world. We broke down the core workflow of face recognition—detection, alignment, encoding, and matching—giving educators a clear technical framework they can easily explain to their students.
The highlight of the session? Building a "Smart Pet" system! Teachers connected the HUSKYLENS to the UNIHIKER K10 via Mind+ and trained the system to recognize different types of cats (like an Orange Tabby, a Striped Tabby, and a Siamese). Depending on the cat it recognized, the system would respond with different interactive states. It was a fun, highly engaging way to demonstrate how visual input drives interactive systems.
By the end of the three-hour session, our goal wasn't just to show off cool tools. As part of the workshop's assessment, we asked each educator to outline how they would adapt these exact projects for their own classrooms.
This workshop highlighted a practical model for K-12 AI education that we believe can be replicated in schools anywhere:
At DFRobot, our greatest success is seeing educators shift from merely understanding AI to confidently applying it. By experiencing the entire creation process themselves, the teachers at Kathmandu University left ready to inspire the next generation of innovators.
Want to bring these projects to your classroom? Check out the UNIHIKER K10 and HUSKYLENS in the DFRobot store, and download our free Mind+ software to get started today!