Posts

Showing posts from March, 2026

Robotic Hand Assist for Rehab - Part 2 - The Genesis

Image
The Inspiration Behind My Robo Hand Assist for Rehab Project! Some incidents have a deep impact on you. In December 2025, someone very close to me met with a horrible accident, suffered multiple injuries in the hands, legs, and hips, had to go through multiple surgeries, a long period in the hospital, followed by a long period of recovery. Among the injuries, one was particularly brutal, not because it was life-threatening, but because it took away control. He suffered a radial nerve injury, a condition that affects the ability to extend the wrist and fingers. In simple terms, the hand loses its ability to open and lift. This leads to something known as “wrist drop”, where the hand is unable to perform even the most basic actions. Holding a bottle. Picking up a pen. Shaking someone’s hand.Movements we never think twice about suddenly become impossible. What followed was a series of complex medical procedures - fracture fixation, nerve repair, and eventually a tendon transfer surgery. F...

Mood Lamp Project Part - 2

Image
 With the hardware done, the next step was the firmware and connecting it to Blynk. The code runs in Arduino IDE and handles three things: Wi-Fi connection, receiving RGB values from the Blynk dashboard, and writing those to the NeoPixel ring. The Blynk app sends values through the cloud to the ESP32, which updates all 16 LEDs. Setting up BlynkIOT: create a project in the Blynk web console, add an ESP32 device, get your Auth Token, and paste it into your code. The phone dashboard is three sliders for Red, Green, and Blue, each linked to a virtual pin.  Testing was plug-and-play. Moved the sliders, the LEDs responded. Ran through a few colors: blue, green, red, and mixes. No issues at all. The lamp also diffuses light well when placed face down on a surface. The acrylic scatters it into a soft glow rather than 16 separate points. It wasn't planned, but it's a useful effect. The working video is up on YouTube: https://youtube.com/shorts/1-iBQc89kcM , and the code is up ...

Mood Lamp Project Part-1

Image
I wanted to build something at the intersection of IoT and home lighting — something you'd actually keep on your desk. The Mood Lamp ended up being a 10 cm circular lamp with 16 RGB LEDs, controlled from your phone over Wi-Fi. The main component is an Adafruit NeoPixel Ring, 16 x 5050 RGB LEDs, each individually addressable. You push an RGB value to each LED to set the colour. That's what gives you basically infinite color options rather than a fixed set. For the microcontroller, I went with ESP32. The reason: it has Wi-Fi built in. No extra module, no extra wiring. You flash the code from Arduino IDE via micro USB, connect to your network, and the hardware side is essentially done. Three wires connect the NeoPixel ring to the ESP32: data to GPIO4, 5V to VIN, and ground to ground. The body is laser-cut acrylic, circular, about 10 cm across. It holds the ring and the ESP32 in place. I had it cut to fit both parts without needing any adhesive. The phone side runs on BlynkIOT....

No Hands Across America

Image
With Autopilot and Full-Self Driving (FSD) options, Tesla (Models 3, Y, S, X), cars are one of the most sophisticated and accurate Driverless cars on the road, and continually getting better and better. Rivian (R1S, R1T), the Waymo fleet of all-electric Jaguar I-PACE SUVs with Google’s autonomous driver technology are the others among the notable names. A milestone in the journey of Driverless cars, though, was the 2,850-mile, no-hands road trip called “No Hands Across America”, taken up by two CMU folks, in an almost completely autonomous car in 1995. In July 1995, Dean Pomerleau and Todd Jochem of CMU’s Robotics Institute took an epic, 2,850-mile journey from Pittsburgh to San Diego, in a 1990 Pontiac Trans Sport minivan. Their driver for more than 98 percent of the journey was a computer named the Rapidly Adapting Lateral Position Handler (RALPH), and their minivan was Navlab 5—the latest in a series of autonomous vehicles that had been developed at CMU’s Robotics Institute since 19...

Shakey: The First Electronic Person

Image
In 2026, Tesla Optimus has become the de facto face of the new age Autonomous Robot revolution. Optimus is a 1.73 m, 57 kg, general-purpose humanoid robot designed for autonomous, repetitive, or dangerous tasks. It utilizes Tesla's EV battery technology, has end-to-end neural network AI capability based on Tesla FSD, and the ability to learn tasks by observing humans. And though it is difficult to fathom, the first humanoid robot belongs to the 1960s, a full 60 years prior to 2026. “Shakey” is the first mobile robot with the ability to perceive and reason about its surroundings. It was physically mobile, and had elementary computer vision, and basic navigation capability.  Shakey (https://www.sri.com/hoi/shakey-the-robot) was created from 1966-72 by the Artificial Intelligence Center at Stanford Research Institute (now SRI International). Shakey could perform tasks that required planning, route-finding, and the rearranging of simple objects. Shakey could perceive its surroundings, ...