28 June 2012

The Nerdkit: Invaginations And Exphallations

Or, how Rule 34 of the interwebs still holds true for Engineering blogs.
I guess the words of the week are Invagination and Exphallation. Two words that sound dirty, kinda are, but really aren't. I haven't been able to find them on standard dictionary websites, but MIT professors say them all the time, as do some of my manufacturing textbooks, so they must exist.


To put it simply, this picture depicts an invagination, which you can imagine is a hole of some sort, and an exphallation, or protrusion. Their geometry isn't perfect, but when you snap the exphallation into the invagination, the invagination's thin walls move to receive the exphallation and hold it together.
*shudder* I feel dirty. Let me explain:
I'm doing some undergraduate research this summer! I am working at the Laboratory for Electromagnetic and Electronic Systems (LEES) as a mechanical designer. Our overall project is the design of various aspects of a potential new class for EECS freshman to get their feet wet. 



Particularly, we're designing a NerdKit, which is a container of sorts with some circuit prototyping elements such as breadboards, power supplies, a function generator... It should be compact, full-featured, and be able to fit in a student's backpack. Also, the students make it themselves using a manufacturing technique called Thermoforming, the process of draping melted plastic over a mold and using vacuum suction beneath the mold to eliminate all air pockets, reproducing the object defined in the mold.




To start things off, my coworker and I are prototyping the more intricate design aspects: fastening and hinges. Here are the molds we made out of High Density Polyurethane, copying the design of an apple container we had lying around.




Some thermoforming process pics and a vacuum fail... 



And the thermoformed pieces! The insert fit into the container like a charm, though it took some modifications to get my hinge snapfit thing working right...



 To get suction in hardToReach spaces, I had to drill some holes in the mold. This allowed the plastic to form closer to the mold, resulting in...


This working snapfit: much better! That invagination and exphallation look so happy together :3. The hinge and snap fit took a few iterations to get right, but worked well! Issue with this design, however, is that the part is made facing up, and closes like a book upon removal from the machine. The parts I'm making for the case of the Nerdkit are all made facing down, as if the book were placed with the cover up. The hinge that worked for the up-facing part will not work for the final product, so I need to design a better hinge that can take a beating and will work for the down-facing final product.
Good thing I have all summer!
Unfortunately the mold broke as I was removing it from the part, probably because it was too thin: HDP is expensive shit so we were frugal with out molds. Next time I won't be so cheap, and will use a urethane release spray.
My next endeavor will be learning to 2.5D CNC mill my molds of some other test hinges. I feel most of product design is iterating through your ideations until you find a prototype that works... gives me new respect for product designers. (PS: Watch the documentary Objectified, it's awesome.)

02 June 2012

6.141: Robotics Final Challenge

This term I took 6.141 (Robotics: Science and Systems I), a class about life, love, and the pursuit of autonomous mobile robotics. Teams consisting of four students, usually from different majors, collaborate on the class’s seven labs as well as a 3-week Final Challenge, to implement a motley array of software-based techniques for robot navigation, localization, mapping, and manipulation. As an Institute Communication Intensive Laboratory class, we had several design proposals to do, both individual and team-based. Additionally, we had to give a presentation after each lab and a design overview before beginning work on the Final Challenge.

The Final Challenge had the following premise: An autonomous rover lands on Mars with the intention of building a predetermined structure for future human colonists to use. 


Within the scope of the class, however, our robot simply had to navigate this maze, collect those blocks, and build some kind of structure in any way possible. Easy enough, right? We'll just make this thing:


The only exception (to my dismay) was no flamethrowers, saws, hammers, or other devices that can destroy or otherwise disassemble the environment may be used. Oh well...

The freedom for the robot's design allowed for each team to come up with a unique approach to solving the problem. Historically, most teams opted to use as much of the code from previous labs as possible, which involved active manipulation for collecting blocks, and a passive construction method of funneling the blocks into a structure on board the robot that could be placed in the environment.



My team, Team VI: The Sentinels, opted to go for the inverse, which was passive collection and active construction. We went for an (almost) entirely open-loop approach, using only the provided map and block locations, with killer wheel encoder odometry, for navigation. 
See, our robot has no vision, bump, or distance sensing. Before we first demonstrated successful navigation, members of other teams started talking smack about our Sentinel, calling the poor thing "blind". 


And it's certainly blind. Not one camera or sensor on that thing. We implemented Computer Vision with both a webcam and a Kinect, but in the end it wasn't really necessary because of our dead-accurate encoder odometry. 

(We experimented with mounting two optical mice underneath the robot and using their readings as odometry. We conducted a test to see which was more accurate, the mice or the wheel encoders. Our robot started at one corner of a 1-meter square, and we gave it 10 waypoints, each at a random corner of that square. After multiple tests, we saw, to our surprise, that the wheel encoders consistently got it to within 2 centimeters of its final waypoint, while the optical mice always left it  greater than 5 centimeters from that final waypoint. Clearly, we had some kick-ass encoders.)

Our robot robot navigated the maze, driving over blocks which would be collected by a bent sheet aluminum funnel suspended underneath. 


As the blocks were collected, they slid along the funnel and into the space underneath that was sized such that the blocks would interlock and form a straight line. 



Once we collected enough blocks, the robot drove to a spacious area of the maze and drove backwards to deposit the blocks in front of it. 



And then construction began using only the provided manipulator which was actuated by hobby servos which were too weak for the job, prone to overheating, and proved to be nonlinear as a result. But it worked well enough:

We somehow got it working. First with my spearheading a teammate's bet that I couldn't get it to stack a 5 block tower. (If I got it working, I'd get $100.00. If I didn't, I got him a sandwich. A reasonable bet.) I got it working once, but with inconsistent results, winning the bet.


Then with hours of fine-tuning that sounded a lot like the above video does at 03:15


More fine tuning led to consistent behavior, 5 minutes of fame on the MIT EECS facebook and the 6.141 website (and an A). Above is footage of it stacking a tower. (Including Neil waking up in disbelief after sleeping on the ground for a couple hours. The life of an MIT student sure is a glamorous one.)


And footage of it completing an entire Challenge run. Complete with Will talking about my old dormitory, known in the 80s as "McKegger", and myself rambling on about all the nerdy stuff I learned (and didn't) in the class. 


The other teams had some noteworthy designs as well, a class favorite being this bad boy.



This robot, like most 6.141 bots of the past, actively collected blocks using the manipulator and dumped them into a container in the back where they were passively assembled into a structure. The real innovation made by this team was the design of the back-mounted hopper, and the type of structure it left in its wake. 


The initial blocks fall into place on the bottom, oriented 45 degrees up from the ground and aligned side-by-side. The following blocks fill in the gaps in between, stabilizing the structure, which is only supported by the corners of the bottom blocks.

When time is up, the hopper rotates back like a dump truck until the blocks contact with the ground. The robot then pulls away, leaving this arch-like structure. 
The class was a blast and I learned a ton. Unfortunately, we don't get to keep the robots because they get disassembled for next year. 
But that's okay, more robots are coming...



(Yes, I got my hands on one of these, too)