Eye Mechanism Assembly
This new mecanism is inspired from Mats Onnerby and Bob Houston. It replaces my previous eye mecanism which you still can find after the below tutorial.
Download STL from the Gallery
Before printing all the parts you should print the CALIBRATOR, to check if your parts will fit together. If you have a very hard time putting those parts together, adjusting the horizontal expansion setting of your slicer software can solve that, this setting can vary depending of your slicer and printer but users report to set it at -0.15 is a great place to start.
An infill of 30%, wall thickness 2mm, best with no raft, no support(unless specified), use a brim for big parts to avoid warping.
You will need to print all these parts at a good resolution:
- 1x2xEyeBallFullV2
- 1xEyeHingeCurve
- 2xEyeHinge
- 1xEyeHolder
- 1xEyePlateLeft
- 1xEyePlateRight
- 1x EyeSupport (print with support)
- 1xEyeTo Nose
Depending on your camera choice, print:
- EyeBallSupportHerculeLeft
- EyeBallSupportHerculeRight
- EyeBallSupportLifeCamHDLeft
- EyeBallSupportLifeCamHDRight
I used 2 DS929HV servos from Hobbyking because they are stronger, but you also can use some DS928HV or even some cheap SG92R or HXT900.
You can use two sorts of webcam:
- Hercules Twist HD webcam
- LifeCam 3000 HD Microsoft (possibly 5000 as well if the PCB is the same)
Dismantling the LifeCam 3000 HD: https://astrophotovideo.
A video tutorial in French is available, you can have subtitles in English through the settings of YouTube:
Step1:
Redrill all the holes on EyeToNose with a 2,5mm drill.
Tab all holes with a tab of 3mm diameter.
Glue, with acetone if you have ABS prints or Epoxy for PLA, EyeToNose to EyeGlass.
Tab after redrilling the 4 big holes of EyeSupport with a 3mm tab.
Redrill with a 3mm drill the two holes of EyeHolder.
Insert and mount four screws of 3mm. You can add bolts as shown to secure them.
After redrilling the holes of both EyeBallSupport with a 3mm drill, mount them as shown. At this stage you can also add the the Hercules camera on the EyeBallSupport. My picture does not show the mounting of the cameras HerculesHD because I was trying a different sort of camera at the time.
Here, I am using a LifeCamHD 3000 from Microsoft. The image quality seems less good than the Hercules Twist HD. Notice that the EyeBallSupport is different because the PCB of the camera is set horizontably.
Step2:
Mount the one DS929HV servos from Hobbyking with its screws.
Set the servo at 0 degree with the horn set as shown.
Mount the second servo DS929HV.
Set the servo at 0 degree and mount the horn in a position similar to a 45degree angle.
Add EyeHolder with two screws of 3mm diameter.
And fix the mecanism to EyetoNose. Once mounted it should be able to rotate freely on its shaft.
Mount and screw the EyeHingeCurve between the servo horn and EyeToNose. Again the mecanism should be able to rotate freely on it’s shaft.
Add on the back of EyeBallSupport the EyePlates. You can either glue them or screw them.
Mount and screw the two EyeHinge between the top servo and the Eyeplates.
Step3:
Now we are going to create some realistic eyes with a few simple steps.
Unmount the mecanism by unscrewing the two screws of EyeToNose to have access to the front of the eyes.
Bellow is the ring order placement:
Now let your creativity get wild!!
Spray paint or use a colored marker of your choice to create a base on the pupile ring. In my case I used mat blue spray. You can also print the IrisEye file and cut it out.
Make some little lines with another thin black marker, from center to the outside. Perfection is not required as you can see on my picture. But if you are good and creative, you can really obtain something very realistic.
Add some little lines with your thin marker on this outer ring as well.
Spray paint in black mat finish the iris ring.
Now assemble all the rings together.
You can also replace the outer ring, pupile, and iris with a Fish Eye Lens for Smartphone. But it looks less realistic.
To make the eyes even more realistic lets be creative!! Now we are going to create the eye transparent cover.
You will need a heat gun, a glass ball of approximately 3 cm diameter (I used the perfume cap “Dior, J’adore” of my wife, fancy eyes!! You also need a piece of thermoformable plastic (cristal clear is better, other wise it will alter the camera vision) I personnaly used a piece of a light bulb blister. But any cristal blister will do the job.
Clean up all the surfaces to make sure you won’t alter the transparent plastic with dust or scratchs during thermoforming.
Apply hot air on the plastic and when it starts to become soft, apply and stretch it over the glass ball. Wait for cooling.
Mark the circle using the pupile ring, and cut out with scissors the lentille.
Add a bit of glue on the perimeter, carefull to be clean and avoid finger traces on the inside surface.
Glue the lentille on the pupile ring. Let it dry.
Do the same with the second eye.
Mount back the mecanism on the EyeGlass. There you go, you have an impressive looking InMoov!!!
You are now ready to install the whole face and eye mecanism into the head. See the Hardware map for to check your default servo positions. Make sure no cables are blocking the eye mecanism to rotate in X and Y directions.
Tutorial for the old eye mecanism:
My original STL files are still available here.
I used 3 DS929HV servos from Hobbyking. The two servos mounted for the right/left movement are connected with Y connector, so they receive the same data and act simulteanously.
Some of the pictures in this tuto are showing parts that might be a bit different than the one you actually printed. This is due to different iterations.
Now fix EyeMoverSide through EyeCamera to the actuator of the servo. You can also mount a second servo instead of using EyeMoverSide.
two servos makes the X movements more sturdy.
Hi Gaël
Tried to advance to the new eye mechanism (2015). Small problem with the eye hinges for the X-movement. I measure the distance between the cam holders turning axis as 54 mm. The eye hinges you say to use twice only add up to 2*26=52 mm and are not taking the angle to the servo arm into account. This gives me a cross-eyed result. Would it not be better to connect the 2 cam holders over a long straight connector to keep them in parallel and connect only 1 side to the servo?
Or maybe I wasn’t able to follow your detailed instructions correctly
Regards
Juerg
Hello Juerg,
Something must not be mounted correctly, I suspect.
I designed to specialy avoid cross eyed effect, in fact the eyes should be having the opposite effect for the purpose.
The two cameras should not be not aligned.
My previous eye design had a unique connector for both eyes, and most people would align the cameras to have a parallel view which gave a crossed eye gaze.
Could you make a thread on the InMoov forum and post pictures? It will be easier to understand what went wrong, and if I need to review something.
Hi gael I am new to Inmoov robot but I am having a question with eye webcam
My question is if you use webcam with usb cable where should it connect should it connect to the nervo board or directly to the tablet can you please tell me….
Hello,
You need to connect it to the tablet or PC.
The nervoboard doesn’t have USB connectors.
If you build a complete robot you will need a USB hub which needs to be powered with at least 4Amps, to support the 2 Arduinos, Camera, Kinect, sound system, Neopixel Arduino. Otherwise your tablet or PC won’t support all these USB devices.
Hello witch Hercules HD Camera du you use? Hercules HD Twist? in both Eyes? Have you an tutorial how to connect the cameras , Speakers and the Kinect?
Yes it is the HD Twist of Hercules. It is simply USB connected to the tablet Lenovo 8″ that runs InMoov MyRobotLab. Although I have two different cameras for each eye, I currently only use one at a time.
The Kinect is also connected via USB port.
Make sure to have a sufficient powered hub. My hub is powered by a 5v 10000mAh powerbank via USB.
The loud speakers can be connected via USB or via a micro USB soundcard + a 6v mini amplifier soundcard. (it depends on what you got).
I will post references in the BOM Hardware page to make it easier for you.
Thank you very much
One question
Kinect is connectic with 220 V Power Suply or without it?
Thanks.
The Kinect is currently connected to the 220V but it is possible to modify the cables to power it via USB cord. I had find on internet a tutorial of “how to”.
Unfortunately, I didn’t have the time to do it myself and post a tutorial about it.
You will have to search the web or ask the InMoov forum if somebody has the schematic.
Przepraszam czy masz może instrukcję ja zdemontować kamerę Herkules Twist ? nie chciał bym czegoś uszkodzić przy demontażu. Czy do koka pasują tylko te 2 kamary czy masz może jeszcze jakieś inne propozycje
Hello,
I didn’t make a tutorial on how to dismantle the Hercules camera.
Mainly, I fixed into a wrench and used a metal saw to cut into the plastic.
Dziękuje za odpowiedź . Wielki szacunek i podziw za to co robisz jak się dzielisz swoimi doświadczeniami z innymi. Chętnie bym Ciebie osobiście poznał jak będziesz w Polsce to daj znać :). A tak przy okazji czy tylko te 2 kamery wchodzą w rachubę może byś zaproponować coś jeszcze co by pasował o do mechanizmu oczu. Ciężko dostać kamerę herkules a LifeCam 3000 HD Microsoft jest dosyć droga.
Hi Gael,
I have 3 questions about the 2.0 version of your eye mechanism. First, I noticed that the X axis pivot points are slightly off center. So the left lens/eye is a little closer to the left eye hole when looking right and the right eye is closer to the right eye hole when looking left. Is there a mechanical reason for the offset? Clearance perhaps? Is it noticeable?
The second question is a little harder. Human eyes have over 135 degrees of vision but can only focus within about 6 degrees. So they move around a LOT in order to focus on anything within that 135 degree plus arc. (They are called Saccades, I think.)
Here is the question. Do the inMoov eyes move strictly for the aesthetics of eye movement, or is there some true object-tracking/face-tracking computational easier-than-moving-the-whole-head logic in MRL someplace? I ask because I have tried face-tracking with a 3-axis head and 2-axis eyes on another platform and it was extremely difficult to make work using both head and eye movement. Can MRL do that or does it effectively turn off eye movement while tracking?
Last question. Given that you are using an HD Twist in your eye, what do you think the degrees of vision is in your bot? How much does eye movement improve that range of vision?
Sorry for the long comments/questions. Thanks in advance for your response.
Hi Scott,
The off center is on purpose, it is to avoid crossed eye effect on the robot. Too many people using my previous eye version, didn’t care much of that effect and they wouldn’t adjust it correctly, but it bothers me very much to see pictures of InMoov with crossed eye effect.
Unfortunately Alan Timm has modified that again and the eyes look again crossed eye.
Cameras don’t care if the vision isn’t aligned because we can set within MyRobotLab the middle point where ever we want.
MyRobotLab has a 4 PID tracking system which is specially implemented for InMoov’s vision. The eyes move faster than the head when searching for a face or an object. Yoy can adjust the speeds of the PID in the python script. The 4 PID has been developped by Alessandro Didonna, who’s often on MRL.
The HD twist doesn’t have a very wide range of vision, therefore, it is possible to have one eye with a HD twst standard, and on the second eye to add an 160 degree fish eye lens for iphone. Kevin watters has done it and it seems to work pretty well.
Remember that Windows cannot (as far as I know) mount two cameras at the same time, a pitty for InMoov.
Thanks for the quick response. I suspected maybe the offset pivots had something to do with solving the cross-eyed problem. I have 3 interesting eye versions now and did notice that Alan’s camera holders are not set up for the twist, so it looks like I’m either going to use your design or maybe do a little remixing.
….Windows can’t handle two cameras…..OK, that’s interesting. I’m guessing you mean that there is only one video channel via USB. I didn’t even stop to think that even the Oculus Rift only has one camera and that is for positioning, not real-time stereo viewing. I know you and the MRL folks are way ahead of me, but I’ll talk to some colleagues to see if they have any ideas.
So far, although I’ve been printing for weeks, I do not have any complete subsystems ready to go yet. I’ve been testing servos and hands with other control tools I have, so I haven’t really jumped in to see MRL’s capabilities. Glad to hear it uses all servos for tracking. Really looking forward to making mine do that.
Thanks again for the answers.
Scott
Gael, amazing work on this. I know this comment is many years old, but Open Broadcaster software ( https://obsproject.com/ ) provides the ability to stream multiple cameras at the same time, though limited by USB bandwidth. I’m wondering if this could be used with InMoov to create a stereoscopic telepresence of some sort.
Hello,
Thanks for your message.
The new version of Myrobotlab allows to do stereoscopy directly with OpenCV. I haven’t tried it though, so I really don’t know how fast it is.
If Open Broadcaster is an open software, (including all it’s components), I am sure MRL developpers could wrap it up for Nixie version.
I suggest you go to Myrobotlab.org and chat about it.
Thank you for the suggestion, I will! I am still new to this community. For anyone interested in OBS, you can find the source on Git at https://github.com/obsproject/obs-studio .
Seconds after posting this, I noticed Alan Timm’s remix of the eye mechanism with centered eye pivots! As usual, somebody here already had an answer!
Thanks,
Scott
In your blog you mention a 3 mm diameter glass sphere, I assume you ment a 3 cm diameter sphere?
I was wondering the same thing. It could have meant to also say 30mm.
Yes, I made a mistake, it’s a 3cm glassball (or 30mm)
Yes you are correct, I made a mistake, it’s a 3cm glassball. I will modify the tutorial.
Thanks Juerg
Hi Gael,
I LOVE the new clear eye covers! Your technique for creating the plastic cover is awesome and will probably work well regardless of which of the dozen or so eyeballs is used. Fortunately (great minds…wives of great minds….) my wife also wears J ‘Adore so I will try this as soon as I get my lab up and running again. (Been sidetracked with a couple of other projects…)
Thanks!
Scott
Lucky you are to have a wife with good taste!
Yes I think the plastic cover technique can work on most eyeballs. What is important is to make the diameter of the cover the same size of the Iris, this avoids to show the overlap of the cover on the white of the eyeball.
Gael, your artistic abilities rival your mechanical mind! I thought the silicone finger grip mold was impressive! Bravo.
Hello, thank you for your comment!
I builds inMoov head for my two younger childs as a holiday activity since ten days. I made eyes mechanism and eye balls with eye lens effect with a transparent sheet curved properly. This is my technic without glass sphere.
I used a 4cm diameter ping-pong ball as a support instead of 3cm diameter glass sphere. I was afraid that the heat could melt the ping-pong ball; but it was not the case.
I used for the clear transparent surface to be thermoformed a transparent sheet to protect and cover bookbinding documents, the one you can buy in a stationery.
I used the hair dryer on position max heating and put it nearby the surface of the transparent sheet (3cm approximately), the sheet being laid on the ping-pong ball put on the table. I used weights on the edges of the transparent sheet to maintain it outstretched on the ping-pong ball put on the table.
When heating enough the transparent sheet it begins to go down to the ping-pong ball and become a little bit adhesive to it, but not on all the top surface needed for the eye. Then, I suddenly catch the edges transparent sheet with hands (somebody else directing the hair dryer in another direction so that I don’t burn my hands), and I stretch out the transparent sheet along the round ball surface and keep it stretched the time it cools down and become solid again.
Then I have nearly the third surface of ping-pong ball covered with the transparent sheet following exactlyt the round surface of the ping-pong ball. I can then put the iris ring on it, draw the circle of the correct size and cut. I have very nice eye lens with this technic and the ping-pong ball never melt nor deformed with the heat; it is made of a material very heat resistant.
So you have a technic with cheap ping-pong ball. What is more, the 4cm diamater of the ping-pong ball is more suitable for the eye lens curvature to be in harmony with the printed eye ball of inMoov.
The final look of eyeball is nice (and it works weel with the webcam to see clear pictures, I mounted Hercules Twisted HD cam bought new for 8€ = $10 on the net with cdiscount and tested it on the eyeball with eye lens) . I just am not great in drawing and cutting a perfect circle on the curved transparent sheet after thermoforming so I don’t have perfect match of the edges of the eye lens with the edges of the iris on my eyeball when I glued it, I can improve!
Hello guys, the code I have to write it or is there a code already written to take cue? Thank you
Hello, MyRobotLab software is already coded for to deal with eye mechanism.
non trovo il software MyRobotLab contenente il codice per il meccanismo degli occhi.. è possibile riceverlo via email? grazie
Hello,
Here is the tutorial to set MyRobotLab and to configure each components, like the head and eyes:
http://myrobotlab.org/service/inmoov
The configuration of the eyes is done within the head config file:
https://github.com/MyRobotLab/inmoov/blob/master/InMoov/config/skeleton_head.config.default
Set head to True:
[MAIN]
isHeadActivated=True
And set the mappings of the eyes servo:
[SERVO_MINIMUM_MAP_OUTPUT]
;your servo minimal limits
eyeX=60
eyeY=50
[SERVO_MAXIMUM_MAP_OUTPUT]
;your servo maximal limits
eyeX=100
eyeY=100
Hi, I started this project recently and have been ordering parts as I go. I have ordered a camera for the eyes, but only 1 was available. You say it has 2 cameras, but only 1 works in windows. My question is, do you require 2 cameras for tracking or any function at this stage? or is 1 OK?
Thankyou in advance, and thankyou for this wonderful project. It has been a great stress reliever for me.
Mark
Hello,
Only one camera is required for vision tracking.
Thanks for your comment.
Hi Gael Langevin ,
Have you look at this device “http://en.t-firefly.com/product/rk3399.html”
It have six cores, it can run arduino code, and it is strong enough to run dual camera’s
Hello,

Thanks for the link, it seems like a great product regarding the specs. It would be interesting to test it for further investigations.
Hello
I am starting to build my head with the movement of eyes to move the servos could be used SG90 micro servos and an arduino nano and an extension plate or it is low memory to create some movements of the head
You can use a Nano to control the InMoov head.
Be sure to add an external power supply. Check the connections on the Hardware and BOM page.
Hello, I’m starting to print in these days the first parts, i Will use PLA and i would like to know how i had to print the parts. (Temperature, fill, ecc….
If someone can help me i’ll be very happy.
Thank you
hello
can we use any usb cam if we adapt the support or only lifecam 3000 and hercule twist HD
Hello, you can use any webcam has long as you can adapt the support and fit it inside.
MyRobotLab should detect almost all standard webcams.
Has anyone found any other webcams that will work.? Can’t find hercules online.
Thanks
Hello,
You have two models of webcam for the eye mecanism.
The other webcam is LifeCam3000 Microsoft.
Make sure to download the correct STL files in the library.
http://inmoov.fr/inmoov-stl-parts-viewer/?bodyparts=Eyes-mechanism
I would like to recommend this camera:
https://www.amazon.de/gp/product/B00OHDQDAW/ref=ppx_yo_dt_b_asin_title_o02_s00?ie=UTF8&psc=1
Camera is so small no large modifications on the camera, beside stripping it from it’s housing and drilling a 14mm hole through the the parts that hold the camera (circuitboard will be one the backside of the camera holder, and lens will go through the eye parts.
Hello Gael,
Regarding OpenCV services, I found and tried a lot of available scripts of face tracking and face recognition but no one of them works, my question is can you share the right link of face tracking and face recognition script to start with it.
thanks in advance for your response.
regards,
Perché servono 2 cam?
Hello, does anyone know where to connect the camera?
Hello,
The camera is connected to one of your USB port of the PC. When you activate Opencv in the InMoov Ui or config, make sure your PC doesn’t have a built camera otherwise it might start the built in instead.
But what if I have a built-in camera? I can’t use the inmoov camera?
You have two options:
1- in the opencv configs you can choose which camera number myrobotlab is going to use.
2- in windows device manager, you can disable your built in camera.
Personally I do the first option, because I know what is my camera number.
Thank you very much @admin.
I disable my built in webcam in the bios. If you enter your bios at boot there is usually an option to disable it. Windows or Linux won’t even be aware of the camera after that. Same result, I just like the tidiness of doing it this way.
Do anyone know how to make eye movement i2 for the inmoov i2 head any help will be helpful
Bonjour , est ce que je peux utiliser les Youmile 2 pièces OV7670 Module de caméra 640×480 0.3Mega 300KP Module VGA CMOS I2C pour bras Arduino avec câble Dupont , disponible sur amazon ?
Hello, can I use the Youmile 2 pieces OV7670 Camera Module 640×480 0.3Mega 300KP I2C CMOS VGA Module for Arduino Arm with Dupont Cable, available on amazon?