This was an educational test in which I shot immobile tripod footage and tracked a number of points on my palm using Syntheyes.
Once the points were tracked in Syntheyes the data was imported into Blender, where the monkey head primitive built into Blender (Suzanne) was composited into the shot, seemingly resting on my hand. I had a few issues. I can’t seem to figure out how to get shadows working properly *an issue I solved after writing this article*.
You can see there is a bit of ghosting around the hand, due to the quick ‘mesh’ I made of the hand on which the shadows were placed. A little bit of editing would clean it up, but at this point this test as served its purpose. Onward to bigger and better things. There is a small amount of slippage / jitter between the 3D element and the video footage due to inexperience using Syntheyes on my part. Hey, that’s what these exercises are for!
I would be a bit more in depth with this post, but the wee hours of the morning are upon me and I need to get some sleep, so how about this. If you have any questions or comments, go ahead and leave me one right at the bottom of this page!
‘roB’ is a robot character I made for a short film at Eastern Michigan University. He’s a spunky little bot with a get up and get it done kind of attitude, and he can often be seen hanging out on my desk.
Okay, so not really. This a special effects test involving matchmoving and compositing courtesy of Syntheyes and Blender. It is a shot I’m quite fond of, and it even has a special effects breakdown to show the ‘steps’ taken to go from raw footage to final composite. I think the visual effects breakdown took as much time as the tracking and compositing work itself.
The raw image sequence was processed using supervised tracking in Syntheyes, meaning that each bright green triangular point shown in the video I actually supervised the manual tracking of from frame to frame. Time consuming? Yes. But look at the results! How cool is that?
Of course, my trusty Blender came through for me on all of the 3D aspects. Modeling, lighting, rendering, and compositing were all handled by this powerful beast of a 3D package. If you haven’t looked into it yet, really, what are you waiting for?
While for the most part my colleagues and classmates in the field of what can be summed up as (and heavily understated by the words) ’3D Animation’ are supportive of new experiences, new software, etc, occasionally work that I have done is overshadowed for someone by the fact that I did not produce it using a ‘socially accepted’ software package such as 3DS Max or Maya. I am, of course, referring to my preferred package, my swiss army knife in a world of 3D, Blender.
The fact of the matter is, I have never seen any package commonly ‘recognized’ to be superior produce anything that Blender could not. In fact, I often see quite the opposite, in that few packages are as capable in reference to the multitude of features in which Blender dominates. What else can do particle simulation, volumetric smoke and fire, water, soft bodies, modeling, rigging, animating, texturing, and real world physics simulations all in the same package? Show me the money, folks.
I’m not necessarily saying that other packages are less capable, nor that Blender is the end all be all for every 3D user in the world. What I am saying is this; when you see a truly quality example of someone’s work, something they put a great deal of time and effort into perfecting to perfection, by all means ask how it was done and with what package. Compliment it if it truly shines, suggest better ways of doing things, compare pros and cons, and develop a unique and meaningful conversation or critique. What better way to learn?
But please. Please, refrain from changing your mind on the quality of craftsmanship just because it wasn’t made with an Autodesk product. It can leave a bad taste in both of your mouths, and may make you seem uninformed. By writing something off just because that high school computer science teacher you had said that it wasn’t worth a nickel of your time, you risk missing out on a tremendous number of opportunities! You might learn something new if you give yourself a chance to open up to it. Of course, that lesson can be applied to a lot of things in life, not just 3D modeling and animating application wars.
Now that the render I’ve been waiting on is done, I can stop pseudo-ranting and get to the interesting part of the post. I spent all afternoon and evening today working in Syntheyes and Blender. The included short fifty frame video marks a number of firsts for me.
Into SynthEyes went the image sequence, (video footage broken into a separate image for each frame). SynthEyes is a software package that, to put it roughly, determines how cameras and objects move based on video footage so that we can further manipulate them in software like Blender. It’s one of the things that lets us add special effects. Then came an hour or two of supervised tracking, a first for me, as in the past I have used SynthEyes’ built in autotracking functions to automatically track my footage.
This time, not only did I track the movement of the camera using supervised tracking, but I also tracked a moving object in the footage, a vehicle driving down the road. After the tracking was done and the data was exported to Blender, I began playing with the idea of testing the footage by attached something to the side of the truck.
After a few hours of fiddling, the final product now arises. It’s a short (nearly two whole seconds) technical test slash learning experience comprised of a truck moving down the highway with an image of the TRON style spheres pasted realistically to its side. Not only is the truck moving, but the handheld camera is swiveling in a tripod like manner to track the truck as it moves along the road.
This may look pretty basic, and in some respects it is, but it is one of the most difficult hurdles in achieving realistic special effects and photoreal 3D additions to live action film. And jumping those hurdles can be a lot of fun.
I spent a decent amount of time yesterday reintroducing myself to blender 2.5(3) and am happy to say that I can now use it almost completely in my SFX pipeline. Earlier versions of Blender only come into the game once, and that is to import my Syntheyes python script, as Syntheyes has a lack of compatibility with the new python API the Blender uses. All of two minutes later, it’s back to the new version and it’s wonderful speedy goodness. Here’s a tracking test of the days work.
The items onscreen are not final, they are simply visual aides to help determine the quality of the track and to configure the lighting. I have something much more interesting planned for this footage.
Though it took a long time working on it here and there between work and my other projects, I have recently finished a more professional, more finalized special effects test. I learn a great deal every time I tackle another piece, and I think it shows in the final product as compared to my last test. All rendering, design, and compositing of 3D elements was done in Blender, the camera tracking was done using Syntheyes, and the video editing using Final Cut Express.
(If you want to, I would suggest taking the time to watch the video in HD rather than the default, it looks much better that way.)
Something that helped a great deal for this project was the addition of a newly built networked attached storage server, also known as a NAS. The system I built is running a free operating system called FreeNAS, and allows me to store and access all of the project files together in a central place on my home network. Eventually I may try configuring port forwarding and a service like DynDNS so that I can access my files from school and other places as well over the internet.
One of the nicest things about the new server, however, is its amount of storage and backup capabilities. HD footage can be a bit of a storage hog, with an hour of footage taking up as much as forty-some gigs of space. Inside this standard off-the-shelf parts NAS machine are two two-terabyte SATA hard drives that are basically identical twins, straight down to the data. They are connected together via a form of software raid, meaning that if one of the two hard drives die, I still have my data safe and sound on the other one until I can replace the failed drive. I had to lose a terabyte of data before I understood how useful raid could be.
In other news, I just recently finished a new and much lengthier project with the help of a close friend. I’ll be posting a video and information about it soon, so check back often!
With the shelling out of some decent dinero and about six hours of effort, I have here for the entertainment of all up and coming VFX enthusiasts a camera tracking test. If you are looking for bright lights, action, and large explosions… well you may be disappointed, this is just a test, there’s nothing too fancy about it, visually. The key focus was the tracking of a CG item to handheld video footage, and it turned out well for a first attempt.
The tracking was done using Syntheyes, the same software used for tracking in films such as Iron Man 2, Alice in Wonderland, and Avatar. If it isn’t already, Syntheyes is becoming an industry standard. However, if you can’t afford the license, the Voodoo camera tracker works rather well, and is free for the most part.
The compositing and CG effects were done using the node editing engine in Blender 3D, a free and open source program that continues to improve daily.
For a look at what the best of the best can do with Blender, check out this Sintel trailer, the newest short film from the Blender Foundation coming out in a few months