At the moment I'm trying out an version of Blender (graphicall.org) that has integrated the bullet physics engine into the regular settings (located alongside Clot sim/Fluid sim/Smoke sim/etc) - instead of having to go through the old method of having to simulate rigid-body physics by utilizing Blender's game engine, then baking that simulation to keyframes.
It works really well.
Here are a couple of example videos from developer Sergej Reich.
You get instant feedback in the 3D viewport of the rigid-body sim as it happens: no more switching to game engine, pressing the 'P' key and having to watch a crappy flat textured version of your objects coming crashing to the ground!
I've been using Blender since I think version 2.37.
We're now at version 2.65 and there has been a major re-haul of the user interface since v2.5+.
The rate of development of Blender is impressive. I've always felt that it will eventually start finding it's way to general usage amongst the bigger VFX houses.
I saw a comment on a website somewhere the other day from someone writing about Blender: he talked about how many visual artists nowadays simply cannot afford expensive VFX software due to the current economic climate and may possibly turn to Blender as it:
(a) is free;
(b) has a full-featured VFX tools as standard such as modeling/UV texturing/uses the ever popular Nodes based interface in it's compositor which kind of mirrors programs such as Nuke/has a 3D tracker/soft body and rigid body simulations/and so on and so on;
(c) is actively in development with new features added several times a year.
He then went on tho speculate that these users would then grow accustomed to Blender and 'bring' it with them to other studios, which in turn would gain interest amongst others. If this happens it would be great.
I've been teaching myself how to use Nuke. It's a very powerful program but somehow when using it I am reminded of Blender. I think it could be due to the way Nuke (and Nukex specifically) loads so quickly and actually at first glance seems both overwhelming yet flimsy. It's only when you play around with bot programs that you realize how incredibly powerful they really are.
The Foundry's Nuke was an in-house tool that just got taken on by so many prefessionals who liked it's way of working. The same could happen with Blender if it gained good word of mouth.
There are many niggles with blender, but that goes for all software. After Effects, I find, has an awful interface but you see beyond that when you get it to do what you need it to do.
Blender is often criticised for being a bit of a Swiss army knife of a tool, a jack of all trades yet master of none.
But I find Blender's camera tracker to be very good and easier to use than the Camera Tracker in Nuke or After Effects. The movement in 3D space is way better in Blender than both Nuke and After Effects as well as Cinema 4D and Lightwave. Many of it's buttons are more intuitive than any of the other programs (with the exception of Nuke).
Going back to my initial posting, the integration of rigid-body simulations into the general workflow of Blender is in my mind is a significant (admittedly very belated!) development which hopefully will have a knock on effect on other developments such as a set of presets for collisons and other simulations so theres less fiddling around through incomprehensible settings before you get the look you want!