AI Agents, Animation, and Minecraft

I have been exploring using Generative AI (LLMs specifically) to assist with going from a screenplay to an animated video, learning about relevant technologies along the way. (No, I have not got that far, but it’s fun trying and learning!) For example, at the recent GTC 2024 there were some interesting sessions on Robotics from Google and Disney. It reminded me of an older paper on Voyager – getting AI to play Minecraft. The GTC videos are just being published now, so I also just got to view Jim Fan’s presentation on “Generally Capable Agents in Open-Ended Worlds”, which also mentioned Voyager. This blog summarizes the highlights from this work that interest me. … More AI Agents, Animation, and Minecraft

TensorFlow.js in the browser for expression prediction

In my last post I introduced the approach of using Google MediaPipe to control VRM character facial expressions. Hasn’t this been done before? Yes! There are some excellent projects around, including XR Animator. But there is nothing like firsthand experience. In this blog post I describe my efforts to use TensorFlow.js to train a machine … More TensorFlow.js in the browser for expression prediction

Generating iFacialMocap Blend Shapes in Unity for VRoid Characters

I have been exploring iFacialMocap, an iOS app that uses the iPhone “ARKit” library to convert facial expressions into a series of movements. These can be sent to Unity so your character inside Unity follows what is being recorded on your iPhone. A challenge however is the iOS ARKit library uses more blendshapes than come … More Generating iFacialMocap Blend Shapes in Unity for VRoid Characters