We are taking an important leap toward a public release of our latest product, RADiCAL Live. We have now released the RADiCAL LIVE Connector for NVIDIA Omniverse to enable real-time, multiplayer 3D motion capture inside Omniverse, for everyone, everywhere, from any device.

Omniverse is based on Pixar’s Universal Scene Description and NVIDIA RTX technology. The platform enables universal interoperability across different applications and 3D ecosystem vendors, as well as provides real-time scene updates. It is designed to act as a hub, enabling new capabilities to be exposed as microservices to any connected clients and applications.

 

About RADiCAL Live:

The RADiCAL LIVE cloud platform powers software-only, massive scalable, high-quality remote 3D skeletal reconstruction, character animation and user virtualization for practically unlimited participants in shared virtual spaces.

Features include:

  • Real time 3D motion capture from 2D video: RADiCAL’s proprietary AI generates high-quality 3D skeletal motion data from a single, real-time 2D video feed.
  • Multiplayer: Our proprietary cloud-based multiplayer solution enables shared virtual spaces in Omniverse. Every participant can see themselves and every other participating actor in the shared 3D space, in real time.
  • No special hardware / software: RADiCAL LIVE runs on any internet-connected consumer device equipped with a 2D camera – no trackers, suits or dedicated hardware required. LIVE works across the entire consumer hardware and software landscape (desktop, laptop, tablet or mobile device). You will preserve nearly 100% device-side compute capacity, making it available to support graphics rendering and application logic.
  • Setup and preparation: RADiCAL Live requires no special setup, calibration of cameras or constrained environments. Just ensure decent lighting, full body visibility, a 1 second calibration and that you’re alone in the frame. That’s it.

 

Access LIVE – developer account:

We intend to release RADiCAL LIVE to our community soon, entirely self-service, easy to use, and massively scalable.  Until then, the LIVE platform is available through a developer account that we grant to customers and partners upon request.

If you don’t have a RADiCAL developer account, get in touch.  We’re happy to offer it to as many users as we can, as fast as possible, but we need to coordinate cloud resources to enable a smooth and seamless experience for everyone. We’re therefore sequencing the rollout according to use case, expected engagement, and a few other metrics.

 

Access LIVE in Omniverse – without a developer account:

Until you have a developer account, you can still try out what it looks and feels like, using a simulated live data stream.  Check out the FAQs (incl technical guide) here.

 

 

Yesterday, 30 August 2021, we released our latest AI, version 3.2.10.  This AI update makes significant progress on the key metrics of better understanding and reconstructing movement through depth and its relationship with the floor:       

 

1.  Floor contact (footlock): Our AI now has a better, explicit, understanding of the relationship between the actor and the floor in the scene.  This produces more consistent, stable results with respect to the feet making contact with the floor.

 

2.  Spatial trajectory: Our AI is now better able to detect and reconstruct movement through space, specifically depth. As a result, you’ll see fewer unnatural global oscillations in spatial trajectory in our results.  And where they still exist, they are consistently less pronounced.

 

These changes, in turn, also produce subtle, but noticeable improvements to fidelity (detail), smoothness and stability (ie, even less jitter, chop and jerk).

 

Update your results:

As ever, results you have previously produced using an older version of the AI can be updated with a single click from the right-hand sidebar in your scene.

Much more to do:

Our aim is to hold our footlock / floor contact and spatial trajectory metrics to the highest standards of the industry.  We believe we can get there, and we know there’s work yet to be done.  There will be many more of these releases, and corresponding improvements, going forward.

 

 

We’re excited to announce an important update to our AI: Gen 3.2Gen3.2 comes with improvements in these areas:

    • Fidelity: You’ll see more detail across a wider domain of motion and video categories  
    • Smoothness / stability: Even more organic stability with even less jitter, choppiness and jerkiness, even when faced with more challenging videos 
    • Input tolerance: We now understand a wider range of camera angles and aspect ratios across your uploads 
    • Fixed the hunch: we fixed a common issue that produced a “hunching effect” in certain videos. 

 

Paving the way for more:

This update represents an important leap in that we’ve been able to peel away legacy constraints.

Beyond making our results more nuanced and stable for the results we produce right now, we’ve also opened the doors to a number of improvements that are in the pipeline for a release to our entire community soon, including a wider motion domain (motion categories), fidelity, improved footlock, stability, as well as real-time performance for everyone.  

 

Best practice -> best results:

Because our AI is so much more resilient, providing greater input tolerance and stability, you can use a much wider variety of videos.  That said, for the best possible results, we recommend you continue to observe these pricinples: 

    1. Single actor: details
    2. Full body visibility (don’t leave the frame): details
    3. Good calibration: details

 

Fast updates: 

Note that results you have previously produced using an older version of the AI can be updated with a single click from the right-hand sidebar in your scene.

FBX results: Blender add-on / Unreal asset pack:

Gen3.2 comes with small changes to our standard skeleton. For most users, these changes are trivial and will either be obvious or trivial when applying our FBX-formatted results in their own software environments.

However, if you’re relying on our Blender add-on or Unreal asset pack to use our FBX-formatted results, please note that we’ve updated both. The new Blender and Unreal integrations are available for free through our downloads page (along with the previous versions, for backward compatibility) .

 

We’re celebrating:

We’re celebrating Gen3’s anniversary and the release of Gen3.2. with an upgrade code for all annual plans.  See more here.

 

Our friend Jacob Ssendagire (Instagram) has developed beautiful content using RADiCAL and Cinema4D.  He’s now prepared this great tutorial making it easy to apply our FBX-formatted results to characters in C4D.

As a reminder, you cn use the free T-pose rig to help you map to your character before re-targeting your RADiCAL animation to that character.  You can acess the T pose file for free through our downloads page.

Thanks Jacob!

13 May 2021

We’ve updated our AI to version 3.1.11 in RADiCAL Core.  Changes include:  

    • Improved fidelity (detail) and stability. 
    • Expanded input tolerance by allowing for a wider range of camera angles and aspect ratios. 
    • Fixed a common issue that produced a “hunching” effect in certain videos. 

We recommend you continue to observe best practice, despite greater input tolerance and stability: 

    1. Single actor: details
    2. Full body visibility (don’t leave the frame): details
    3. Good calibration: details
    4. Aspect ratio: 4:3 (landscape): details

 

Fast updates: 

Note that results you have previously produced using an older version of the AI can be updated with a single click from the right-hand sidebar in your scene.

Studio:

The AI update will be rolled out to RADiCAL Studio in due course (check announcements).

BlenderDaily’s has posted a quickfire tutorial with additional tips for Blender users on their Instagram channel.  Great work by BlenderDaily. More tips about using RADiCAL in Blender can be found in our FAQs re Using RADiCAL’s MoCap Data.

 

We’re grateful and excited that we have received an Epic MegaGrant.

We already provide a real-time deployment for UE4 through RADiCAL Studio. With the support of this grant, we’re working on democratizing our product further.

We’ll soon release RADiCAL Live to our entire community of about 80,000 content creators, meaning it will no longer be limited to enterprise customers.  RADiCAL Live is a cloud deployment of our AI, accessible to everyone, everywhere, for real-time remote 3D animation and virtualization, right from your home device (regardless of what it is) to enable virtual production, game development, game play, digital art and motion capture.

 

 

For users working with Unreal Engine, RADiCAL Live will come integrated with UE4 (soon, UE5) LiveLink. This means you’ll be able to ingest live animation data coming from RADiCAL’s cloud servers directly into your local machine running Unreal, both inside the editor and packaged apps.
*    *    *

RADiCAL Live animation data will also be available through our website (in WebGL) and, soon, Unity.  We’re also looking into real-time integrations for Blender and iClone, as we’re fans of both, but we’re still evaluating the requirements and timeline for those.  If you’re interested in playing a role in any of these integrations, as a developer, tester or to give advice of any nature, please do get in touch.

Toward the metaverse.

Thank you, Unreal and Epic Games.  It means the world.

– Team RADiCAL

For many of our users, it is important that their results come with stable and realistic contact between the character’s feet and the floor. We’ve therefore been working hard on a solution we call “footlock.” 

 

We’re still working on it. To be precise, we’re right in the middle of it. 

 

But we’re now able to release an experimental version, in public beta, in the form of a post-processing footlock layer in RADiCAL Core. The post-processing layer produces decent, and sometimes great, results across a number of use cases. However, because the footlock layer may not work for all users across all use cases, we’ve decided to provide a choice: you can see your results both with and without the footlock solver enabled.

 

This footlock solver is available automatically for all scenes created in RADiCAL Core after December 8, 2020. It remains experimental and we’re continually improving it. To understand how best to use it, where it might do well, or fail, please consult our Learn section

 

Switching to footlock: open a new tab within your scene

 

For scenes created after December 8, 2020, you can choose to view your results in a separate browser tab, but within the same scene, with the “footlock” solver enabled.  To do this, click the “Switch” button within the “footlock” section on the right sidebar of your scene.

 

Your viewing mode is reflected in the “current view” status. To see the solver results, make sure that: 

  • Footlock: Available
  • Current view: On

 

 

FBX downloads: current view mode + file naming convention

 

The FBX generate + download buttons will give you the FBX file that corresponds to the “current view” mode.  

 

Once downloaded, you will know which version you’re looking at by checking for a suffix in the file name that looks  like this: _fl. For example: 

  • Conventional FBX:  gen-3-1-samples_scan-005 
  • FBX with footlockgen-3-1-samples_scan-005_fl

 

Available in Core – other products to come

 

The footlock solver is currently available as an optional view as part of the Core product (and certain Core API users). It, or a subsequent update, will also be rolled out to Studio – keep an eye out for announcements.

 

.  

We’re excited to announce the launch of our Core API.  Now, developers and enterprise partners can create applications around RADiCAL’s cloud-based motion capture.

 

The API allows our partners to track motion and animate characters in custom user experiences, programmatically, at runtime. 

 

We maintain your own private API to upload videos and download animation data.  You can use your own cloud resources or run your pipeline through RADiCAL’s end to-end cloud infrastructure – ie, we will seamlessly process your videos and deliver results to your users.

 

Beyond the API and cloud resources, we also provide technical support and code to connect the API and programmatically apply the animation data to characters inside end users’ 3D clients.

 

If you’re interested in learning more, just get in touch through our dedicated channel for developers and enterprise partners: https://getrad.co/contact.

 

*      *      * 

 

-Team RADiCAL

 

Our friends at Project Spark Studio have developed a Blender plugin that will allow you to create retargeting profiles for RADiCAL animations to any custom rig you desire.

 

You can download the add-on here.

 

There is a video tutorial on how to use it here.

 

As ever, if you have any questions please do not hesitate to reach out.

 

Happy Capturing!

 

*                          *                          *

 

-Team RADiCAL

 

After carefully considering feedback from our community and the industry at large,  we are excited to offer the Studio Creator package.

 

  • Use for free, no limits: anybody with a RADiCAL account can now install and use Studio to animate and visualize as much motion as they want, free of charge, no credit card required.

 

  • Download FBX exports – pay only for what you need: We’re also launching our pay-as-you-go system (PAYG) as part of the Studio Creator package. If you decide that you want to export animation data from Studio, you can purchase one minute increments of export time that you can use whenever you want. You only pay for the results you need. Left-over credits won’t expire for a year.

 

  • Download directly from website (we’re leaving Steam): Studio is now available directly through our website here and here. If you previously downloaded Studio from Steam, no worries, you can simply delete the Steam version and replace it with the version downloaded from our website.  Your results will still be in the folder where you saved them.

 

  • Annual Producer – lower price – unlimited FBX exports:  we’re dramatically reducing the price of our Studio Producer package to $250, featuring unlimited animation data exports.  Check out our updated pricing page here.

 

We are doing this in the hopes that all creators, regardless of resources, will be able to use RADiCAL Studio in their content pipelines.

 

No subscription.  Free to download.  Free to use.

 

P.S: many of you also asked for a Studio tutorial, which you can view here.

 

*                       *                        *

 

– Team RADiCAL

14 Oct 2020

Yesterday, on October 13, 2020, we released an important update to our AI: version 3.1.7.

 

What you should know:

  • Impact on AI results – reduces oscillations: in terms of visible results, the improvements are subtle, but critical.  Specifically, v3.1.7 significantly reduces certain oscillations on the Y axis (the vertical axis). In previous versions, these Y axis oscillations were mostly correlated with motion that involved the actor raising the arms over the head, often from a simple standing position. In all of our tests, these specific oscillations are now significantly reduced, if not entirely gone.   
  • Footlock – preparing for an important release: the v3.1.7 update should also be seen in the context of our wider efforts to enhance our AI for a more solid planting of the skeletal animation data in relation to the floor.  Version 3.1.7 lays part of the foundation for improved footlock.  With v3.1.7 now in production, we hope to release a first major update improving footlock, coupled with an improved positioning of the root motion in world space, in 2020.     
  • Calibration – still important: even after this update, you should minimize oscillations – and secure fidelity, plausibility and aesthetics – by executing a solid calibration. Here is a reminder of the specific advice we provided on calibration cycles a short while ago.

 

Here’s how it’s being rolled out:

  • Core: Our cloud AI, available through RADiCAL Core, has already been updated to v3.1.7.
  • Studio: our RADiCAL Studio users should expect to see the (free) upgrade this week. Steam should be updating your app automatically (please check your Steam update settings).

 

Special thanks to the AI team, who have pulled another major achievement out of the hat.  

 

*     *     *

 

As always, thanks for being a part of our community, and don’t hesitate to reach out, we are always looking for constructive feedback.

 

– Team RADiCAL

 

A quick reminder of the importance of a solid calibration to support best results. Below is some specific advice on what to do. There are smart updates in the works to make all of this easier (more detail to follow in the next few weeks).  

 

Summary (tl;dr): 

 

  • Do this: make sure the actor starts the scene, from the first frame onwards, in a T-pose facing, and in full view of, the camera, at the center of the scene (stage), with the entirety of the body visible and feet firmly planted on the floor, for 1 – 2 seconds.  
  • Don’t do this: we mean “starting” your scene with a T-pose literally. Avoid footage prior to the T-pose in which the actor prepares for the scene. This includes the actor “transitioning” into the T-pose, such as walking into the frame, standing in profile, turned away from the camera, or otherwise anything less than a solid T-pose. Trim (edit) those frames out of the video, if you can.  

 

RADiCAL Core calibration:

 

We currently accept, without warning, videos in which the actor’s presence is visible enough, but falls short of a solid T-pose. We made that decision because many of our users would prefer to have some results, rather than no results at all. However, this also means that we currently don’t warn users when their calibration wasn’t supportive of best results. 

 

Therefore, you should carefully review how your video starts: if there is a solid T-pose in the video, great. If there’s no T-pose, try to find a natural, full-body camera-facing pose that comes close to it. Either way, try to trim away (edit out) any footage that comes before it. 

 

RADiCAL Studio calibration:

 

  • Step-by-step (SbS) processing: The Studio app already comes with a mandatory five-second countdown to assist you in finding the best position before the Studio app starts recording your video for SbS processing. Make sure the actor stands in a solid T-pose by the time the Studio app has completed the countdown and starts recording.
  • Real-time (RT) processing: Because there is no five second countdown, you‘ll need a person other than the actor to hit the START button. You can also experiment with external camera integrations (to ensure a better angle and distance) or keyboards / mouse configurations (to allow for remote operation of the Studio app). 

 

 

 

We’ve released plugins to make the re-targeting process faster and easier for Blender and Unreal users.

For our Studio customers, we have deployed a dedicated FBX exporter through our website: https://getrad.co/uploader-studio.  

Note that the Studio FBX exporter will only read animation data produced by RADiCAL Studio. You need an active Studio subscription to use the Studio FBX exporter.

For more information on how it works, check out the dedicated learn section for RADiCAL Studio.

After months of innovation and tireless work by our team, the RADiCAL Studio is finally here.  It’s available through Steam here (you’ll need to sign up for a Studio product to log in). Before we get into the details, here are some quick pointers to materials we’re covering elsewhere:

 

  • Early bird pricing: We’re offering early bird pricing (>50% off) for a short time: details here.
  • Free trial: Studio comes with a free trial (no credit card required):  details here
  • Known issues: This release comes with a few known constraints and issues: details here

 

Studio brings our AI to your machine: 

 

With Studio, you are untethered from the cloud and have access to unlimited motion capture in your own home, studio or event space. At heart, the RADiCAL Studio brings our AI to your local machine. We call it step-by-step (SBS) processing.  SBS processing mimics the way we sequentially process your videos through the cloud: record video first, run the AI later.  

 

The big difference? Since this is your own workstation, we don’t have to meter your usage.  Your usage is only limited by your own time.  🙂  

 

Real time results (beta):

 

With Studio, we are also revealing our real-time functionality.  The real time feature is the product of multi-disciplinary efforts across not just deep learning, but also GPU optimization and a lot of great software engineering. It remains in beta because we still need to test and stabilize the AI’s output across a wider range of hardware configurations.

Our final objective, even with real time processing, is to achieve the fidelity, smoothness and range of motion Gen3 is capable of through the cloud.

At this time, we recommend that the real time feature is run on Windows machines with the strongest NVIDIA graphics cards (at least 1080, 1080 Ti, 2060, 2070, 2080, or 2080 Ti), although we’ve seen it do reasonably well on even smaller cards (1060 and 1070).

 

Live stream into Unreal Engine (Live Link):

 

With the right subscription, you can also use our real-time feature to stream your motion directly into a scene in Unreal Engine 4 (Unity, iClone, and Blender coming soon). For more about using UE4 LiveLink, go here (Change Log) and here (FAQs). 

Tip: if you’re a student or an indie, you may qualify for discounted pricing on LiveLink access, please get in touch.

 

Exporting your animation data:

 

Whether you use step-by-step (SbS) or real-time processing, you can export your animation data in FBX format through our website here. You can read more about how that works here (FAQs).

Tip: if you’re using the UE4 Live Link for real time streaming, your animation data can also be saved directly in UE4. 

 

Gen3.1 – new animation rig:

 

Studio also comes with Gen3.1, which features a new and improved using animation rig. The 3.1 skeleton more closely conforms to industry standards. It’s much easier to be used across modern and legacy workflows in Unity, Unreal, Blender, iClone and others.  See the details in this change log post

 

 

*     *     *

 

As always, thanks for being a part of our community, and don’t hesitate to reach out, we are always looking for constructive criticism and feedback.

– Team RADiCAL

Studio’s real time functionality opens up exciting possibilities for previz and virtual production.  For virtual production professionals and enthusiasts, we are enabling Unreal Engine 4 LiveLink access with RADiCAL Studio. We’ve described how it works in detail here (FAQs).

We have included a sample Unreal project and instructions on setting it up with Studio.  It’s available through this Github repo.

Professional Studio subscribers will have livestream access to all engines included in their subscription.

We are also developing livestreaming capabilites for Unity, Blender, and iClone.

Yesterday, September 4, 2020, we released the latest update to our AI: Gen3.1. With the release of Gen3, we signaled that all areas of our product were going to improve, including our FBX output. As the versioning suggests, while this is an upgrade to 3.0, version 3.1 doesn’t imply fundamental, visible, changes in our output.  Rather, it’s the structure of our animation data that has improved.

 

Key benefits of Gen3.1: 

Gen3.1 features an updated skeleton with more joints and a new naming convention that more narrowly conforms to industry standards.  As a consequence, 3.1 improves the ingestion of our animation data across software environments, whether that’s in FBX format or as raw animation data.

In short order, we will also be releasing plugins that make the retargeting process for Blender, Unreal, and Unity even easier.

 

Transitioning from 3.0 to 3.1: 

We understand many of our users have developed pipelines in reliance on the RADiCAL skeleton having a particular structure.

To help ease the transition, we’ve made a new 3.1 t-pose available in the download section.  You can also see a diagram of the new skeleton and the naming convention below.  As you start to align your pipelines for 3.1, we can promise that we don’t expect to make structural changes to our skeleton going forward.  3.1 will be our standard for years to come.

 

New RADiCAL Samples: 

New RADiCAL Samples with free FBX downloads can be found here.

 

How to export legacy animation scenes:

If you need to export FBX animation data for legacy Gen2 or Gen3.0 results, you can do so through our website for a period of one month going forward, ie from today through early October 2020. The user experience is the same: simply hit the FBX download button on the completed scene page.  After that initial transition period, from October 2020 onwards, exporting Gen2 or Gen3.0 results to FBX will require the help of the RADiCAL support team, so you should expect it to take more time. We therefore recommend you start exporting now.

 

Special note for Blender users: 

For Blender users, please select automatic bone orientation under the armature settings when you import the FBX.

We have pushed our  solution of the FBX problem in Blender to production. All animations should be in T-pose in Edit mode now.

Please bear in mind that this is a temporary solution.  There may still be minor issues that pop up (animations may have bones with abnormal rolls).

In a few weeks, we expect to release a new skeleton that more fundamentally solves this problem with an add-on that makes re-targeting a drag and drop process for users.

The new Gen3 T-pose can be found here on our downloads page.

As always, feel free to drop us an email or book a meeting with our team.

Best

TR

Some users have reached out because they’re experiencing re-targeting issues with respect to our FBX in Blender, specifically they’re seeing some abnormal rotations in the skeleton.

We’re aware of the problem and have identified the root cause.  We’re now working on a temporary solution.  Please bear with us for the next few days while we generate a short video tutorial for the temporary fix.

You should also know that, hopefully within just a few weeks, we will release an add-on that will make re-targeting a drag drop process in Blender.

If you have specific thoughts on Blender, or these specific issues, feel free to drop us an email or book a meeting with our team.

As always, thank you!

Team RADiCAL

 

*   *   *

 

As always, feel free to drop us an email or book a meeting with our team.