Train2Game News Organic Motion improves MoCap Market

Organic MotionOrganic Motion has taken another giant step forward in the motion capture market with their release of OpenStage 2.4.

In addition to multiple usability features added to the product, OpenStage 2.4 boasts a real-time frame capture rate of 120fps; double its previous mark of 60fps. The increase in frame rate means a substantial increase in tracking quality for users of OpenStage 2.4. Quicker character actions, such as kicking, punching or swinging a golf club now track better than ever. Brent George, professional animator and beta tester of OpenStage 2.4, has already recognized a noticeable improvement.

“At 60fps, OpenStage drastically improved the efficiency of my motion capture pipeline. The increase in frame rate opens up even more possibilities to further leverage OpenStage to save money and improve overall animation quality.”

In addition to animation, the upgrade also substantially impacts the system’s versatility within life sciences.  Dynamic Athletics, an Organic Motion partner and the leader in the application of 3D motion capture in human performance and musculoskeletal health, is excited at the possibilities that exist with OpenStage 2.4.
“The advancements that Organic Motion has made with version 2.4 allows the applications of motion capture to expand into new arenas.” explained Patrick Moodie, Chief of Science and Co-Founder of Dynamic Athletics. “At DARI, collections start with simple movements and progress to the more complex. From golf swings to pitching motions, the possibilities with OpenStage 2.4 are now endless.”

In an effort to allow everyone to witness the new features of OpenStage 2.4 live, Organic Motion will be hosting an online live demonstration from their headquarters in Manhattan on July 15th at 2pm ET. Those interested may register via the following link – Organic Motion Live Online Demo Registration.

Train2Game News: Extreme Motion Android Challenge

Extreme RealityExtreme Reality, the only company to bring full-body motion analysis and control via a device’s native or peripheral camera, today announced the call for submissions for the Extreme Motion Android Challenge 2014.

The contest, open to Android game developers, gives participants a chance to win prizes of up to $8,000, and a meeting with gaming pioneer and the founder of EA Trip Hawkins, for the most innovative uses of motion control in an existing or new Android game.

Games submitted by the deadline of May 20, 2014 will be evaluated by a panel of judges led by Trip Hawkins. The grand prize winner of the contest will receive $8,000 and a meeting with Trip Hawkins and three additional winners will receive $1,000 respectively.

With Extreme Reality integrated into an Android game, all a player needs to do is place their Android device on a table with the device’s camera facing them, take two steps back, and the motion of their body will be captured and analyzed in real-time, enabling their motion to control the game. Similarly, an Android device can be tethered to a larger screen, such as a TV, so a player can enjoy the same level of immersive motion control offered by console systems.

“No longer limited to gaming consoles or other hardware, motion control is adding an entirely new and exciting experience to games on any platform,” said Sarit Firon, CEO of Extreme Reality. “We look forward to seeing the innovations in motion gaming created by participants in the Extreme Android Challenge, from adventure to sports to fitness or dance games.”

Developers can sign up now by submitting their project proposals at After their concept has been approved, they will receive the Extreme Reality Android SDK that enables them to easily incorporate motion control into their game. The Android SDK will be sent by March 17 to start development until May 20when they will submit the game to Extreme Reality. Evaluation of the games by the judges will begin May 25, and winners will be announced on May 30.
The judges will review the following six aspects in their scoring:

  • Motionization – Best utilization of motion control to enhance the game experience
  • Graphics and Audio – The level of visuals and audio in the game
  • Gameplay – The storyline and mechanics that underpin the game. Is it fun to play?
  • Stickiness – Does it have lasting appeal? Is it addictive? Would you finish the game? Would you play it again?
  • Polish – Is the game ready for prime time?
  • Usability – Is it easy to use with minimal explanations? Is it intuitive?

This would be a good challenged for some Train2Game student studios and give you access to some powerful software. There is also the chance for some healthy prize money and to meet someone who could offer invaluable advice.

Train2Game interview with Far Cry 3 Narrative Director Jason Vandenberghe

Train2Game was at Gamescom in Cologne, Germany from 17th August to 21st August. While there, we spoke with Far Cry 3 Narrative Director Jason Vandenberghe.

In an in-depth interview, he discussed what his role involves, the game design process behind an open world title, creating believable characters and much more.

He also reveals how he got into the games industry and gives Train2Game students advice on how to follow in his footsteps.

Read the interview below here on the Train2Game blog, or listen via Train2Game Radio.

Far Cry 3 Narrative Director: Using actors and performance capture improves game design

The use of actors and performance capture is the future of game design. That’s according to Far Cry 3 Narrative Director Jason Vandenberghe, who argues that it improves games making characters more believable, something he believes the industry needs to do more of.

It could be a technology that Train2Game students could use in their future careers.

“I think we’ve been putting up with poor performances and poor writing for too long in the industry” Vandenberghe told Train2Game in a soon to be published interview.

“There’s a lot of people who’ve kind of accepted that it’s just a game so you don’t need to have a good story or don’t need to have good believable characters. Why not? We should have good, believable, strong characters every time.”

The Far Cry 3 Narrative Director believes that as the performance capture technology becomes more readily available, more game developers should take advantage of it.

“We have examples of that, there have been great characters in gaming and we should continue with that, we should expect that” said Vandenberghe.

“I believe that now that the technology for performance capture is becoming more and more available, and we’re learning more about it, I expect the quality bar to rise and I hope you guys (gamers) should be demanding better characters out of your games.”

“What I’m trying to do with this game is raise the audiences’ expectations” he added.

Stay tuned to the Train2Game blog for the full interview with Far Cry 3 Narrative Director Jason Vandenberghe. The title from Ubisoft is set for release next year.

Other games that use motion capture include L.A. Noire and its impressive facial animations, and the Uncharted series which takes input of actors very seriously.

And last month the Train2Game blog reported that Assassin’s Creed: Revelations will also use motion technology.

So Train2Game, is Vandenberg right? Is performance capture and the user of actors the future of the industry? Will it help game designers produce better games?

Leave your comments here on the Train2Game blog, or on the Train2Game forum.

Train2Game Animators get excited: Assassin’s Creed: Revelations tech could surpass that of L.A. Noire

Train2Game students will be familiar with the impressive facial Art & Animation of L.A. Noire, with many wondering if it could be beaten in future.

Well, the facial animation of Assassin’s Creed: Revelations could surpass that later this year, thanks to the Mocam technique used to capture footage.

“One of the elements that’s really interesting about Mocam is that, while  it creates a lot of high-fidelity character expression and movement, the actor doesn’t need to look like the character he’s playing.” Assassin’s Creed: Revelations Lead Game Designer Alexandre Breault told Now Gamer.

“It’s a system that’s able to interpolate the facial movements of one person and apply them to any model. That gives us a lot of flexibility with our actors.” He added.

That means the in-game character doesn’t have to look like the actor who plays them, useful for Assassin’s Creed: Revelations which features a number of historical characters.

Another way the motion capture used for Assassin’s Creed: Revelations could beat that of L.A. Noire is that it incorporates the entire body. Team Bondi’s method, while very impressive, only captured the face of the actor or actress.

“They’re also able to act with their whole body, as the system isn’t just limited to the head” explained Breault.

“ Mocam doesn’t create a clash between facial expression and body movement – it’s all integrated. It allows realistic facial expression, but not at the cost of actor expression as normal mo-cap does.”

It certainly sounds impressive, much like the fact the Train2Game blog reported earlier this year that the Ubisoft team behind Assassin’s Creed: Revelations is over 200 people strong.

And while it isn’t being used to capture facial animation, Uncharted 3 is also using advanced motion capture techniques.

So Train2Game, could Revelations surpass the tech of L.A. Noire? Would it improve the game? Is motion capture the way forward?

Leave your comments here on the Train2Game blog, or on the Train2Game forum.

[Source: Now Gamer]

Train2Game students could see L.A. Noire tech in Grand Theft Auto V

Grand Theft Auto V could feature the extremely impressive facial art & animation techniques originally used in L.A. Noire. (Train2Game students can remind themselves about the motion capture here on the Train2Game blog)

That’s according to  Team Bondi co-founder Brendan McNamara in an interview with PSM3.

“Yeah, I think they’re looking at it for every game. As much as LA Noire is a huge game, Grand Theft Auto is incredibly huge, so you’ve got all the problems of how big the cast would be and how many lines would you have to record and all that kind of stuff.

“Obviously we’d like them to, and they’re more than welcome to use MotionScan, but if they decide it’s not right for that and want to use it for another game, then that’s fine too.

“I think it brings a level of humanity to the experience that means people will – in the first few minutes – start relating to the characters on screen. They don’t have to make that decision about ‘whether I like this guy’ or ‘do I actually believe them? – but they can make all the like or dislike decisions based on the actor’s performance.

“Rockstar will make those decisions. They generally make the right decisions in terms of what they do for their games.”

The prospect of motion capture in Grand Theft Auto V is no doubt an intriguing prospect, both to Train2Game Art & Animation students, and everyone else.

There has been no official announcement regarding Grand Theft Auto V, but increasing rumours suggest that we’ll glimpses of it in the not too distant future.

Indeed, as previously reported by the Train2Game blog, analysts believe Grand Theft Auto V will arrive next year.

So Train2Game, do you think GTA V could benefit from motion capture? Do you believe it’ll be in the game?

Leave your comments here on the Train2Game blog, or on the Train2Game forum.

[Source:  Develop Online]

Another essential F1 2010 dev diary for Train2Game students

Codemasters have released another F1 2010 developer diary and once again it should make interesting viewing for Train2Game students, particularly the Games Designers.

Entitled Live the Life, and describes what is essentially the games story. You start off as a new driver, with the expectations placed upon you depending on your team and difficulty setting. The video is once again fronted by Formula 1 driver and Technical Consultant on F1 2010 Anthony Davidson, and he explains how your teams expectations are very authentic.

“The expectations for the driver playing the game are the same as in real life given your machinery. At the end of the day, the teammate that you have is the only direct competition you’ll have through the whole season. There’s a strange balance of having to work together but also this desperate competition. Where it gets a little bit personal is events like qualifying and the race where you’re just out there to beat him, no matter what.”

The video also demonstrates how you won’t just be driving the car, but will be involved in press conferences and other media duties with the in-game journalists’ questions depending on how well you’ve been driving. The video doesn’t reveal how this’ll affect the game, but perhaps it’ll be in the style of Football Manager where your reactions can either boost or lower the morale of your team. Or will your comments cause your rivals to almost run you into a wall?

Interestingly, the developers discuss how they’ve striven for realism in the garage by using motion capture from real F1 mechanics to make everything as close to the real thing as possible. Of course, there are also pit girls, whether or not they were motion captured isn’t revealed…

Anyway, you can watch Live the Life of a Formula 1 driver below.

If you missed the previous developer diary, which examined the work put into recreating the cars, you can watch it here.

So Train2Game, what do you think of this latest Codemasters F1 2010 Developer Diary?  Did you expect Games Designers to have to include a story and scripts for a racing game? And how would you like to use motion capture in one of your future games?

As usual, leave your comments here or on the Train2Game forum.