Tawfeeq Martin, Technical Innovations Manager
David and Lisha invited our creative technology team to collaborate on a fun challenge here — the brief called for mammoth amounts of independent elements including structures, landscape, vegetation, and water to be treated with hand painted look. Our objective was to leverage machine intelligence and state of the art stylization algorithms offering laborsaving execution without compromising artistry.
We looked to convolutional neural networks to compliment earliest look exploration. Fueled by Nvidia’s latest AI enhanced RTX boards we fed the network its weight in mood paintings, style references and game assets to produce artistic images of high perceptual quality – all within minutes! Synthesized texture maps that respected edge boundaries was a novel way of addressing temporal incoherence over image sequences. In the end this method was invaluable for character and overall style inspiration.
Once we met creative approval we reproduced the look into an anisotropic kuwahara filter for even more coherent abstraction. Derivative’s Touchdesigner was used as a prototyping platform and real-time sequence review tool before finally porting the algorithm into distributed Nuke instances that offered artists more familiar creative control.
A great collaborative effort by artists and technologists trading high doses of curiosity!