sábado, 6 de octubre de 2018

Oil Paintings (I)



I just wanted to share one of my firsts oil painting exercises. This was done a long time ago. I personnally like the contrast between the hard blue shadow and the grey-yellowish texture of the background where the spot light hits more intensely.

Also, i like the tones of the pink glass and the highlights of the vase. Given my yet poor expertise and as a fresher, i was quite satisfied with the results.

In my free time im now involved in some anatomy studies but as soon as i finish i will try another one inspired by the hudson river school. One of my art teachers discovered this movement to me, and im really drawn to those wild almost fantasy landscapes of the american colonies.

viernes, 2 de marzo de 2018

Integrating DCCs into VFX pipelines: A generic Approach (I)

Context

Most VFX houses and boutiques tend to develop their pipelines around a core set of engines and digital content creation tools. The pipeline grows to deal with file/folder structures, asset tracking (adding to the equation any, usually web-based, digital asset management tool) across departments, some sort of data/metadata storage... normally combining serializables (JSON, XML, ..etc) and relational databases....

This implies a whole bunch of development work so when building a pipeline it's important from the technical point of view as well (not only art) to take into account the programming languages, available APIs, compiler/interpreters versions. It's a great deal. But once the choice is made, studios normally stick to them for several years and hence with the elected DCCs. Changing DCCs requires adapting the pipeline to support it and normally this task is parallelized so it doesnt have an impact on current productions.

Developing for a fixed set of DCCs implies one can spend time using their apis at full, code separate tools and make efforts to integrate them in the best artist-friendly way one is capable of. For example, if you plan to develop a working files manager for Maya and Nuke you may develop some core functionality that is common to both, but you wont trouble yourself much in making a unique tool talk to both. Instead, because you can afford it, you will most probably insert these core funcionality (because you hate to repeat yourself) in different widgets for each application (think about having the tool embedded in a tab).

But the approach has to be different when your plan is to integrate any possible existing 3D software out there in your pipeline!

It's easy to understand you cannot afford at first developing the inputs/outputs tools for a particular soft when you a) are part of a highly specialized and agile but not that numerous team, b) you dont have all the time in the world. So you need to take a more generic approach.

Generic Approach

How about developing your core pipeline tools as standalone instead of having them embedded in a specific app? You are not completely bound to the app specific programming language, you restrain only the specific atomic actions to the software.. the rest is handled from your core tools, and you are no more dependent on each Graphics Library API and versions. Imagine you could develop tools that work for the different Maya versions without relying on PySide and PySide2, integrate Cinema 4d (which doesnt have any Qt binding), Blender (which is Python 3), Photoshop and all the Adobe Suite....

In the approach we are taking currently we are developing our core tools in Python 2.7/PySide (because it is a widely used programming language in vfx and you can get away) and using different kinds of interprocess communication notably via socket connections.

But it's not gold all that shines... We have to face up some difficulties.

Some stones on the road are:

- When talking to apps outside the app you need a way to investigate how each app behaves for this. 
Ideally, one would want the DCC to come with an interpreter separated from the app executable so that you can feed the interpreter with your scripts and execute them all in the same app instance. That is not what you will encounter most of the time. The executable file is the interpreter as well and different apps can behave differently, even the same app in different operating systems! How do you handle this?

- DCCs come with programming apis in a varied bouquet of flavors. Blender alone is python 3, but a big part of the DCCs come with Python 2 and most recent versions havent made the switch yet; Adobe suite has a customized javascript called ExtendedScript... one of its kind!

- If you plan on communicating between your tools and the apps this communication implies the tools need to know which apps are running, and if this communication is made via sockets you start to think some kind of port manager and some sort of handshaking system is needed to be able to control the app and even be communicating to different instances of apps without executing each time a different process for your tools...

- also, for some apps it is not necessary to be running already inside an instance, you can just launch a standalone process and execute your scripts from there (some packages of the adobe suite) whereas for others you need to be inside the app. This means your system needs some flexibility to adapt to these features while staying still generic.

After this, it's clear that an emphasize on division and compartimentalization of each process acting as client and server is vital as well as handling a clean path for errors and exceptions.. (nobody wants your tools to freeze or stop working because a process raised an exception and you didnt let the others die....furthermore the whole operating system can be jeopardized with duplicate "ghost" processes!)

to be continued.......