Tumblelog by Soup.io
Newer posts are loading.
You are at the newest post.
Click here to check if anything new just came in.

January 01 2018

Blender projects in 2018 to look forward to

The blender.org project is very lucky with attracting talent –great developers working together with fantastic artists. It’s how Blender manages to stand out as a successful and highly functional free & open software project. In this post I want to thank everyone for a wonderful Blender year and give a view at all of the exciting things that are going to happen– in 2018! (Fingers crossed :)


In 2016 it was just an idea, having an interactive viewport in Blender with rendering quality at PBR levels. Last year this project took off in ways beyond expectation – everyone should have seen the demos by now.

Eevee in Blender

Early in 2018 animation support will come back (with support for modifiers), with as highlight OpenSubdiv support (GPU based adaptive subdivision surfaces).

Read about the Eevee roadmap here.

Grease Pencil

Blender is an original innovator in this area –providing a fully functional 2D animation tool in a 3D environment. You have to see it to believe it –it’s a mindblowing workflow for animators and story artists.

Grease Pencil

In Q1 of 2018 the short film “Hero” will be finished as proof-of-concept for the new workflow and tools of Grease Pencil in 2.8x.

You can read the latest status report here.

Workflow & “Blender 101”

Optimizing and organizing one’s working environment can significantly improve the workflow in 3D applications. We can’t make everyone happy with a single Blender configuration anymore. This is where the new Workspaces and Application Templates come in. In Q1 and Q2 of 2018 the first prototypes for radically configured simple Blenders are going to be made (a.k.a. the Blender 101 project).

Meanwhile work continues on usability and configurations in a daily production environment. Blender’s latest Open Movie “Spring” is going to be used for this.

Blender 2.8x is also getting a complete new layer system, allowing to organize your scenes in advanced ways. A Scene can have unlimited amount of layers (= drawings or renders),  unlimited amounts of collections and per collection render settings and overrides.

Visit the code.blender.org blog to read more about it.

New UI theme

No, there are no pictures yet! But one of the cool things of releasing a massive update is to also to update the looks. Nothing radical, just to make it look fresh and to match contemporary desktop environments. We’re still using the (great) design from 2009-2010. In computer years, that’s a century ago! Work on this should start Q1 and get finalized before Q2 ends. Contributions welcome (check ‘get involved’).


In 2017 we saw the rise of AMD GPUs. Thanks to a full time developer who worked on OpenCL for a year, Blender is now a good choice for use on AMD hardware. For 2018 we want to work on solving the kernel compiling waiting time.

The Daily Dweebs

Cycles is now one of the most popular areas for developers to work in. Most of these are doing this as part of their daytime job – to make sure Cycles stays an excellent choice for production rendering. Expect in 2018 a lot of high quality additions and especially ways to manage fast renders.

Read more about the Cycles Roadmap here.

Blender Game Engine

One of Blender’s best features is that it’s a complete integrated 3D creation suite –enabling artists to create projects from concept to final edits or playback. Unfortunately the game engine has fallen behind in development– not getting the focus or development time it needs. There are many reasons for it, but one of these is that the code base for BGE is too much separated from the rest of Blender. That means that newly added Blender features need to be ported over to the engine to work.

Blender Game Engine

For the 2.8 project we want to achieve a better integration of BGE and Blender itself. The Eevee project has proven already how important real-time tools are and how well this can work for interactive 3D design and game creators.

That being said, outside of blender.org interesting Blender-related development for game engines happens too. Check out the Blender fork UPBGE for example, or the fascinating Armory Engine (see image above, it’s written in Haxe and Kha). And don’t forget the open source WebGL environments Blend4Web and Verge3D.

Assets and presets

Another ‘2.8 workflow’ related feature: we are working on better managing your files and 3d assets. Partially it’s for complex production setup, partially it’s also about configuring your Workspaces with nice visible presets – lists of pictures of shaders or primitives for example, ready to be dragged & dropped or selected.

Asset engine preview

More information can be found here, the planning for asset management and overrides.

Viewport Compositing

An important design aspect of Blender’s new viewport system is that each engine is drawing in its own buffer. These buffers then get composited in real-time.

Blender 267 splash screen

To illustrate how fast it is: in 2.8x the “Overlay engine” is using real-time compositing in the viewport (to draw selections or widgets).

When 2.8x is ready to get out of beta, we will also check on how to allow (node based) real time compositing in viewports. That then is the final step to fully have replaced the old “Blender Internal” render engine with an OpenGL based system.

This will especially be interesting for the Non-Photo-Realistic rendering enthusiasts out there. Note – FreeStyle rendering will have to fully recoded for 2.8. That’s still an open issue.

Modifiers & Physics upgrade

Blender’s modifier code is getting fully rewritten for 2.8. That’s needed for the new dependency graph system (threadable animation updates and duplication of data).

Blender Physics

A nice side effect of this recode is that all modifiers then will be ‘node ready’. We expect first experiments with modifier nodes to happen in 2018. Don’t get too excited yet, it’s especially the complexity of upgrading the old particle and hair system that’s making it a very hard project to handle.

An important related issue here is how to handle “caches” well (i.e. generated mesh data by modifiers or physics systems). This needs to be saved and managed properly – which is what the dependency graph has to do as well. As soon that’s solved we can finally merge in the highly anticipated Fracture Modifier branch.

Animation tools

Blender’s armature and rigging system is based on a design from the 90ies. It’s a solid concept, but it’s about time to refresh and upgrade it. When Blender 2.8x gets closer to beta I want to move my focus on getting a project organized (and funded) to establish a small team of developers on animation tools for the next decade – Animation 2020! Contact me if you want to contribute.

Discourse forums

Improving onboarding for new developers is on our wish list already for years. There are several areas we should do better – more swiftly handle reviews for provided patches and branches for example.

Discourse Forum

We also often hear that Blender developer channels are hard to find or not very accessible. The blender.org teams  still mainly use IRC chat and MailMan mailing lists for communciation.

In January we will test a dedicated blender.org developer forum using Discourse (fully free/open software). This forum will focus on people working with Blender’s code, developer tools and anything related to becoming a contributor. If this experiment works well we can expand it to a more general “get involved” website (for docs, educators, scientists, conferences, events).
However, user questions, feature requests – would be off topic, there are better places that handle this.

20th anniversary of first public Blender release

Oh yes! Today is exactly 20 years ago that I released the first Blender version in public – only for the Silicon Graphics IRIX platform.

Blender on IRIX

A FreeBSD and Linux version were made a couple of months after.

All as “freeware” then, not open source. I first had to learn the lesson of bursting internet bubbles before going fully open!

Blender 2.80 beta release

Originally planned for Q2 this year… luckily that quarter lasts until July 1st. All depends on how well the current projects go the coming months. But if it’s not July first, then at least we have…

SIGGRAPH, Vancouver

The largest annual 3D CG event is at August 12-16 this year. We aim at a great presence there and certainly it’s a great milestone to showcase 2.80 there!

Open issues

The 2.8 team tries to keep focus – not to do too many things at once and to finish what’s being worked on in the highest usable quality possible. That means that some topics are being done first, and some later. The priorities for 2.8 have been written down in this mail to the main developers list.

We can still use a lot of help. Please don’t hesitate to reach out – especially when workflow and usability are your strength! But we can use contributors in many ‘orphaned’ areas: such as Booleans, Video editor, Freestyle render, Particles, Physics caching, Hair, Nurbs… but also to work on better integration with Windows and MacOS desktop environments.


An important part of the blender.org project are the studios and companies who contribute to Blender.

Special thanks goes to Blender Foundation (development fund grants), Blender Institute/Animation Studio (hiring 3-5 devs), Tangent Animation (viewport development), Aleph Objects (from Lulzbot printers, supporting Blender 101), Nimble Collective (Alembic), AMD (Cycles OpenCL support), Intel (seeding hardware, Cycles development), Nvidia (seeding hardware), Theory Animation and Barnstorm VFX (Cycles development, VFX pipeline).

Special thanks also to the biggest supporters of the Development Fund: Valve Steam Workshop and Blender Market.

Ton Roosendaal, Chairman Blender Foundation

September 27 2017

Title Design: from Wonder Woman to xXx

Joseph Conover is a 3D artist at Greenhaus GFX, where he created graphics for several high profile film credit sequences such as Wonder Woman, xXx: Return of Xander Cage, Guardians of the Galaxy Vol. 2 and more. As he stepped into the industry and picked up other creative tools, Joseph found that Blender often gave him an edge in terms of workflow.

Text by Joseph Conover, Greenhaus GFX

I started using Blender about ten years ago and still implement it in my workflow for modeling, simulation, texturing, sculpting, and various other general tasks. The software is so comprehensive that it lets me picture the final product from a wide viewpoint. It offers a big advantage in eliminating guesswork and time wasted when jumping between different programs.

The largest project I’ve worked on at Greenhaus so far is the Wonder Woman’s end title sequence.

I did too many random things to count, but these are screenshots of notable parts:

Patty Jenkins (Wonder Woman’s director) thought that many scenes in our sequence were too warlike and wanted some uplifting moments, so I 3D projected this view of Themyscira (home to Wonder Woman and the Amazons) based on a painted version created by my boss, Jason Doherty.

Here are several of my more notable models that were used in various scenes. The woman was based on actress Gal Gadot – sculpted in Zbrush and refined in Blender. For the plane, I took inspiration from the WWII German Biplane. My favorite thing to work on was the Sword structure, in which I used arrays and curve modifiers to create a rotating structure effect.

This was one of the environments I got to develop from start to finish. It was a mix of kitbashing and modeling in Blender. The whole process only took me an afternoon to finish because I was able to quickly duplicate the pieces and fill in the space. This scene was also repurposed in different shots throughout the sequence.

Guardians of the Galaxy Vol. 2’s logo was a different story, because it started off in Blender but ended in C4D. This was the logo our client liked at first, which was done in Blender with some 80’s style comping in After Effects:

While Guardians of the Galaxy Vol. 2’s final product didn’t use much of Blender other than the animation, this promotional ad for the 2017 NHL All-Star Weekend did.

This was a great example of Blender’s versatility. For the two shots below, I had to hand model the scenes to match the Cinerama Dome and the Hollywood Sign. Blender allowed me to quickly draft out my ideas from animation to the final lighting before I exported it to Maya and rendered in V-ray.

So what are your thoughts? Hit me up at josephconover.com if you want to chat about Blender or just talk art!

July 14 2017

July 11 2017

Over Half a Million Downloads per Month

The official Blender release is now being downloaded over half a million times per month, and a total of 6.5M last year.

During the period of July 2016 and July 2017, Blender has seen the release of Blender 2.78 and a/b/c fix releases.

This is not counting:

  • Experimental Builds on Buildbot
  • Release Candidates and Test Builds
  • Other services offering Blender (app stores like Steam or community sites like GraphicAll)
  • Linux repositories

Below is the full report for each platform.

May 26 2017

Blender at SIGGRAPH 2017

SIGGRAPH 2017 (Los Angeles, 30 July – 3 August) is around the corner! To continue and celebrate the long standing tradition of Blender and SIGGRAPH, this year we have 3 announcements.

Talk selected for SIGGRAPH 2017

Ton and the Blender Animation Studio team will present Beyond “Cosmos Laundromat”: Blender’s Open Source studio pipeline, a talk focused on Open Source pipelines and Blender.

Here is the abstract: For “Cosmos Laundromat” – CAF 2016 Jury Award winner – the Blender team, headed by CG pioneer and producer Ton Roosendaal, developed and used a complete open source creation pipeline. The team released several other shorts since then, including a 360-degrees VR experience and a pitch for the feature animation “Agent 327”. Developing and sharing open source technologies is a great challenge, and leads to great bene for the small and medium animation studios.

Blender booth

SIGGRAPH hosts one of the largest exhibitions of the computer graphics industry, and this year Blender is going back to it. There will be demos, goodies and a new Blender demo-reel!

Giveaway: free exhibit pass

Follow this link for a free pass (worth $50 – get them before they run out), so you can drop by the exhibit hall in the LA Convention Center. We would love to see you there!

May 22 2017

ESA space debris movie by ONiRiXEL

ONiRiXEL 3D Animation Studio is a small startup based in Toulouse, France. In this article by Jean-Gabriel Loquet, they share how they used Blender to create a high-end stereoscopic video for the European Space Agenty. ONiRiXEL is a Corporate member of the Blender Network.

We are specialized in the production of 3D CGI animation films, mainly for corpororate or institutional films and commercials, but we also love to work on fiction and documentaries, shoot live action, create VFX, perform film preproduction and/or postproduction, or create 3D VR or AR apps as well.

Blender is at the heart of our pipeline, thanks to its “all-in-one” functionality and overall awesomeness: we use it for concept and previs, 3D modeling and texturing, animation, simulation, lighting and rendering with Cycles, compositing and video editing, for monoscopic and stereoscopic projects.

The European Space Agency ESA

The ISS (partly operated by ESA) 3D render by ONiRiXEL, for the space debris movie.

ESA is the European equivalent of NASA: its purpose is to provide for, and to promote, for exclusively peaceful purposes, cooperation among European States in space research and technology and their space applications, with a view to their being used for scientific purposes and for operational space applications systems.

Amongst many, one of ESA’s missions is to investigate about space debris, communicate about this issue with the space operations community, find and implement solutions to mitigate the risks associated therewith.

To raise awareness about this issue, ESA hosted the 7th European Conference on Space Debris Risks and Mitigation in April 2017 in ESOC (the European Space Operations Center) in Darmstadt, Germany. According to ESA: “3D animation is the most suitable way to explain technical principles and to give a clear picture of the situation in space”. Thus, the agency enthrusted the creation of a 3D animation movie about space debris to ONiRiXEL 3D Studio, along with the french consulting startup (also based in Toulouse) ID&SENSE.

The projection of the movie was the main act during the opening ceremony of the conference.

Orbitography simulation with Orekit

The accuracy of spacecraft position, speed and attitude was paramount for ESA so the project team got in touch with the developers of the open source flight dynamics computation library Orekit, in order to implement correct representation of the orbital data provided by ESA.

The input data were the Kepler orbital parameters for each object (active satellite, defunct satellite, launcher upper stage, and spacecraft fragment):

The Orekit team developed a software interface that read these parameters, translated them into ephemeris data (location and attitude of each object for any point in time), which was eventually written to files in Blender’s particle cache format (one for each object and for every frame), and thus already animated in Blender with extremely good accuracy!
Active spacecraft use controlled attitude depending on their orbit (typically nadir pointing with yaw compensation for LEO, LOF aligned for GEO) and other categories use a tumbling mode with random initial attitude and angular velocity. Attitude for the solar arrays is also computed with respect to the body attitude to ensure a proper orientation in the direction of the sun in each movie scene.

Screen capture of Blender reading the Orekit computed spacefraft animation

The next step was to create relatively simple 3D models for the Orekit-simulated objects, with a reasonnable amount of polygons as there were over 20.000 duplications instanciated by Blender’s particle engine. Finally we setup materials, lighting, and camera animation, and thanks to Cycles’ magic: voilà!

Rendered frame of the movie by ONiRiXEL with Orekit computed spacefraft animation

The interface developped by the Orekit team also allowed the visualization of the object’s trajectory as curves:

Rendered frame of the movie by ONiRiXEL with Orekit computed spacefraft trajectory as a curve

Some scenes concerned only one piece of spacecraft, and allowed for more detailed modelling:

Screen capture of ONiRiXEL’s 3D model of Herschel Space Telescope in Blender

Blender’s stereoscopic pipeline

Another of Blender’s strengths on this project was also to support a solid, end-to-end strereoscopic pipeline from conception to delivery of the movie.

We used stereoscopic preview of animation:

Screen capture of ONiRiXEL’s stereoscopic animation preview for the movie in Blender

Stereoscopic rendering and compositing:

Screen capture of ONiRiXEL’s stereoscopic compositing for the movie in Blender

And even stereoscopic editing and encoding, all within Blender:

Screen capture of ONiRiXEL’s stereoscopic video editing for the movie in Blender

ESA’s space debris movie 2017: “A Journey to Earth”

The resulting 12 minutes 3D animation film was realeased by ESA under the CC-BY-SA licence, and is available for download on their website.

Rendered frame of ESA’s space debris movie by ONiRiXEL 3D

It is also available on Youtube in monoscopic or stereoscopic versions.

In the end, this was a challenging but very exciting project, with an extremely efficient production pipeline, with a very high satisfaction of the final client (ESA),
all made possible by open source software in general, and Blender in particular.

You can get in touch with us on the ONiRiXEL 3D Animation Studio website for more info, or check our Blender Network profile.

May 15 2017

Teaser for Agent 327: Operation Barbershop

Blender brings cult comic Agent 327 to life in 3D animation

The studio of Blender Institute releases ambitious three-minute teaser for full-length animated feature based on Dutch artist Martin Lodewijk’s classic comics

Amsterdam, Netherlands (May 15, 2017) – It has created a string of award-winning shorts, raised over a million dollars in crowdfunding, and helped to shape development of the world’s most popular 3D application. But now Blender Institute has embarked on its most ambitious project to date. The studio has just released Agent 327: Operation Barbershop : a three-minute animation based on Martin Lodewijk’s cult comics, and co-directed by former Pixar artist Colin Levy – and the proof of concept for what it hopes will become a major international animated feature created entirely in open-source software.
Watch it here: http://agent327.com

A history of award-winning open movies

Since its foundation in 2007, Blender Institute has created eight acclaimed animated and visual effects shorts, culminating in 2015’s Cosmos Laundromat , winner of the Jury Prize at the SIGGRAPH Computer Animation Festival. Each ‘open movie’ has been created entirely in open-source tools – including Blender, the world’s most widely used 3D software, whose development is overseen by the Institute’s sister organization, the Blender Foundation.

Assets from the films are released to the public under a Creative Commons license, most recently via Blender Cloud, the Institute’s crowdfunding platform, also used to raise the €300,000 budget for Agent 327: Operation Barbershop. Based on Martin Lodewijk’s cult series of comics, the three-minute movie teaser brings the Dutch artist’s underdog secret agent vividly to life.

Creating a cult spy thriller

“Agent 327 is the Netherlands’ answer to James Bond,” said producer Ton Roosendaal, original creator of the Blender 3D software. “He’s fighting international supervillains, but the underfunded Dutch secret service agency doesn’t have the resources of MI6. Rather than multi-million-dollar gadgets, he has to rely on his own resourcefulness to get things done – and in the end, he always pulls it off. To me, that also reflects the spirit of Blender itself.”

Created by a core team of 10 artists and developers over the course of a year, Operation Barbershop sees Agent 327 going undercover in an attempt to uncover a secret criminal lair. Confronted first by the barbershop’s strangely sinister owner, then his old adversary Boris Kloris, Agent 327 becomes embroiled in a life-or-death struggle – only to confront an even more deadly peril in the shop’s hidden basement.

Translating a 1970s comic icon into 3D

A key artistic challenge on the project was translating the stylized look of the original 1970s comics into 3D. “Agent 327 doesn’t fit the American design template for animated characters,” says Blender Institute pipeline TD Francesco Siddi. “He has a gigantic nose, gigantic ears, and bags under his eyes. How many Disney movies do you see with characters like that?”

For the work, the Institute’s modeling and design artists, led by Blender veteran Andy Goralczyk, carried out a series of look development tests. Concept designs were created in open-source 2D painting software Krita, while test models were created in Blender itself, and textured in GIMP.

Another issue was balancing action and storytelling. Although a richly detailed piece, Operation Barbershop isn’t a conventional animated short, but a proof of concept for a movie. It’s designed to introduce Agent 327’s universe, and to leave the viewer wanting more. To achieve the right mix of narrative and exposition, Colin Levy and co-director and lead animator Hjalti Hjálmarsson ping-ponged ideas off one another, mixing animated storyboards, live action, and 3D previs.

Building an open-source feature animation pipeline

As with all of the Institute’s open movies, technical development on the project feeds back into public builds of Blender. In the case of Operation Barbershop , the work done on Cycles, the software’s physically based render engine – which now renders scenes with hair and motion blur 10 times faster – was rolled out in Blender 2.78b in February. Work on Blender’s dependency graph, which controls the way a character rig acts upon the geometry of the model, will follow in the upcoming Blender 2.8. “For users, it’s going to mean much better performance, enabling much more complex animation set-ups than are possible now,” says Siddi.

Other development work focused on the Blender Institute’s open-source pipeline tools: render manager Flamenco and production-tracking system Attract. “The pipeline for making shorts in Blender is already super-solid, but we wanted to build a workflow that could be used on a feature film,” says Siddi. “ Operation Barbershop was great for identifying areas for improvement, like the way we manage asset libraries.”

Joining Hollywood’s A-list

For the Agent 327 movie itself, the Blender Institute is establishing Blender Animation Studio, a separate department devoted to feature animation, for which it aims to recruit a team of 80 artists and developers from its international network. To help raise the film’s proposed budget of €14 million, the Institute has signed with leading talent agency WME, which also represents A-list Hollywood directors like Martin Scorsese, Ridley Scott, and Michael Bay.

“Blender Animation Studio is devoted to producing feature animation with world-class visuals and storytelling, created entirely in free and open-source software,” says founder and producer Ton Roosendaal, “We’ve proved that Blender can create stunning short films. Now we aim to create stunning features, while building and sharing a free software production pipeline.”

Although the Agent 327 movie isn’t the first film to be created in Blender – a distinction that belongs to 2010 Argentinean animated comedy Plumíferos – it will be by far the largest and most ambitious, and one that the Blender Institute hopes will revolutionize feature animation.

“As an independent studio, we’re in the unique position of being in complete control of the tools we use in production,” says Roosendaal. “That’s a luxury enjoyed only by the world’s largest animation facilities. We intend to create movies that redefine the concept of independent animated feature production.”

May 03 2017

Rochlitz VR

About blendFX

blendFX is a small studio for 3D, VR, AR and VFX based in Leipzig/Germany.
We produce virtual and augmented reality applications, architectural visualizations, 3d reconstructions and animations & visual effects for TV and cinema, with a focus on high quality content for mobile VR.

Rendering of Renaissance Period

How we came to VR

We started working with virtual reality in 2014. During our first experiments it became clear quite quickly that we wanted to concentrate on mobile VR, simply because it’s more accessible and affordable. For the time being, however, smartphones are simply not capable of rendering high quality content with reflections, transparencies and nice antialiased edges. That’s why we began focusing on pre-rendered content, using stereoscopic panoramas. We developed a workflow with Blender and Unity, where we integrate interactive elements and 3d animations into stereo panoramas. The resulting apps perform very well on recent smartphones like the Samsung S6 and S7.

Rochlitz VR

“Rochlitz VR” is a virtual reality app which we created for Schlösserland Sachsen. It’s a virtual reconstruction of one of the chambers from the old castle “Schloss Rochlitz” in Saxony, Germany. Today you can see the remains of 5 different periods from the 800 years long history of the “captain’s chamber” (“Hauptmannsstube”) in that room. There are doors and traces of paint from the gothic period, the backwall of a fireplace and romanesque windows, a painted ceiling from the renaissance and on top of that plumbing, drillholes and wallpaper from the GDR (German Democratic Republic) period in Eastern Germany. It’s a really weird and crappy looking room.

The room as it looks like today

Usually rooms like this are being restored to one particular time period. But then of course you lose the chance to see what the room would have looked like in other periods. In this case the castle museum wanted to make it clear to the visitors that the castle has been in constant change throughout the ages. Traces of all these periods can be found today. So the museum needed a way to make this room accessible and understandable for visitors without destroying its current peculiar state with all the various items from different time periods. Virtual Reality turned out to be the perfect medium for this!
Together with conservators and museologists we reconstructed the captain’s chamber how it might have looked like in those 5 different periods. The Rochlitz Virtual Reality Experience will be shown on 4 Samsung Gear VR devices inside the captain’s chamber starting in spring 2017.


We used Blender for the entire process of modeling, texturing and rendering.
The captain’s chamber changed a lot throughout the 800 years, and so we had to heavily adjust the model for each time period.
The 3d modeling turned out to be a crucial part of the scientific reconstruction. The conservators and museologists would provide us with data, measurements and theories of how the room should have looked like back in the days, but it was only during the actual modeling process where we found that some of these theories would have been impractical or even impossible in real life. So the 3D model of the room was also a tool to test a scientific hypothesis.

Screenshot of the blendfile of the gothic period

For the gothic period Nora Pietrowski, the museum’s conservator, gave us the flowery wall paintings, which we mapped onto the four sides of the room. The painted ceiling of the renaissance period was also created by Nora. Then with Blender’s 3d painting tools we added subtle dirt and weather effects to the walls and the floor.
One of the most interesting periods was the one of the second half of the 20th century. The museum wanted to stay as scientifically accurate as possible with the reconstruction. That’s why most of the rooms we reconstructed have no furniture, because there is no detailed information of what kind of furniture actually was in the room. However, there are two photos from the GDR period that do show some the furnishing. That was a good reference for us to rebuild the room with a bit more detail.

Rendering of the GDR Period. In the back you can see the reference photo on the wall.


For rendering we used Blender’s path tracing engine Cycles and its stereoscopic panorama rendering mode, which Dalai Felinto had added in 2015. However, we found that equirectangular rendering is wasting a lot of processing power for the poles of the spherical images, which made rendering a lot slower than it needed to be. So we hired Dalai to code a script to render cube maps, which not only render much faster but also even look better on the poles than the usual equirectangular panoramas. Rendering a cube map panorama creates 6 different images, one for each side of the cube. Each side has a resolution of 1280×1280 pixels. And because we are rendering stereo we end up with 12 images, which add up to a resolution of 15360×1280 pixels for each stereo panorama. Luckily Cycles can take advantage of GPU rendering, which made the renders a bit less painful.

Cubemap Stripe

Another thing we needed was pole merging: It’s a common problem of stereoscopic panoramas that top and bottom (zenith and nadir) look a bit weird because of left and right eye perspective overlap. One way to fix that is simply to let the parallax converge to zero right on the poles of the panorama, also known as pole merging. And since we are not Blender coders ourselves but know some very skilled programmers at the Blender Institute we simply drove to Amsterdam in spring 2016 and talked to Sergey Sharybin, one of the amazing core developers. Within hours we had a working prototype and not much later pole merging was implemented to Blender. Amazing!

For testing the VR renderings internally we used VRAIS, a VR viewer which we developed ourselves together with our colleagues from Mikavaa. To get feedback from conservators and museologists on the test renderings and panoramas we used ReViz, a VR viewer with comment and revisioning system, developed by Mikavaa.

The high resolution stereoscopic panoramas where then taken to the game engine Unity, where we put interactive elements in to the scenes in order to create an intuitive and immersive VR experience, which the visitors of the castle will be able to enjoy with a GearVR headset. The user can travel to the different time periods simply by focusing the interactive overlays on the walls or by using the time line.

Screenshot of some UI elements in the app

The Installation

Four turning chairs have been customized for this exhibition. The 4 headsets need to be continuously attached to a power supply in order to prevent discharging during constant usage. For that the museum’s constructor designed a custom solution with wiper contacts so that the USB cable that charges the GearVR headset can be attached to the turning chair without any cable clutter.

The goal is to be able to have four visitors experience RochlitzVR simultaneously without the need for museum attendants. So the GearVR itself needs to be protected against theft. Therefore a USB-C cable mount was designed that can be screwed onto the GearVR body. The cable itself is quite strong by itself. The museum figured that anyone who would bring the tools to cut those cables would also be able to bring gear to cut stronger metal cables, so the USB cable as theft protection should do it. Further, the phone itself is hidden behind the plastic GearVR cover, which is held in place by 4 screws.

The installation in the captain’s chamber

VR is still a relatively young medium (even though last year’s hype has already faded a bit). But especially for cases like this we think that it can be a brilliant addition for museums, exhibitions and education. Of course it would be even more exiting to be able to walk around in VR, like you can for example with the HTC Vive, but for exhibitions that’s not really practical: it is way more heavy, more expensive and you definitely need someone to operate it and help the users, which makes it even more expensive. It just adds too much maintenance costs for a small museum.

This project took over a year from start to finish, with some breaks in between. We think it’s quite remarkable that an old castle is open and forward thinking enough to embrace a fresh technology like VR, where there is still a lot of development and change happening all the time. Therefore it was an absolute pleasure to work with the castle’s museologist Frank Schmidt, who also initiated the whole thing.
Being able to use Blender for this project was a great experience. Over the course of the last year several features have been implemented that helped us a lot in our workflow. Cycles was improved a lot and is now even faster than when we started. So I think it’s safe to say that Blender really is a perfect tool for producing high quality VR content, and we can’t wait to see what the future brings!

April 19 2017

Visual Effects for The Man in the High Castle

About Barnstorm

Barnstorm VFX embodies the diverse skills, freewheeling spirit, and daredevil attitude of the early days stunt plane pilots. Nominated for VES award for their outstanding work on the TV series “The Man in the High Castle”, they have been using Blender as integral part of their pipeline.

The following text is an edited version of the answers of a Reddit AMA held by the heads of the team (Lawson Deming and Cory Jamieson) on February 3, 2017.

Getting into Blender

We’ve experimented with a variety of programs over the years, but for 3D work, we settled on using Blender starting about 3 years ago. Its very unusual for VFX houses (at least in the US) to use Blender (as opposed to, say, Maya), but there are a number of great features that caused us to switch over to it. One of them was the Cycles render engine, that we’ve used for our rendering of most of the 3D elements in High Castle and other shows. In order to deal with the huge rendering needs of High Castle, we set up cloud rendering using Amazon’s own AWS servers through Deadline, which allowed us to have as many as 150 machines working at a time to render some of the big sequences.

In addition to Blender, we occasionally use other 3D programs, including Houdini for particle systems, fire, etc. Our texturing and material work is done in Substance Painter, and compositing is done in Nuke and After Effects.

The original decision to use Blender actually didn’t have anything to do with the cost (though its certainly helpful now that we have more people using it). We were already using Nuke and NukeX as a company (which are pretty expensive software packages) and had been using Maya for about a year. Before that, Lightwave was what we used.

Assembling a team

The real turning point came when we had to pull together a small team of freelancers to do a sequence. The process went a little bit like this:

1) We hire a 3D artist to start modeling for us. He’s an experienced modeler but his background is in a studio environment where there are a lot of departments and a pretty hefty pipeline to help deal with everything. He’s nominally a Maya guy, but the studio he was at had their own custom modeling software which he’s more familiar with, so even though he’s working in Maya, its not his first choice.

2) The modeling guy only does modeling, so we need to bring in a texture artist. She doesn’t actually use Maya for UV work or texturing. Instead she uses Mari (a Foundry product). She and the Modeler have some issues making the texturing work back and forth between Mari and Maya because they aren’t used to being outside of a studio pipeline that takes care of everything for them.

3) Since neither of the above are experienced in layout or rendering, we hire a third guy to do the setup of the scene. He is a Maya guy as well, but once he starts working, he says “oh, you guys don’t have VRay? I can get by in Mental Ray (Maya’s renderer at the time) but I prefer Vray.” We spend a ton of time trying to work around Mental Ray’s idiosyncrasies, including weird behavior with the HDR lighting major gamma issues with the textures.

4) We need to do some particle simulation work and smoke and create some water in the same scene… Guess who uses Maya to do these things? No one, apparently. Water and particles are Houdini in this case. Smoke is FumeFX (which at the time only existed as a 3DStudio Max plugin and had no Maya version).

So, pop quiz. What is Maya doing for us in this instance? We’ve got a modeler who is begrudgingly using it but prefers other modeling software, a texture artist who isn’t using it at all, a layout/lighter who would rather be using a third party rendering engine, and the prospect of doing SFX that will require multiple additional third party softwares totaling thousands of dollars. At the time we were attempting this, the core team of our company was just 5 people, of which I was the only one who regularly did 3D work (in Lightwave).

I consider myself a generalist and had been puttering along in Maya, but I found it very obtuse and difficult to approach from a generalist standpoint. I’d just started dabbling in Blender and found it very approachable and easy to use, with a lot of support and tutorials out there. At the same time our three freelancers were struggling with the above sequence, I managed to build and render another shot from the scene fully in Blender (a program that I was a novice in at the time), utilizing its internal smoke simulation tools and the ocean simulation toolkit (which is actually a port of the one in Houdini) to do SFX on my own, and I got a great looking render out of Cycles.

Blender has its weaknesses, and as a general 3D package, its not the best in any one area, but neither is Maya. Any specialty task will always be better in another program. But without a pre-existing Maya pipeline, and with the fact that Maya’s structure encourages the use of many specialists collaborating on a single task (rather than one well-rounded generalist working solo) it didn’t make sense to dump a lot of resources and money into making Maya work for such a small studio.

I ended up falling in love with working in Blender, and as we brought on and trained some other 3D artists, I encouraged them to use it. Eventually we found ourselves a Blender studio. That advantage of being good for a generalist, though, has also been a weakness as we’ve grown as a company, because its hard to find people who are really amazing artists in Blender. Our solution up until now has been to work hard on finding good Blender artists and to try and train others who want to learn.

Blender in production

Also, since Blender acts as a hub for VFX work, its still possible for specialists to contribute from their respective programs. Initial modeling, for example, can be done in almost any program. It can be difficult, but the more people from other VFX studios I talk to, the more I realize that everybody’s pipeline is pretty messy, and even the studios who are fully behind Maya use a ton of other software and have a lot of custom scripts and techniques to get everything working the way they want it to.

We use Blender for modeling, animation, and rendering. Our partners at Theory Animation have focused a lot on how to make Blender better for animation (they all came from a Maya background as well but fell in love with Blender the same way I did). We’ve used Blender’s fluid system and particle system (though both of these need work) and render everything in Cycles. We still use Houdini for the stuff that its good at. We used Massive to create character animations for “The Man in the High Castle”. We also started using Substance Painter and Substance Designer for texture work. Cycles is good at exporting render layers, which we composited mostly in Nuke.

One of the big hurdles that Blender has to overcome is the the fact that its licensing rules can make it difficult legally for it to interact with paid software. Most companies want to keep their code closed, so the open-source nature of Blender has made it tricky to, for example, get a Substance Designer plugin. Its something we’re working on though.

When collaborating with other companies, we usually separate the 3d and compositing aspects of the work to keep the software issues from being a problem. Its getting easier every day, though, especially now that Blender is starting to support Alembic. For season one, the sequence we worked on was completely separate and turnkey, so we didn’t have any issues sharing assets. For season 2, however, we did need to do a lot of conversion and re-modeling of elements. Also, many of the models we received were textured using UDIMs, which Blender does not currently support. It would be great for blender to eventually adopt the UDIM workflow for texturing.
We do get a lot of raised eyebrows from people when we tell them we use Blender professionally. Hopefully the popularity of the show (and the fact that we’ve been nominated for some VFX awards) will help remove some of the stigma that Blender has developed over the years. Its a great program.

We’ve developed a number of in-house solutions for Blender. We use Blender solely for 3D and NukeX for tracking and compositing, but we hand camera data back and forth between Nuke and Blender using .chan files (that’s technically built into blender but we’ve developed a system to make it a bit easier). Fitting Blender into a compositing pipeline (Nuke, EXR workflow) is surprisingly easy. Layer render rendering, and the ease of setting up Blender have made it pretty fast for passing around assets between artists and vendors. We also have a custom procedure and PBR shader setup for working with materials out of Substance Painter in Blender. A mix of Shotgun, our own asset tracking, and a workflow based on Blender Linking with a handful of add-ons are needed to make sure everything works.

Production Design

We worked really hard to make it feel correct. You can also thank the Production Designer, Andrew Boughton, who designed the practical sets in the show. He has a lot of architectural knowledge and was very collaborative with us to help make sure our designs matched the feel of the rest of the stuff in the show.

Our visual bible for Germania was a book called “Albert Speer: Architecture 1932-1942”. There were extensive and detailed plans for the transformation of Berlin, including blueprints for buildings like the Volkshalle. We did take some creative liberties with the arrangement and positioning of buildings for the sake of the narrative and to better coordinate with the production designer’s aesthetic of the sets. We looked at old film reels including the famous “Triumph of the Will” for references of how Nazi rallies were organized. One video game that I remember paying attention to was “Wolfenstein: The New Order” because it presents a world that was taken over by the Nazis, though its presentation of post war Berlin (including the Volkshalle) was much more futuristic and sci-fi-ish that what we went for. Our goal in MITHC was to create a sense of the world that felt fairly mundane and grounded in reality. The more it felt like something that could really happen, the more effective the message of the show.

March 22 2017

Industry support for Blender

Blender is a true community effort. It’s an open public project where everyone’s welcome to contribute. In the past year, a growing number of corporations started to contribute to Blender as well.

We’d like to credit the companies who helping out to make Blender 2.8 happen.

Tangent Animation

This animation studio released Ozzy last year, a feature film entirely made with Blender. They currently have 2 new films in production. The facility has two departments (Toronto, Winnipeg) and is growing to 150 people in 2017. They exclusively use Blender for 3D.

Since October 2016, Tangent supports two Blender Institute devs full time to work on the 2.8 viewport. They also hired their own Cycles developer team, who will be contributing openly.

Nimble Collective

Nimble Collective was founded by former Dreamworks animators. Their goal is to give artists access to a complete studio pipeline, accessible online by just using your browser.

Since their launch in 2016 Nimble Collective has seriously invested in integrating Blender in their platform. They currently support one full time developer position in Blender Institute to support animation tools (dependency graph) and pipelines (Alembic).


AMD is developing a prominent open source strategy, leading the way for FOSS graphics card drivers and the new open graphics standard Vulkan.
Since last summer 2016 AMD supports a developer to work on modernizing Blender OpenGL, and a developer to work on Cycles OpenCL (GPU) rendering.

Aleph Objects

Aleph Objects is the manufacturer of the popular Libre Hardware Lulzbot 3D printer.

Starting this year, Aleph Objects will support Blender Institute to hire two people to work full time on UI and Workflow topics for Blender 2.8, with as goal to deliver a release-compatible “Blender 101” + training material for occasional 3D users.

Development Fund Sponsors

The Blender Development fund is an essential instrument to keep Blender alive. Blender Foundation uses the Development fund and donations to support 2-3 full time developer positions. Big and loyal corporate sponsors to the fund are BlenderMarket , Cambridge Medical Robotics , Valve Steam Workshop , Blend4Web , CGCookie , Effetti Digitali , Insydium , Sketchfab , Wube Software , blendFX , Machinimatrix , Pepeland and RenderStreet.

Blender Institute

Last but not least: Blender Institute uses Blender Cloud income, sponsoring and subsidies to support developers and artists to work on free/open movies and 3D computer graphics production pipelines. BI currently employs 14 people, including BF chairman Ton Roosendaal.

December 31 2016

The top 30 Blender developers 2016

Let’s salute and applaud the most active developers for Blender of the past year again! The ranking is based on commit total, for Blender itself and all its branches.  

Obviously a commit total doesn’t mean much. Nevertheless, it’s a nice way to put the people who make Blender in the spotlights.

The number ’30’ is also arbitrary. I just had to stop adding more! Names are listed in increasing commit count order.

Special thanks to Miika Hämäläinen for making the stats listing.

Ton Roosendaal, Blender Foundation chairman.

Joey Ferwerda (28)

openhmd-logoJoey (Netherlands) worked in 2016 on adding real-time VR viewing in Blender’s viewport. This works for Oculus, with Vive support coming soon.

He currently works on OpenHMD, an open source library to support all current Head Mounted Displays.

Luca Rood (30)

screen-shot-2016-12-31-at-18-49-08Luca (Brazil) is to my knowledge the youngest on this list. With his 19 years he’s impressing everyone with in-depth knowledge of simulation techniques and courage to dive into Blender’s ancient cloth code to fix it up.

Luca currently works with a Development Fund grant on improving cloth sim, to make it usable for high quality character animation.

Gaia Clary (32)

collada-banner-200x55Gaia (Germany) is the maintainer of COLLADA in Blender. Her never-ending energy to keep this working in Blender means we can keep it supported for 2.8 as well.

Martijn Berger (40)

screen-shot-2016-12-31-at-18-43-59Martijn (Netherlands) was active in 2016 as platform manager for Windows and MacOS. He helps making the releases, especially to comply to the security standards for downloading binaries on Windows and MacOS.

Antonio Vazquez (41)

screen-shot-2016-12-31-at-18-39-22Antonio (Spain) joined the team to work on Grease Pencil. Based on feedback and guidance of Daniel Lara (Pepeland), he helped turning this annotation tool in Blender into a full fledged 2d animation and animatic storyboarding tool.

Ray Molenkamp (46)

screen-shot-2016-12-31-at-18-37-18Ray (Canada) joined the team in 2016, volunteering to help out maintaining Blender for the Windows platform, supporting Microsoft’s development environment.

Alexander Gavrilov (58)

26-manual-modeling-meshes-weight-paint-face-selectAlexander (Russia) joined the development team in 2016. He starting contributing fixes for Weight Painting and later on his attention moved to Cloth and Physics simulation in general.

He is also active in the bug tracker, providing bug fixes on regular basis.

Sybren Stüvel (59)

screen-shot-2016-12-31-at-18-10-55Sybren (Netherlands) works for Blender Institute as Cloud developer (shot management, render manager, libraries, security) and as developer for Blender pipeline features – such as Blender file manipulations, UI previews and the Pose library.

João Araújo (65)

800px-improved_extrusion1João (Portugal) accepted a Google Summer of Code grant to work on Blender’s 3D Curve object. He added improved extrusion options and tools for Extend, Batch Extend, Trim, Offset, Chamfer and Fillet.

His project is almost ready and will be submitted for review early 2017.

Benoit Bolsee (65)

screen-shot-2016-12-31-at-17-53-50Benoit (Belgium) is a long term contributor to Blender’s Game Engine. In 2016 he worked on the “Decklink” branch, supporting one of the industry’s best video capture cards.

Pascal Schön (78)

screen-shot-2016-12-31-at-17-48-36Pascal (Germany) joined the Cycles team this year, contributing the implementation of the Disney BSDF/BSSRDF.

This new physically based shading model  is able to reproduce a wide range of materials with only a few parameters.

Nathan Vollmer (80)

screen-shot-2016-12-31-at-17-44-50Nathan (Germany) accepted a GSoC grant to work on vertex painting and weight painting in Blender.

With the new P-BVH vertex painting we now get much improved performance, especially when painting dense meshes.

Philipp Oeser (83)

screen-shot-2016-12-31-at-17-38-34Philipp (Germany) is active in Blender’s bug tracker, providing fixes for issues in many areas in Blender.

Contributors who work on Blender’s quality this way are super important and can’t be valued enough. Kudos!

Phil Gosch (131)

pack_1_comparisonPhil (Austria) accepted a GSoC grant to work on Blender’s UV Tools, especially the Pack Island tool. While a bit more computation heavy, the solutions found by the new algorithm give much better results than the old “Pack Islands” in terms of used UV space.

Tainwei Shen (142)

blender33Tianwei (China) accepted a GSoC grant to work on Multiview camera reconstruction. This allows film makers to retrieve more accurate camera position information from footage, when one area gets shot from different positions.

His work is ready and close to be added in Blender.

Thomas Dinges (144)

cycles_278_single_channel_texturesThomas (Germany) started in the UI team for the 2.5 project, but with the start of Cycles in 2011 he put all his time in helping making it even more awesome.

His main contribution this year was work on  Cycles texture system, increasing the maximum amount of textures that can be used on CUDA GPUs, and lowering memory usage in many cases.

Dalai Felinto (192)

screen-shot-2016-12-31-at-17-06-00Dalai (Brazil, lives in Netherlands) added Multiview and Stereo rendering to Blender in 2015. In 2016 he contributed to making VR rendering possible in Cycles.

Dalai currently works (with Clement “PBR branch” Foucault) for Blender Institute on the Viewport 2.8 project. Check the posts on https://code.blender.org to see what’s coming.

Martin Felke (199)

screen-shot-2016-12-31-at-16-57-06Martin (Germany) deserves our respect and admiration for maintaining one of the oldest and very popular Blender branches: the “Fracture Modifier” branch.

For technical and quality reasons his work was never deemed to fit for a release. But for Blender 2.8 internal design will get updated to finally get his work released. Stay tuned!

Mai Lavelle (202)

screen-shot-2016-12-31-at-16-51-58Mai (USA) surprised everyone by falling from the sky with a patch for Cycles to support micro-polygon rendering. The skepticism from the Cycles developers quickly changed. “This is actually really good code” said one of them, which is a huge compliment when coming from coders!

She is currently working for Blender Institute on the Cycles “Split Kernel” project, especially for OpenCL GPU rendering.

Brecht Van Lommel (210)

cycles_shader_ao-200x170Brecht (Belgium, lives in Spain) worked on Blender for a decade. His most memorable contribution was the Cycles render engine (2011). 

Aside of working on Cycles, Brecht is active in maintaining the MacOS version and Blender’s UI code.

Joshua Leung (264)

bbone-restpose_curves-inactionJoshua (New Zealand) is Blender’s animation system coder. He contributed many new features to Blender in the past decade (including Grease Pencil).

Joshua’s highlight for 2016 was adding the “Bendy Bones”. A project that was started by Jose Molina and Daniel Lara.

Lukas Stockner (277)

screen-shot-2016-12-31-at-16-34-38Lukas (Germany) is a new contributor to Cycles, since 2015. He accepted a Google Summer of Code grant to work on Cycles denoising.

Lukas’ specialism is implementing math. One of his last 2016 commits was titled “Replace T-SVD algorithm with new Jacobi Eigen-decomposition solver”. Right on!

Sebastián Barschkis (300)

300px-nb_flipSebastián (Germany) is a recurring GSoC student. He is currently working in his branch on “Manta Flow”, an improved fluid simulation library.

Mike Erwin (308)

imgresMike (USA) has been contracted this year by AMD to help modernizing Blenders’s OpenGL, and to make sure we’re Vulkan ready in the future.

He currently works on the Blender 2.8 branch. making Blender work with OpenGL 3.2 or later.

Lukas Toenne (413)

cv7t4cnxaauludsLukas (Germany) worked for Blender Institute on hair simulation in 2014-2015. In 2016 he went back experimenting with node systems for objects and particles and wrote a review and proposal for how to add this in Blender.

Most of his commits were in the object-nodes branch, a project which is currently on hold, until we find more people for it.

Kévin Dietrich (516)

screen-shot-2016-12-31-at-15-59-04Kévin (France) has mainly been working on two topics in 2016. In a branch he still works on integration of OpenVDB – tools for storage of volumetric data such as smoke.

Released in 2.78 was his work on Alembic input/output. Alembic is essential for mixed application pipelines for film and animation.

Julian Eisel (760)

manipulator_spinJulian (Germany) not only finds usability and UI interesting topics, he also manages to untangle Blender’s code for it. He contributed to many areas already, such as pie-menus and node inserting.

His 2016 highlight is ongoing work on Custom Manipulators – which is a topic for 2.8 workflow project. Goal: bring back editing to the viewport!

Bastien Montagne (1008)

screenBastien (France) is working full-time for Blender Foundation for many years now. He became our #1 bug tracker reviewer in the past years.

His special interest is Asset management though. He’s now an expert in Blender’s file system and works on 2.8 Asset Browsing.

Sergey Sharybin (1143)

xmas3Sergey (Russia, living in Netherlands) is on his way to become the #1 Blender contributor. He is best known for work on Motion tracking, Cycles rendering, Open Subdiv and recently on the Blender dependency graph.

And: of course we shouldn’t forget all of his 100s of bug fixes and patch reviews. The Blender Institute is happy to have him on board.

Campbell Barton (1156)

290px-bmesh_boolean_example_03Campbell (Australia) surprised everyone in August with his announcement to step down from his duties at blender.org. He is taking a well deserved break to renew his energy, and to work on other (own) projects.

He’s still Blender’s #1 committer of 2016 though. Even after his retirement he kept providing code, over 50 commits now. One of this year highlights was adding a high quality boolean modifier in Blender.

November 18 2016

Miyazaki Tribute

I am dono, CG freelancer from Paris, France. I use Blender as my main tool for both personal and professional work.

My workflow was a bit hectic during the creation of my tribute to Hayao Miyazaki short. There’s a ton of ways to produce such film anyway, and everyone has its own workflow, so the best I can do is to simply share how I personally did it.

I always loved the work of Hayao Miyazaki. I already had a lot of references from blu-ray, art books, mangas and such, so I didn’t spend a lot of time searching for references, but all I can say is that’s quite an important task at the beginning of a project. Having good references can save a lot of time.

I simply started the project as a modeling and texturing exercise, just to practice. After modeling the bath of “Spirited Away”, I thought it could be cool to do something more evolved.


So I first did a layout with very low poly meshes to have a realtime preview of the camera’s movements. I also extracted frames from the movies using blurays footage to make two different quality versions. One version used low res JPGs to use for realtime preview in 3D viewport. The second one used raw PNGs for final renders.


I used realtime previews to edit it all together using Blender’s sequencer. I wanted to find a good tempo and feeling for the music, and with realtime in Blender’s viewport, it was easy and smooth to built up. I edited directly the 3D viewport, by linking the scene in the sequencer, so I didn’t need to render anything!


Next, I did the rotoscoping in Blender frame by frame. Having used realtime previews for the editing, I already knew exactly how many frames I had to rotoscope. That way I didn’t wasted any time rotoscoping unecessary footage, which was crucial because rotoscoping is very, very time consuming. The very important thing when you do a rotoscoping is to separate parts. You do not want to have everything in one part. Having separated layers makes it more flexible and faster.


Then, I modeled and unwraped the assets in blender, textured them in Blender and Gimp. I used one blend for each asset to limit blends file size, and used linking to bring everything together in one scene. I also created a blend file that contained a lot of materials (different kind of metal, wood), so I could link them and reuse them at will. It was worth it since it having a modular workflow often really saves time.


For the smoke, I used the blender smoke, directly rendered in openGL in Blender Internal. You can see and correct very easily any mistakes. I did also some dust and fog pass with it.


Ocean was done using ocean modifier in Blender. I baked an image sequence in EXR, and used these images to do the wave displacement and foam.


For rendering I used Octane since I wanted to try a new renderer for this project, but it could have been done using Cycles without any troubles. I rendered layers separatly: characters, sets, backgrounds and fxs. It was very good to have rendered things separatly: the render is more fast, you can have more bigger scene with more polys, and mostly, you can render again a part, if necessary (and it was very often the case) without to render the whole image all over again. Renders were saved in PNG 16 bits for the layer color, and in EXR 32 bits for the Z layer pass. I also rendered some masks and ID mask. This allowed to correct details very quickly during compositing without having to render again the whole image. The rendering time for one frame was from 4 minutes to 15 minutes.


I finished the compo with Natron, added glow, vignetting, motion blur. The Z layer pass was used to add some fog, and ID mask to correct some objects colors. When you have a lot of layer pass from blender, it is very easy to do compositing and tweak things very quickly. I remember when I used to do everything in one single pass at the time. I did renders over and over to fixe errors and it was very time consuming. Sozap, a friend of mine and a very talented artist taught me to use separate layers. It was a really great tip and thanks to him I could work more efficiently.


During the production, I showed wip to my friends, because they could provide a new and fresh look on my work. Sometimes, it is hard to have critics, but it is important to listen as they can help you a lot to to improve your work. Without critics, my short most certainly wouldn’t have looked as it does now. Thanks again to Blackschmoll, Boby, Christophe, Clouclou, Cremuss, David, Félicia, Frenchman, Sozap, Stéphane, Virgil! And Thanks to Ton Roosendaal, the Blender community, developers of Blender, Gimp and Natron!


Check out the making of video!

August 07 2016

SIGGRAPH 2016 report

Anaheim, 23 – 28 July 2016

This year was the 25th anniversary of my SIGGRAPH membership (I am a proud member since ’91)! It was also my 18th visit in a row to the annual convention (since ’99). We didn’t have a booth on the trade show this year though. Expenses are so high! Since 2002 we exhibited 7 times, we skipped years more often, but since 2011 we were there every year. The positive side of not exhibiting was that I finally had time and energy to have meetings and participate in other events.

Friday 22 – Saturday 23: Toronto


But first: an unexpected last minute change in the planning. Originally I was going to Anaheim to also meet with the owners of Tangent Animation about their (near 100% Blender) feature film studio. Instead they suggested it would be much more practical to rebook my flight and have a day stopover in Toronto to see the studio and have more time to meet.

I spent two half days with them, and it was really blown away by the work they do there. I saw the opening 10 minutes of their current feature film (“Run Ozzy Run”). The film is nearly finished, currently being processed for grading and sound. The character designs are adorable, the story is engaging and funny, and they pulled off surprising good quality animation and visuals – especially knowing it’s still a low budget project made with all the constraints associated with it. And they used Blender! Very impressive how they managed to get quite massive scenes work. They hired a good team of technical artists and developers to support them. Their Cycles coder is a former Mental-Ray engineer, who will become a frequent contributor to Cycles.

I also had a sneak peek of the excellent concept art of the new feature that’s in development – more budget, and much more ambitious even. For that project they offer to invest substantially in Blender, we spent the 2nd day on outlining a deal. In short that is:

  • Tangent will sponsor two developers to work in Blender Institute on 2.8 targets (defined by us)
  • Tangent will sponsor one Cycles developer, either to work in Blender Institute or in Toronto.
  • All of this full time and decently paid positions, for at least 1 year. Can be effective in September.

Sunday 24: SIGGRAPH Anaheim

blenderbof162 PM: Blender Birds of a Feather, community meeting

As usual we start the meeting with giving everyone a short moment to say who they are what they do with Blender (or want to see happen). This takes 25+ minutes! There were visitors from Boeing, BMW, Pixar, Autodesk, Microsoft, etc.

The rest of the time I did my usual presentation (talk about who we are, what we did last year, and the plans for next year).

You can download the pdf of the slides here.

3:30 PM : Blender Birds of a Feather, Spotlight event

Theory Animation’s David Andrade offered to organise this ‘open stage’ event, giving artists or developers 5 minutes of time to show the work they did with Blender. It was great to see this organised so well! There was a huge line-up even, lasting 90 minutes even. Some highlights from my memory:

  • Theory Animation showed work they did for the famous TV show “Silicon Valley”. The hilarious “Pipey” animation is theirs.
  • Sean Kennedy is doing a lot of Blender vfx for tv series. Amazing work (can’t share here, sorry), and he gave a warm plea for more development attention for the Compositor in Blender.
  • Director Stephen Norrington (Blade, League of Extraordinary Gentlemen) is using Blender! He showed vfx work he did for a stop motion / puppet short film.
  • JT Nelson showed results of Martin Felke’s Blender Fracture Branch. Example.
  • Nimble Collective premiered their first “Animal Facts” short, The Chicken.

Afterwards we went for drinks and food to one of the many bar/restaurants close by. (Well close, on the map it looked like 2 blocks, but in Anaheim these blocks were half a mile! Made the beer taste even better though :)

Monday 25: the SIGGRAPH Animation Festival, Jury Award!

Selfie with badge + ribbon

Aside of all these interesting encounters you can have in LA (I met with people from Paramount Animation), the absolute highlight of Monday was picking up the Jury prize for Cosmos Laundromat. Still vividly remembering 25 years ago, struggling with the basics of CG, I never thought I’d be cheered on and applauded by 1000+ people in the Siggraph Electronic Theater!

Clearly the award is not just mine, it’s for director Mathieu Auvray and writer Esther Wouda, the team of artists and developers who worked on the film, and most of all for everyone who contributed to Blender and to Blender Cloud in one way or another.

Wait… but the amount of surprises weren’t over that day. I sneaked away from the festival screening and went to AMD’s launch party. I was pleasantly surprised to watch corporate VP Roy Taylor spending quite some time talking about Blender, ending with “We love Blender, we love the Blender Community!” AMD is very serious about focusing on 3D creators online, to serve the creative CG communities of which Blender users are one of the biggest now. If AMD could win back the hearts of Blender artists…

Theory Animation guys!

After the event I met with Roy Taylor, he confirmed the support they already give to Blender developer Mike Erwin (to upgrade OpenGL). Roy said AMD is committed to help us in many more ways, so I asked for one more full time Cycles coder. Deal! Support for 1 year full time developer on Cycles to finish the ‘OpenCL split kernel’ project is being processed now. I’ll be busy hiring people the coming period!

Later in the evening I met with several Blender artists. They got the award handed over by me to show my appreciation. Big fun :)

Tuesday 26 – Wednesday 27, SIGGRAPH tradeshow and meetings

Not having a booth was a blessing (at least for once!). I could freely move around and plan the days with meetings and time to attend the activities outside of the trade show as well. Here’s a summary of activities and highlights

  • Tradeshow impression
    This year’s show seemed a bit smaller than last year, but on both days it felt crowded in most places, the attendance was very good. Best highlights are still the presentations by artists to show their work on larger booths such as Nvidia or Foundry. Also for having an original Vive experience it was worth the visit. Google’s Tango was there, but the marketing team failed to impress demoing it – 3d scanning the booth failed completely all the time (don’t put tv screens on walls if you want to scan!).
  • USD-1Pixar USD launch lunch
    Pixar presented the official launch of the Universal Scene Description format, a set of formats with a software library to manage your entire pipeline. The  USD design is very inviting for Blender to align well with – we already share some of the core design decisions, but USD is quite more advanced. It will be interesting to see whether USD will be used for pipeline IO (file exchange) among applications as well.
  • Autodesk meeting
    Autodesk has appointed a director open source strategy, he couldn’t attend but connected me with Marc Stevens and Chris Vienneau, executives in the M&E department. They also brought in Arnold’s creator Marcos Fajardo.
    Marcos expressed their interest in having Arnold support for Blender. We discussed the (legal, licensing) technicalities of this a bit more, but for as long they stick to data transport between the programs (like PRman and VRay do now using Blender’s render API) there’s no issue. With Marc and Chris I had a lengthy discussion about Autodesk’s (lack of) commitment to open source and openly accessible production pipelines. They said that Autodesk is changing their strategy though and they will show this with actively sharing sources or participating in open projects as well. I invited them to publish the FBX spec doc (needs to get blessings from board, but they’ll try) and to work with Pixar on getting the character module for USD fleshed out (make it work for Maya + Max, in open license). The latter suggestion was met with quite some enthusiasm. Would make the whole FBX issue go away mostly. 
  • Nvidia
    It was very cool to meet with Ross Cunniff, Technology Lead at NVIDIA. He is nice down-to-earth and practical. With his connections it’ll be easier to get a regular seed of GTX cards to developers. I’ve asked for a handful 1080ies right away! Nvidia will also actively work on getting Blender Cycles files in the official benchmarking suites.
  • Massive Software
    David Andrade (Theory Animation) setup a meeting with me and industry legend Stephen Regelous, founder of Massive Software and the genius behind the epic Lord of the Rings battle scenes. Stephen said that at Massive user meetings there’s an increasing demand for Blender support. He explained me how they do it; basically everything’s low poly and usually gets rendered in 1 pass! The Massive API has a hook into the render engine to generate the geometry on the fly, to prevent huge file or caching bottlenecks. In order to get this work for Blender Cycles, a similar hook should be written. They currently don’t have the engineers to do this, but they’d be happy to support someone for it.
  • Khronos
    I further attended the WebGL meeting (with demos by Blend4web team) and the Khronos party. Was big fun, a lot of Blender users and fans there! The Khronos initiative remains incredibly important – they are keeping the graphics standards open (like OpenGL, glTF) and make innovation available for everyone (WebGL and Vulkan).

Friday 29, San Francisco and Bay Area

on-highway1Wednesday evening and Thursday I took my time driving the touristic route north to San Francisco. I wanted to meet some friends there (loyal Blender supporter David Jeske, director/layout artist Colin Levy, CG industry consultants Jon and Kathleen Peddie, Google engineer Keir Mierle) and visit two business contacts.

  • Nimble Collective
    Located in a lovely office in Mountain View (looks like it’s always sunny and pleasant there!) this startup is also heavily investing in Blender and using it for a couple of short film projects. I leave it them to release the info on the films :) but it’s going to be amazing good! I also had a demo of their platform, which is like a ‘virtual’ animation production workstation, which you can use in a browser. The Blender demo on their platform was feeling very responsive, including fast Cycles renders.
    The visit ended participating in their “weekly”. Just like the Blender Institute weekly! An encouraging and enthusiast gathering to celebrate results and work that’s been done.
  • Netflix
    netflix_cosmoslaundromatThe technical department from Netflix contacted us a while ago, they were looking for high quality HDR content to do streaming and other tests. We then sent them the OpenEXR files of Cosmos Laundromat, which is unclipped high resolution color. Netflix took it to a specialist HDR grading company and they showed me the result – M I N D blowing! Really awesome to see how the dynamics of Cycles renders (like the hard morning light) works on a screen that allows a dynamic ‘more than white’ display. Cosmos Laundromat is now on Netflix, as one of the first HDR films.
    We then discussed how Netflix could do more with our work. Obviously they’re happy to share the graded HDR film, but they’re especially interested in getting more content – especially in 4k. A proposal for sponsoring our work is being evaluated internally now.

Sunday 31 July, Back home

I was gone for 9 days, with 24 hours spent in airplanes. But it was worth it :) Jetlag usually kicks in then, took a week to resolve. In the coming weeks there’s a lot of work waiting, especially setting up all the projects around Blender 2.8. A new design/planning doc on 2.8 is first priority.

Please feel invited to discuss the topics in our channels and talk to me in person in IRC about Blender 2.8 and Cycles development work. Or send me a mail with feedback. That’s ton at blender.org, as usual.

Ton Roosendaal
August 7, 2016

July 28 2016

E-Interiores: Next-generation interior design with Blender

By: Dalai Felinto, Blender Developer

Meet e-interiores. This Brazilian interior design e-commerce startup transformed their creation process into an entire new fashion. This tale will show you how Blender made this possible, and how far we got.

We developed a new platform based on a semi-vanilla Blender, Fluid Designer, and our own pipelines. Thanks to the accomplished results, e-interiores was able to consolidate a partnership with the giant Tok&Stok providing a complete design of a room, in 72 hours.

A long time ago in a galaxy far far away

During its initial years, e-interiores focused on delivering top-notch projects, with state of the art 3d rendering. Back then, this would involve a pantheon of software, namely: AutoCAD, SketchUp, VRay, Photoshop.

All those mainstream tools were responsible for producing technical drawings, 3D studies, final renderings, and the presentation boards. Although nothing could be said about the final quality of their deliverables, the overall process was “artisanal” at most and extremely time consuming.

Would it be possible to handle those steps inside a single tool? How much time could be saved from handling the non-essential tasks to the computer itself?

New times require new tools

The benefits of automatization in a pipeline are known and easily measured. But how much thought does a studio give to customization? How much can a studio gain from a custom-tailored tool?

It was clear that we had to minimize the time spent on the preparation, rendering and presentation. This would leave the creators free to dedicate their time and sweat over what really matters: which furnitures to use and how to arrange them, which colors and materials to employ, the interior design itself.

A fresh start

The development paradigm was as such:

  • Vanilla Blender: The underneath software should stay as close to its consumer version as possible
  • Addon: The core of the project would be to create a Python script to control the end to end user experience
  • Low entry barrier: the users should not have to be skilled in any previous 3D software, specially not in Blender

The development started by cleaning up the Blender Interface completely. I wanted the user to be unaware of the software being used underneath. We took a few hints from Fluid Designer (the theme is literally their startup file), but we focused on making the interface tied to the specifics of e-interiores working steps.

You have the tools to create the unchanged elements of the space – walls, floor, …, the render point of views, the dynamic elements of the project, and the library. Besides that, there are a whole different set of tools dedicated to create the final boards, add notations, measurements, …

A little bit about coding

Although I wanted to keep Blender as close to its pristine release condition as possible, there were some changes in Blender that were necessary. They mostly orbited around the Font objects functionality which we use extensively in the boards preparations.

The simplest solution in this case was to make the required modifications myself, and contribute them back to Blender. The following contributions are all part of the official Blender code, helping not only our project, but anyone that requires a more robust all-around text editing functionality:

With this out of the way we have a total 18,443 lines of code for the core system, 1,458 of model conversion and 2,407 for database. All of this ammounts to over 22 thousand lines of Python scripting.

Infrastructure barebones

The first tools we drafted are what we call the skeleton. We have parametric walls, doors, windows. We can make floor and ceilings. We can adjust their measurements later. We can play with their style and materials.

Objects library

We have over 12,000 3d models made available to us by Tok&Stok. The challenge was to batch convert them into a format Cycles could use. The files were originally in Collada, and modelled and textured for realtime usage. We then ditched the lightmaps, removed the support meshes, and assigned Cycles-hand-made materials based on the object category.

Part of this was only possible thanks to the support of Blender developer and Collada functionality maintainer Gaia Clary. Many thanks!

More dynamic elements

Curtains, Mirrors, marbles, blindex . . . there a few components of a project that are custom-made and adjusted on an individual case basis.


This is where the system shines. The moment an object is on the scene we can automatically generate the lighting layout, the descriptive memorial, and the product list.

The boards are the final deliverable to the clients. This is where the perspectives, the project lists, the blueprints all come together. The following animation illustrates the few steps involved in creating a board with all the used products, with their info gathered from our database.

Miscellaneous results

Finally you can see a sample of the generated result of the initial projects done with this platform. Thanks to Blender’s script possibilities and customization we put together an end-to-end experience to our designer and architects.

July 07 2016

Cosmos Laundromat wins SIGGRAPH 2016 Computer Animation Festival Jury’s Choice Award

A few days ago we wrote about three Blender-made films being selected for the SIGGRAPH 43rd annual Computer Animation Festival. Today we are happy to announce that Cosmos Laundromat Open Movie (by Blender Institute) has won the Jury’s Choice Award!

Producer Ton Roosendaal says:

SIGGRAPH always brings the best content together for the Computer Animation Festival from the most talented artists and we are honoured to be acknowledged in this way for all our hard work and dedication.


Get ready to see more and more pictures of Victor and Frank as Cosmos Laundromat takes over SIGGRAPH 2016!

Google Expeditions – Education in VR

By: Mike Pan, Lead Artist at Vida Systems

The concept of virtual-reality has been around for many decades now. However it is only in the last few years that technology has matured enough for VR to really take off. At Vida Systems, we have been at the forefront of this VR resurgence every step of the way.


Vida Systems had the amazing opportunity to work with Google on their Expeditions project. Google Expeditions is a VR learning experience designed for classrooms. With a simple smartphone and a Cardboard viewer, students can journey to far-away places and feel completely immersed in the environment. This level of immersion not only delights the students, it actually helps learning as they are able to experience places in a much more tangible way.


To fulfill the challenge of creating stunning visuals, we rely on Blender and the Cycles rendering engine. First, each topic is carefully researched. Then the 3D artists work to create a scene based on the layout set by the designer. With Cycles, it is incredibly easy to create photorealistic artwork in a short period of time. Lighting, shading and effects can all be done with realtime preview.


With the built-in VR rendering features including stereo camera support and equirectangular panoramic camera, we can render the entire scene with one click and deliver the image without stitching or resampling, saving us valuable time.


For VR, the image needs to be noise-free, in stereo, and high resolution. Combining all 3 factors means our rendering time for a 4K by 4K frame is 8 times longer than a traditional 1080p frame. With two consumer-grade GPUs working together (980Ti and 780), Cycles was able to crunch through most of our scenes in under 3 hours per frame.

Working in VR has some limitations. The layout has to follow realworld scales, otherwise it would look odd in 3D. It is also more demanding to create the scene, as everything has to look good from every angle. We also spent a lot of time on the details. The images had to stand up to scrutiny. Any imperfection would be readily visible due to the level of immersion offered by VR.


For this project, we tackled a huge variety of topics, ranging from geography to anatomy. This was only possible thanks to the four spectacular artists we have: Felipe Torrents, Jonathan Sousa de Jesus, Diego Gangl and Greg Zaal.



Our work can be seen in the Google Expeditions app available for Android.

On blender.org we are always looking for inspiring user stories! Share yours with foundation@blender.org.

Follow us on Twitter or Facebook to get the latest user stories!

June 23 2016

Siggraph 2016 Computer Animation Festival Selections

We are proud to share the news that 3 films completely produced with Blender have been selected for the 43rd Computer Animation Festival to be celebrated in Anaheim, California, 24-28 July 2016! The films are Cosmos Laundromat (Blender Institute, directed by Mathieu Auvray), Glass Half (Blender Institute, directed by Beorn Leonard) and Alike (directed and produced by Daniel M. Lara and Rafa Cano).


The films are going to be screened at the Electronic Theater, which is one of the highlights of the SIGGRAPH conference. SIGGRAPH is widely considered the most prestigious forum for the publication of computer graphics research and it is an honour to see such films in the same venue where computer graphics has been pioneered for decades.

Here you can see a trailer of the Animation Festival, where some shots of Cosmos Laundromat can be spotted.

May 04 2016

Hardcore Henry – using Blender for VFX

By: Yaroslav Kemnits, Ph.D., Creative VFX director, Division LLC, Moscow, Russia

Hardcore Henry is a sci-fi movie. The hero is a cyborg fighting other hostile cyborgs. Instead of putting it in futuristic setting, writer/director Ilya Nayshuller puts the events in the present ordinary world. That’s why the shooting has taken place in buildings and streets of a real city. Just one scene of the movie couldn’t be produced this way; the life pod free fall from stratosphere on the road near Moscow City, which should be looking like common GoPro action recording.

The fall has been filmed in three different parts: sky (above clouds), clouds (inside) and above city. The first part is set with blue sky above and white clouds canvas below – such scenes are always pretty. Then the pod enters the clouds. The GoPro shooting of falling through the clouds looks quite boring – grey screen is the only thing you see. That’s why I added turbulence effect on entering the cloud mass and made huge cave-like hollows inside. Finally, the pod flies out of clouds and we can see quickly approaching city. Hero opens the parachute which softens the impact of collision with track.

I often use Blender when I create visual effects. I use it to create action scenes animatics, to make decoration sketches and many more. And, obviously, I know about its Cloud Generator add-on, so I used it in this movie. It is simple to use and versatile at the same time.

Using it, I’ve made a cloud and dropped 6-camera unit on it. This unit is descending closer to the border of the cloud because the hero never looks backwards.

image01 Animation of 6-camera unit

The scene was illuminated by two sources of light – Sun and Hemi (sky). I have compared render result with real GoPro recordings and it has exceeded all my expectations.

image02 Render result

We have filmed the city with 360 degrees shooting using drone with 6-GoPro box. It was simpler.


I have synchronized the records and layed it on the cube.


The highest building in the Moscow City complex is 374 meters tall. The drone couldn’t ascend higher than that, and I have needed to make the feeling of much greater height.


I have used camera mapping in order to do this.

We have created a “white room” with programmed luminaires around the pod.


Several light FX were created with it. Moving sun light on heroine’s face during pod’s rotation, for example.


The program allowed us to alter light parameters inside and out of clouds and many other things.


Usage of mirrors enabled recording of reflections.

Finally, I’ve needed to create and animate a parachute.


It wasn’t difficult because the parachute should have been visible just for one second. I’ve used usual Round Canopy Parachute. One thing – I had to enlarge it a little, otherwise it was becoming too small. I used the cloth simulation and the wind as force for animation parachute.

We have also used Blender Fracture Modifier (http://df-vfx.de/fracturemodifier/) to create explosions and collapses.


Why have we chosen Blender?

It is a very flexible tool. It includes almost every modern top technologies and has convenient and user-friendly interface, which gives us the opportunity to solve creative problems without struggling with fiddly software.


Yaroslav Kemnits, Ph.D., Creative VFX director, Division LLC, Moscow, Russia

On set VFX supervisor of “Hardcore Henry” movie

February 16 2016

NASA’s Experience Curiosity

It is amazing to see how NASA is using Blender 3D for their innovative projects. From the controllable Rover web-app, Experience Curiosity, to simulated space exploration of Exoplanets, to mobile-based Augmented Reality, NASA is in the forefront of demonstrating the benefits of having Blender as an interactive 3D tool.

Brian Kumanchik, Project Lead & Art Director of NASA Jet Propulsion Laboratory, has this to say about Blender…

I started using Blender personally about 6 years ago as an free alternative to Maya and Max when I started my own business – modeling and selling Aircraft for Microsoft Flight Simulator http://simflight3d.com/ after being laid-off my job in the video game industry, I decided to try an all open source route and found the tools very capable. I actually prefer Blender over both Maya and 3DS Max. It was my Blender/GIMP-created aircraft that landed me the job with NASA. And the fact that I’m using open source tools at NASA means that the public can download my models and open them up and play with them without spending money on 3D software. I do have about 25 years experience in the video game industry mostly using 3DS Max.

The Blend4Web decision was made because it was already Blender-friendly, had a physics engine and was the most mature WebGL engine out at the time. They were also willing to work with us.

The Blender/Blend4Web pipeline was pretty smooth, the only problem with working in WebGL is that browsers are forever-changing and features get turned on and off daily. but on the plus side our app runs on mobile devices without any changes except to accommodate the smaller screens.

Watch for other apps using Blender, GIMP and Blend4Web in the future.


NASA’s SPACECRAFT3D Augmented Reality App

More NASA projects using Blender:

Brian also is using Blender for his upcoming board game called Project Mars

January 25 2016

The Art of Open Source

This article introduces Blender to a wider audience.

Written for Linux Format magazine, Jim Thacker sketches Blender’s history and the successful content-driven development model.

Download or read the pdf here.

(Text and pdf is (C) by Linux Format, copied on blender.org with permission)

Screen Shot 2016-01-25 at 12.05.21 Screen Shot 2016-01-25 at 12.23.57
Older posts are this way If this message doesn't go away, click anywhere on the page to continue loading posts.
Could not load more posts
Maybe Soup is currently being updated? I'll try again automatically in a few seconds...
Just a second, loading more posts...
You've reached the end.

Don't be the product, buy the product!