ruminations on a series of unrelated events
After a bit of a hiatus from the blog, I am back with some interesting projects that are in various phases. The one I’ll write about now is a project called Any Way Back.
But first a little background…
I am currently teaching a course called Breaking the Frame. (In the School of Informatics and Computing at IUPUI in Indianapolis) It explores various emerging video technologies. This first semester I am teaching it, I am having students do four different projects:
(1) A branching video. For this project students use Eko Studio to create a short branching, non-linear narrative;
(2) A 360˚ stereographic project for which students use a Vuze stereographic 360˚ camera and Mettle Studio to embed graphics in 3D space, and then use a tool developed i nUnity to view the footage on a 3D TV with navigation between 3D nodes. (This is pretty exciting. I’ll write a separate blog about that 360˚ Unity technology, which I worked with IU’s Advanced Visualization Lab to develop.);
(3) A video-mapping project for which we will use good old Millumin, the software I have used for all my video mapping;
And finally (4) a Database film, which is what brings me to today’s topic.
Well over a year ago a student, Jon Eddy, asked me if he could do an independent study with me. I suggested we work together to conceive, shoot and edit a prototype of a piece I had in mind. We would not do the database work, but we would mock-up a normal video to demonstrate what the piece would look like. So Jon and I came up with a concept and enlisted another student, Nich Frost, to act in it. We shot it over the course of a couple of days, recorded another student, Hannah West, to do the voice over and then designed the layouts and put an 18-minute prototype of Any Way Back together. And that is where it sat for way too long.
Eventually, I approached a colleague, Travis Faas, who teaches coding and database stuff at our school, and he and I have been making some progress lately. Yesterday, we had some pretty good success and it feels like after a couple more meetings we will have the bugs worked out.
The thing about soft cinema is that it is not designed to play on the internet. Though I am sure there is some way to stream it. It lives on an iMac that has been put into 5K display, The software pulls a voiceover file and then searches for compatible meta tags to pull a 3K video, a 1080 video, a graphic and a music clip to populate a layout. It plays for one minute and then grabs another set and populates a new layout. The piece plays continually, but due to the number of assets and the logic, it will never playback the same way twice. In thinking about the logic, it became obvious that some semblance of order needed to be imposed to give structure, otherwise it could just become random nonsense. I realized that the VO had to be in a given order and then all the other choices would be determined by what VO was being played.
As we make more progress, I will update this entry. Stay tuned…
December 18, 2018
During this past Spring semester, I worked extensively with Travis Faas to create the database for this piece. After several attempts we got it working, albeit with a few bugs. Unfortunately, we didn’t complete it in time to use as a template in my class, and honestly I think it is too complex for students to actually use. I did however show it to my student. And this past semester I had a student who created a database capstone. Last week, I sat down with Travis and this student to help develop the database for her. While we were at it, we reviewed the bugs in Any Way Back, the biggest being two identical videos will play at the same time for no apparent reason. Shortly after sitting down with Travis, I heard word that the Arts Council of Indianapolis is doing a call for a video show. Ah ha,! Here is the perfect opportunity to showcase this new project.
So now all we have to do is work these bugs out and it might be in a show at the Arts Council come February. Check back to see if it got in.