Monday 30 June 2014

Tangible Projection - morph!

Those horizontal lines I've been harping on about are finally becoming playful via Processing. Now to translate that to the Kinect. 


Tangible Projection - second experiment

Today has involved searching for an unused studio space, sourcing the same projector and working out the right distances of the Kinect's inferred capture. There's a bit of a struggle for corresponding screen resolution at the moment, and finding the most ideal ratio for the Kinect to pick up movement, which is further complicated by the level of light in the environment.

We're getting there, slowly...



The issue here is size and correct mapping. It has been suggested that a taller canvas be made to match a ratio of 640 x 480 pixels, as there was essentially a height problem in terms of Kinect's motion tracking. 145 cm in width is fine, but height could be taller. In fact, it might not even be a case of making a taller canvas, but rotating the Kinect 90 degrees, because it wasn't so much the width that was problematic, but the height. If that's the case, then I should look into a way of elevating and suspending the canvas itself. 

I'm also unsure at this point whether the Epson TW480 might be suitable for rear projection, unless it has a mode for that function, which I've yet to find out by playing with its settings a bit more. If not, then I need to find another projector with such a mode, and one that might be vertical if I do make a taller canvas. 


I can safely say that I've not really done anything like this before, and my departure from traditional Fine Art mediums has been a lot harder than I anticipated. I've had to get my head around programming languages for a start, tap into physical computing, take further considerations for the environment and use other, technological methods to apply an effective visualisation.

Saturday 28 June 2014

Tangible Projection - first experiment

A couple of days after making the screen, and playing around with the coding, I arranged a dark studio space in order to test the projection and map it out.

The set up consisted of a laptop, projector (Epson HD 480TW), Xbox Kinect and the screen itself. I was borrowing a laptop from my tutor at the time, before my friend arrived with his own laptop, which had all the necessary bits already installed to accommodate Processing. 

I wasn't surprised to find that there were issues in terms of projection rendering and screen synchronisation with the laptop, especially with Processing itself. We were scratching our heads as to how you full-screen the damn thing and have it render accordingly through the projector, but no; they have to be awkward with each other. This will continue in a different yet similar space, most likely in the now-vacant Fine Art studios.

A few problems that should be addressed for the next testing:
  • Sort out screening/projector synchronisation
  • Pinpoint exact distance of Kinect vs. captured motion appearing through spandex.
  • Clean up the coding, especially dimensions if alternatives can't be used for the screening. 
More on this space on Monday, hopefully.

Tangible Projection - screen is ready

Here is an image of the screen in the workshop. One of the technicians made the frame, and I've attached the spandex material. It was done on Tuesday this week. Big enough to serve its purpose, as well as flexible.

Dimensions: 145 x 145 cm

Thursday 26 June 2014

Kinect and Processing - coding and set up.


I've figured that to achieve my projection idea is to use a Kinect, hence my earlier post on how it works. I've had trouble with admin rights and permissions with the university computers, so I've borrowed a Kinect from my tutor to take home with me and try it out on my own pc, which runs Windows 7. I also referred to Daniel Shiffman's guide to configuring the device to suit Processing. I had a friend, who works at Lockwood Ltd as a coder, help and take me through this. Didn't take him long to figure out Processing, which I was surprised to hear he had no awareness to.

For future references, this is basically what I've done to get the two working together:

1. Installed the Kinect-for-Windows SDK and Developer Toolkit applications - first and foremost thing one needs to do in order to install the hardware properly
.
2. Set up the necessary codes and sources/libraries OpenKinect - a few problems arose out of this, like the imported library SimpleOpenNI which didn't work at first, and compatibility issues between different versions of Processing itself, which were already factors of Mac vs. Windows. but luckily, said friend from Lockwood

3. Set up the pixel and depth resolution settings for the Kinect on Processing.

4. Translated the Processing sketch to the Kinect.

So far, so good; albeit a few brain farts and headaches. A lot of what I had to do now was to build the screen, source the projector and set it all up in a space I've organised at Trent.

Friday 20 June 2014

Xbox Kinect - depth resolution

Efficient, clever and useful to know, especially for my Tangible Projection idea, which is slowly but surely coming together. 


Tuesday 10 June 2014

Processing - image warping/morphing

First of all, I find this mesmorising:

Second, this is quite cute:

Thirdly, ye gods this man's work is beautiful! And he's done several more, all in Processing.


Lately, I've been racking my brain around the programming language, which is generally aimed at designers and artists. I have at least figured out how to make static sketches with the code, which is practically basic java/html (the sort that one used to change backgrounds and colour schemes on myspace or vampirefreaks). What I'm trying to achieve by this point is a way to have the mouse hover over the vertical lines I've inputted into the sketch and influence them with morph effects and ripples. Something along that line anyway, but I have lately been struggling with that kind of coding and have asked for help from my tutor. Hopefully, the appropriate code to achieve such intricate effects will be sorted by this week.

Thursday 5 June 2014

Tangible Projection - a new and additional idea

Bohunk Institute is a 50/50 endeavour that relies on the chance of being selected for my submission, which would mean that I would further enhance the presentation of the Coniglio Trial. That's not even considering how I would get it transported over there.

But does one remember when I was rambling on about some line drawings I was doing back in January?


And theorising its warping in a manner such as this?


Well, I've been toying more with that idea, and how to make such a piece interactive and compelling. The solution I have so far is through Processing, and perhaps the Arduino. I was shown such devices and software some months ago - probably even last year, around November. It was also used during the Light Night project, even though I didn't have the foggiest about how it was used. Over time, I've been learning the computer code bit by bit. My knowledge and skills are still far from adept, but I think there's enough there to achieve what I want from this with some support. 

The other things I will need is a Microsoft Xbox Kinect, which has been known to be a very useful yet very cheap method for infrared capture, and it can be combined with Processing and Arduino. The next thing I need to do is to apply this to a reverse projection, using thin spandex material on backdrop frames to render and map the 'warped line' effect. This is all easier said than done, and I have yet to find a suitable space in which this can be best exhibited. But, hopefully, by the end, I will have what I call a 'tangible projection' that draws on the themes of this project.

There are a lot of other things to consider here, like:
  • What kind of space do I need?
    • I need an exhibition space that can obviously facilitate a darkened room with public access, enough space for a projector and backdrop frame/screen.
  • How much electrical power or how many electrical sources do I need?
    • Electrical power is to be measured (I'm not sure how this is normally done), but in terms of sockets, we have a projector, a Kinect (I don't own one so I don't currently know if they do need a power source, but I imagine that they do), and perhaps the Arduino. The laptop does not necessarily have to stay.
  • How big is this piece going to be?
    • Depends on how big I can make this projection screen and how the rendered result of Processing, calibrated with the Kinect, is mapped onto it. I would like the width to be larger than height, but it does not matter a great deal, so long as the height is roughly 180 - 200 cm. 
  • What kind of projector is needed, and perhaps how many?
    • A HD one, ideally. Epson has been pretty good in this respect. 
There are likely other things to consider, but they have yet to be raised in discussion with my supervisors.

For comprehension, here is a quick scribbling:

Tuesday 3 June 2014

Atrium Experiment (video clip)

Coniglio Trials


As previously mentioned, I had a video clip which just needed rotating. I'll be including this as an additional element of my submission to the Bohunk. 

Monday 2 June 2014

Shinier images of the Atrium Experiment (Coniglio Trial)

Before I took the piece away from the Atrium gallery in Bonington and left it back in the MA studio, I took a few pictures of it with a proper camera, and even managed to take a video of it (it's currently filmed in the wrong angle, and I intend to get my hands on a version of Final Cut Pro at uni tomorrow). I will be using these images in my submission to the Bohunk Institute, for their Open Submission in July. 





I will also include the video in on this, just to demonstrate the interactive aspect of it. I've also been working on some non-interactive pieces as of late, due to revising everything in my Learning Agreement. This will hopefully address the vs. aspect of the project if it relates well to the presentation of the updated Coniglio Trial. I think the main thing that needs to be wrestled with here, is how to make the interactive piece more sustainable, which was an apparent weakness of this piece.