Interface Engineering

Our project (step by step) :

When the application is lauched, it go to the FullScreen (Library Full Screen in Processing), we can see a picture of our glove and a timer for take all necessary informations, the user should not move.

Begin

When the timer is done, we can play with a cube, all data come from a thread class, this thread retrieves in a while loop all data from the sensor.

Cube

The user can push down and the cube explode.

Cube explode

It’s for going to the next panel and choose two themes, it’s a kind of subway map.

Themes

After the user has chosen two themes, the map explode when the user push down.

Subway Explode

After that the program sends OSC messages with Processing (OscP5) for play the first video of the two themes and with the glove, it’s an example of OSC message (for rotate x) :

String rotationX(float value){
byte[] valeur = this.GetBytes(value);

position =”/layer”+layer+”/clip”+clip+”/video/rotatex/values” + (char)0 + (char)0 + “,f” + (char)0 + (char)0 + (char)abs(valeur[0]) + (char)abs(valeur[1]) + (char)abs(valeur[2]) + (char)abs(valeur[3]);
return position;
}

private byte[] GetBytes(float value) {

ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
DataOutputStream stream = new DataOutputStream(byteStream);
try {
stream.writeFloat(value);
} catch (IOException e) {
return new byte[4];
}
return byteStream.toByteArray();
}

We must convert float number to a bytes array and after that, we can build the OSC message (here the name is “position”).

We use XML file for know where are all the videos in Resolume Avenue and wich audio is affected to a video, like that:

if you look the “rotationX”, you can see “layer” and “position”, the content of these variables are taken from the XML file.

With our program, the user can :

  • change the opacity (for watching the first or the second video);
  • zooming;
  • rotate x.

in Resolume Avenue (Only three effects for the moment).

When the user selected a video with the glove, Processing launch the audio attached to the video (librairy Minim) !

All we have 4 themes of our videos (Atomium, Night Club, EU (Europeen Union), ISIB (Institut Supérieur Industriel de Bruxelles)).

All is programming in Java with Processing (interface, the data acquisition and the sending of OSC messages), we don’t have used Prototyping Tools and all backgrounds are made by us.


Interface Engineering

COURBE Yohan help us to modifie the explode of our cube because we have a lot of stuff to do, this is a screen shot :

Cube explode

We have calibrate the sensor to processing and we have send OSC messages to Resolume Avenue.

Now we can with the glove :

  • change the opacity (for watching the first or the second video);
  • zooming;
  • rotate x.

We think these 3 things, it’s very cool for a prototype just do a mix with 2 videos, we play sound in processing, each videos is associated to a sound.


First Brainstorming

To introduce and explain our goal of this project, we made a main brainstorming session with all our principle ideas. With the help of our coaches, Tom HEENE and Kenneth LABIAU, we talked about the possibilities and functionality of the final result.

So we had to find a common understanding of the project tools, the project goal and also the final prototype. After all these reflections, we agreed on the same concept: using the interactive glove to mix videos and add music of specific themes, Brussels.

The sources for the videos are personal travel experiences (e.g. monuments, cultural stuff, nightclubs, … ) and videos from the internet (youtube, vimeo, google, … ).

The prototype for our group for the following week will include videos from our own experiences as well as videos from the web. Those videos will be loaded into a video composer software called Resolume Avenue. The user chooses some tags of Brussels and gets directed to a video composition related to the chosen tags. With the ability of the glove, especially the finger/gravity sensors, the user can manipulate the videos (while Resolume runs in the background).

So for conclusion, we want to create a installation wich allows users to create their own Brussels experiences, mixed by themselves.

After all of this, we had to find a name to our project. With several propositions by all the members of the group, we agreed with this name: VGMove. We thought that name was the most significant, including keywords like “visual”, “glove” and “move”; which means that we can move and mix videos by including visual effects, with all the possibilities given by the interactive glove.