Interface Engineering

Our project (step by step) :

When the application is lauched, it go to the FullScreen (Library Full Screen in Processing), we can see a picture of our glove and a timer for take all necessary informations, the user should not move.

Begin

When the timer is done, we can play with a cube, all data come from a thread class, this thread retrieves in a while loop all data from the sensor.

Cube

The user can push down and the cube explode.

Cube explode

It’s for going to the next panel and choose two themes, it’s a kind of subway map.

Themes

After the user has chosen two themes, the map explode when the user push down.

Subway Explode

After that the program sends OSC messages with Processing (OscP5) for play the first video of the two themes and with the glove, it’s an example of OSC message (for rotate x) :

String rotationX(float value){
byte[] valeur = this.GetBytes(value);

position =”/layer”+layer+”/clip”+clip+”/video/rotatex/values” + (char)0 + (char)0 + “,f” + (char)0 + (char)0 + (char)abs(valeur[0]) + (char)abs(valeur[1]) + (char)abs(valeur[2]) + (char)abs(valeur[3]);
return position;
}

private byte[] GetBytes(float value) {

ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
DataOutputStream stream = new DataOutputStream(byteStream);
try {
stream.writeFloat(value);
} catch (IOException e) {
return new byte[4];
}
return byteStream.toByteArray();
}

We must convert float number to a bytes array and after that, we can build the OSC message (here the name is “position”).

We use XML file for know where are all the videos in Resolume Avenue and wich audio is affected to a video, like that:

if you look the “rotationX”, you can see “layer” and “position”, the content of these variables are taken from the XML file.

With our program, the user can :

  • change the opacity (for watching the first or the second video);
  • zooming;
  • rotate x.

in Resolume Avenue (Only three effects for the moment).

When the user selected a video with the glove, Processing launch the audio attached to the video (librairy Minim) !

All we have 4 themes of our videos (Atomium, Night Club, EU (Europeen Union), ISIB (Institut Supérieur Industriel de Bruxelles)).

All is programming in Java with Processing (interface, the data acquisition and the sending of OSC messages), we don’t have used Prototyping Tools and all backgrounds are made by us.


Minor Problems And Presentation

Originally, we wanted to be finished by friday. We did our final tests and everything worked quite well. Unfortunately, we encountered some problems when combining the sensor in a box with the glove. Due to the magnet on the glove the sensor delivered wrong data concerning movements – so we hat do move the sensor to the wrist. Therefor, we had to do modifications on the interaction control between sensor and processing and had to rethink the way of interaction.

In the morning we also finished the movies and/or movie-conversation for the final prototypes and we just have to add them onto the relevant lines in Resolume Avenue.

Additionally, we prepared the presentation for tomorrows event. Due to our design of the poster we decided to take up this for the presentation too. For the content, we decided to do about five slides with the – in our view – most inportant content. So here is the layout – the rest can be seen by tomorrow.
presentation


Interface Engineering

We have finished the panel interaction with the glove today – so it’s done. Additionally, we have fixed possible bugs! In parallel we have created a class to send OSC messages to Resolume with the sensor equipped glove! I hope tomorrow it will be done too!

A rectification, the user can be only choose 2 themes, after that, the OSC mode come and he can mix with the glove!


Interface Engineering

We have done two screens with Processing, with the first screen, the user can test the glove with a cube (acceleration, x, y, …), the second screen is to choose the word theme and looks like a map of the subway.
AERTS Gregory helped us to be fast and make the second panel, it was very fast and very good!
After that we have take these two panels for do only one project. We have have a class to import the xml and know where is the position of all videos in Resolume Avenue for a theme:

  • to have an interaction with the cube by the glove
  • to draw the second panel
  • to get the data from the 9DOF

Analysys

  • Before choose a theme the user must play with the glove for know how to use it, this is the panel :

First Panel

But before that, the user must don’t move because we need the correct position of the hand and after that he go to the first panel.

When the user do a push there for go to the second screen, we have an effect the cube is divided into small piece, like that :

Split the cube (Begin)

Split the cube

  • The second screen is to choose the theme :

Second Screen (Part 1)

Second Screen (Part 2)

The user will choose the theme with the glove, when we have a right movement, the user go to the right theme and the same for the left movement.

For choose the theme he need to do a push ! And after that he can mix with Resolume Avenue !

We have start to command processing with the glove… It’s very difficult to have a fluidity and very long to do something because it’s not a software for prototyping ! Tomorrow we will continue that thing and i hope we will finish it!


First Brainstorming

To introduce and explain our goal of this project, we made a main brainstorming session with all our principle ideas. With the help of our coaches, Tom HEENE and Kenneth LABIAU, we talked about the possibilities and functionality of the final result.

So we had to find a common understanding of the project tools, the project goal and also the final prototype. After all these reflections, we agreed on the same concept: using the interactive glove to mix videos and add music of specific themes, Brussels.

The sources for the videos are personal travel experiences (e.g. monuments, cultural stuff, nightclubs, … ) and videos from the internet (youtube, vimeo, google, … ).

The prototype for our group for the following week will include videos from our own experiences as well as videos from the web. Those videos will be loaded into a video composer software called Resolume Avenue. The user chooses some tags of Brussels and gets directed to a video composition related to the chosen tags. With the ability of the glove, especially the finger/gravity sensors, the user can manipulate the videos (while Resolume runs in the background).

So for conclusion, we want to create a installation wich allows users to create their own Brussels experiences, mixed by themselves.

After all of this, we had to find a name to our project. With several propositions by all the members of the group, we agreed with this name: VGMove. We thought that name was the most significant, including keywords like “visual”, “glove” and “move”; which means that we can move and mix videos by including visual effects, with all the possibilities given by the interactive glove.