VGmove Presentation

Our presentation is there.


Interface Engineering

Our project (step by step) :

When the application is lauched, it go to the FullScreen (Library Full Screen in Processing), we can see a picture of our glove and a timer for take all necessary informations, the user should not move.

Begin

When the timer is done, we can play with a cube, all data come from a thread class, this thread retrieves in a while loop all data from the sensor.

Cube

The user can push down and the cube explode.

Cube explode

It’s for going to the next panel and choose two themes, it’s a kind of subway map.

Themes

After the user has chosen two themes, the map explode when the user push down.

Subway Explode

After that the program sends OSC messages with Processing (OscP5) for play the first video of the two themes and with the glove, it’s an example of OSC message (for rotate x) :

String rotationX(float value){
byte[] valeur = this.GetBytes(value);

position =”/layer”+layer+”/clip”+clip+”/video/rotatex/values” + (char)0 + (char)0 + “,f” + (char)0 + (char)0 + (char)abs(valeur[0]) + (char)abs(valeur[1]) + (char)abs(valeur[2]) + (char)abs(valeur[3]);
return position;
}

private byte[] GetBytes(float value) {

ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
DataOutputStream stream = new DataOutputStream(byteStream);
try {
stream.writeFloat(value);
} catch (IOException e) {
return new byte[4];
}
return byteStream.toByteArray();
}

We must convert float number to a bytes array and after that, we can build the OSC message (here the name is “position”).

We use XML file for know where are all the videos in Resolume Avenue and wich audio is affected to a video, like that:

if you look the “rotationX”, you can see “layer” and “position”, the content of these variables are taken from the XML file.

With our program, the user can :

  • change the opacity (for watching the first or the second video);
  • zooming;
  • rotate x.

in Resolume Avenue (Only three effects for the moment).

When the user selected a video with the glove, Processing launch the audio attached to the video (librairy Minim) !

All we have 4 themes of our videos (Atomium, Night Club, EU (Europeen Union), ISIB (Institut Supérieur Industriel de Bruxelles)).

All is programming in Java with Processing (interface, the data acquisition and the sending of OSC messages), we don’t have used Prototyping Tools and all backgrounds are made by us.


Minor Problems And Presentation

Originally, we wanted to be finished by friday. We did our final tests and everything worked quite well. Unfortunately, we encountered some problems when combining the sensor in a box with the glove. Due to the magnet on the glove the sensor delivered wrong data concerning movements – so we hat do move the sensor to the wrist. Therefor, we had to do modifications on the interaction control between sensor and processing and had to rethink the way of interaction.

In the morning we also finished the movies and/or movie-conversation for the final prototypes and we just have to add them onto the relevant lines in Resolume Avenue.

Additionally, we prepared the presentation for tomorrows event. Due to our design of the poster we decided to take up this for the presentation too. For the content, we decided to do about five slides with the – in our view – most inportant content. So here is the layout – the rest can be seen by tomorrow.
presentation


Fixing the box.

The problems of data sent over the serial port were caused by the electrostatic potential changes generated by human contact. We previously conducted tests in a rough box - now we have to create a box that will be assembled on the glove.

With a few tools and a little practice we created one according to the sensor dimensions, the sensor fits perfectly in the box.


Interface Engineering

COURBE Yohan help us to modifie the explode of our cube because we have a lot of stuff to do, this is a screen shot :

Cube explode

We have calibrate the sensor to processing and we have send OSC messages to Resolume Avenue.

Now we can with the glove :

  • change the opacity (for watching the first or the second video);
  • zooming;
  • rotate x.

We think these 3 things, it’s very cool for a prototype just do a mix with 2 videos, we play sound in processing, each videos is associated to a sound.


Prototype Improvement

So now, at the end of the week we roughly completed our prototype. In fact we did the general stuff to achieve this and used our whole Saturday’s energy with debugging and improving functionality and interface design.

For debugging purpose, it was necessary to go through the code to check possible problems when connecting with Resolume Avenue. Afterwards and/or within this process we had to make some adaptions in the settings for the sensor and/or for the program-code.

Additionally, we searched and defined the content for Resolume Avenue. For the music we wanted to have a mixture between famous music from our countries and belgian traditional music. Related to the tags we decided to take our somehow favorite music party songs for the nightlife, more belgian traditional music for the Atomium, for the European Union we took traditional composed music, for the ISIB we took funny music about teamwork. For the video material, we decided doing most of the video material on our own. So a part of our group climbed buildings just to make sure we have enough video material for our prototype. Additionally, it was necessary to edit and convert the files into the right file format.

Meanwhile, we were doing the new interface design for the screens. According to our project poster, we decided to do an similar design for the interface. In the end – almost all screens are done by today – maybe some slight changes due to the code-design connection are still possible. If so, this minor bugs are going to be corrected while discussing about the presentation.

Furthermore, we searched for visual effects from the processing examples book, Rudi GIOT, gave us to discuss possible screen-changings. By the way: it’s pretty good and worth to write the ISBN number.
Team 4 at work


Interface Engineering

We have finished the panel interaction with the glove today – so it’s done. Additionally, we have fixed possible bugs! In parallel we have created a class to send OSC messages to Resolume with the sensor equipped glove! I hope tomorrow it will be done too!

A rectification, the user can be only choose 2 themes, after that, the OSC mode come and he can mix with the glove!


OMG… it’s Friday

After a week of creating our concept prototype we are on a pretty good way to a final prototype. Finally, we agreed on how we will use our new sensor equipped glove. We defined the following commands to interact with our installation:

General
Commands
Acceleration Z down = select and go to the next level
Acceleration Z down + pick 1 = go to previous screen
Gyro right/left = select next/previous object
Commands for
Resolume Avenue
Magnitude of X-axis (move hand like on scooter) = scale in/out
Magnitude of Y-axis (move hand around the axis of your arm) =1 video more to background and 1 more to front
magnitude of Z-axis (move hand in his own plane) = change colors in the video

Additionally, we set up the XML file to access the relevant panels in Resolume Avenue. This reference file will be mainly used to make a link between actions, tags and Resolume Avenue actions.

We combined the program code Gregory did yesterday (“tag map”) with our programm code. We also included the sensor control into the new program part. Unfortunately, we encountered that there is a problem when moving the sensor too fast and/or when touching it (electrostatic manipulation to the sensor). The temporarily solution for this problem was to put it into a seperated box for testing purpose – and to isolate it from human contact manipulation (see picture below).
Sensor Box


Interface Engineering

We have done two screens with Processing, with the first screen, the user can test the glove with a cube (acceleration, x, y, …), the second screen is to choose the word theme and looks like a map of the subway.
AERTS Gregory helped us to be fast and make the second panel, it was very fast and very good!
After that we have take these two panels for do only one project. We have have a class to import the xml and know where is the position of all videos in Resolume Avenue for a theme:

  • to have an interaction with the cube by the glove
  • to draw the second panel
  • to get the data from the 9DOF

Analysys

  • Before choose a theme the user must play with the glove for know how to use it, this is the panel :

First Panel

But before that, the user must don’t move because we need the correct position of the hand and after that he go to the first panel.

When the user do a push there for go to the second screen, we have an effect the cube is divided into small piece, like that :

Split the cube (Begin)

Split the cube

  • The second screen is to choose the theme :

Second Screen (Part 1)

Second Screen (Part 2)

The user will choose the theme with the glove, when we have a right movement, the user go to the right theme and the same for the left movement.

For choose the theme he need to do a push ! And after that he can mix with Resolume Avenue !

We have start to command processing with the glove… It’s very difficult to have a fluidity and very long to do something because it’s not a software for prototyping ! Tomorrow we will continue that thing and i hope we will finish it!


Finalization Of Our Poster

The poster visualizing our concept for the project is finally done. It reflects the way of interaction process. Therefor we had to collect all the data, visualize our processes and corporate our ideas in written words.

For the poster we wanted to spread our idea visually. The color use should reflect two very important parts of our concept: The first one reflects the possible user ineraction, the second one reflects the technical background (which of course the users shoudn’t get in touch in). Overall we tried to homogenize the colors and tried to forward information very clearly structured. Additionally, we want to create a poster that invites users to watch and read – and afterwards to get interested in the project.