Minor Problems And Presentation

Originally, we wanted to be finished by friday. We did our final tests and everything worked quite well. Unfortunately, we encountered some problems when combining the sensor in a box with the glove. Due to the magnet on the glove the sensor delivered wrong data concerning movements – so we hat do move the sensor to the wrist. Therefor, we had to do modifications on the interaction control between sensor and processing and had to rethink the way of interaction.

In the morning we also finished the movies and/or movie-conversation for the final prototypes and we just have to add them onto the relevant lines in Resolume Avenue.

Additionally, we prepared the presentation for tomorrows event. Due to our design of the poster we decided to take up this for the presentation too. For the content, we decided to do about five slides with the – in our view – most inportant content. So here is the layout – the rest can be seen by tomorrow.
presentation


Fixing the box.

The problems of data sent over the serial port were caused by the electrostatic potential changes generated by human contact. We previously conducted tests in a rough box - now we have to create a box that will be assembled on the glove.

With a few tools and a little practice we created one according to the sensor dimensions, the sensor fits perfectly in the box.


Interface Engineering

COURBE Yohan help us to modifie the explode of our cube because we have a lot of stuff to do, this is a screen shot :

Cube explode

We have calibrate the sensor to processing and we have send OSC messages to Resolume Avenue.

Now we can with the glove :

  • change the opacity (for watching the first or the second video);
  • zooming;
  • rotate x.

We think these 3 things, it’s very cool for a prototype just do a mix with 2 videos, we play sound in processing, each videos is associated to a sound.


Prototype Improvement

So now, at the end of the week we roughly completed our prototype. In fact we did the general stuff to achieve this and used our whole Saturday’s energy with debugging and improving functionality and interface design.

For debugging purpose, it was necessary to go through the code to check possible problems when connecting with Resolume Avenue. Afterwards and/or within this process we had to make some adaptions in the settings for the sensor and/or for the program-code.

Additionally, we searched and defined the content for Resolume Avenue. For the music we wanted to have a mixture between famous music from our countries and belgian traditional music. Related to the tags we decided to take our somehow favorite music party songs for the nightlife, more belgian traditional music for the Atomium, for the European Union we took traditional composed music, for the ISIB we took funny music about teamwork. For the video material, we decided doing most of the video material on our own. So a part of our group climbed buildings just to make sure we have enough video material for our prototype. Additionally, it was necessary to edit and convert the files into the right file format.

Meanwhile, we were doing the new interface design for the screens. According to our project poster, we decided to do an similar design for the interface. In the end – almost all screens are done by today – maybe some slight changes due to the code-design connection are still possible. If so, this minor bugs are going to be corrected while discussing about the presentation.

Furthermore, we searched for visual effects from the processing examples book, Rudi GIOT, gave us to discuss possible screen-changings. By the way: it’s pretty good and worth to write the ISBN number.
Team 4 at work


Interface Engineering

We have finished the panel interaction with the glove today – so it’s done. Additionally, we have fixed possible bugs! In parallel we have created a class to send OSC messages to Resolume with the sensor equipped glove! I hope tomorrow it will be done too!

A rectification, the user can be only choose 2 themes, after that, the OSC mode come and he can mix with the glove!


OMG… it’s Friday

After a week of creating our concept prototype we are on a pretty good way to a final prototype. Finally, we agreed on how we will use our new sensor equipped glove. We defined the following commands to interact with our installation:

General
Commands
Acceleration Z down = select and go to the next level
Acceleration Z down + pick 1 = go to previous screen
Gyro right/left = select next/previous object
Commands for
Resolume Avenue
Magnitude of X-axis (move hand like on scooter) = scale in/out
Magnitude of Y-axis (move hand around the axis of your arm) =1 video more to background and 1 more to front
magnitude of Z-axis (move hand in his own plane) = change colors in the video

Additionally, we set up the XML file to access the relevant panels in Resolume Avenue. This reference file will be mainly used to make a link between actions, tags and Resolume Avenue actions.

We combined the program code Gregory did yesterday (“tag map”) with our programm code. We also included the sensor control into the new program part. Unfortunately, we encountered that there is a problem when moving the sensor too fast and/or when touching it (electrostatic manipulation to the sensor). The temporarily solution for this problem was to put it into a seperated box for testing purpose – and to isolate it from human contact manipulation (see picture below).
Sensor Box


Interface Engineering

We have done two screens with Processing, with the first screen, the user can test the glove with a cube (acceleration, x, y, …), the second screen is to choose the word theme and looks like a map of the subway.
AERTS Gregory helped us to be fast and make the second panel, it was very fast and very good!
After that we have take these two panels for do only one project. We have have a class to import the xml and know where is the position of all videos in Resolume Avenue for a theme:

  • to have an interaction with the cube by the glove
  • to draw the second panel
  • to get the data from the 9DOF

Analysys

  • Before choose a theme the user must play with the glove for know how to use it, this is the panel :

First Panel

But before that, the user must don’t move because we need the correct position of the hand and after that he go to the first panel.

When the user do a push there for go to the second screen, we have an effect the cube is divided into small piece, like that :

Split the cube (Begin)

Split the cube

  • The second screen is to choose the theme :

Second Screen (Part 1)

Second Screen (Part 2)

The user will choose the theme with the glove, when we have a right movement, the user go to the right theme and the same for the left movement.

For choose the theme he need to do a push ! And after that he can mix with Resolume Avenue !

We have start to command processing with the glove… It’s very difficult to have a fluidity and very long to do something because it’s not a software for prototyping ! Tomorrow we will continue that thing and i hope we will finish it!


Finalization Of Our Poster

The poster visualizing our concept for the project is finally done. It reflects the way of interaction process. Therefor we had to collect all the data, visualize our processes and corporate our ideas in written words.

For the poster we wanted to spread our idea visually. The color use should reflect two very important parts of our concept: The first one reflects the possible user ineraction, the second one reflects the technical background (which of course the users shoudn’t get in touch in). Overall we tried to homogenize the colors and tried to forward information very clearly structured. Additionally, we want to create a poster that invites users to watch and read – and afterwards to get interested in the project.


Flow Analysis Of Sensor Data

With the implementation of the three sensors and data analysis, we fixed the intervals of values ​​for each axis.

The microprocessor provides us with a steady flow of data as a string. This string is treated to separate the values. For indeed, they are simply delimited by commas. We must therefore recognize which are the gyroscope, accelerometer or magnetometer.

These values ​​studied can be used to start the encoding on Processing, and therefor adapt these values ​​to the requirements for control applications.

All axes are redefining the specific actions in applications. This allows to see that we have a series of combinations of movement, making programming easier for the clarity of the requirements of the final result.

Then for example, to move from one application to another, we can cause an acceleration in the direction of the z-axis. So the subdivision of the different axes allows us to clearly identify the movements, generating actions.

Throughout the programming step of the application on Processing, step by step we will set the axes, allowing the correct application of the program.


Interface Engineering

We have think about the informatic interface and after a couple of hours (we have test Open Interface), we have decide to do all the interface and OSC message with Processing because we need an Innovative User Interface, Open Interface offers good User Interface like iTunes but it’s not new !