Monday, July 21, 2014

We made it! 
Poster practicing
Poster session at the Student Services Building on July 17th!

Wednesday, July 9, 2014

Lab 2: Sorting

For this lab, I created 3 cpp programs which sorted a list of random numbers from minimum to maximum, and then gave the user a choice on what kind of finished product they wanted to be printed.

The program involved mastering the 3 different sorting algorithms - bubble sort, selection sort, and insertion sort. I also learned how to set up a function which would take into account (using an "if" statement) what the user inputted, in order to decide what to output.

Creating the programs also involved importing the numbers into the array from another file.
For importing the numbers into the program, I was able to user what I learned form lab 1. However, it required a new function to put those numbers into an array.


Thursday, July 3, 2014

Screencast of new game (controlled using my mind) -7/3/14

http://youtu.be/Q8CaUX4OGdw

Successsssssss (or at least, some) - July 3, 2014

Using the EmoKey I am now successfully able to control the game using my mind. 

The EmoKey is a tool to map the EmoStates into keyboard inputs. 
It can use combinations of keys or any of the keys on the keyboard.

To set up the EmoKey, you have to create an EmoKey mapping. The EmoKey mapping has a couple parts to it. First, you should add the "rules". By setting up the rule you name the command and choose what keyboard input will correspond to your command. For example, if you want to replace the space bar in a game with input from the headset, you would add the spacebar as the "rule". The "behavior" of the key allows you to choose whether you want activating the key to hold down the key or not. The "connection" that goes with this "rule" is what you want the command to do. For example, if you want the spacebar to make the player jump, you would choose "lift" in the drop down menu. The mapping connects to the control panel rather than the neuroheadset or EmoEngine so that it is possible to set it up with different/certain users and to configure the settings before using it. The "value" of the connection allows you to control how easy it is to activate the key. A lower value would mean it takes less power to activate the command. 

A couple things I ran into when using the neuroheadset to control the game:
  • Sometimes the EPOC randomly crashes and I don't realize this while in the game, so I keep trying to move the block without realizing the signal has been lost. 
  • Setting up the input to replace the keyboard keys in the control panel instead of in the EmoKey does not give an option to save, while the EmoKey does. (Mapping will be lost when the control panel is closed)
  • Setting up the space bar to correspond to "lift" unintentionally triggered the fullscreen option in the Control Panel. 

Next, I redesigned level 2 of the game. Previously, when I reduced the game to only one command, forward, it was too easy to to win and it would only be a matter of time before you would get to the end of the game. 

In order to make the game more easily playable, I reduced the game to two commands, forward and up (or fly/lift). I changed the game so that there are now spaces in the floor plane. Since the player can only move forward, I also made it so that all the goal blocks that won the player points and the final goal block would be in a straight line. However, to make the game more interesting I incorporated the command jump by making it necessary to jump over the breaks in the game plane or else the player would lose. In addition, I moved the goal blocks up so that the player would have to jump to reach them as well. 

The game now tests the players ability to command 2 different commands at the same time, since it is necessary to not only move up but up and forwards. In addition, the player has to control how much they move forward to avoid falling into the next gap in the game plane in front of them. 
View of game from side
Game view

Finally, in order to make it easier to see where exactly in the game the player is (difficult originally to estimate depth because of the perspective), I attached the camera to the player so that the camera follower the player as it moved. 




Tuesday, July 1, 2014

Experimenting with Blender-July 1st, 2014

We originally thought we would have to redo the Unity game prototype using Blender and rewriting the scripts in Python, since the rest of the project, like the Cortical Learning Algorithm and the analysis on the brainwaves that Julia was doing, used Python and Linux. 

I made a snowman. :)
Extruding (shown above) is super fun. Pretty sure Unity doesn't have this. 

(Thanks YouTube user BornCG for the tutorials on how to use Blender)


However, we realized we were able to use the EmoKey to create new inputs for the commands that the keyboard arrows served previously. To make the game less difficult, I made it so only moving forward would be necessary to get through the game. As a result, now the final hurdle to finish the game was lowered so that jumping was not necessary and the blocks that earned the user points were placed in a straight line as opposed to all over the game plane which would have required left and right arrow keys. 

I am now able to move the cube in the game using my mind!


Using the Emotiv EPOC

Summarized thoughts on the Emotiv EPOC: 
"Wow wow super cool"

The EPOC had 2 reference sensors and 16 sensors that had to be hydrated before use. Once the sensors were hydrated and placed into the headset, we tried the headset out on AJ. The sensors had to be hydrated again since there was not enough saline solution the first time. The wireless signal according to the control panel was strong, although the user has to sit fairly close to the USB Dongle. After a bit of wiggling around of the sensors, we were able to get a good connection where nearly all of the sensors had a "green" (the best) connection. 

We started with training the EPOC to recognize facial expressions. 
The control panel had a little robot which would mimick your facial expressions as it recognized them. Among the facial expressions were laugh, smile, smirk (right and left), brow furrow, eyebrow raise, and blink. 

We next worked on Cognitiv training. We began by training Push. For training, the user has to concentrate on thinking of the command consistently for 8 seconds. After training is over, the user is prompted to either accept the training or discard it. It probably would have been better to first train neutral until the skill level for neutral was very high. On the other hand, we tried to add to many commands too quickly and began adding pull and left and right before the first commands were mastered, making controlling the block much more difficult. The more commands that have been added, the harder it is for the control panel to recognize each command. 

We also took a look at the Affectiv suite, which showed us our emotions during the training. It was interesting to see the places where we were frustrated, and the ones where we had a burst of excitement. 

After a couple days of playing around with the EPOC, we discovered a couple of things:
  • The control panel crashes after about 800 seconds of run time when used on Linux, but not on Windows
  • Training without using the animation helps the user realize whether or not the training was successful so that they can make a smart decision on whether or not to keep the training. When the animation is turned on, it is impossible to tell whether or not the block's movement was due to the user or the animation. If the block does not successfully complete the command that is being focused on during training, the user could/should choose to discard the training. 
  • Visualizing the command that is being trained helps the control panel understand the user's intentions. 
  • Thinking of a specific word like "nothing" and staring at a solid color such as the white background of a window helped me master "neutral."