Lessons Learnt

I feel it’s good to end a project with some takeaways or lessons learnt. I certainly picked up some valuable items along the way that should I revisit such a concept again I would do differently. In summary they would be;

  • Natural language is incredibly complex – as obvious as that seems a statement it really is only when you get in to it that you understand the subtlety and scope a designer would need to cater for
  • The maze design I went with really didn’t suit the approach. In the small bit of user testing completed users quickly adjusted to just typing ‘up’ , ‘down’, left’ or ‘right’ as they needed speed to get the emoji to turn in time. This went against my initial design of a more natural conversation.
  • I wasn’t able to get the text messaging length to be 100% dynamic. Anything over a certain number of characters breaks the design convention which is not ideal.

In saying this I learnt an incredible amount about how it is fun to break the tradtional conventions of keyboard inputs. The scope and design opportunities it opened up for me as a designer was exciting and the project I believe is a success in it’s humour.. but that may just be my humour 😛


Version 1.0

The version I presented at the end of the project can be seen in the video here. A couple of additional items made it in the last release:

  • There is a little easter egg in the game for anyone who types “glasses” in a sentence the emoji will put on some sunglasses and respond with a cheeky follow up.
  • The game starts with a quick intro to the story. This adds context to the game
  • The game ends when the two meet and your time is calculated
  • The user can restart the game at any stage by typing reset in to the conversation

Iteration 0.9

The next iteration of the game design was to bring it closer to final version. I set out the following requirements for this version:

  • Could I reverse the order of how the messages appeared onscreen to closer match text messaging
  • Could I randomise the BOT responses to my text input.
  • Could I redesign the maze and emojis layout to be cleaner and flatter design aesthetic

You will see in the screenshot above I have texts now showing as expected in reverse order. Playing off the humour from Monkey Island inspiration I have the BOT respond with ‘Alriiight Captain’ … just one of his many random responses. I was getting closer to the final version.

Iteration 0.2 – 0.8

For the next couple of iterations I was working my way through the following key interactions:

  1. Could I take a sentence and extract key words from the input in to an Array
  2. Could I loop through the Array and extract any key “intents”. These would be ‘Move’, ‘Go’, ‘Up’, ‘Down’, ‘Stop’ and so on
  3. Could I replace the static ball with an image – At this stage I was focused on an Emoji type character
  4. Could I design a Game background
  5. Could I design collision detection when the character hit a wall
  6. Could I control the emoji speed

Setup of Game

At this point I was ready to formalise the approach for the game. It would;

  • Have a character who can be controlled via the X and Y co-ordinates through text commands
  • Have 3 stages to the Game – Loading screen, Playing screen and Win screen.
  • It would have a time element to the concept to increase replayability
  • Allow the user to restart the game at any stage

Inspiration – Monkey Island

Inspiration for the concept initially came from the series of games called Monkey Island. I always enjoyed the format of play in the game where the user would control the actions through a series of text based commands. The humour and self referential style of the game was also appealing. If I could incorporate these elements in to the game design I would be happy.