Final Project: Play Testing (PCOMP)

Today I was able to user test with 7 people, all of whom offered great feedback, inspiration and insight into the next step of my project (which, by the grace of God I have not changed yet).

IMG_2059

OBSERVATIONS

These were the natural inclinations toward interacting with the dog device, that I noticed:

hug
pet head
pet body
squish body/squeeze body (the poor animal and its organs)
press down on body
put on lap
hold out in front

IMG_2060 IMG_2057

FEEDBACK

Make the device draw your attention
Make the device something that the user wants to interact with visually and physically
Have the proximity sensor in the eyes be mapped to a close distance and a far distance, the far distance will cause the device to notice you and grab your attention to reel you in
Have the device become warm to give that warming sensation of a real dog
If the tail moves on the device, it should move on screen
In connecting the computer to the physical device, think about what each is strong at
Have the dog interface need you. If it is sad on screen, then you become the care giver and hug it back
Make it initiate
Leave room for humor and surprise
Tongue goes up and down, not out
More interested in the physical, as in this current state it asks for split attention
There can be an option to feed it
The dog device should be something that you can talk to and share your feelings to
It listens to you and reacts
It can be naughty and the user can play with it
The digital interface can be like Gabe’s trapped in bottle/projection
With the servo motors, the tail wagging and the tongue movement are not natural/organic to that of a real dog. It will be weird. And there is a noise that goes with it. Perhaps try using a vibration motor.

 

QUESTIONS RAISED

Users had several questions that I found interesting to ponder:

Q: What happens if the device is in your lap – will multiple sensors go off?
Solution: Use an accelerometer and map this to other sensors

Q: What if the computer is not around? What if the toy is not around?
At this point I believe that the device can work away from the computer IF bluetooth is involved, but the screen part will need the device to work. 

Q: What if it falls?
It would react based on the sensors in it, not to the falling and getting hurt idea

Q: Can it jump or run around?
No. This device is not intended to have forward or backward momentum

Q: Can it recognize its owner?
At this point, I don’t have plans for it to do this, however this would be a fabulous feature for future iterations.

 

MY QUESTIONS ANSWERED
What are the most important interactions ? Which one(s) make you as a user feel better in the process?
tail wagging
panting/tongue
reacting to you
inviting you in to interact with it
dog looking at you

Which interactions (perhaps not on this agenda) with a dog or pet make you feel better?
It seemed that holding it or talking to it, interacting with it in general, was the natural response to this

What material would you prefer to see the physical interface made of – plush, felted wooling, fabric, other?
Something soft

What on-screen visuals would make you feel better, when interacting with the stuffed physical interface – live action video, sounds, other types of visuals?
Have the visuals reflect what the physical device does. If it sniffs you, then the dog on screen sniffs. 

What are the best sensors for the petting and belly scratch? Force or touch?
Capacity sensor (metalic fabric on the inside)
Heat sensor
Accelerometer
Proximity sensor (invites you in)
Pressure sensor
Flex sensor
Vibration motor
Solenoid

OTHER THOUGHTS
What if I can, instead of a tongue licking interaction, have air blowing from his nose? Moon suggested the aquarium bubbles fan.
Can I use a screen for eyes (for this I was referred to Abishek) or animatronic eyes…?!

 

IMG_2062

 

DIGESTING THE FEEDBACK

This was a great experience, to learn what works, what doesn’t, what other people need vs. what I think other people need. I would love to test this idea on the target market. However, first thing’s first – a final project design. From the above feedback, below are the most important points that I want to take a look at further:

petting the head and body
having the dog on the lap, where possibly many sensors may be engaged at once
have the dog look at the user (although in real life dogs are said to not make eye contact, I certainly make eye contact with my dog a lot. They are the doors at which the souls connect. (They are also the doors that see breakfast)
use an accelerometer and map it with other sensors so depending on which access the accelerometer is on along with what other sensor is activated, that is the reaction of the device
Make the device something that wants the user to interact
Have the device draw the attention of the user
Leave room for humor and surprise
The tail and tongue movements are not natural with the servo motors and also create noise
Have the visuals reflect the physical device
Capacity sensor (metalic fabric on the inside)
Accelerometer
Proximity sensor (invites you in)
Flex sensor
Vibration motor

Cynthia Breazeal is an inspiration in this field. The work at the MIT Media Lab Personal Robotics team is fabulous. I discovered this amazing project. The digital eyes are amazing and in a perfect world, I would want to create that. I love how the device comes together soft on the outside, but removable, for access to the electronic counterparts. Also the skin is washable. However, it is not the soft, natural feel of an animal that I am trying to get at. And then there is this project where a stuffed bear is interactive both on and off screen. Users on one end manipulate the bear via a computer program, whereas on the other end, the user is interacting with the bear, and that reflects back to the computer program users on the program end. In that same perfect world that I mentioned above, I would want a real dog in real time to be able to reflect the actions of the stuffed dog that the users are interacting with. The real dog in real time would be on the screen reacting in unison with the user.

 

RE-ENVISIONED FINAL PROJECT DESIGN

Taking this feedback, digesting it and reorganizing the plan, I have narrowed down my focus to several basic interactions, before the device gets completely sensor menagerie.

 

Here are the interactions and their counterparts, as the plan is now:

  1. Tail wagging, vibration motor moves tail. Flex sensors on the head and back. Mapped with the accelerometer. When you rub the head or move in a certain way, the tail will wag.
  2. Eyes moving, infrared PIR motion sensor, via small ping pong balls and servo motors for eyes.
  3. Petting for a long time while dog is on the lap will cause it’s eyes to close.

For now I want to focus on these interactions. And I don’t necessarily need to connect it to the screen. This may change.

 

Screen Shot 2015-11-10 at 11.43.09 PM

Screen Shot 2015-11-10 at 11.49.00 PM

 

FUTURE CAPABILITIES

ON-SCREEN INTERACTION
It would be great in the future to have the user interact with a dog on screen whether in video loop/mapped form or  in real time. This MAY happen in this iteration, but not 100% sure that it will get that far at this point.

 

NOSE SNIFFLING
I would love have an interaction that via a proximity sensor, the dog sniffs. I tried looking up ways to do that. Moon mentioned the small bubble mahines for aquariums. I was looking at small air blowers such as this one. Not having seen how it works and what it sounds like (lack of videos online) this was sacrificed this round.

DOG NOISES
I would love for noises to come from the dog device as well. However, I worry that this may cheapen it, make it seem more like a toy and less like an animal. Somehow.

← Previous post

Next post →

2 Comments

  1. See also Paro’s baby harp seal: https://www.flickr.com/photos/tigoe/11934668486/ and http://www.parorobots.com/index.asp. Worth noting because it solves some of the problems you have raised through simplificaton. For example, the eyes are all black, so they do not look around, but they do blink. This is a very expressive motion, and has a similar effect to looking around (it personifies the robot) but takes potentially less effort.

    Likewise, Paro’s movement is primarily the head, in response to petting. Again, this gets the effect of responsiveness with little motion. LIke both Brazeal’s robot and Paro (both of which took over a year of development), you’re going to need a skeleton if you want movement in more than one place at a time, so that the movement of one thing (tail) doesn’t adversely affect the movement of another.

    I think you might want to get one action down well and study its effect rather than try too implement them all at once.

  2. Oh wow. Instead of having eyes that move, one could make lids that move. That being said…I like the idea but two things.
    1) I find it special and very personal when I see the white’s of a dog’s eye. It reminds me that they are living, like myself. It brings this special connection.
    2) I don’t want to do too much like the seal

    I do like the idea of focusing on one action. Im going to aim to do both tail and eyes on a skeleton, going with the wagging first at this point. (this is a huge cutback of the original idea…)

Leave a Reply