Petcube was a kickstart start-up raising $3.8 million to build a cube-like device that allows dog and cat owners to remotely interact via an app with their pet through video and voice communication and with lazer game interactions. The product was originally thought of when the designer Alex Neskin pet chiwawa, Rocky, suffered from separation anxiety leading to the neighbours complaining about the noise caused from the dogs distress. This lead to Alex building prototypes of Petcube in Arduino to entertain and check up upon his dog. He then joined together with the other two developers of Petcube to iteratively create the overall product.
The final product allows the human user to view the cameras 180 degree range, talk and hear their pet, and use their finger to guide a laser which the pet can chase. Whilst this product has been available since 2013, it has recently gone under a few software updates which involve sound and motion detection with an auto-play mode, which plays randomised patterns of laser triggered randomly. The video that is recorded can be viewed and stored remotely, with cloud storage being a future addition. These videos then can be shared, in an online Petcube community. This community also allows the remote playing with other peoples pets and to view other peoples videos/images.
The feature I really liked about Petcube was there implementation of these devices into shelters to help increase adoption rates through exposure and increase the entertainment of the sheltered pets.
As a designer of dog-interfaces I had a few questions for the company around how the design and implementation of the product:
How is the product pet focused? The company makes the product pet focused through iterative design with the owner & pet. They claim to pro-actively reach out to the owners and as the system is primarily software based they do updates through these systematic reviews. This enables a feedback loop to be built. The methods they use to increase feedback from the owner about Petcube use is through surveys and research.
How did you test the product and come up with its design? As mentioned above, the product is tested through iterative design in what appears to be an agile methodology. Testing a product like this can be troublesome as you have effectively two users: the dog and the owner. Whilst the owner is the one buying the product, the dog also has the choice of using the technology which ultimately is why the product is baught, to aid the human-dog relationship. It would be interesting to know more about the method they choose, or built, to test the products with dogs to see the validity of their results and how they correlate with current methods of designing with dogs. Aesthetically, the product is designed to look sleek and appeal to humans, and is provided with a tripod to enable a higher degree of vision.
How does the advance motion sensing work? How is this Automatic? There has been various forms of tracking animals, such as through facial recognition , body posture analysis and eye tracking. With this new update I was interested in how consumer products recognised animals. Petcube uses the webcams to pick up image changes as a basic form of motion detection to send push notifications to the user. This method however would send push notifications through any movement and not specially the pet, as with the above methods.
How does this enrich animals lives? This product claims to enrich the animals lives through providing dogs and cats with basic entertainment (laser) and allowing them to communicate with their owners and vice versa.
How do you ensure no negative effects or do you provide guidelines for the owners? When working with animals their is always the aspect of ensuring animals welfare, especially with a remote system. To help aid the owners in using the system with their pets, Petcube does provide a userguide which produces health warnings about the prolonged use of a laser system but does not give guidance on the systems use. Petcube instead put the earnest on the dog-owner, as there system is highly customisable the responsibility of its use is down to the owners own use.
What future updates are you thinking of implementing for this system? Petcube are currently looking into further sensory and haptic devices which they can integrate into their current system. Such as an interactive ball (think walkytalky ball) which moves on its own accord.
THOUGHTS ABOUT PETCUBE
This conversation with Petcube lead me to an interesting conversation about the current systems in place in ACI, such a Playing with Pigs which has similar concept as Petcube but with pigs. They used an interactive wall, but once again remote interaction through devices (tablets/phones). There work however was primarily based upon tactile interaction and unlike Petcube, the animal had higher automatic interactivity within the system with visual feedback (when the ball of light is touched on the wall it sparks). In this way I believe that Petcube can be improved through involving the animal as a more prominent user in the system. This can be done by increasing its interactivity so that even when the owner is busy the animal can still use the system. This is where there new update on auto-play slots in, however is currently unable to recognise the animals features to enable more advanced game play.
Whilst their system does not have touch interfaces, although this could be added through the future implementations such as the external devices mentioned, dogs and cats primarily use body language as a form of communication. This is why I became so excited at the prospect of automated detection being used in a commercial system as this allowed for further automation but also autonomous play. However, I was disappointed to find this was only generic movement alone which could not only cause false positives, but also not identify an animal from a human or the meaning behind the animals movement. The current technology system, high specification camera, could use basic methods that I have made through image recognition to see where the dog/cat was looking and thus respond with the laser in an automatic way. This would allow the pets communication with the system (laser) to be understood and increase the interactivity and a feedback loop from technology to pet. This in tern would allow the system to be used more, bringing more satisfaction to the owner through the enjoyment of the pet, satisfying both user groups.
I also question the cognitive benefits of cats/dogs playing with lazers and whilst this may provide stimulation, I am unsure if this would solve separation anxiety other than help temporarily persuade off boredom. Hopefully through creating a more interactive system with their updates this could provide the dog/cat with further cognitive challenges rather than relying on a chase instinct.
Lastly I am faced with the similar quandary that myself and Patricia felt with Clever Pet: where does the responsibility lie in the use of animal-computer systems? Whilst I think that it is up to the owners on their own use of this system, I believe there should be stronger guidelines warning about the continual use of such a system in place of real human-animal interaction and the confusion factors which can occur when dogs hear their owners voice remotely, especially with anxious pets who are suffering cognately.
Overall, whilst I enjoy learning about these products and how they are used, I believe that the ACI community needs to work more closely with industry to help create better publicly available technology that benefits the animal as equally as the human.