On the 26th May, 2016, in Dublin, Measuring Behavior hosted a workshop on Animal Computer Interaction (ACI) (previously blogged). The workshop was organised by Myself, Anton Nijholt, Patricia Pons and Adrian D. Cheok.
The aims of this Symposium were twofold. Firstly we wanted to introduce the topic animal-computer interaction in the Measuring Behavior community with the assumption that there can be fruitful interaction between these two communities. Secondly, we aimed at contributions that address methodological questions. How to conduct user-centred design in animal-computer interaction or in computer-mediated human-animal interaction? Which methodologies from HCI can be adapted to ACI? Clearly, in this emerging field of research case studies can help to give answers to these questions and they are welcomed as well.
In this blog post, I plan to give you a taste of the work presented as well as talk about the questions raised during this workshop.
Animal-Computer Interaction: Animal-Centred, Participatory, and Playful Design.
This workshop started with a talk by Anton Nijholt, welcoming everyone to the conference and introducing everyone to ACI, Dublin and Measuring Behavior. Within his talk he spoke about Claras Mancinis definition of Animal Computer Interaction. He then went to speak about the previous workshops and events done such as ACI@BHCI, The first and Second Symposium and introduced this workshop as another step for ACI.
Detecting Animals’ Body Postures Using Depth-Based Tracking Systems.
This next presentation was on Patricia Pons, J. Jaen and A. Catala paper talking about depth-based tracking systems with XBox presented by Alejandro Catala. Here he explained the motivation behind their work and spoke about the growing interest in mapping behavioural patterns. Through this mapping he stated that there are a lot of possibilities that a system, like the one made by them, could bring. Examples given where the computers ability to automatically recognise behaviour and create a interactive playful environment. Their work was primarily interested in posture recognition, which currently they had a 90% recognition rate. They hope this work can be used in Kennels and rescue centres to help learn about animal behaviour and have tried there system with Lions and orangutans in zoos. However, they felt that for orangutans this would not be the best approach and humans methods are preferred. They ended with talking about currently collaborating with specialists to see posture behaviours.
Q: Would you not need perspective inverse vision to allow for correlated vision?
A: It is not needed using the tracking area
Q: Do you consider using wither cues such as colours as an additional set of features to distinguish between animals?
A: We are proposing this under a controlled scenario. It can be hard to maintain colours especially when doing other animals
Q: A cat is rather big considering the size of your tracking area (270 x 250cm). Do you face issues with the size of your tracking area, and could you use this with other animals, such as mice and insects?
A: This could be possible but you would need to define new features. (Fiona French then stated that you could mount the Kinect on a drone to allow continue tracking and not be limited by space – loved this idea!).
Animal-Human Digital Interface: Can Animals Collaborate with Artificial Presences?
Prof Naohisa Ohta then presented his paper he wrote with Hiroki Nishino, Akihiko Takashima and Adrian David Cheok on animals using visual computer machinery. Prof Ohta, is new to the field of ACI, previously being involved in computer vision. He started his talk by speaking about the many players in Animal-Human interaction and how we can share the same world, even if how we experience it is different. This opened up to a discussion about the limited spaces between humans and animals, and with both experiencing this space differently. From this, he said him and his students have been doing private experiments for a while with their dogs where they would Skype them. Often the dogs would recognise the owners voice on Skype and look around for their owner. His own dog, would look out the window. So it is through creating interactions between dogs and technology that his long term goals lye. He stated he did not want to make technology like Pawculus Rift (below) but instead focus on a visual interface screen. He talked about how people visually see things, and whilst 2D sometimes work well he wanted to make a device with dynamic motion cues for dogs. The challenge he currently faces is, whilst tracking technology is available in humans, it is currently not fully functional in dogs.
I personally really enjoyed this talk, and as someone who is involved in dogs, tracking and media technology, I can see the value on such a product not only for home alone dogs, but also kennelled. After the conference I had a talk with Prof Ohta about his ideas on using this system as a ‘window’ to not only see other things but smell and create a truly interactive product – maybe even with dog-dog interaction. I look forward to his future publications!
Q: Would you think of adding Haptics into this?
A: No he had not planned to do that.
Video of an example of the VR system Prof Ohta was looking to implement found here.
Towards a Wearer-Centred Framework for Animal Biotelemetry.
Patrizia Paci presented her paper she wrote with Clara Mancini & B.A. Price on her framework she developed for animals working with biotelemetry devices such as activity trackers etc. She started the talk about the rise of such devices and the implications this can have on the health of the animal. The design of the framework for these devices focuses on two things: features and methods. As an example, she mentioned about animals wearing brightly coloured collars causing them to stand out. At this point I got reminded of the animals in the film Happy Feet who when tagged, believed it to be something
special. She then told a nice example of how wearables need to be centred around the animal by showing the different frequencies a cat and mouse can both hear. A device would need to avoid these both so that the cat can do normal behaviour and hunt mice. She then went on to talk about a study she conducted with cats wearing different devices and measuring their behaviour. The three devices where controlled (nothing), an activity monitor (10g) and a GPS tracker (40g). She then videoed the cat and noticed that the cat would often try to get the heavier collar off and a correlation between the method of attachment and disturbing behaviours. She finally presented a proposed design recommendations in a user centred framework.
Q: What principals do we need for user centred design and to have a good wearable experience?
A: A good experience is having no experience at all! Principals presented here not only relate with the wearer but also with other animals in relation to the animals biology. For each animal their biology would need to be considered and then centred around that particular animal. Whilst this is currently only a work in progress, the next thing we are going to explore if the use of the device.
Q: Would you instead use biotelemetric devices under the skin?
A: No, because this is not good for animal welfare.
Q: What if the animal would wear the device for along time, so it became used to the weight and features of the device?
A: She had not considered this, as this would be conditioning towards the device going against animal centred design and hindering the normal behaviour.
Playful UX for Elephants.
Fiona French did a lovely presentation on her paper she wrote with Clara Mancini & Helen Sharp on playful user experiences with elephants. The aim of her work was to enhance the life of captive elephants through creating an enriched environment. However, as she pointed out, this is tension between the technology and the animal as technology is not within the ordinary environment. She saw though, that technology could give the elephant back some control over its own environment. The question she faced was about what the elephant had to play with, and how do we add this to their play space. She spoke about the different systems she had tried, from musical tubes, to tactile pads and buttons. She then stated that the keeper of the elephant that she was trying this technology with wanted a shower system so experimented with the interface. She wanted this device to add value to the animal and made a footpad (similar to a sewing machine pedal) to allow the animal to play with the device. Whilst this work is still in progress, her work took a very animal centred approach which involved not only the animal, but the designer (herself) and the zoo staff. She was making the device using an Arudino and I look forward to seeing how the elephant uses the device and what she learned from this.
Q: Have you thought about using a branch as an interactive device?
A: Elephants tend to be quite destructive and using a branch would be both difficult, expensive and could possibly have potential dangers.
Q: Could you not train the elephant to use the device first instead of discovering things that are interesting to an elephant intrinsically? Would this not allow for more opportunities with this?
A: Potentially food could always be the motivator for this discovering. However it removes the intent of the animal and modifies their behaviour which we are trying to measure.
Interactive Toys for Elephants Website: http://toys4elephants.blogspot.co.uk/
Animal-Computer Interaction (ACI): A survey and some suggestions.
Egon L. van den Broek presented his paper on guidelines for ACI from an outsiders perspective, being mainly involved in Artificial Intelligence (AI). He had some experience of human-animal interaction through a PhD student of his using horses to regulate disabled children emotions. He saw HCI as providing a fragile foundation for ACI and suggested a closed loop model. He spoke about the ease of use of a system and the different interactive styles. He stressed the need to understand the agents involved and getting different communities together. He then presented a video on monkey behaviour with a woollen and a wire ‘nursing mother’ to demonstrate some of the challenges faced in ACI; that behaviour can be temporary or indirect. His key suggestions where to take care of validity with a specific baseline. He was interested in empathy in ACI, where you can infer others feelings. The key message I took away from his talk was to employ an interdisciplinary approach to ACI; ‘ the challenge is as big as the need for animal computer interaction’. He finally presented a New Journal in Animal Sentience (ASEnt) on non-humans animal emotions.
The Ethics of How to Work with Dogs in Animal Computer Interaction.
I presented my paper I wore with Janet Read on the ethics of working with dogs in ACI. I started off by talking about how as what we have known about animals has grown and technology has evolved, the boundaries between humans, animals and machines have blurred with the relationships between the modalities changing. This lead to a change in how humans with with animals with the old cost vs. harm ethics being replaced by an animal centred approach but still different approaches used on this linear scale: human vs. animal. I then spoke about the two ethical papers in ACI, Clara Mancinis’ and Heli Vaatajas’ papers and presented my own ethical guidelines on working with dogs in animal computer interaction, which I call Dog Computer Interaction (DCI). I spoke about how I came up with these guidelines and that I expect these to grow as DCI field, my own work and myself does. Lastly I spoke about how the research we do in ACI is influenced by our feelings around what animals are, and how this directly influences not only the study design but as such the results, methods and theories created. I ended by asking other ACI researchers to state within their work how they feel about the animals they work with to create a more in-depth approach.
Q: Why not do a double blind study with animals though to stop the feelings of the owner being involved?
A: Even if the owner did not know about the study, the behaviour of the researcher who designed the study would still be influential.
Q: I found this point interesting as it goes against normal science where you are supposed to take your emotions out of the equation. Is it personal feelings you want or the statement around what you are doing as they are two different things?
A: Ideally I think both as they are interwoven with each other
Q: Yes, but you could just say your doing those guidelines and it doesn’t mean anything.
A: I agree, but by putting in a statement about how you felt about the animal it would state how you treated the animal within the study and give clues to in what mind-frame the study was designed. For example, someone doing work on dogs that loved dogs vs. someone doing work on dogs that did not like them could possible create different results from the way that they designed their work.
Overall, the day was enjoyable and interesting to see the different opinions of standard measuring behaviour in science to the view that most researchers hold in ACI. To me, the key difference in these two fields is the definition of animal welfare. Whilst in animal science (bioscience) animal welfare is considered, in ACI this often goes beyond simply providing the animal with these needs to shifting entirely to an often seen phrase in ACI publications, animal-centred. I agree with Fiona Frenchs answer, that in order to measure true behaviour we need to not train this behaviour, and indeed this was in my publication. This user-centred approach, whilst not new in HCI is rather new in animal behaviour and as more interest grows in this field within the interdisciplinary field, as it should – and I am sure it will be, interesting to see the different theories and methods taken with residue from previous field normalities.
As an organiser of this event I would finally like to take a moment to thank all of those who presented, asked questions and attended this workshop at Measuring Behaviour, and lastly Anton Nijholt for starting this wonderful event.