What is Autonomous/Demo Mode?
Starting the demo controller initialises the Biomimetic Core and puts it in control of the robot. Central components of the core compute multi-sensory maps of salience in the space around MiRo (what is nearby that "matters" to the robot) and compute what to do next ("action selection") based on priority computed from those salience maps.
To really understand MiRo's behaviour in demo mode, then, you'll need to look into how mammalian brains work. On the other hand, if you just want MiRo to roam around interacting with whatever it comes across, you just have to start the controller.
Just like an animal, you'll find that working with a MiRo in demo mode becomes more rewarding as you learn what he [1] likes. As you come to understand what "matters" to MiRo, you will find it easier to persuade him to pay attention to you, rather than to passing shiny objects.
MiRo's drivers of salience include motion (as determined using the cameras in the eyes), sounds (picked up and localised by the microphones in the ears), human faces and his favourite toy—a blue ball [2]. You can get MiRo's attention with any of these stimuli, but some will be more effective than others depending on your environment—note that each source of salience can also be disabled using the demo flags. He also responds to being touched and stroked, but we will leave it to you to determine just how.
MiRo's responses to these stimuli depend on what mood he is in. You can judge MiRo's mood by the colour of the lights under his body shell, by the nature of his vocalisations, and—naturally—by the way he responds.
To learn more about how to interact with a MiRo in demo mode, look at the page on Wrangling.