Proximity!
My Gimbal beacons arrived yesterday. These are three tiny Bluetooth LE devices, not much bigger than the watch battery that powers them. They do very little more than send out a little radio signal that says “I’m me!” twice a second.
There are three very different ways of using them that I can immediately think of:
I’ve just tried leaving one in in each of three different rooms, then walking around the house with the the simple Gimbal manager app on my iPhone. It seems their range is about three meters, and the walls of my house cause some obstruction So with careful placing, they could tell my phone very simply which room it is in. And it could then serve me media like a simple audio tour.
Alternatively, as they are designed like key-fobs, they could be carried around by the user, and interpretive devices in a heritage space could identify that each user as they approach, and serve tailor media to that user. Straight away I’m thinking that a user might for example be assigned a character visiting, say, a house party at Polesden Lacey, the the house could react to the user as though they were they character. Or perhaps the user could identify their particular interests when they start their visit. If they said for example, “I’m particularly interested in art” then they could walk around their a house like Polesden Lacey, and when they pick up a tablet kiosk in one of the rooms, it would serve them details of the art first. Such an application wouldn’t hide the non-art content of course, it would just make it a lower priority so that the art appears at the top of the page. Or more cleverly, the devices around the space could communicate with each other, sharing details of the user’s movements and adapting their offer according to presumed interest. So for example, device a might send a signal saying “User 1x413d just spent a long time standing close to me, so we might presume they are interested in my Chinese porcelain.” Device b might then think to itself (forgive my anthropomorphism) “I shall make the story of the owner’s travels to China the headline of what I serve User 1x413d.”
But the third option and the one I want to experiment with, is this. I distributed my three Gimbals around the perimeter of a single room. Then when I stood by different objects of interest in my room, read of the signal strength I was getting from each beacon. It looks like I should be able to triangulate the signal strengths to map the location of my device within the room to within about a metre, which I think is good enough to identify which object of interest I’m looking at.
What I want to do is create a “simple” proof of concept program that uses the proximity of the three beacons to serve me two narratives, one about the objects I might be looking at, and a second more linear narrative which manages to adapt to the objects I’m by, and which I’ve seen.
I’ve got the tech, now “all” I need to do is learn to code!
Unless anybody wants to help me…?