Let’s say — hypothetically — your VP of Drone Fleet Operations just asked you to help her handle drone management, route planning, payload optimization and more. What do you do? Well, there’s a few approaches to tackling the problem. Approach #1 is all about controlling drones using the Salesforce1 Mobile app. That’s what I’m going to talk about today. Note that all of this is done with a free Developer Edition and a little code.
Although I won’t cover it here, there’s also a mildly entertaining yet entirely impractical YouTube artifact documenting my adventures at ThingMonk where together with the excellent Darach Ennis we were able to launch a quadcopter using a coffeepot.
Equipment & Architecture
Let’s start by looking at the equipment you’ll need. The first thing is a quadcopter or two. I used the Parrot AR Drone 2.0 available at pretty much every retailer worth their weight in salt. The Parrot is great for a lot of reasons, but first and foremost is that it has a great API. Where you have an API, great things are possible, right? Now, the Parrot is also a toy, so you production minded folks will probably want to upgrade to something more robust.
The way the AR Drone works out of the box is that it creates a WiFi hotspot. You then connect your controlling device to that AR Drone hotspot and operate it. Parrot sets you up with an app that runs on either an iOS or Android device. I’ve controlled them from both platforms and it works great. The default AR Drone configuration requires one controller per drone, and it requires that controller to be on the WiFi network provided by the drone. If you have two drones, they are isolated from each other by their network connections and there’s no interaction.
In order for this to work with Salesforce, and in order to control multiple drones at the same time, we have to somehow unify these devices, which using the out of the box configuration means the controller needs to bridge multiple networks. My goto local interface box is typically the Raspberry Pi, and, fortunately, the Raspberry Pi is capable of supporting multiple network interfaces, which means it can also handle multiple network addresses. There are a few ways you could configure this, but I chose to use a single Raspberry Pi as a bridge between Salesforce and two other Raspberry Pi’s which connect to the AR Drones. It looks a little like this:
Now all you need is an app to handle the interface to the AR Drone. There are a lot of great ways to do this, and for this example I have used CylonJS and their ARDrone driver. (You might remember Cylon JS from the Dreamforce Connected Device Lab. They do tons of great stuff in the robotics world and have cool libraries for JavaScript, Ruby and Go.)
Control via the Streaming API Pattern
My first approach on this project is to use the familiar Streaming API Pattern. (See Controlling Philips Hue from Salesforce1 for another example, or get started with a really simple streaming API example.) The Drone Gateway connects to Salesforce, listens to a Streaming API Push Topic and then forwards those instructions to a device as soon as they’re received.
On the Salesforce side of the house, we have to create a simple way to generate listenable data. This is easier than it sounds. The first thing we want is an sObject to store the data in. I’m re-using an existing object pattern I’ve used for other message driven work, I call it “Drone Message.” The two key pieces of data it stores are the Address and Message. You can see in the screen capture that this one is setting the “land” message to “D2”.
You can of course use the default UI to create the records, but that then requires you to know addresses and message codes. Since code handles those kind of details better than my brain does, I created a simple Apex class to create these records.
public class DroneController{ public void droneOneTakeoff() { insertMessage('D1','takeoff'); } public void droneOneLand() { insertMessage('D1','land'); } public void droneTwoTakeoff() { insertMessage('D2', 'takeoff'); } public void droneTwoLand() { insertMessage('D2', 'land'); } public void insertMessage(String address, String message) { Drone_Message__c m = new Drone_Message__c(); m.Address__c = address; m.Message__c = message; insert m; } }
And now all I need is a little Visualforce code to extend this out to the UI layer. Note that this Visualforce page is tied to the Apex code above using the controller attribute.
<apex:page controller="DroneController" standardStylesheets="false" showHeader="false"> <style> h1 { font-family: sans-serif; } .large { font-family: sans-serif; font-size: 18pt; } </style> <h1>Drone Controller</h1> <apex:form > <h1>Drone 1</h1> <p><apex:commandButton action="{!droneOneTakeoff}" value="Takeoff" styleClass="large" /> <apex:commandButton action="{!droneOneLand}" value="Land" styleClass="large"/></p> <h1>Drone 2</h1> <p><apex:commandButton action="{!droneTwoTakeoff}" value="Takeoff" styleClass="large"/> <apex:commandButton action="{!droneTwoLand}" value="Land" styleClass="large" /></p> </apex:form> </apex:page>
Now you need to make this Visualforce page available for mobile apps, create a tab for it and finally customize your mobile navigation options. These are all just a few clicks, so check out the links if you’ve never done it before — pretty easy. Out in the world of humans, this renders as a very simple page, the same one that you saw in the video clip above.
Now that we have a quick and easy way to create listenable messages, let’s take a quick look at the Drone Gateway that’s doing this listening. This is a pattern I’ve re-used a few times, so you might be familiar with it. The gateway authenticates, begins listening to a Streaming API Push Topic, and then handles whatever it receives. I chose to write this in Node.js and the code is pretty simple. The connect to Salesforce is detailed in the Philips Hue article, so I’ll just show you how it handles the message. Note the “address” and “message” arguments.
function handleMessage(address, message) { console.log("address: " + address); console.log("message: " + message); if (address == 'D1') { if (message == 'takeoff') { console.log("in d1 takeoff"); handleRequest("http://10.0.0.2:1337/start"); } else if (message == 'land') { console.log("in d1 land"); handleRequest("http://10.0.0.2:1337/stop"); } } else if (address == "D2") { if (message == 'takeoff') { console.log("in d2 takeoff"); handleRequest("http://10.0.0.3:1337/start"); } else if (message == 'land') { console.log("in d2 land"); handleRequest("http://10.0.0.3:1337/stop"); } } }
Now, you will have no doubt noticed that the above code is doing nothing more than making a call to a webserver. When I was testing, I decided that an http based interface would also be fun, so I created a small server that simply responds to two URLs: start and stop. You can see that these map to the CylonJS commands for “takeoff” and “land”.
http.createServer(function(req,res) { if (req.url == "/start") { console.log("ready to start"); copter1.connections['ardrone'].takeoff(); } else if (req.url == "/stop") { console.log("ready to stop"); copter1.connections['ardrone'].land(); } res.writeHead(200, {'Content-Type': 'text/plain'}); res.end('Executed'); }).listen(1337, "10.0.0.2");
And there you have it. The start to finish message flow now looks like this:
- User presses takeoff on their mobile device.
- Salesforce1 inserts a Drone Message object for takeoff.
- Streaming API picks up the new records, forwards to listeners.
- The Node.js based Drone Gateway catches the new record, and sends it to the right address.
- The Node.js based Drone Server sends the specific command to the AR Drone.
Code notes and links:
- Visualforce and Apex is above, everything else is a minor configuration.
- Drone Gateway code
- Drone Server code
My command center for the video shoot looks a bit more complicated, but it follows the diagrams above. Note the three Raspberry Pi’s and two network hubs on the lower left.
Wrap Up
As you can see from the video, it’s pretty easy to get the drones to follow some simple instructions. The primary challenge with this method is the inherent lag between when an instruction is issued and when it gets to the drone. This lag depends on a huge number of factors — Internet connection, gateway device performance, Streaming API performance, etc — but the end result is the same. A drone moving at 5-6 meters per second will be in a completely different place by the time it responds to a delayed command.
An interesting experiment that raises a lot of questions for me. First and foremost, what is the best way to spread device intelligence out among the components of a system? Which is to say, what kind of work should Salesforce play in this kind of complicated interaction? My overall feeling is that this, while interesting, is lower level than truly practical today.
Hats off Reid! What a fun Salesforce integration story if there ever was one.
http://www.ratsaas.com/quadrotor
Definitely the end-to-end latency makes one nervous about the quad!
Ah ha!! I thought I remembered seeing something like this. THANK YOU for posting the link — I was searching for quad copter not quad rotor. Anyway, super cool that you did UDP direct. I suspect the interesting future story is about WHERE to put the intelligence. Latency is an inherent issue any time you are several hops away from the device talking to the drone. So in the end I think SFDC with contain relatively high level instructions that are interpreted by the closer controller. Thoughts??
Thank you for the feedback!
You can set NETWORK:WIFI_MODE=2 to get the drone out of Access Point mode. Then it connects as a wifi client that you can expose directly on the internet via NAT / port forwarding, getting rid of a few hops.
Short of trashing the firmware, there are onboard Telnet + FTP services where instructions could be dumped or converted to AT commands, but the accelerometers drift very quickly and he will get lost 😉
Interesting. I’ve gotta take a deeper dive in here for sure. Thx for the pointer!
Hi Reid, We have a group of four students working on same stuff. I’ll point them to here to collaborate!
Awesome!
Hey Reid! Great Post! I am going to use this in a presentation to a bunch of web devs at work as an example of the neat things possible with Salesforce…
So if you didn’t have the latency, it would probably be hard to video, so maybe that’s kind of good in a way. #optimist. Totally sharing this with my brother who is interested in this stuff.
this shit is awesome