Reid Carlberg

Connected Devices, salesforce.com & Other Adventures

Great Playing Cards: Shuffle & Roll

0

In the “other adventures” department I’ve been geeking around with playing cards, and now have the tiniest e-commerce outlet you’ve ever seen focused on this. Checkout the Great Playing Cards at Shuffle & Roll.

It’s been very interesting getting to know the playing card business. A lot of people use them for magic, or cardistry, but just as many like them as a way to add spice to their everyday games. Love that.

I back 3-4 Kickstarter campaigns a month. December was a heavy one at six projects, including Queen Bee (that I wanted to buy more of than I did), Neo Tokyo that just looks awesome (I probably should have bought more, but dark mode cards aren’t my favorite), Blue Jay (that I also wanted more of), Atlantis (and YES I wanted more of that too but was too late for the special gilded dang it) — but I miss a lot of really great stuff, too — a list too long to include here.

When I find the good stuff, I’m super happy to share it.

Four Questions to Ask Yourself When Making Urgent Decisions

0

We’re all called on to make “urgent” decisions. It’s stressful, and it’s an easy time to make a mistake we wouldn’t otherwise make. Here are four questions I ask myself when these come up.

1. Is this decision really “urgent” to me? I keep putting urgent in quotes because, most of the time, these decisions don’t involve life or death, something that’s truly and obviously urgent. They’re about much more pedestrian things, and their actual urgency is a matter of perspective. Before you do anything, you need to figure out if the urgency behind the decision is real, or if it’s imposed by you or someone else.

2. What happens if I do nothing? Inaction is also a decision, and sometimes it’s a good decision, sometimes even the best. Since urgent decisions often result in error, once you’ve decided that something might be urgent to you, the first things you need to think about are the consequences of inaction. If they are trivial, or even non-existent, inaction may well be the wisest choice.

3. What is the narrowest decision I can possibly make? You’ve come to the conclusion that this issue is urgent to you, and that action is better than inaction. Now you need to identify the narrowest possible decision space within which to work. This usually means redefining the context within which you’re making the decision. Most of us have a surprising surprising amount of detail in our brains, and urgent decisions stir up a lot of detail. It’s tempting to include all of that detail when making an urgent decision, but this means you’re actually making a big decision under poor circumstances. You need to consciously put that detail aside and make the minimum viable decision.

4. What are the impacts of the decision I’m making? Now that you’ve made a decision, stop and think for a minute about what the decision means over the slightly longer term once you’re outside of your current urgent context. Does it create future urgent issues that will need to be handled? Does it artificially constrain your future possibilities? If your decision boxes you into a corner or creates a cascade of urgent issues, you should revisit this checklist and see if you can easily improve your decision. You might not be able to. Or you might not be able to see how you can improve your decision because you need some mental and emotional distance from the issue in order to better understand it. You should make the time to get that distance and revisit this list.

I’m not perfect at making decisions, but I’m better at when I take time to think.  Urgency isn’t a clear cut issue, and it’s important to think clearly about it when you are in the middle of it.

No Watches at the Dinner Table

0

My wife and I have an important rule at the dinner table: no phones.  Yesterday I tweaked that rule to include watches.
I’ve had the Apple Watch for about a week. I’m not particularly passionate about it yet. I love the Fitbit-like functionality, the heart rate tracker and the push notifications. (Tho this morning forgot to wear the watch, just like I always did with the Fitbit.) I’m neutral about most of the rest of the apps on it so far.

Except the timer. I love the timer. And the astronomy watch face. (Go to the moon, twist the dial, and watch it change phases–super cool!)

Anyway, last night I found myself checking my watch at dinner instead of engaging with my wife and kids. When I realized I was doing it, I took the watch off and put it in the kitchen with my phone.

Dinnertime is pretty important in our house. The kids are young, but we’re all running around, and it’s a good chance for us to interact and talk about our days, share our favorite parts and joke around. We focus. We’re all present. We connect.

I can imagine a time in the not too distant future when I’ll be wondering where my kids are and what amazing (and hopefully non-life threatening) things they’re up to. I’m sure for people who are there today, that special buzz of an inbound message from their child is amazing and it’s worth interrupting whatever else is happening.

By the time I’m ready for that — probably sooner than I think — what device will we all be glued to?  I’ve had a chance to use a lot of the most recent devices. A lot of them promise faster, easier access to the information we crave, a richer view of the real world provided with less effort. But they all still take effort, and as much as they give us, they also take something away.

Don’t get me wrong. I don’t dislike the watch. It’s just that it’s another thing, and things demand my attention, which is often the scarcest and most finite resource at my disposal.

(BTW, screenshot above is a push notification sent from a little app I built to test the iOS SDK from the Salesforce Marketing Cloud. Pretty sweet that it just pops up. For the curious, here’s a basic demo of sending a push notification.)

My Next #DreamJob @Salesforce

0

It is not without some sadness that I write this post.

For the last five years, it has been my pleasure to dive deep into the Salesforce1 Platform community. I have shepherded the Salesforce1 Labs program, helped launch AppExchange Checkout, delivered the IoT Zone experience at Dreamforce 2013 and co-delivered it for Dreamforce 2014. I have met thousands of developers, admins and partners; diagnosed problems, challenges and opportunities; and talked with CEOs, CIOs and CTOs about their businesses. I have presented on a large number of topics at user groups and conferences around the world, and worked extensively with Salesforce’s uniquely awesome Sales Engineer, Account Executive, Customer Success, and Services teams.

And make no mistake: it has been my honor to work with a great crew of Developer Evangelists and an incredible Developer Relations team. They have been fantastic and inspirational co-workers. They have pushed me to be better and I have learned a ton from them in the process. I am incredibly grateful to have met them!

Yes, the last five years have been fantastic.

However, these words (penned by fellow Kenyon College grad Bill Watterson) have often bounced around in my head:

calvin-change

And so I have decided to embark on a new adventure.

Later this week I officially transition to the Salesforce Marketing Cloud where I will help manage a collection of products on the mobile team, including MobilePush. We have a great team working on these products, and a great crew of customers using them. It is a huge opportunity for Salesforce and I am very excited about the chance to shape it.

Best of all, I still get to work with the Salesforce developer community: one of the key outputs of my new team is a series of developer facing mobile SDKs (checkout the Github account).

These last few days have been more emotional for me than I predicted. My last regularly scheduled 1:1s with my team in particular — such a great crew!!

A big thank you to Salesforce for this new opportunity, and a big thank you to everyone I’ve worked with — in and out of Salesforce — over the last few years. Here’s to many more together!

Yes, this is change, but I’m not going far. You’ll still see me at Salesforce events and I’m still only an email or tweet away. Let’s definitely keep in touch, and if you have a question about Salesforce Marketing Cloud mobile apps, please reach out!

Science Expo Lesson #1 (LEGO Mindcub3r Revisited)

2

Our local school district had our annual K-6 Science Expo yesterday. It was a blast!

Surprising no one, I sold raffle tickets to help keep the expo free for all students. Surprising everyone (or maybe no one who reads this), I brought my Lego EV3 Mindcub3r to make the raffle ticket table a little more interesting.

Funny things happen when you use a robot to solve Rubik’s Cubes for hours on end.  At one point my EV3 was misbehaving, so I decided to check the batteries and the poor little AAs were actually hot!  I replaced them, and everything was OK.

The most challenging piece was the mechanism that flips the cube. It has a tendency to slip during the solve phase, and this problem seemed to get worse as the day went on. Being that we were at a Science Expo, the kids and I decided to come up with a hypothesis and then run some tests.

Hypothesis #1: we needed more weight. So we tried adding weight. We added a couple of small pencils. A AA battery. 2 batteries, then 3. Nothing. If anything, it seemed to make the problem a bit worse.

Hypothesis #2: we needed more friction at the point where the Mindcub3r does the flip. So I added a small piece of, you guessed it, duct tape.

Duct Tape Fixes Everthing

And voila! The solve rate went back up to a pretty reasonable 80-90% and, more importantly, the kids learned a very valuable lesson:

Duct tape fixes everything.

Mo Tester Octothorpe: Now with Lightning Process Builder & Lightning Connect Examples

1

I updated Mo Tester to include a couple more Lightning Examples. Install link and code are here.

It now includes the External Data Source Pat created for the Lightning Connect Tutorial. Select the “LAB Mo External Data” app to see the tabs.

Mo Tester also includes a Lightning Process Builder Example. It fires when you create a Mo Tester 1 record with a particular name, either “Chatter” or “Later”. “Chatter” posts immediately, “later” posts an hour later.

mo-tester-octothorpe

Access IoT Data in Salesforce using Heroku Connect OData

1

Yesterday I wrote about getting IoT Data into Heroku.  It’s pretty cool, but other than looking at a Heroku PostgreSQL Dataclip, there’s not a lot you can do with the super simple IoT host app I built.  Which made me wonder:

OK, I have the data. Now what?

Well, my usual tool of choice is Force.com, but this kind of high-frequency, low-difference data doesn’t work well when fully imported to that environment. This got me thinking about two relatively new and completely exciting things: Lightning Connect and Heroku Connect External Objects.

Lightning Connect (earlier blog, cool tutorial) is a product that includes external data in Salesforce by reference — the data is accessed via the OData protocol and is never copied into your org.  Hence it does not count against storage limits. Heroku Connect (Pat’s awesome blog) is all about easy data synchronization between Salesforce and a Heroku PostgreSQL database. But Heroku Connect also includes an External Objects (@vzmind‘s awesome blog) option that publishes any Heroku PostgreSQL table you request in OData format.

So, um ….. HOLY SHIT I WONDER IF THIS WILL WORK.

I started by adding the Heroku Connect add-on and then opening it.

heroku addons:add herokuconnect
heroku addons:open herokuconnect

Heroku Connect demo edition is free, which is great, but you’ll need to file a ticket to have External Objects activated. Once that is done it’s super simple. Note: External Objects are activated per app, not per user. If you create a new app, you’ll need to request them again.

Start by creating a connection:

heroku-connect-create-connection

Next initialize your username and password.  These will come to your email.

heroku-connect-start

You’ll now see your OData URL as well as a list of tables you might want to share.  Note that the listing of available tables appears to be an asynchronous process — it takes a minute for them to show up. Grab a cup of coffee. Then check the ones you want to share.

heroku-connect-readings-shared

External Data Source configuration is super easy (with many examples), the only difference here being that the Identity Type is Named Principal and the Authentication Protocol is Password Authentication.  Settings example below based on my setup.

force-external-data-source-with-username-password

And now we have the Heroku Readings object in Salesforce!

heroku-readings-in-salesforce

Which is super cool.  But we can make it cooler by connecting this IoT data with something meaningful in Salesforce.  For example, a Device record. As long as I have an External ID field on the Device, I can create an Indirect Lookup between that Device and the Heroku Readings by modifying the Reading’s “Device Name” field type.

create-indirect-lookup-between-device-and-data]

And now we have live IoT data meaningfully connected to Salesforce data without importing it and using up storage.

heroku-connect-device-with-data

Which, I gotta say, feels pretty much like a HOLY SHIT moment to me.

Node.js IoT Host on Heroku: The Simplest Thing that Might Possibly Work

3

TL;DR – Look ma — CODE that stores DATA which becomes usable in SALESFORCE via OData.

For the last couple of days, I’ve been talking about some hardware I’ve put together (here and here).  These efforts all come from a pretty simple problem I have:

My house is old, drafty and hard to heat. And I live in Chicago.

This problem is particularly noticeable in January and February, but July and August can be just as bad in the other direction so I need to address it somehow. Ugh.

arduino-then-what

Well, that which can be measured can be improved, right?  So that’s where I’m going to start. Measurement is actually the easy part. Simple Data Recorder handles that.  Where I’ve struggled is figuring out where to put that data which I measure, how to put it there and then what to do with it.  This post covers the first two.

Most of the time, I work on the Force.com side of the Salesforce1 platform.  I’m extremely comfortable in it, but the free Developer Edition will run out of storage very quickly if you start throwing a new record at it every minute or so. Enter the other side of Salesforce1: Heroku.

Naive IoT Host, deployed to Heroku, solves my storage problem. I’m using the free tier of everything so far, and should be able to for ~20 days at my current rate of data production. You can see a live dataclip of conditions in my house right now should you be interested. If you’ve even glanced at the code, you know it won’t scale to 100,000 devices any time soon.  But it will do a couple of dozen just fine, and that’s just the scale I need for this experiment.

The high level end to end looks something like this:

arduino-rpi-heroku-nodejs-postgresEverything starts out on the Arduinos.  The Raspberry Pi polls the Arduinos for data, transforms that data into JSON, and then posts that to a Node.js app running on Heroku. The Node.js app augments that data with current weather conditions and then writes it to a PostgreSQL table.  I have it set to do this about every 5 minutes.

The basic Node.js app is built using Express, but I haven’t spent any significant time optimizing it.

I’m using ipfilter to ensure that only someone from a whitelisted IP address can make a request on the app.  If you aren’t calling from some recognized IP, you get an “unauthorized” response.

wtf-sql

Sequelize coordinates the database activities.  For some reason I didn’t look at ORM when I wrote that tweet a couple of months ago. Sequelize isn’t perfect, but it’s super fast as a getting started tool, and it’s way better than anything I might write.  You start by declaring a model, which it then automatically creates for you in the actual database.  It doesn’t do database modifications as far as I can tell — there’s some kind of a separate module for that which I haven’t investigated yet.

var Reading = sequelize.define('Reading', {
	deviceName: Sequelize.STRING,
	rangeStartDate: Sequelize.DATE,
	rangeEndDate: Sequelize.DATE,
	maxBrightness: Sequelize.INTEGER,
	minBrightness: Sequelize.INTEGER,
	maxTemperature: Sequelize.DECIMAL(4,2),
	minTemperature: Sequelize.DECIMAL(4,2),
	weatherSummary : Sequelize.STRING,
	weatherTemperature: Sequelize.DECIMAL(4,2),
	weatherWindSpeed: Sequelize.DECIMAL(4,2),
	weatherWindBearing: Sequelize.INTEGER
});

Lastly, the Forecast module connects me to Forecast.io to get the current outside weather conditions.  I thought this would be useful information to keep in mind as I start to look at the data. Now, I could have chosen to work with current-condition-then data in a number of ways, including querying for it analysis time (whenever that turns out to be), but the incremental cost of including it with each reading was so trivial I decided to do it.

And voila — for your enjoyment — the data.

OK, so far so good.

Next: Accessing the data in Salesforce using Heroku Connect External Objects OData

Connect I2C Devices Using Cat5 and RJ45 Ports

1

The other day I shared my approach to reading data off a couple Arduinos with a Raspberry Pi using I2C and Node.js. Works great, but the wiring was ugly and brittle.2arduinos1pi

Mess or not, the wiring did teach me two things: I2C works as long as there is conductivity and yes you can power a couple of Arduinos from the RPi. I2C doesn’t require a particular wiring sequence or whole bunch of cables extending from a central point to a single device, so when I decided to beef it up a bit, I started to imagine a way of connecting devices over a small distance. I was hopeful it would be easy and not require 100% custom wiring. The requirements boiled down to something like this:

raspberry pi arduino i2c

I decided to look for a solution built around Cat5 cable — it’s cheap, has great connectors and I’ve worked with it a fair amount. I found it at Microcenter.  I’m using a couple of el cheapo cable splitters and an RJ45 F-F joiner, for most of the connection, and it works great. I still need to solder 1/2 a Cat5 cable for the Raspberry Pi and another 1/2 for each device, but I can now put the Arduinos (somewhat) arbitrarily far away from the RPi node and add more until my power runs out.

raspberry-pi-arduino-i2c-cables

Still not ideal, but like I said it works great.  Although relatively inexpensive, each splitter is about $10, and I could do without stripping the Cat5 wires and soldering them to pins.  Next up I think I’ll add a couple of RJ45 ports to a proto shield, and connect that directly to each Arduino.  I’ll net a little more soldering, but it will be easier soldering so should be faster.

Here’s what the current system actually looks like in my attic.  Note that the Arduino and RPi are far enough away that it’s impractical to photograph them together.

raspberry-pi-i2c-cat5-example

Questions I’ve pondered:

Why am I doing this instead of just getting an Arduino with an Ethernet shield?  I’m trying to solve two problems, communication and power.  if I go with the Ethernet shield, I’m in it for $60 each if I include a power over ethernet (POE) module, which is a lot.  It solves the power problem and the connectivity problem, but I still have to wire ethernet.

Why don’t I just go wireless? Well, I could.  In fact, I have a bunch of XBees sitting around from Dreamforce 13 that I could harness.  However, I’d still have to get power to each one of these devices, so I’ve only solved one problem.

How will I scale this beyond my attic? I’ll probably need to scale this to the other floors of my house to get the data I want.  I suspect the easiest way to scale it will be to add a nearly identical system on each floor.  Otherwise I’ll have to deal with stringing wires, and I hate stringing wires.

Read Data from Multiple Arduinos with a Raspberry Pi B+ using I2C and Node.js

3

Last year I wrote about controlling an Arduino using a Raspberry Pi and I2C.  I2C being, of course, a 30+ year old protocol that everybody and their brother still supports.  I tried this again, for reasons I’ll state in the very last sentence of this blog, and it failed. Significantly.

Why? Well, turns out I had forgotten the cardinal rule of I2C.

Link your devices, master and slave, sender and receiver, using a shared ground wire.  Otherwise, you’re going to have a bad time.  Really.

Mmm’k, with that out of the way:

1) The Adafruit tutorial is still the best one I’ve found for enabling I2C on your Raspberry Pi. However, raspi-config now helps you do this under advanced options.  I used the advanced options and then doubled checked everything Adafruit told me to do. (Rule #1: do what Adafruit says.)

2) Add your user to the i2c group.  The command is “sudo adduser pi i2c“. This will make your life easier.  Thank you ABC Electronics.

3) Configure your Arduinos as slave senders.  The example code that comes with the Wire library in the Arduino IDE is perfect for this. Wire ground to ground, SDA to SDA and SCL to SCL. Once you add power to the Arduino, install the slave sender example code and boot your pi, you should be able to see your I2C device using the i2cdetect -y 1 command.  Mine with two Arduinos looks like this:

i2cdetect

My code looks a little different than the slave sender sample, since I have a photocell and a temperature probe on each one.  There’s a fair amount of boiler plate code to read those values, but the fundamental structure is the same as the sample code: configure an I2C address, wait for a request, send a fixed length string in response.

#include <Wire.h> 
#include <OneWire>

// Define analog pin
int sensorPin = 0;
int lightLevel = 0;

int DS18S20_Pin = 2; 
OneWire ds(DS18S20_Pin);  // on digital pin 2

String deviceId = "D001";
int wireId = 4;

String lightLevelString = "";
String output;
int m;
char c[5];
char d[10];

// Setup
void setup() {
 Wire.begin(wireId);
 Wire.onRequest(requestEvent);
 // Init serial
 Serial.begin(9600);
}
 
// Main loop
void loop() {
 
 // Get temperature
 int sensorValue = analogRead(sensorPin);
 
 lightLevel = map(sensorValue, 10, 1000, 1, 100);
 m = sprintf(c, "%03d", lightLevel);
 
  float temperature = getTemp();
  
  dtostrf(temperature, 6, 2, d);

  output = deviceId;
  output.concat("|");
  output.concat(String(c));
  output.concat("|");
  output.concat(String(d));
  
  Serial.println(output);
  
 
 delay(1000);
}

void requestEvent() {
  
  char newOut[16];
  output.toCharArray(newOut, 16);
  
  Wire.write(newOut);
}

float getTemp(){
  //returns the temperature from one DS18S20 in DEG Celsius

  byte data[12];
  byte addr[8];

  if ( !ds.search(addr)) {
      //no more sensors on chain, reset search
      ds.reset_search();
      return -1000;
  }

  if ( OneWire::crc8( addr, 7) != addr[7]) {
      Serial.println("CRC is not valid!");
      return -1000;
  }

  if ( addr[0] != 0x10 && addr[0] != 0x28) {
      Serial.print("Device is not recognized");
      return -1000;
  }

  ds.reset();
  ds.select(addr);
  ds.write(0x44,1); // start conversion, with parasite power on at the end

  byte present = ds.reset();
  ds.select(addr);    
  ds.write(0xBE); // Read Scratchpad

  
  for (int i = 0; i < 9; i++) { // we need 9 bytes
    data[i] = ds.read();
  }
  
  ds.reset_search();
  
  byte MSB = data[1];
  byte LSB = data[0];

  float tempRead = ((MSB << 8) | LSB); //using two's compliment
  float TemperatureSum = tempRead / 16;
  
  return TemperatureSum;
  
}

4) Install node.js on your rpi, and then use npm to install the i2c library.  You should now be able to read your Arduino via Node.js.  My code declares two devices, both with addresses matching the settings in my Arduino code, and it looks like this:

var i2c = require('i2c');
var address = 0x18;

var device1 = new i2c(address, {device: '/dev/i2c-1'});
device1.setAddress(0x4);

var device2 = new i2c(address, {device: '/dev/i2c-1'});
device2.setAddress(0x6);

var devices = [ device1, device2 ];

function handleTimeout() {
	  setTimeout(function() { handleRead(); }, 1000 );
}

function handleRead() {
  	for (i = 0; i < devices.length; i++) {
		    devices[i].readBytes(null,15, function(err,res) {
			      console.log(res.toString('ascii'));
  		  });
  	}
	  handleTimeout();
}

handleTimeout();

The output is pretty simple — it just prints the data it receives, like so:

i2c-dataexample

My final setup consists of two Arduinos (D001 and D002 in the data) connected to one Raspberry Pi B+.  You’ll notice that the RPi is powering both Arduinos.  (Red is power. Back is ground. Blue is SCL. Yellow is SDA.)  Everything is integrated via the breadboard.

2arduinos1pi

Lessons learned (and remembered):

1) Always connect the grounds.  If you’re not seeing your minimally configured, powered and connected slave-sender device, check the ground first.

2) You can easily power a couple of Arduino’s from the RPi.  Nice.  I wonder how many you can power. I’m sure it depends on what you’re doing.

3) The SDA and SCL wires seem to be pretty flexible. You don’t have to chain them in a particular order as far as I can tell. Just make sure there’s some kind of connection and you’re good to go.

4) Arduinos, with built in analog to digital converters, are a lot less painful (and more reliable) to work with when it comes to reading basic sensors and sharing that data with an app running on the RPi.  My original reason for revisiting this use case was dissatisfaction with reading a photocell directly on the RPi using Python (tutorial) and Arduinos are cheap — $10 at MicroCenter.

%d bloggers like this: