Reid Carlberg

Connected Devices, salesforce.com & Other Adventures

Experiment Faster with Lightning Components & Force.com (Mo Tester Update)

0

Update from 2015-03-06: Mo Tester Octothorpe

TL;DR: Updated Mo Tester, install link on Github Repo.

Back in the day I wrote a post called Experiment Faster on Force.com where I confessed to blatantly stealing an idea from Mr. Steve Molis himself.

It was SteveMo’s idea to have a dev org with a bunch of objects already built, objects with fields of every shape.  My add was to create a package out of it, so that other users could try things out more quickly than if they had to build everything themselves.

Hence: Mo Tester.  Mo Tester is lab equipment for your Developer Edition.  It gives you a pre-defined framework for messing around with many kinds of Force.com goodies, and now includes some basic Lightning Components for your enjoyment.

This update:

  • has an new Visualforce controller (which inspired a post on Refactoring Visualforce Controllers for Lightning Components)
  • includes a Mo Tester Lightning Tab suitable for immediate inclusion in your DE’s mobile navigation
  • demonstrates several Salesforce1 Lightning Component events including (drum roll) a SHOW TOASTER button.

Hopefully this update helps you go faster as you’re discovering what you can do on the platform.

Questions? Comments? Feedback? LMK!

@ReidCarlberg

Refactoring Visualforce Controllers for Lightning Components

1

You might remember Mo Tester, the package that helps you Experiment Faster on Force.com. I’m tweaking it for some Lightning Component stuff (spoiler alert) and ran into this.  (Posting to the Lightning Newbie Notes too.)

The following code won’t compile. You’ll get a handy Return type does not support AuraEnabled error because — wait for it — the @AuraEnabled annotation doesn’t support PageReference return types.

@AuraEnabled
public PageReference createMoTester1() {
    SmartFactory.FillAllFields = true;
    LAB_Mo_Tester_1__c t = (LAB_Mo_Tester_1__c) 
           SmartFactory.createSObject('LAB_Mo_Tester_1__c');
    insert t;
    return new PageReference('/'+t.id);
}

You need to refactor your Visualforce controller to have two methods, one for your existing code, one for your new Lightning Component.  Also, the @AuraEnabled method needs to be static.

public PageReference createMoTester1() {
    Id myMo = LAB_CreateDataHelper.getNewMoTester1Id();
    return new PageReference('/'+myMo);        
}    

@AuraEnabled
public static Id getNewMoTester1Id() {
    SmartFactory.FillAllFields = true;
    LAB_Mo_Tester_1__c t = (LAB_Mo_Tester_1__c) 
           SmartFactory.createSObject('LAB_Mo_Tester_1__c');
    insert t;  
    return t.id;                
}

Note that in this case, based on the way my test classes are structured, this didn’t affect my code coverage. It was 100% before, it’s 100% after.

Incidentally, not supporting PageReference return types is perfectly logical when you think about it.  PageReferences are a Visualforce thing after all.

Also note that, yes, I’m bewildered by my choice of method names here, too.  Also, the original package doesn’t seem to use this method.  And the test coverage in the package didn’t include coverage for the StandardController and StandardSetController dependent constructors. What was I thinking???

Looking at old (ish) code is fun.

Update 2/23/15 — I did use the button, on the list view.  I just didn’t inspect hard enough.  Phew & Sheesh!mo-tester-button-found

 

Also the method has to be static.  Whoops!

npm install parenting

0

My young son and I (aka The Critter) talk about a lot of things, including the fairly abstract work I do.  I wanted to make it more concrete for him, so, for fun, created a super simple node library which is now, shockingly, available via the esteemed npm and of course on github.

The gist? Well, you put a number of things you might say to your child in as prompts, and can then have the computer say them out loud. It comes with defaults, you can create your own, and it of course has auto mode where it just says one right after the other.

Useful phrase like, Reid, use your napkin.  Reid, clean your room.  Reid, stop doing that.

You get the idea!

It must have connected a little because here you can see him changing the configured name to be that of his sister!

Haven’t tested on a Windows machine.

parenting in action!

 

 

 

Salesforce Lightning Component Newbie Notes

1

Like a lot of you, I recently started down the path of learning Lightning Components. These are my notes and code snippets based on a series of questions I asked as I went through the docs and tutorials. I hope you find them useful.

Salesforce Lightning Component Newbie Notes

This is the first time I’ve used Jekyll and markdown to create a relatively extended blog post.  Pretty convenient!

LEGO Mindstorms: Solve Rubik’s Cube in 90 Seconds

2

If you’re looking for a last minute holiday gift for the burgeoning genius on your list, go get a LEGO Mindstorms EV3 set.  Yes, they’re expensive, but they’re worth it.

Want proof?  Watch this:

Yes, that’s from a single set, and it solves the infamous Rubik’s Cube in about 90 seconds. Amazing, right? Yes, it is. And my kid has been talking about it non-stop since I stayed up a little late to build it on Friday night.

I didn’t design this, I just followed the excellent instructions for Mindcub3r from David Gilday.

So totally worth it.

Simple Salesforce Lightning Connect Example (External Objects)

13

Lightning Connect includes External Data Sources and External Objects.  You can try this in your free developer edition. There’s a nice blog that talks about setting up a more custom data service and covers a bit more about why this is super cool.  Andy Mahood also blogged about this recently, including more details on related lists and Chatter.

Lightning Connect is the new service from Salesforce that lets you bring OData content into your Salesforce org. I was curious how to test it, so I thought I’d share what I found.

Step 1: Locate Sample Data

OData is a protocol for exchanging .. er … data, so it makes sense that you could probably find some sample data to connect to. A bit of the old Google led me to this sample data page where you can click on a tab to see “V2” sources.

salesforce-sample-odata

When you click on the Northwind sample, you get a bunch of XML (hooray). You’ll want to copy the URL in your browser window at this point.

salesforce-sample-odata-xml

Step 2: Create a new External Data Source

Back in your free developer edition, navigate to Setup > Develop > External Data Sources, and fill in the form as shown in the image. In the Server URL, paste the .. er … server URL you just copied above.

salesforce-external-data-source-new

Important: once you click “Save” you then need to click the button that says “Validate and Sync” before you can move on.

salesfroce-external-data-validate-sync

Once that’s complete, you will be able to select the specific object you’re interested in. For this example, select “Categories” and click on “Sync”.

salesforce-external-data-select

After a moment, you will return to the external data source screen and your external objects related list will look like this — sweet!

salesforce-external-objects-related-list

Step 3: Create a Tab for the External Object

If you click on the “Categories” label in external objects related list, you’re going to see an object definition screen that looks shockingly familiar. Yes, it even includes a “Page Layout” section at the bottom.

salesforce-external-object-definition

Adding this to a tab is super easy. Navigate to Setup > Create > Tabs, and create a new tab for a custom object. That’s it!  Now you can easily navigate to particular records, work with list views, etc — all of the stuff you can easily do with custom objects.

salesforce-external-object-related-list

Step 4: Do a SOQL Query

If you’re like me, you are by now fairly curious about accessing the data in Apex and in a standard SOQL query. Head on over to Workbench. When you login, be sure to login using v32.0. Jump to SOQL queries, and select the “Categories__x” object.

workbench-SOQL

And in the .. er … category of least surprising thing ever, you can see it works!

workbench-soql-results

But what about Apex you say? Yes, also just that easy, as you can demonstrate for yourself using Workbench’s Apex Execute utility.

workbench-apex-execute

Pretty Sweet!

Give it a shot and let me know what you think. @ReidCarlberg

 

 

 

 

Five Enterprise Internet of Things (IoT) System Architecture Patterns

3

Updates based on feedback at the bottom.

I’ve said before that all Internet of Things (IoT) projects should start with a great use case that matters. In this post I want to discuss the next logical question: how to architect a system capable of handling the use case. I divide these use case centric IoT deployments into five architectural patterns, and I call them Anonymous, Direct, Buffered, Gateway and Cloudy. I’m going to first introduce a simple metaphor for how to think about IoT systems and then describe these core architectures, discuss their capabilities and limitations and review some interesting variations.

Note that I use Salesforce products here to represent key business systems, but each architecture could be adapted for use with other offerings (although you should really have Salesforce in the mix IMHO).

Also note that these architectures are more about generating actionable insights than they are about managing devices deployed in the field. Great Salesforce partners like Devicify and Cycle30 can help with these issues.

The Metaphor & Key Considerations

The easiest metaphor for a robust Internet of Things system is the nervous system. Signals come in from the extremities, they are processed in the brain, and muscles take action. In an IoT style solution, sensors send signals up the chain, those signals are aggregated and analyzed, and new signals are sent to actuators of some kind. Note that these actuators may be devices, other systems or people.

iot-system-architecture-metaphor

(You might enjoy reading about the nervous system over on Wikipedia, which is also where I found this great image.)

When you’re designing your system, you should consider:

  • Device Sophistication: What is the minimum viable device I can use to gather the signals I’m interested in?
  • Reporting Frequency: How often will that device need to report data in order for me to accurately analyze and understand the conditions I’m interested in?
  • Next Action: Once the information is aggregated and analyzed, what happens next and is it driven by a person or an automated system?
  • Distance: How many steps are there between understanding the need for action and the ability to take action?

A key difference between a standard animal nervous system and a robust IoT system is that the nervous system has private pathways for signals to flow. Most IoT systems will be running on the public Internet, and so will have significant concerns around signal encryption. That is the subject for a different article.

Anonymous

Anonymous implementations take advantage of APIs originally designed for use with basic websites. In the Salesforce world, these would be the Web2Case and Web2Lead APIs. These are incredibly easy to deploy by design, but, also by design, they lack many of the features considered standard in today’s business and IT climate, including authentication that clearly identifies the individual user or device reporting an issue. Example implementation: Create a case in Salesforce with the Staples Easy ButtonDevice sophistication: high. Reporting frequency: very low. Next action: human intervention. Distance: Low.

anonymous-enterprise-iot-architecture-pattern

Direct

The first step above Anonymous is Direct, where the device knows how to authenticate and is powerful enough to communicate securely using standard REST APIs. Direct integrations require standard credentials embedded in each device, and so is generally appropriate for higher order systems and higher value add situations. This architecture isn’t well suited to high volumes of data or transactions, but works great for systems that report, for example, once per day. Example implementation: Control Philips Hue from SalesforceDevice sophistication: high. Reporting frequency: low. Next action: human intervention. Distance: low.

direct-salesforce-enterprise-iot-pattern

Buffered

The Buffered architecture enables systems to communicate at higher transaction volumes without requiring the destination system to be aware of the originating devices in a detailed way. Buffered architectures simplify IoT systems by allowing you to create a custom API and to define your own interaction models. On the Salesforce1 Platform, you can easily create a buffered architecture by combining Heroku services with Force.com services. Heroku would be responsible for device interaction and information aggregation. The system would then sync summarized information to Force.com using Heroku Connect or standard REST APIs. Device sophistication: medium-high.  Reporting frequency: medium. Next action: automatic or human response.  Distance: medium.

buffered-salesforce-enterprise-iot-pattern

Gateway

Gateway architectures allow you to include much lower power devices in your IoT system. These low power devices communicate with a more sophisticated gateway device that is powerful enough to connect with either a custom API built on Heroku or a standard Force.com API. Although I’ve used Bluetooth to represent the low power device, it is not limited to bluetooth. Gateway architectures open your system to a much greater variety of participating devices since the gateway can serve as a broadly capable adapter class. Device sophistication: low for the device, high for the gateway. Reporting frequency: high. Next action: automatic or human response. Distance: high to medium.

gateway-salesforce-enterprise-iot-patterns

Cloudy

Cloudy architectures are interesting because they essentially roll-up functionality starting with the device end of the typical IoT value stream. The core model is to combine a core piece of hardware with Internet connectivity. The connectivity ends in a cloud service, and the cloud service becomes the end point for talking to business systems. This pattern requires a higher level device in the field along with rich connectivity between that device and cloud system. The integration is then cloud to cloud. This is a great way to get started, but, like any tightly coupled design, your ability to change and modify the system is bound to the strengths and limitations of your supplier’s platform. Device sophistication: high.  Reporting frequency: indeterminate.  Next action: automatic or human response. Distance: low to medium.

cloudy-salesforce-enterprise-iot-pattern

Variation #1: Direct (Augmented)

Sometimes you’ll want to interact with Salesforce APIs directly, but you’ll want to augment that interaction with computing resources located on another system, e.g. an app running on Heroku. This is useful when you need processing not typically handled on the Force.com platform, such as image analysis.  In that case, you can easily upload data to Force.com, trigger a Heroku process based on that data, and then complete the interaction by putting the actionable results back in Force.com. Device sophistication: high. Reporting frequency: low. Next action: human response. Distance: medium.

direct-augmented-enterprise-iot-patterns

Variation #2: Buffered by Partners

The Salesforce ecosystem has a rich variety of partners, including several focused on building end-to-end solutions for our customers. One way of deploying a system is to use a partner solution as your aggregation layer, looking something like the picture below. Note that the device below could be replaced by a collection of lower powered devices connecting to a more powerful gateway. Example implementation: Get Started with the Internet of Things in 10 Minutes.  Device sophistication: medium-high. Reporting frequency: medium. Next action: automatic or human response. Distance: medium.

buffered-partner-enterprise-iot-pattern

Variation #3: Double Buffered

It’s also possible to create a double buffered system that puts a partner solution in front of a custom API deployed to Heroku, which then connects to Force.com. This creates a more complicated system, but might be justified by your business case. Device sophistication: medium-high. Reporting frequency: medium. Next action: automatic or human response. Distance: far to medium.double-buffered-enterprise-iot-pattern

Next Steps

I hope this discussion is useful for you.  I’m going to be doing a longer version of this at ThingsExpo in Santa Clara this week (November 6), which also includes some background, use cases and predictions.

If you have comments, I’d love to hear them. If you’re working on an IoT deployment, I’d love to hear about that too.

 

Updates & Feedback

@MJFreshyFresh put together a scatter plot of patterns AND sketched some ideas about how actuators and sensors might have different pathways.  Pretty sweet!

@AliNajefi points out that examples would be helpful, so I’ve added links where I’ve created examples.  Great feedback!

Me, Myself and Dreamforce ’14

0

Last year I attempted to chronicle Every Internet of Things Thing at Dreamforce 2013.  It took me one post (albeit a long one) and I think I caught every session.

2014-09-iot_robot

This year, I ignored everything but my little portion — the DF14 DevZone Internet of Things area — and it still took me three pretty long posts and almost 3000 words.  What a difference a year makes!

So yes, 48 hands on IoT workshops.

28 great IoT partners (yes, we had one join since I wrote the post — Automatic).

12 picture worthy things you can checkout and probably — estimating here — 25 plastic shoe boxes of components you can build stuff with.

And a lot of demos.  Something new every single day of Dreamforce.

Devices, Wearables and Robots — my oh my.  See you there!

Safecast: my experience building bGeigie Nano #2333

0

At ThingMonk last year, I was very intrigued to hear about Safecast and their project mapping radiation levels in Japan and around the world in near real time. So, when this year’s Dreamforce DevZone Internet of Things area was coming together, I made a point to buy a kit for the zone so I could tell a bit about this truly inspirational story to our attendees.

First, I should point out that the bGeigie Nano is relatively easy to build. There are great instructions, and the design of the kit is targeted at amateurs and pretty well tested. If you haven’t done a DIY soldering kit in a while, let me assure you that very few are designed this well.

Step 1 was getting all of the small parts soldered to the main board. These are the resistors, the capacitors, switches, etc. Now I’m far from the world’s greatest solderer, and this board was the first thing I’ve soldered in a few months. However, it still came together just fine.

IMG_4287

Next, the system comes with s number of component boards. These add functionality like GPS, data logging and a small screen. The most challenging thing here is holding the headers in place while you solder then to the back of the main board. Times like this, I generally use a little piece of paper, folded up to just the right height, to prop the headers up. Works like a charm!

IMG_4293

Now it was time to test the unit, and this is where I ran into my first challenge. I plugged in the USB power cable, and ……….. nothing happened. Using my trusty multimeter, I verified the solder joints and power flow and it seemed like everything was right. I came to the mistaken conclusion that my unit shipped without firmware. So I ordered an FTDI cable to handle that. When the FTDI cable arrived, I plugged everything in and voila! It worked. However, I hadn’t pushed any firmware so I didn’t know why it worked.

IMG_4361

The explanation turns out to be relatively simple. The USB connector is designed to charge the battery not operate the board. The system turns on when the battery or the FTDI cable is attached.

IMG_4308

The rest of the build went smoothly and today I have a fully functioning radiation logger. Yes, I’ll definitely bring it to Dreamforce.

See you there!

 

Moov is the AI coach my dogs have been waiting for

0

I pre-ordered Moov a few months ago. It arrived Saturday and I tried it out this morning.

 

IMG_4278If, like me, you haven’t used an artificially intelligent coach before I suspect you’ll be pleasantly surprised by the Moov. You select a walking or running program, and it does the rest. I chose a brisk walk program, set some levels and we were off and running — er — walking I guess. The screenshot above is the outcome. I started on Level 1 and pushed to Level 7. Level 7 was 122 steps per minute with 2 minute intervals. The coach would tell you you’re on pace, the speed up, to avoid tending up, etc.

IMG_4281

My dogs loved it – it was definitely faster than our usual morning walk.

The Moov is very interesting. The device itself comes with a couple of bands that it pops into and it pairs with your phone via Bluetooth. It’s very light and you put it around your ankle. The computerized voice tells you things like “You crushed it!” just like that $100 an hour personal trainer. Moving the levels on the app is a little unwieldy when you lock your phone but that probably gets easier when you use it more and know your levels. Also charging it is a little difficult due to the small charging leads, but should be manageable if you pay a little more attention that I usually do the first time I plug a new toy in.

Overall, $40 well spent.

%d bloggers like this: