PiAware gets Lit!

In this brief post I will walk you through the steps I took and challenges I faced in connecting my Raspberry Pi 3 running PiAware (flight tracking software) to my Particle Spark Core IoT device via the MuleSoft Anypoint platform.


Light up a LED on the Core whenever a aircraft emitting an ADS-B signal passes overhead. Additionally, I wanted to learn about the Mulesoft tools, hence the over-complication. 

Devices involved:

  • Raspberry Pi 3 Model B running OSMC 
  • Particle Spark Core running a simple web-enabled LED enable/disable app

Software involved:

  • PiAware connected to a FlightAware Pro USB stick
  • MuleSoft Anypoint Platform running a 30 day trial of Enterprise ESB
  • Photon web IDE (for pushing code to the Spark Core)


Step 1 - Program the Spark Core

This was supposed to be a simple task, however, due to the fact that the Core hadn't been used in over a year quickly I found that I wasn't even able to push the simplest of apps to it. After a few hours of trial and error I was finally able to connect it to my computer, flash it with the following commands, and establish a solid link to the Photon cloud.

photon flash --usb cc3000
photon setup

I then pushed a barely modified version of the Web Enabled LED sample program to it that, instead of on/off, received true/false as parameters to the enable & disable function. The input of true (planeOverhead) would light the diode and false would turn it off. 

Step 2 - Configure and Hack the Pi

Installing PiAware is a whole different topic and one that is covered very well by FlightAware. Therefore, I will skip to the part where I figured out two ways to pull the data that the Dump1090-FA module received from the receiver and antenna. 

1. TCP via port 30003 - this is a *decoded* stream to which Dump1090-FA writes when aircraft data is received. It includes information like latitude, longitude, identity, speed and altitude. 

Example data:

captured aircraft data 

2. JSON file published via Lighttpd so accessible remotely (http://<host>:8080/data/aircraft.json). This file is updated with the current aircraft nearby and is exposed via the bootstrapped HTTP server. It looks like this:

{ "now" : 1506331591.7,
  "messages" : 40461,
  "aircraft" : [

3. Useless in this scenario but cool nonetheless, a web page with detailed information:


In the integration I would attempt to consume both data endpoints with varying levels of success. 

Step 3 - Create an Integration Flow in Mule

Easier said than done... at least it is when you're starting from scratch. Anyway, I love integrations, when I was implementing software the part where I got to assist with the nuts and bolts of making two very different systems speak was my favorite part, hence why I wanted to give Mule a spin and see how it makes things easier. 

I briefly skimmed a few tutorials and then jumped right in.

And sank.

Take 1

My first flow (above) failed because of my poor attempt at trying to read from the socket with the TCP connector. Sure, in theory this should work but this connector seemed to be 1) new to the product/lightly documented and 2) geared towards a server role and therefore wanted to accept connections and NOT create them. Couple that with the fact I didn't want to deal with decoding the stream, I moved on after I found the JSON file I mentioned earlier. 


Try #2 was marginally better since I switched over to the HTTP component and started my attempt at reading that JSON file. I still had a few major problems though:

  1. The HTTP component wanted to accept connections, not create them. 
  2. My payload kept coming back erroneous/unable to be manipulated or tested.

So I pressed on and ultimately came up with this:


The flow does a few fun things:

  1. CRITICAL MISSING LINK - It polls the JSON file every 10 seconds for content. 
  2. CRITICAL MISSING LINK - The content that is found is converted to a String. 
  3. The new String is logged. 
  4. The variable component sets planeOverhead = true if a string "flight" is found in the content of the data. This was a test that I found eliminated a lot of bad/incomplete transmissions.
  5. The planeOverhead variable is logged.
  6. The planeOverhead variable is injected and access_token for my Photon account set as the new payload for the call to the Photon Cloud. 
  7. The Photon Cloud is called and the Spark Core is lit if a plane is overhead!

In the End...

The result is nothing fancy, but very much sufficient proof of concept that it all works. 

It lives!

It lives!


A couple other takeaways:

  • I tried running Mule ESB on my Pi, however, the load was too great (even with Java stack memory set to 256). Guess I need another one. 
  • Getting the payload just right for Photon was a pain. 

And if you're curious, here's an example of the console logging:

DEBUG 2017-09-25 19:58:06,849 [[kjbn].http.requester.FA.worker(1)] org.mule.module.http.internal.HttpMessageLogger: REQUESTER
HTTP/1.1 200 OK
Content-Type: application/json
Accept-Ranges: bytes
ETag: "3414929403"
Last-Modified: Mon, 25 Sep 2017 09:58:04 GMT
Content-Length: 69
Date: Mon, 25 Sep 2017 09:58:04 GMT
Server: lighttpd/1.4.35

{ "now" : 1506333484.4,
  "messages" : 41883,
  "aircraft" : [

DEBUG 2017-09-25 19:58:07,083 [[kjbn].http.requester.HTTPS_Request_Configuration(1) SelectorRunner] org.mule.module.http.internal.HttpMessageLogger: REQUESTER
POST /v1/devices/x/led HTTP/1.1
Connection: close
Host: api.particle.io:443
User-Agent: AHC/1.0
Accept: */*
Content-Type: application/x-www-form-urlencoded
Content-Length: 63

DEBUG 2017-09-25 19:58:08,013 [[kjbn].http.requester.HTTPS_Request_Configuration.worker(2)] org.mule.module.http.internal.HttpMessageLogger: REQUESTER
HTTP/1.1 200 OK
Date: Mon, 25 Sep 2017 09:58:05 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 81
Connection: close
Server: nginx
X-Request-Id: ec45fbce-9337-4300-bd22-e4bf8eb67eef
Access-Control-Allow-Origin: *