Man. I spent the last three days learning how to use Flash so I could create this simple tutorial. You'd think that it would be easy to create a simple interactive thing like this, but it was actually extremely painful. You can't just draw the symbols and move things around. You actually need to program in a language that Adobe (MacroMedia) made called ActionScript. The programming part is no problem, but searching the massive documentation to find the right method to do what I wanted caused me to pound a large hole in the wall with my head. I still don't know why it requires a zillion languages just to do some stuff on the web.

Anyways, this is something that I've been wanting to do for a long time. Trying to explain Zigbee mesh routing using just words is very difficult, and has the tendency to cause people's eyes to glaze over. With an interactive simulation, you can do a play-by-play analysis of what's going on which hopefully will make it easier to see the flow of events that makes up the AODV routing algorithm. I originally had the idea when I was studying the routing algorithm myself. It's almost impossible to understand what's going on by reading the Zigbee spec. That's like trying to understand how to use Linux by reading the source code. You need to be a real masochist. 

There weren't many good tutorials on the web for AODV mesh routing, either. So to save the people the same pain I had to go through in studying it, I created this little applet. To move forward one frame, just click anywhere on the graphic. To move back, press the space bar. Flash doesn't have a handler for the right-click button, because Apple users usually only have one mouse button. You can't tell who will be using the Flash file since it's accessed inside a web browser. You'll need to have Adobe Flash v9 installed (or higher depending on when you're reading this) in order to use it. Otherwise, it will probably just give you a bad graphic, no graphic, or it will cause your computer to explode.

I'm also planning to try and make other tutorials similar to this for tree routing, source routing, and possibly some other aspects of Zigbee. I think its much easier to learn visually than reading that monster document. Without further ado, here's the tutorial. Hope you like it. If there are any problems, leave a comment for me...especially if you find bugs. 

I got a question on how you could use the Z-Wave routing method on a Zigbee router, but still stay compliant to the Zigbee spec. The reason I got this question was probably because of the article I posted on Zigbee vs. Z-Wave . The Z-Wave protocol implements a light version of mesh routing by using only source routing and the Z-Wave controller is the only device capable of originating a source routing frame. For those that don't know, source routing is a method of mesh routing, where you embed the complete path to the destination within the frame. Why would you want to use the Z-Wave style of routing with a Zigbee device? To make a cheap-ass router, dummy.

Seriously, the Zigbee spec, although supporting many features, is too heavy for a lot of consumer applications. For home automation especially, cheap devices need to be created that can use the benefits that Zigbee has to offer like mesh routing and interoperability, but they can't be too expensive. One way to decrease the RAM and flash requirements is to get rid of AODV mesh routing and just use source routing, a la Z-Wave.

The Zigbee 2006 spec also supports source routing, however this does not seem to be a widely known fact. Most Zigbee articles focus only on the AODV mesh routing which uses the routing tables. Or even worse, that only Zigbee Pro supports source routing. The reason that source routing probably does not get a lot of airplay is that the spec implements it in a rather awkward way. If you do it according to the spec, the method of route discovery requires that you already have the route located in your AODV routing tables. This is useless because if you already have the route, then you don't need to do source routing. Maybe its difficult to explain so I'll try and give a rough example. Say you want to source route a frame from point A to point D. If you are starting fresh with no routes in your routing table or source routing table, this is what you would have to do.

There's been more and more attention being focused on wireless home automation recently and the two biggest candidates for it is Zigbee and Z-Wave.

Readers of this blog probably don't need much of an introduction to Zigbee because this blog is basically about it, or at least my adventures diving into Zigbee as I write my software. You're also probably aware that there has been a lot of mudslinging (ie: shit talking) between both camps, Zigbee and Z-Wave, as well as an ongoing flame war. So in an interest to educate myself on the real issues behind the talk, I decided to do some research and comparison between Z-Wave and Zigbee. I should note that a lot of the information was just pieced together via the internet since the Z-Wave specification is not public. A lot of the material in this post borrowed heavily from the information found in this article. Since there is not a lot of technical data publicly available for Z-Wave, I recommend perusing this article in Dr. Dobbs Journal since it's an excellent introduction.

Z-Wave and the Z-Wave Alliance is a group that was established around the proprietary wireless networking protocol developed by a Danish company called Zensys. The Z-Wave protocol was designed to be a lightweight protocol for home automation that would be low cost yet still could support mesh networking.

Lets begin with a look at the device types specified by Z-Wave...

There is still a lot of confusion between the different versions of the 802.15.4 specification.When you mention it, most people immediately think of the original 802.15.4 spec which is otherwise known as 802.15.4-2003. This was the spec that laid the foundation for low-rate wireless PANs (LR-WPANs). The specification described the radio, modulation, bitrate, headers, protocol, and services.

The original 2003 spec had three different radios to choose from when implementing it: 868 MHz, 915 MHz, and 2.4 GHz. By far, the 2.4 GHz radio dominated the product offerings, so much so that there are a many new papers discussing Zigbee co-existence with other 2.4 GHz radio technologies (802.11, Bluetooth).

The main reason that the 2.4 GHz spectrum came to dominate was that 2.4 GHz is a free spectrum in almost the entire world, and also the 2003 spec mandated that the 2.4 GHz would be allowed to run at 250 kbps while the 868/915 MHz frequencies had to run at lower frequencies due to their allowed modulation schemes. For reference, the 2003 spec limited the 868 MHz frequency to run at 20 kbps and the 915 MHz band to 40 kbps using BPSK modulation. Unfortunately, the 868/915 MHz radios never recovered from this and there is still a scarcity of radios at these frequencies.

In 2006, the LR-WPAN working group introduced revisions to the original 802.15.4 specification. The original name of the revisions was called 802.15.4b however since alphabet letters are less exciting than numbers, the industry ended up calling this the 802.15.4-2006 and relegated the old spec to the similarly innovative name of 802.15.4-2003.

The team at SICS released a paper documenting their research on increasing the battery life of Zigbee nodes . Currently, according to the Zigbee spec, Zigbee routers and coordinators need to be on and listening all the time which can be disastrous if you're not plugged into a wall outlet or powered by a car battery. 

The paper describes an interesting approach where you introduce a power savings protocol (X-MAC) in between the 802.15.4 MAC layer and PHY layers. That way, the Zigbee network and application layers don't need to change and you get the power savings benefit. One of the researchers, Pablo Suarez, ported the Open-ZB stack from TinyOS to Contiki and implemented some mesh routing functionality to it to evaluate the X-MAC's power savings protocol. Their paper claims a 90% reduction in power consumption (or conversely a 1000% increase in battery life). The downside is that all devices on the network would need to support the X-MAC. But for a 10x battery life in a remote system, it might be worth it! This would be great for remote projects if they could reduce the power consumption and use some form of energy harvesting.

Here's the paper's abstract: