<?xml version="1.0" encoding="utf-8"?>
<feed xmlns="http://www.w3.org/2005/Atom"><title>Kyle Machulis</title><link href="/" rel="alternate"></link><link href="http://kyle.machul.is/atom.xml" rel="self"></link><id>/</id><updated>2016-11-07T17:50:27-08:00</updated><entry><title>Talking Bluetooth LE on Desktop in 2016</title><link href="/2016/11/07/talking-bluetooth-le-on-desktop-in-2016/" rel="alternate"></link><updated>2016-11-07T17:50:27-08:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2016-11-07:2016/11/07/talking-bluetooth-le-on-desktop-in-2016/</id><summary type="html">&lt;p&gt;&lt;img alt="Bluetooth LE" src="/images/2016-11-07-talking-bluetooth-le-on-desktop-in-2016/btle.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;I am sorry. I am so, so sorry.&lt;/p&gt;
&lt;p&gt;Every so often, I decide "Gosh, I'd really like to write code for some
Bluetooth LE devices, but I don't really do much on mobile. Maybe
things have gotten better on desktop!" So far, I have been
disappointed every time. Now is no exception, but I've decided to
actually write down that disappointment as a form of therapy. &lt;/p&gt;
&lt;p&gt;This post will go over how different desktop OSes, libraries, and
hardware deal with Bluetooth LE. I'm sticking to desktop here because
product manufacturers assume BTLE devices will usually be used with
phones. Mobile certainly isn't a solved problem either, but it's
better than desktop right now.&lt;/p&gt;
&lt;p&gt;This article isn't an introduction to BTLE itself. I'm going to assume
readers know the basic terms and differences between, say, pairing and
connection. If you're not familiar, I recommend checking
out
&lt;a href="https://learn.adafruit.com/introduction-to-bluetooth-low-energy/introduction"&gt;Adafruit's BTLE Intro.&lt;/a&gt; Apple's
&lt;a href="https://developer.apple.com/library/content/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothOverview/CoreBluetoothOverview.html#//apple_ref/doc/uid/TP40013257-CH2-SW1"&gt;CoreBluetooth Overview&lt;/a&gt; has
some nice explanation also, though the examples are obviously platform
specific.&lt;/p&gt;
&lt;h2&gt;Operating System Support&lt;/h2&gt;
&lt;h3&gt;OS X&lt;/h3&gt;
&lt;p&gt;Starting off easy. OS X has had support as a BTLE Central Node since
10.6, and as a peripheral node since 10.9. Done!&lt;/p&gt;
&lt;h3&gt;Linux&lt;/h3&gt;
&lt;p&gt;tl;dr: Bluez &amp;lt; 5.38 or so, use gattlib. Bluez &amp;gt;= 5.38 or so, use dbus.&lt;/p&gt;
&lt;p&gt;And then right on up the difficulty curve to Linux, where we
have &lt;a href="http://www.bluez.org/"&gt;bluez&lt;/a&gt;. I've yet to ever hear anyone say
"yay bluez!"&lt;/p&gt;
&lt;p&gt;Bluez got BTLE support in 4.93 or so. As of this writing (November
2016), we're at 5.43. That's a full major version and a ton of minor
versions difference.&lt;/p&gt;
&lt;p&gt;Between bluez 4 and 5, APIs moved from direct access to dbus. Then,
within DBUS 5, the methods have changed multiple times. I spent part
of last weekend trying to write some dbus code for accessing BTLE
devices with no luck, as I couldn't seem to identify services on the
device. It turns out that I'm on debian Jessie, which comes with dbus
5.23 (released September 2014) . After looking around a bit, it seems
most current bluez supporting libraries expect users to have at least
5.38 or higher, and sure enough, those expose different methods.&lt;/p&gt;
&lt;p&gt;I was pretty confused by this, as I'd been
using &lt;a href="https://bitbucket.org/OscarAcena/pygattlib"&gt;pygattlib&lt;/a&gt; with no
problems on the same linux box to write some BTLE test scripts. Turns
out, pygattlib is similar to the
C-based &lt;a href="https://github.com/labapart/gattlib"&gt;gattlib&lt;/a&gt;. Both of these
use the GATT functions from gatttool in the bluez 4 line to talk to
BTLE devices without having to go through dbus, to let older
machines/kernels talk BTLE. That's why things worked, because it was
just bypassing the dbus interface.&lt;/p&gt;
&lt;h3&gt;Windows&lt;/h3&gt;
&lt;p&gt;tl;dr: Either hope your device requires pairing, or your code can deal
with WinRT APIs and will only run on updated Windows 10. Otherwise, do
something crazy.&lt;/p&gt;
&lt;p&gt;Windows started supporting communicating with BTLE as of Windows 8.
However, this didn't mean you could just go talking willy-nilly to any
device you pleased. You actually had to pair with devices before they
were available to functions.&lt;/p&gt;
&lt;p&gt;The problem is that device manufacturers are lazy and cheap, and
pairing is an optional part of the BTLE handshake process. It's also
the part that's vaguely secure, but a lot of products don't really
care about that. For many devices, you just connect, query for
services, and off you go. This is possible on both OS X and Linux, but
on windows, it was a no-go up until a set of Windows 10 updates.&lt;/p&gt;
&lt;p&gt;As of updates leading up to and including the Anniversary update in
August 2016
(thanks &lt;a href="https://twitter.com/kraln/status/795851296171589633"&gt;@kraln&lt;/a&gt;,
WinRT started providing functions to connect without pairing, which is
how some of the platforms listed in the next section are going to
tackle
this.
&lt;a href="https://twitter.com/Vincent_Scheib/status/796107087298183168"&gt;Microsoft has apparently stated that there will never be connection APIs for Win32&lt;/a&gt; (thanks
to
&lt;a href="https://twitter.com/Vincent_Scheib/status/796107087298183168"&gt;@vincent_scheib&lt;/a&gt; for
info).&lt;/p&gt;
&lt;p&gt;There is one other workaround for BTLE on windows, even for
pre-windows 10 platforms, but it's not pretty. Check out the section
below on Noble for more information.&lt;/p&gt;
&lt;h2&gt;Cross-Platform ways to access BTLE currently&lt;/h2&gt;
&lt;p&gt;Given those warnings, if you still want to access bluetooth in a
pre-written, cross-platform way, here's a few choices. This is by no
means an even partially complete list of bluetooth wrappers/libraries,
it's just what I looked up while figuring all this out..&lt;/p&gt;
&lt;h3&gt;Qt (C++)&lt;/h3&gt;
&lt;p&gt;Qt was actually one of the first places I went to check for this, as
they're usually pretty good about supporting as much functionality as
possible across platforms.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://doc.qt.io/qt-5/qtbluetooth-index.html"&gt;Qt has had support for BTLE on OS X, Linux, Android, and iOS since 5.7&lt;/a&gt;.
Notice the lack of Windows
there?
&lt;a href="https://bugreports.qt.io/browse/QTBUG-31674"&gt;Looks like they'll get WinRT in 5.8, "platform" support "later".&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;Web Bluetooth&lt;/h3&gt;
&lt;p&gt;There is currently
a &lt;a href="https://webbluetoothcg.github.io/web-bluetooth/"&gt;proposed spec&lt;/a&gt;
being implemented by Google in Blink (the browser engine that backs both
Chrome and Opera) that will allow webpages to interact with BTLE
devices. This is slated
to
&lt;a href="https://groups.google.com/a/chromium.org/forum/#!topic/blink-dev/Ono3RWkejAA"&gt;ship in Chrome 56&lt;/a&gt;.
It will support Linux, OS X, Android, and ChromeOS. Windows is
apparently coming with WinRT later.&lt;/p&gt;
&lt;p&gt;Currently,
&lt;a href="https://developer.microsoft.com/en-us/microsoft-edge/platform/status/"&gt;Edge has no stated plans to implement Web Bluetooth but it is "Under Consideration"&lt;/a&gt;.
&lt;a href="https://bugzilla.mozilla.org/show_bug.cgi?id=674737"&gt;Mozilla is pushing back on spec implementation in Firefox due to privacy concerns,&lt;/a&gt;,
though &lt;a href="https://szeged.github.io/servo/"&gt;work is happening in Servo&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;So while this API may show up in Chrome, it may also ONLY show up in
Chrome. This situation certainly hasn't stopped anyone from using APIs
before, though.&lt;/p&gt;
&lt;p&gt;And for anyone that is saying "Wait, Kyle, didn't you implement one of
the Bluetooth stacks for FirefoxOS? What about that?", my reply is
"DON'T MENTION THE WAR." (Translation: That was an early version of a
non-standardized API that has since been removed from Gecko, Mozilla's
browser engine)&lt;/p&gt;
&lt;h3&gt;Noble (Node.js)&lt;/h3&gt;
&lt;p&gt;&lt;a href="https://github.com/sandeepmistry/noble"&gt;Noble&lt;/a&gt; is a node.js BTLE
library that supports OS X, Linux, and Windows.&lt;/p&gt;
&lt;p&gt;Yes, Windows. Without WinRT. Crazy, right?&lt;/p&gt;
&lt;p&gt;Well, yes. It actually is crazy. To use Noble on windows (which many
IoT/Maker programs do), you have to
install
&lt;a href="https://github.com/sandeepmistry/node-bluetooth-hci-socket#windows"&gt;WinUSB drivers over the standard Bluetooth Dongle drivers&lt;/a&gt;.
Noble then handles the full bluetooth stack for you, bypassing the
connection/scanning APIs missing from regular old windows. While a
clever way to do that, it's not exactly something you'd want to ship
to non-savvy end-users.&lt;/p&gt;
&lt;h3&gt;BGAPI&lt;/h3&gt;
&lt;p&gt;Some people aren't happy to just bitbang bluetooth to a dongle though.
Instead, they go all the way and implement a specialized API
specifically for their dongle.
The
&lt;a href="http://www.silabs.com/products/wireless/bluetooth/bluetooth-smart-modules/Pages/bled112-bluetooth-smart-dongle.aspx"&gt;BlueGiga BLED112 BTLE Dongle&lt;/a&gt; comes
with a special, proprietary API that allows users to connect to BTLE
devices on Windows (and other platforms), also routing around the lack
of OS API functionality. So, as long as your platform can talk USB, it
can also talk BTLE.&lt;/p&gt;
&lt;p&gt;(Found this dongle
via
&lt;a href="http://christopherpeplin.com/2015/10/22/pygatt-python-bluetooth-low-energy-ble/"&gt;an article on the PyGATT library&lt;/a&gt;,
which supports both Linux dbus and BGAPI for cross platform python
bluetooth support).&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;Well, that's the state of things for the moment. Those are some of the
reasons there's no &lt;a href="http://libusb.info"&gt;libusb&lt;/a&gt;-equivilent for
bluetooth yet. Hopefully we'll see Microsoft fill out the Windows API
surface soon and make this article a little less sad, 'cause the WinRT
stuff is kinda painful. Until then, though, this is what we get to
deal with.&lt;/p&gt;
&lt;p&gt;Thanks to &lt;a href="http://twitter.com/sandeepmistry"&gt;Sandeep Mistry&lt;/a&gt; for
filling in some of the details on the Windows situation.&lt;/p&gt;
&lt;p&gt;PS Oh, yeah, &lt;a href="https://github.com/takawata/FreeBSD-BLE"&gt;almost forgot FreeBSD&lt;/a&gt;&lt;/p&gt;</summary></entry><entry><title>Using A Firewire Phantom Omni on Windows 8/8.1/10</title><link href="/2016/10/03/using-a-firewire-phantom-omni-on-windows-8-10/" rel="alternate"></link><updated>2016-10-03T19:31:28-07:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2016-10-03:2016/10/03/using-a-firewire-phantom-omni-on-windows-8-10/</id><summary type="html">&lt;p&gt;&lt;img alt="Sensable Phantom Omni" src="/images/2016-10-03-using-a-firewire-phantom-omni-on-windows-8-10/omni.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;Last weekend I decided to dust off my Sensable Phantom Omni (now
the
&lt;a href="http://www.geomagic.com/en/products/phantom-omni/overview"&gt;3D Systems Geomagic Touch&lt;/a&gt;,
but I bought mine before Geomagic bought Sensable and 3D Systems
bought Geomagic), and see if it was still usable. I had to order
a
&lt;a href="https://www.amazon.com/Syba-Firewire-XIO2213B-Components-SY-PEX30016/dp/B006DQ0KD2/"&gt;PCIe 1394b card&lt;/a&gt;,
but other than that, I hooked it up, and the Phantom drivers seemed to
install correctly on Windows 10. However, when running the Phantom
demo software, anytime a program tried to access the motors, the
program would crash. Sensor reading seemed ok, but I couldn't get any
feedback.&lt;/p&gt;
&lt;p&gt;A quick call to the Geomagic Freeform support line turned up the
solution. As with many video products that used 1394b, the Phantom
Omni requires the "Legacy" firewire drivers. These were included with
Windows up to Windows 7, but as of Windows 8 and above, are no longer
included with the operating system.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://support.microsoft.com/en-us/kb/2970191"&gt;The legacy 1394 drivers can now be retrieved from the Microsoft Support Site.&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;After installing the drivers and changing the 1394 PCIe interface to
use the Legacy drivers
(&lt;a href="https://www.studio1productions.com/Articles/Firewire-1.htm"&gt;process documented here&lt;/a&gt;),
I rebooted and the demos worked fine, with force feedback and all!&lt;/p&gt;
&lt;p&gt;Along the way, I also found some information on repairing internal
cable breakage in the Omni, by a research team at John Hopkins
University. The original site had died, but the instructions and
images were still
available
&lt;a href="https://web.archive.org/web/20100621034327/https://haptics.lcsr.jhu.edu/Repairing_a_PHANTOM_Omni"&gt;at this link via the Internet Archive&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;Anyways, hope this helps others that still want to get some life out
of their haptic controllers!&lt;/p&gt;</summary></entry><entry><title>Autodesk Pier 9 Residency Project Lecture: Industrial ASMR</title><link href="/2015/09/19/autodesk-pier-9-residency-project-lecture-industrial-asmr/" rel="alternate"></link><updated>2015-09-19T17:58:10-07:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2015-09-19:2015/09/19/autodesk-pier-9-residency-project-lecture-industrial-asmr/</id><summary type="html">&lt;p&gt;I was an Artist In Residence at
&lt;a href="http://autodesk.com/pier9"&gt;Autodesk Pier 9&lt;/a&gt; from July 2014 to January
2015, concentrating on a sound art project to extract new and
interesting sounds from the prototyping machines around the workshop.
There's a video of the lecture I gave covering my time at Pier 9:&lt;/p&gt;
&lt;p&gt;
&lt;div class="mdx-video-container"&gt;
&lt;iframe allowfullscreen="true" frameborder="0" mozallowfullscreen="true" src="//player.vimeo.com/video/123686188" webkitallowfullscreen="true"&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;/p&gt;
&lt;p&gt;&lt;a href="http://opentranscripts.org/transcript/words-sounds-pier-9/"&gt;OpenTranscripts was also nice enough to provide a transcription of the talk&lt;/a&gt;.
The project was covered by media outlets such as
&lt;a href="http://www.cnet.com/news/the-strange-surprising-sounds-of-3d-printers-waterjets-and-laser-cutters/"&gt;CNet&lt;/a&gt;
and &lt;a href="http://recode.net/2015/01/24/bees-were-the-original-3-d-printers-autodesk-has-its-first-art-show/"&gt;re/code&lt;/a&gt;.&lt;/p&gt;</summary></entry><entry><title>Upcoming Speaking Engagements and Project Updates</title><link href="/2012/10/21/upcoming-speaking-engagements-and-project-updates/" rel="alternate"></link><updated>2012-10-21T16:42:10-07:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2012-10-21:2012/10/21/upcoming-speaking-engagements-and-project-updates/</id><summary type="html">&lt;p&gt;Time for more speaking updates! &lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2012-10-21-updates-and-speaking/aes2012.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;I'll be part of a panel on
&lt;a href="http://www.aes.org/events/133/gameaudio/?ID=3261"&gt;using biometrics for generative game audio at AES&lt;/a&gt;, on Friday,
October 26th, at 2pm. Should be fun!&lt;/p&gt;
&lt;p&gt;Also, I've just moved a lot of the health driver projects I currently
maintain to the &lt;a href="http://www.github.com/openyou"&gt;OpenYou organization on github&lt;/a&gt;. The hope is to
get more developers working on these projects, versus having the world
waiting on me to have time to work on things. There's more information
available at &lt;a href="http://www.openyou.org/2012/10/21/openyou-github-org/"&gt;the post on openyou.org&lt;/a&gt;.&lt;/p&gt;</summary></entry><entry><title>Keepon Control via Kinect</title><link href="/2011/11/16/keepon-control-via-kinect/" rel="alternate"></link><updated>2011-11-16T00:12:09-08:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2011-11-16:2011/11/16/keepon-control-via-kinect/</id><summary type="html">&lt;p&gt;And the hits just keep on comin'.&lt;/p&gt;
&lt;p&gt;
&lt;div class="mdx-video-container"&gt;
&lt;iframe allowfullscreen="true" frameborder="0" mozallowfullscreen="true" src="//www.youtube.com/embed/6XhbYWLnsq0" webkitallowfullscreen="true"&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;/p&gt;
&lt;p&gt;I threw this together this evening, in about 4 hours from top to
bottom (code + video). Yay open source. &lt;/p&gt;


&lt;p&gt;Project uses:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Arduino + Keepoff &lt;a href="http://www.github.com/qdot/keepoff"&gt;http://www.github.com/qdot/keepoff&lt;/a&gt; - Control of Keepon Robot&lt;/li&gt;
&lt;li&gt;Processing &lt;a href="http://www.processing.org"&gt;http://www.processing.org&lt;/a&gt; - tying the whole thing together&lt;/li&gt;
&lt;li&gt;OSCP5 &lt;a href="http://www.sojamo.de/oscP5"&gt;http://www.sojamo.de/oscP5&lt;/a&gt; - talk to python script that's controlling arduino, already had that written so didn't write serial controls in Processing&lt;/li&gt;
&lt;li&gt;GSVideo &lt;a href="http://gsvideo.sourceforge.net"&gt;http://gsvideo.sourceforge.net&lt;/a&gt; - for webcam (filming keepon)&lt;/li&gt;
&lt;li&gt;SimpleOpenNI &lt;a href="http://code.google.com/p/simple-openni"&gt;http://code.google.com/p/simple-openni&lt;/a&gt; - kinect recording and skeleton tracking in processing&lt;/li&gt;
&lt;li&gt;libfreenect &lt;a href="http://www.openkinect.org"&gt;http://www.openkinect.org&lt;/a&gt; - Cross platform kinect access&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Processing running on Linux, X Forwarded to OS X because apparently
it's impossible to get good screencast software on linux.&lt;/p&gt;
&lt;p&gt;It's missing the side to side bend sensor because I'm still not quite
sure how that motor message works yet, but this is good enough for a
first demo.&lt;/p&gt;
&lt;p&gt;Code is, as usual, available at&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.github.com/qdot/keepoff"&gt;http://www.github.com/qdot/keepoff&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;So, why goth dance? This is a running joke we had at
&lt;a href="http://www.artandcode.com/3d"&gt;the Art and Code conference at CMU last month&lt;/a&gt;.
For those not familiar with the youtube meme:&lt;/p&gt;
&lt;p&gt;
&lt;div class="mdx-video-container"&gt;
&lt;iframe allowfullscreen="true" frameborder="0" mozallowfullscreen="true" src="//www.youtube.com/embed/PvNrjcg3WjA" webkitallowfullscreen="true"&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;/p&gt;
&lt;p&gt;I figured that I might as well turn it into a way to demo new
hardware. Other A&amp;amp;&amp;amp;C people, consider the (velvet, tear stained)
gauntlet thrown down.&lt;/p&gt;</summary></entry><entry><title>Keepon Hacking Proof of Concept</title><link href="/2011/11/14/keepon-hacking-proof-of-concept/" rel="alternate"></link><updated>2011-11-14T00:12:09-08:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2011-11-14:2011/11/14/keepon-hacking-proof-of-concept/</id><summary type="html">&lt;p&gt;We have Keepon control!&lt;/p&gt;
&lt;p&gt;
&lt;div class="mdx-video-container"&gt;
&lt;iframe allowfullscreen="true" frameborder="0" mozallowfullscreen="true" src="//www.youtube.com/embed/P0u2lakH5nc" webkitallowfullscreen="true"&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;/p&gt;
&lt;p&gt;Yay! Thanks to &lt;a href="/2011/11/09/mykeepon-hacking/#comment-359766077"&gt;mAngO on the comment thread for my last keepon post&lt;/a&gt;, we now know that grounding out the bus during keepon's powerup allows you to act as the master to the bus! This means we can now control the motors and sound, as can be seen in the video above. I'm just controlling motors there, using the &lt;a href="http://charlie-roberts.com/Control/"&gt;Control Program for Android&lt;/a&gt; to send OSC messages to a python script I wrote. The python talks to the USB serial port, and the arduino turns the commands coming over serial into I2C to go to keepon.&lt;/p&gt;
&lt;p&gt;All the source code for this is available in completely raw, uncommented form at&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.github.com/qdot/keepoff"&gt;http://www.github.com/qdot/keepoff&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;So, that's the first part finished. Now it's on to polishing things out and figuring out the rest of the parts of the hardware we don't have access to yet. I'm keeping &lt;a href="http://www.github.com/qdot/keepoff/issues"&gt;the github issues list&lt;/a&gt; updated with things we have left to do.&lt;/p&gt;</summary></entry><entry><title>Everything You Ever Wanted to Know about MyKeepon Except for the Parts I Don't Know About Yet</title><link href="/2011/11/09/everything-you-ever-wanted-to-know-about-mykeepon-except-for-the-parts-i-dont-know-about-yet/" rel="alternate"></link><updated>2011-11-09T17:15:09-08:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2011-11-09:2011/11/09/everything-you-ever-wanted-to-know-about-mykeepon-except-for-the-parts-i-dont-know-about-yet/</id><summary type="html">&lt;p&gt;&lt;strong&gt;UPDATE 2013-06-01:&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;While this post still has relevant information, the engineers at
&lt;a href="http://www.beatbots.net"&gt;BeatBots&lt;/a&gt; have created a far more stable
firmware. I highly recommend using their MyKeepon firmware, as it fixes
a lot of the timing issues the KeepOff firmware had. The MyKeepon
firmware is available at:&lt;/p&gt;
&lt;p&gt;&lt;a href="https://github.com/beatbots/MyKeepon"&gt;https://github.com/beatbots/MyKeepon&lt;/a&gt;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;strong&gt;UPDATE 2011-11-14:&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Keepon hacking has made a major step! Thanks to &lt;a href="/2011/11/09/mykeepon-hacking/#comment-359766077"&gt;mAngO on the comment thread for my last keepon post&lt;/a&gt;,
we now know that grounding out the bus during keepon's powerup allows
you to act as the master to the bus!
&lt;a href="/2011/11/14/keepon-hacking-proof-of-concept/"&gt;There's a Proof of Concept video posted on youtube now.&lt;/a&gt;.
I'm leaving the rest of this post as it was when I first wrote it for
history sake, but the information in it plus knowing that you just
need to hold down the I2C lines for a second when the keepon powers up
are enough to actually get control going. The reverse engineering
document and code in the keepoff repository will be updated to reflect
this information.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;I'm really not sure I've never spent so much time cursing at something
so adorable. The past week has been yelling, crying, and generally
losing my emotional shit toward a few servos wrapped in a weird,
sticky plasticy skin, better known as the MyKeepon Dancing Robot. &lt;/p&gt;
&lt;p&gt;How better to atone for my sin of the vivisection of the most adorable
christmas toy this year, than writing up what I found. That way,
future generations can avoid the pain inflicted on it, and the pain it
inflicted on me.&lt;/p&gt;
&lt;p&gt;But good lord, it's so fucking CUTE.&lt;/p&gt;
&lt;p&gt;Usually I wouldn't write this up until after I had things completely
finished, but I gave myself a week deadline for that, and that
deadline passed 2 days ago. I'm still in the middle of a few different
ideas for reversing it, but those could take a while (stupid real life
getting in the way of toy hacking), so I figured I'd dump what
information I do have now.&lt;/p&gt;


&lt;h2&gt;Resources&lt;/h2&gt;
&lt;p&gt;Before I get into technical descriptions, the current resources
available for updates on keepon hacking are:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="http://www.github.com/qdot/keepoff"&gt;The Keepoff Project Github Site&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/qdot/keepoff/blob/master/doc/keepon_reverse_engineering.asciidoc"&gt;Keepon Reverse Engineering Doc&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.twitter.com/qdot"&gt;My twitter account for real time updates&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.flickr.com/photos/qdot76367"&gt;My flickr account, for pictures when I remember to upload them&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Default Interfaces (i.e. What It's Supposed to Do)&lt;/h2&gt;
&lt;p&gt;The MyKeepon robot isn't the most complicated toy in the world. The
main selling points of the toy are that it's cute, it's interactive, it
can dance to your music, and it's cute. &lt;/p&gt;
&lt;p&gt;&lt;CENTER&gt;&lt;a href="https://www.flickr.com/photos/qdot76367/6326487862/" title="photo.JPG by qdot76367, on Flickr"&gt;&lt;img src="https://farm7.static.flickr.com/6214/6326487862_2f74c73eec_m.jpg" width="180" height="240" alt="photo.JPG"&gt;&lt;/a&gt;&lt;/CENTER&gt;&lt;/p&gt;
&lt;p&gt;To establish the interactive cuteness, the following interfaces are
available to users:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;2 Front Buttons, for switching between "Touch" and "Music" mode&lt;/li&gt;
&lt;li&gt;5 Body Buttons, inside the skin of the keepon. 4 around the sides,
  one on the top of the head.&lt;/li&gt;
&lt;li&gt;A Microphone in the nose  &lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;CENTER&gt;&lt;a href="https://www.flickr.com/photos/qdot76367/6325747781/" title="photo.JPG by qdot76367, on Flickr"&gt;&lt;img src="https://farm7.static.flickr.com/6031/6325747781_19068f41db_m.jpg" width="240" height="180" alt="photo.JPG"&gt;&lt;/a&gt;&lt;/CENTER&gt;&lt;/p&gt;
&lt;p&gt;There's 4 degrees of freedom in movement for the MyKeepon (X-Axis
horizontal, Y-Axis up, Z-axis thru):&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Base rotation, that rotates the whole bot around the Y-Axis.&lt;/li&gt;
&lt;li&gt;X-Axis bend, for bending the head/body left/right&lt;/li&gt;
&lt;li&gt;Z-Axis bend, for bending the head/body forward/back&lt;/li&gt;
&lt;li&gt;Y-Axis compression, for 'squatting' (action when top of head button
  is tapped)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;A speaker is also available for playing sounds. There is no volume
knob, sounds are always on, and always loud. The secondary market for
keepons with volume knobs installed is going to be killer.&lt;/p&gt;
&lt;h2&gt;Keepon States&lt;/h2&gt;
&lt;p&gt;MyKeepon has 3 states when powered on:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Touch - Pays attention to buttons, looks around&lt;/li&gt;
&lt;li&gt;Music - Listens for music, or dances to rhythm tapped out using head
  button&lt;/li&gt;
&lt;li&gt;Sleep - Processor(s) in low power state, can be brought out of sleep
  by hitting music button or tapping head, which causes reset line to
  be pulled.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Mechanisms and Power&lt;/h2&gt;
&lt;p&gt;Keepon runs on a 12v supply, using either 8AA batteries in what has to
be the worst enclosure I've ever had the displeasure of jamming
batteries into, or using a 12v plug, not sold with the device (though
they give a TON of info about the plug sizes needed in the manual, a
surprising, rarely seen bit of information). The 12V runs to the
motors, and is stepped down to 3.3v for the internal circuit.&lt;/p&gt;
&lt;p&gt;&lt;CENTER&gt;&lt;a href="https://www.flickr.com/photos/qdot76367/6326491472/" title="photo.JPG by qdot76367, on Flickr"&gt;&lt;img src="https://farm7.static.flickr.com/6221/6326491472_d923ca47ba_m.jpg" width="180" height="240" alt="photo.JPG"&gt;&lt;/a&gt;&lt;/CENTER&gt;&lt;/p&gt;
&lt;p&gt;Each of the motors has an encoding mechanism on it, but I haven't
really done much work figuring out exactly what it is yet. I believe
the largest motor for turning the base is a regular gear motor (see
encoding conjecture below), but the bend motors may be small servos.&lt;/p&gt;
&lt;p&gt;&lt;CENTER&gt;&lt;a href="https://www.flickr.com/photos/qdot76367/6325743721/" title="photo.JPG by qdot76367, on Flickr"&gt;&lt;img src="https://farm7.static.flickr.com/6219/6325743721_6cf38e9db2_m.jpg" width="240" height="180" alt="photo.JPG"&gt;&lt;/a&gt;&lt;/CENTER&gt;&lt;/p&gt;
&lt;p&gt;On the circuit board, there are 3 pins (see picture below) that come
in contact with this piece that's mounted on top of the battery pack.
It's used to recenter the bot on boot, as no state may be available
for motors to know their postion at time of last power off. The bot
being centered is the only position where the middle pin won't have an
electrical contact with one of the outer pins on the circuit board.
This accounts for part of the "startup time" mentioned in the keepon
manual.&lt;/p&gt;
&lt;h2&gt;Circuits and Chips&lt;/h2&gt;
&lt;p&gt;There's one main circuit board in the MyKeepon, with 2 very odd
processors. &lt;/p&gt;
&lt;p&gt;&lt;CENTER&gt;&lt;a href="https://www.flickr.com/photos/qdot76367/6296272375/"
title="Keepon circuit by qdot76367, on Flickr"&gt;&lt;img
src="https://farm7.static.flickr.com/6031/6296272375_8de75e398b.jpg"
width="500" height="375" alt="Keepon circuit"&gt;&lt;/a&gt;&lt;/CENTER&gt;&lt;/p&gt;
&lt;p&gt;The microprocessors are &lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Padauk P234CS20 (P234 chip in a SOP20 package)&lt;/li&gt;
&lt;li&gt;Padauk P232CS14 (P232 chip in a SOP14 package)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Their data sheets describe them as FPPAs, or "Field Programmable
Processor Arrays". I have no idea how this makes them special. They're
dual core, hence the "processor array", but other than that, they look
like they're One Time Programmable, hence the Field Programmable part
being... questionable at best. Also, as marcan found,
&lt;a href="https://twitter.com/#!/marcan42/status/131869545903296512"&gt;they tend to straight up lift figures and paragraphs from the PIC datasheets&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The two processors talk to each other via I2C. The PS232 deals with
sound and encoders and is the slave node on the I2C bus, while the
PS234 handles driving the h-bridges, main processing (including
handling button presses), and is the master node on the I2C bus.
There's more info on this communication in the next section.&lt;/p&gt;
&lt;p&gt;There's 3 H-Bridges, running to the motors listed earlier. &lt;/p&gt;
&lt;h2&gt;I2C Bus and Protocol&lt;/h2&gt;
&lt;p&gt;The MyKeepon developers have been nice enough to provide pads to
access the I2C bus between the processors. Looking at the board with
the solder mask facing up, it's on the lower left hand side of the
board, marked with a gigantic smilie face. So they were damn well
aware of what they were doing.&lt;/p&gt;
&lt;p&gt;For those not familiar with the I2C Bus, there's a good tutorial at
&lt;a href="http://www.i2c-bus.org/"&gt;http://www.i2c-bus.org/&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;The pads exist on the bus between the processors, meaning that you can
see how the master node addresses the slave node. I2C addresses are
for "devices", and there can be up to 127 devices on the bus, that can
be written to/read from. As of this writing, only 2 devices have been
found:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;0x52 - Sound&lt;/li&gt;
&lt;li&gt;0x55 - Motors&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;To trigger a sound, an message is written by the master node to the
bus in the format "0x01 0xWW", where WW is the index of a sound. The
following indexes are known so far (though more sounds are certainly
available, they just haven't been mapped yet):&lt;/p&gt;
&lt;p&gt;Motor messages are 3 bytes, of the format "0xUU 0xWW 0xVV", where:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;0xUU: Motor Index&lt;/li&gt;
&lt;li&gt;0xWW: Motor Position?&lt;/li&gt;
&lt;li&gt;0xVV: Unknown&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;To retreive information about the motors, a read request for 12 bytes
is sent to device 0x55. In normal communications between the
processors, this request is sent every ~15ms by the master node.
Format of the returned information is currenly unknown.&lt;/p&gt;
&lt;p&gt;Information about button presses (either body or base buttons) and
input from the microphone do not seem to be relayed across the I2C
bus.&lt;/p&gt;
&lt;h2&gt;Accessing the I2C Bus&lt;/h2&gt;
&lt;p&gt;Since the protocol of the I2C bus is (mostly) known, the problem
becomes talking to the devices on the bus. The tap exists in between
the chips, and the master node does not seem to have the capabilities
to work on a multimaster bus (not to mention, the default animations
being timing based means we can't stop moves we don't want from being
sent). When the device is asleep, the data and clock lines are pulled
to ground, meaning we can't actually communicate to any of the other
devices.&lt;/p&gt;
&lt;p&gt;The only current solution seems to be to lift the I2C pins on one of
the chips. This seems like the wrong thing to do, since the pads are
obviously available and there's a smilie face above them, letting on
that there's some way to easily access the hardware on the board.&lt;/p&gt;
&lt;h2&gt;Simple Modifications&lt;/h2&gt;
&lt;p&gt;&lt;CENTER&gt;&lt;a href="https://www.flickr.com/photos/qdot76367/6326494230/" title="photo.JPG by qdot76367, on Flickr"&gt;&lt;img src="https://farm7.static.flickr.com/6034/6326494230_3462cb3a72_m.jpg" width="180" height="240" alt="photo.JPG"&gt;&lt;/a&gt;&lt;/CENTER&gt;&lt;/p&gt;
&lt;p&gt;I've only made a couple of small modications so far. The first is
running a wire from the I2C pads out of the bot. I dremeled a small
hole in the back near the plug outlet, and made sure the wire had a
LOT of play inside the bot, since turning pulls it around inside.&lt;/p&gt;
&lt;p&gt;The second is putting a switch in between the speaker wires. This
allows for the sound to be turned off, which is a huge blessing when
you're just trying to watch the bus. A rheostat could also be
installed to turn the volume down. There's a good amount of room for
installation of the rheostat inside the device.&lt;/p&gt;
&lt;h2&gt;What's Left to Do&lt;/h2&gt;
&lt;p&gt;The protocol is known for the motors and sound banks, outside of what
the 12 byte return from the motor device is. I've actually got a board
where I've lifted the I2C pins on the slave processor, and am going to
see whether I can communicate directly with it. Everything I'm using
that talks I2C doesn't deal with arbitration of a multi-master bus, so
it could very well be that the master node on-board will arbitrate
correctly if another node interrupts, but I haven't figured out
whether or not that is true yet. It really does seem like there should
be an easier, non-pin-removing way to speak to the chips on the bus,
and I'm really hoping no one follows my lead on pin lifting before we
figure it out.&lt;/p&gt;
&lt;p&gt;However, once all that's done, we should have a completely USB
controllable keepon, which I then have all sorts of ideas for, except
NO NOT THAT you pervert.&lt;/p&gt;</summary></entry><entry><title>Console Controls Usage and the Kinect SDK</title><link href="/2011/06/16/console-controls-usage-and-the-kinect-sdk/" rel="alternate"></link><updated>2011-06-16T14:15:09-07:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2011-06-16:2011/06/16/console-controls-usage-and-the-kinect-sdk/</id><summary type="html">&lt;p&gt;Oh frabjous fucking day. The &lt;a href="http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/default.aspx"&gt;Microsoft Kinect SDK is out&lt;/a&gt;. Along
with a license that takes a &lt;a href="http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/faq.aspx"&gt;very, very nasty FAQ to explain&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;On this big day in UI development, let's take a look over the current
console controls landscape, and what it means to non-game
developers.&lt;/p&gt;
&lt;p&gt;Why focus on game consoles controls? They've driven down sensor prices
like crazy, due to mass manufacturing and required price points for
game sales. They've established more than a few careers of
non-game-developers now. Uses of the kinect and the wiimote for
projects not pertaining to their original console have been all over
the media lately. Keeping a forecast of where development for these
technologies is going means we have a better idea of how to ride the
wave when it comes.&lt;/p&gt;
&lt;h2&gt;Disclaimers&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;In terms of licensing issues, I am not a lawyer. I do not play one
  on TV. However, I do have a lawyer fursona.&lt;/li&gt;
&lt;li&gt;While I am part of the OpenKinect project, I do not speak for others
  involved in the project. All opinions expressed here are my own, and
  all cursing is far fucking better than anyone else on the project
  could turn out, so while I may share my source code, I'm not giving
  them rights to that.&lt;/li&gt;
&lt;li&gt;I strive to keep all the information as correct as possible, but,
  well, I've been drinking.&lt;/li&gt;
&lt;li&gt;I am not a game developer. I am a reverse engineer that specializes
  in controls and interface devices. My view of this hardware is
  purely from the driver and capabilities side.&lt;/li&gt;
&lt;li&gt;I have not directly used the Move SDK or Kinect SDK. But I have read
  some articles and created very strong opinions, which means they are
  valid for internet consumption.&lt;/li&gt;
&lt;li&gt;This article is only about reversing/using alternative console
  controllers, not about reversing consoles themselves. There's a
  completely different history to that which would take much more than
  a blog post to cover, though I will admit that it does have some
  influence on the information here..&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;Nintendo Wiimote&lt;/h2&gt;
&lt;p&gt;The WiiMote was first out of the gate, so let's start with it. &lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-wiimote.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;You get:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;100hz update for 4 IR points with 4 bits of depth @ 1024x768&lt;/li&gt;
&lt;li&gt;3-axis Accelerometer&lt;/li&gt;
&lt;li&gt;Bluetooth Communication&lt;/li&gt;
&lt;li&gt;3-axis Gyro with WiiMotion Plus&lt;/li&gt;
&lt;li&gt;Extensible Control Port&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;What Nintendo provide in terms of software&lt;/h3&gt;
&lt;p&gt;Nothing.&lt;/p&gt;
&lt;h3&gt;What the community provides in terms of software&lt;/h3&gt;
&lt;p&gt;The &lt;a href="http://wiibrew.org/wiki/Wiimote"&gt;Wiimote protocols have been reversed and known for years&lt;/a&gt;, and
there's availability in pretty much every language you can think of.&lt;/p&gt;
&lt;h3&gt;What Nintendo thinks of non-game developers&lt;/h3&gt;
&lt;p&gt;Are you Zelda? Are you Mario? No. No you are not. Therefore, go fuck
yourself.&lt;/p&gt;
&lt;p&gt;Nintendo hasn't put out anything in terms of press releases about the
DIY community during the lifetime of the Wiimote. They're happy to let
them live seperately, and that's a fine strategy. Nintendo has to put
zero into support, they aren't actively stopping people from using the
wiimote, and developers can survive on their own.&lt;/p&gt;
&lt;h3&gt;Where Nintendo is going with it&lt;/h3&gt;
&lt;p&gt;Honestly? Not real sure here. With the upcoming release of the WiiU
controller, there wasn't a lot of talk about Nintendo's flaily
controls strategy.&lt;/p&gt;
&lt;h3&gt;Where the community is going with it&lt;/h3&gt;
&lt;p&gt;Where haven't they? There's &lt;a href="http://johnnylee.net/projects/wii/"&gt;Johnny Lee's demos&lt;/a&gt;, there's
&lt;a href="http://www.colorsaregood.de/index.php?cont=4&amp;amp;inhalt=oioo"&gt;sex toys&lt;/a&gt;, there's more "generative art" than you could shake a
wiimote at. The Wiimote is as ubiquitous as alternative controls get
these days.&lt;/p&gt;
&lt;p&gt;With the WiiMotion Plus, it even turns into a nice IMU. The WiiMote
still has some life to it as a cheap, extensible sensor platform,
especially with the amount of prior usage it has seen already. Outside
of a somewhat flakey bluetooth interface, the "just works"ness of it
will keep it alive for a while to come.&lt;/p&gt;
&lt;h2&gt;Nintendo WiiU Controller&lt;/h2&gt;
&lt;p&gt;I don't even know if this'll exist in a year, but it's been
announced, so might as well talk about it.&lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-wiiu.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;You Get:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A 6.2inch screen&lt;/li&gt;
&lt;li&gt;No idea on communications method, but enough bandwidth to pipe over video in real time&lt;/li&gt;
&lt;li&gt;Joystick&lt;/li&gt;
&lt;li&gt;3-axis Accel/Gyro&lt;/li&gt;
&lt;li&gt;Front Facing Camera&lt;/li&gt;
&lt;li&gt;Probably other stuff that I'm missing or they'll add.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;What Nintendo provide in terms of software&lt;/h3&gt;
&lt;p&gt;Well, it's not gonna come out for another 18 months, so I have no
idea. It may not even look the same by then.&lt;/p&gt;
&lt;h3&gt;Where Nintendo's going with it&lt;/h3&gt;
&lt;p&gt;I... don't know, and I don't think that's good. It's almost &lt;em&gt;too&lt;/em&gt;
integrated, when we're already seeing new controllers that provide
both physical controls and a detachable screen.&lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-livid.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;&lt;small markdown='1'&gt;Above: &lt;a href="http://lividinstruments.com/"&gt;Livid Instruments Controller&lt;/a&gt; integrated with iPad&lt;/small&gt;&lt;/p&gt;
&lt;p&gt;These are using tablets that, by the release time of the WiiU, should
be under $100. Between the Android OpenAccessories SDK and whatever
Apple decides to do, this will be far beyond replicated by
release. Not only that, the rumors are flying about the WiiU only
working with one controller per console.&lt;/p&gt;
&lt;p&gt;In the end, it's really too far out to make accurate predictions for
this, though that's obviously not stopping me from trying. I thought
the GBA interface to Gamecube games was great, but, well, it could
still be a GBA after that too. This... Who knows. It doesn't seems as
mind breaking at the Wiimote did, but then again, you only get to
one-up the joystick once.&lt;/p&gt;
&lt;h3&gt;What Nintendo think of non-game developers&lt;/h3&gt;
&lt;p&gt;You still won't be Mario or Zelda, so you can still go fuck yourself. &lt;/p&gt;
&lt;p&gt;More important this time though, why would it be easier for you to
work with Nintendo's hardware than with a tablet and another
accessory? Nintendo has already proven time and time again they don't
care to support developers outside of console games.  Doesn't seem
like this will gain community traction in its current state.&lt;/p&gt;
&lt;h2&gt;Playstation Move&lt;/h2&gt;
&lt;p&gt;Sony's entry into the gaming market: a wand with a light on it.&lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-move.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;You get:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;A wand with a light on it&lt;/li&gt;
&lt;li&gt;Bluetooth connection&lt;/li&gt;
&lt;li&gt;Accelerometer&lt;/li&gt;
&lt;li&gt;Gyro&lt;/li&gt;
&lt;li&gt;EyeToy Camera to watch wand with light on it&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;What Sony provides in terms of software&lt;/h3&gt;
&lt;p&gt;The &lt;a href="http://us.playstation.com/ps3/playstation-move/move-me/"&gt;MoveMe SDK&lt;/a&gt;, announced at GDC, not yet released. This allows
you to get full real-time 3d positioning from Move controllers.&lt;/p&gt;
&lt;p&gt;It requires you to get this information via a network connection to
your PS3. The tracking algorithms are locked onto the PS3, and you
have to have the console running special server software to use the
SDK.&lt;/p&gt;
&lt;p&gt;To develop for the Move, you have a minimum investment of around $600,
for the controller plus the console. A fanboy would tell you that you
also get a Blu-Ray player out of that. I am not a fanboy.&lt;/p&gt;
&lt;h3&gt;What the community provides in terms of software&lt;/h3&gt;
&lt;p&gt;There's the &lt;a href="http://code.google.com/p/moveonpc/"&gt;MoveOnPC project&lt;/a&gt;, which is an effort to create open
source drivers for using the Move as a control mechanism. They're not
real far along yet, but the project is at least active.&lt;/p&gt;
&lt;h3&gt;What Sony think of non-game developers&lt;/h3&gt;
&lt;p&gt;"Absolutely adorable" is about the best thing I can think of. They
think the homebrew and DIY community is cute, and they seem to view
homebrew devs, much like their customers, as lesser beings. Weirder
still, they seem to harbor both fear and disdain at the same time. For
instance, as seen through the way they released the SDK. They're
making SURE you have to have a console, and that you only get what
they want, and getting to their algos will require both hardware and
software reversing.&lt;/p&gt;
&lt;h3&gt;Where Sony's going with it&lt;/h3&gt;
&lt;p&gt;Same idea as the Wiimote, except more accurate positioning. Build
plastic toys around it, use those as controls. Oh boy.&lt;/p&gt;
&lt;h3&gt;Where the community is going with it&lt;/h3&gt;
&lt;p&gt;This turns the Nintendo view around... Does the community really care?
I haven't seen bounties out for the Move. I haven't seen message
boards filled with people working on it. This time, it seems like the
company would be insulting the community with the way they released
their SDK, but the community doesn't even care in the first place.&lt;/p&gt;
&lt;h2&gt;Microsoft Kinect&lt;/h2&gt;
&lt;p&gt;For the Kinect, we've got so many different solutions out there now
that I'm actually splitting them into their own sections.&lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-teardown.jpg" /&gt;&lt;/p&gt;
&lt;p&gt;You get:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;640x480 RGB @ 30hz&lt;/li&gt;
&lt;li&gt;320x240 Depth @ 30hz, .5-4m depth range&lt;/li&gt;
&lt;li&gt;USB 2.0 High Speed Connection&lt;/li&gt;
&lt;li&gt;Microphone Array&lt;/li&gt;
&lt;li&gt;LED, Accelerometer&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;Microsoft Kinect - OpenKinect&lt;/h2&gt;
&lt;p&gt;&lt;a href="http://www.openkinect.org"&gt;OpenKinect&lt;/a&gt; was the project that sprung up around the
&lt;a href="http://www.adafruit.com/blog/2010/11/10/we-have-a-winner-open-kinect-drivers-released-winner-will-use-3k-for-more-hacking-plus-an-additional-2k-goes-to-the-eff/"&gt;OpenKinect Bounty hosted by Adafruit Industries&lt;/a&gt; in November
2010. If you've like a more in-depth history, check out
&lt;a href="http://fora.tv/2011/05/21/Kyle_Machulis_OpenKinect"&gt;my presentation on it at Maker Faire 2011&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-openkinect.png" /&gt;&lt;/p&gt;
&lt;p&gt;As I said in the intro, while I am part of the OpenKinect project,
what I am stating about the project here is my own opinion, and does
not speak for other members of the project.&lt;/p&gt;
&lt;h3&gt;What OpenKinect provides&lt;/h3&gt;
&lt;p&gt;An open-source, cross-platform method for accessing raw data coming
off of the kinect. Nothing more, nothing less. It was the first to
publically provide access to images, leds, and accelerometers on all
major platforms. Audio support is in the works, and has been taken to
the proof of concept stage.&lt;/p&gt;
&lt;p&gt;There's been talk of including processing algorithms created by the
open source community, in a seperate library. This part of the project
has not yet taken shape, though, and most concentration lies in
finishing the driver.&lt;/p&gt;
&lt;h3&gt;What OpenKinect think of non-game developers&lt;/h3&gt;
&lt;p&gt;Not to sound harsh, but they just don't. They don't really think about
any kind of specific developer. The driver exists to be just the
driver, and this is the simplest way to serve that goal. You take it
and do whatever you want with it. Have fun.&lt;/p&gt;
&lt;p&gt;What this means is that, unlike OpenNI and Microsoft's SDK, OpenKinect
is the easiest way to get the raw data from the camera. If you are
interested in doing something other than skeleton tracking, this makes
it the lowest barrier to entry.&lt;/p&gt;
&lt;h3&gt;Where OpenKinect is going with it&lt;/h3&gt;
&lt;p&gt;At this point, it's all about finishing providing the features of the
camera. This mostly has to do with the audio core, as the camera
features are fairly well covered.&lt;/p&gt;
&lt;p&gt;In terms of what will happen with OpenKinect now that the Microsoft
SDK is out, I think the answer is "not much". There's a fork of
OpenNI's sensor library that uses OpenKinect to talk to the kinect and
provide the rest of the OpenNI capabilities for that camera. MS has no
interest in supporting non-windows 7 platforms, so there will
certainly still be a place for OpenKinect in the Kinect ecosystem.&lt;/p&gt;
&lt;p&gt;Not only that, Windows has been by far the hardest platform to deal
with for development and support for OpenKinect. I actually hope that
we can build an API wrapper around the MS SDK, to make it fit with
OpenKinect without having to switch out drivers. However, having not
read much of the MS SDK documentation as of yet, this remains to be
seen. &lt;/p&gt;
&lt;p&gt;Also, with the licensing terms as they currently are, OpenKinect
remains commercially viable on all platforms, while Microsoft's SDK
specifically prohibits that kind of usage.&lt;/p&gt;
&lt;h2&gt;Microsoft Kinect - PrimeSense/OpenNI&lt;/h2&gt;
&lt;p&gt;A few weeks after the OpenKinect project put out the source for their
library, depth camera chip manufacturer &lt;a href="http://www.primesense.com"&gt;PrimeSense&lt;/a&gt; followed suit
with their own SDK, known as &lt;a href="http://www.openni.org"&gt;OpenNI&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-openni.png" /&gt;&lt;/p&gt;
&lt;p&gt;OpenNI is both a library and an initiative. The library is the
implementation of a standard that PrimeSense is pushing to be a
standard SDK for future depth sensors. The initiative part includes
multiple companies on board with this standard.&lt;/p&gt;
&lt;h3&gt;What PrimeSense provides&lt;/h3&gt;
&lt;p&gt;PrimeSense provides 3 different libraries, all cross-platform, with
varying open/closed source policies:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Sensor - Hardware access library. Their version only supports their
  PrimeSense SDK camera, but there's a version that's build on top of
  OpenKinect for kinect access. Open Source.&lt;/li&gt;
&lt;li&gt;OpenNI - This is supposed to be an "abstraction library for depth
  cameras". Which I guess translates into "huge C++ beast". It's a
  way to abstract depth camera information so you can use anything
  providing depth data with any program that will take depth
  data. Seeing the only consumer depth cameras are PrimeSense's, this
  works out well for them. Open Source.&lt;/li&gt;
&lt;li&gt;NITE - This is PrimeSense's body/skeleton tracking library. Unlike
  Microsoft's algorithms, NITE requires a calibration pose to find
  skeletons, meaning you have to stand in front of the camera and hold
  your arms in a certain way for it to find you in the scene. This can
  be flakey sometimes. Unlike Microsoft's libraries, NITE is available
  for commercial use. It is closed source, but the binaries are free.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;What they think of non-game developers&lt;/h3&gt;
&lt;p&gt;Developers, via OpenNI, are marketing for PrimeSense.&lt;/p&gt;
&lt;p&gt;Funny enough, PrimeSense doesn't actually care about game
developers. It's not really even their domain. Primesense wants to own
the home theatre remote control market, hence their focus on things
like the Asus Wavi Xtion. The battle for being able to control your
media consumption is far larger than the battle to control how you
flail in front of your TV and call it control. More people watch TV
and movies than play video games, and the amount of hours logged on
watching activities is exponentially higher. TV Remotes are getting
unwieldy now, so the next step that is apparently logical to someone
with money is that we now wave our hands around to start and stop our
movies, or change our channels.&lt;/p&gt;
&lt;p&gt;On top of this, PrimeSense is not a camera company, they are a chip
company. They need to be able to sell their chip in large quantities
to people who will manufacture cameras with it. Therefore, if
developers make applications with OpenNI/NITE, and OpenNI/NITE will
"just work" with any PrimeSense camera, then PrimeSense gets to claim
they are "open", and anyone that manufactures a camera with a
PrimeSense chip will have access to all the applications that've
already been written by other developers.&lt;/p&gt;
&lt;h3&gt;Where they're going with it&lt;/h3&gt;
&lt;p&gt;OpenNI/NITE is another weapon in the battle for the home theatre as
home information hub. That's why PrimeSense wants the chip and
software everywhere, not just in kinects. So, they're betting both for
and against Microsoft at the same time, which is a very interesting
position to be in.&lt;/p&gt;
&lt;p&gt;If NITE doesn't support non-calibration poses very soon, it's going to
lose out to SDKs that do. MS has proven you can track bodies without
calibration, even if it does take thousands of hours of video to
analyze through advanced algorithms. Now that MS's SDK is out and does
not require calibration poses, people's expectations are quickly going
to change. NITE still has a foothold on the commercial licensing side,
but that doesn't help much for consumer expectations of product.&lt;/p&gt;
&lt;h2&gt;Microsoft Kinect - Microsoft's SDK&lt;/h2&gt;
&lt;p&gt;On June 16th 2011,
&lt;a href="http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/"&gt;Microsoft released their own SDK for the kinect&lt;/a&gt;, much to the
surprise of just about everyone, since there had been no communication
since March about it.&lt;/p&gt;
&lt;p&gt;&lt;img alt="" src="/images/2011-06-16-console-controls-usage-and-the-kinect-sdk/kinect-mssdk.png" /&gt;&lt;/p&gt;
&lt;p&gt;As of this writing, the SDK is still considered "beta", and is only
available for non-commercial use.&lt;/p&gt;
&lt;h3&gt;What Microsoft provides&lt;/h3&gt;
&lt;p&gt;Seamless skeleton tracking, which is a huge deal. While the algorithms
have been released and replicated in open source, the advantage that
MS has here is big data. Thousands of hours of training sets to send
through the algorithm so that it can find any body type, meaning that
people can walk in and out of the scene and instantly be recognized
and tracked. Seriously fucking &lt;em&gt;BIG&lt;/em&gt; deal.&lt;/p&gt;
&lt;p&gt;There is also access to the audio system, which no project has gotten
a complete hold on yet. OpenKinect has some access to the raw audio
stream, but Microsoft provides a voice SDK that allows developers to
identify and position users based on sound.&lt;/p&gt;
&lt;h3&gt;What they think of non-game developers&lt;/h3&gt;
&lt;p&gt;With the beta licensing terms, developers are somewhere between
fanboys and marketing. Open Source developers are back on the bad
side, too.&lt;/p&gt;
&lt;p&gt;The &lt;a href="http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/faq.aspx"&gt;Beta SDK FAQ&lt;/a&gt; is chock full of interesting issues, including:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Not being able to distribute applications standalone, users must
  also download SDK to get the runtimes&lt;/li&gt;
&lt;li&gt;No commercial use, and on top of that, due to the fact that MS
  cannot predict the usage of the kinect SDK, all SDK derived
  applications should not be considered "allowed under the SDK".&lt;/li&gt;
&lt;li&gt;Microsoft owns the right to say what software you can use the
  hardware with, and using the kinect with anything outside of the SDK
  is not allowed. Even with this wording in place, the MSDN Channel 9
  launch video lauded all of the open source applications currently
  available for the kinect.&lt;/li&gt;
&lt;li&gt;The SDK will not run on Virtual Machines&lt;/li&gt;
&lt;li&gt;Refusal for any SDK derived application code to be released
  under copyleft licenses (GPL, etc...)&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;In other words, you can write kinect apps for yourself, then you can
upload videos of those apps to show off how awesome the SDK is, but
redistribution is not allowed, and neither is selling. There's no word
on when a commercial SDK will be available or how much it will cost.&lt;/p&gt;
&lt;p&gt;Of course, MS can't really track most of what they claim in the FAQ,
but it seems to be worded in a very predatory way. It's still better
than Sony's "lock down the algorithms on the console" idea, and it's
not really enforcable for a lot of projects. Whether anyone but nerds
like me will give a shit about that is a completely different question
that we're not going to answer because it's my blog and I'm the center
of the universe here.&lt;/p&gt;
&lt;h3&gt;Where they're going with it&lt;/h3&gt;
&lt;p&gt;Everywhere, and into everything, as fast as possible. Which, for a
company the size of Microsoft, will not be all that quick. There's
already talk of Windows 8 shipping with kinect drivers.&lt;/p&gt;
&lt;p&gt;With the algorithms and samples Microsoft has put out in their SDK,
they've jumped way ahead of the other solutions in the NUI
game. Without the calibration pose requirement, MS SDK programs should
"just work" for people coming in to/out of the scene. It adds access
to the audio core that no other solution can get anywhere near at the
moment. For the time being, MS is now winning the NUI game in terms of
capabilities available to developers. &lt;/p&gt;
&lt;p&gt;At least, for developers want to make non-distributable demos. Since
the license is still non-commercial only, that's all they're going to
win. This could end up pissing off the industry enough that they find
some way to replicate it without Microsoft's terms. At that point, we
have a very fun war on our hands.&lt;/p&gt;
&lt;h3&gt;Where the community is going with it&lt;/h3&gt;
&lt;p&gt;There's not much of a community to speak of since it's been out all of
10 hours as I write this. While I know &lt;a href="http://www.codeplex.com"&gt;CodePlex&lt;/a&gt; is huge, I've
never really dealt with the MS Open Source Community, or, well, any MS
community period in the past decade or so. I guess we'll see what
happens. I'm certainly interested to see how the cultural philosophies
of OpenKinect versus MS SDK influence the projects that come out of
them. &lt;/p&gt;
&lt;p&gt;For a better perspective on this, I defer to &lt;a href="http://nui.joshland.org/"&gt;Josh Blake&lt;/a&gt;, who I
am signing up to write an article on it before even asking him. Hi
Josh!&lt;/p&gt;
&lt;h2&gt;Conclusion ##&lt;/h2&gt;
&lt;p&gt;So, that's how I see things going for creative outside development on
console controls for the time being. Take it with a variable sized
grain of salt.&lt;/p&gt;
&lt;p&gt;For the wiimote, it's gonna keep dropping from an already fairly
cheap price, and the fact that you don't have to solder things to it
makes it very handy for prototyping. It'll live for a while longer.&lt;/p&gt;
&lt;p&gt;I wouldn't be surprised if the move never sees the light of day in the
maker community. It's just not getting any sort of traction from
either side, and it seems easy enough to just replicate at some point.&lt;/p&gt;
&lt;p&gt;For the Kinect, MS has a good grip on the NUI market now. However,
there are many other uses for depth cameras that don't require a body
to be in front of it. 3d modeling, robotics, art, etc... Yes, the
kinect was made to track bodies, and that's what the camera range and
other properties of the camera reflect. While the MS SDK will make
access the raw data easier on windows, it won't completely overtake
the Kinect development world.&lt;/p&gt;
&lt;p&gt;It'll also be interesting to see if the open source community can
figure out a way to source enough data to train their own algorithms,
and provide MS with some competition. We've thrown this idea around at
the &lt;a href="http://www.meetup.com/3DVision/"&gt;SF 3D Vision Meetup&lt;/a&gt; and in the &lt;a href="http://openkinect.org/wiki/IRC"&gt;OpenKinect IRC channel&lt;/a&gt;,
but that's a huge undertaking. It signals the next big move for the
open source community though. We've proven out big code, and see huge
projects released as open source. Now it's time to start playing more
with open big data.&lt;/p&gt;</summary></entry><entry><title>Upcoming Speaking Engagements</title><link href="/2011/05/13/upcoming-speaking-engagements/" rel="alternate"></link><updated>2011-05-13T20:12:12-07:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2011-05-13:2011/05/13/upcoming-speaking-engagements/</id><summary type="html">&lt;p&gt;This summer is shaping up to be a busy one...&lt;/p&gt;
&lt;p&gt;&lt;a href="http://makerfaire.com"&gt;&lt;img alt="" src="/images/2011-05-13-upcoming-speaking-engagements/makerfaire.gif" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;I'm speaking at &lt;a href="http://www.makerfaire.com"&gt;Maker Faire&lt;/a&gt;. Twice, even!&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;2011-05-21 3:30pm - &lt;a href="http://www.openyou.org"&gt;OpenYou.org&lt;/a&gt; Presentation, Health 2.0 Stage&lt;/li&gt;
&lt;li&gt;2011-05-21 6:00pm - &lt;a href="http://www.openkinect.org"&gt;OpenKinect&lt;/a&gt; Presentation, Main Stage&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a href="http://quantifiedself.com/conference/"&gt;&lt;img alt="" src="/images/2011-05-13-upcoming-speaking-engagements/qs_conf.png" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Then there's the &lt;a href="http://quantifiedself.com/conference"&gt;Quantified Self Conference&lt;/a&gt; on May 28-29th,
2011, at the Computer History Museum in Mountain View, CA. There's no
central presentation, but honestly, I probably won't stop talking at
any point during the 2 days, as I have a table at the expo, plus will
be helping out with the health hardware session and the hackathon.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://en.www.netexplorateur.org/"&gt;&lt;img alt="" src="/images/2011-05-13-upcoming-speaking-engagements/netexplore_zoom.jpg" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;On June 23rd, I'll be doing a presentation on the &lt;a href="http://www.openkinect.org"&gt;OpenKinect&lt;/a&gt;
project at &lt;a href="http://en.www.netexplorateur.org/"&gt;NetExplorateur Zoom 2011&lt;/a&gt; in Paris.&lt;/p&gt;</summary></entry><entry><title>XIO - Novint's Gaming Exoskeleton</title><link href="/2011/04/24/xio-novints-gaming-exoskeleton/" rel="alternate"></link><updated>2011-04-24T22:12:21-07:00</updated><author><name>Kyle Machulis</name></author><id>tag:,2011-04-24:2011/04/24/xio-novints-gaming-exoskeleton/</id><summary type="html">&lt;p&gt;I'm not really used to writing about hardware that's hardly to
prototype stage yet, but damn, I cannot wait 'til this comes out so I
can start reverse engineering it.&lt;/p&gt;
&lt;p&gt;&lt;a href="http://www.novint.com"&gt;Novint&lt;/a&gt;, the company that manufacturers the &lt;a href="http://novint.com/index.php?option=com_content&amp;amp;view=article&amp;amp;id=39&amp;amp;Itemid=175"&gt;Falcon&lt;/a&gt; haptic
device (that I wrote/maintain the cross-platform &lt;a href="http://www.github.com/qdot/libnifalcon"&gt;libnifalcon&lt;/a&gt; for
- if you aren't familiar with the falcon,
&lt;a href="/2008/03/25/everything-i-know-about-the-novint-falcon-as-of-march-2008/"&gt;check out this rather exhaustive article I wrote on it a couple of years ago&lt;/a&gt;),
recently announced a merger with another company.&lt;/p&gt;
&lt;p&gt;The other company in the deal, Forcetek Enterprises, doesn't even seem
to exist outside of the PR about this merger (&lt;em&gt;UPDATE:&lt;/em&gt;
&lt;a href="http://www.forcetekusa.com/"&gt;Ok, I actually found their old website finally.&lt;/a&gt; Apparently this
was shown at E3?). Successful stealth mode.&lt;/p&gt;
&lt;p&gt;What came out of the merger...&lt;/p&gt;
&lt;p&gt;&lt;a href="http://novint.com/index.php?option=com_content&amp;amp;view=article&amp;amp;id=76&amp;amp;Itemid=178"&gt;&lt;img alt="" src="/images/2011-04-20-xio-novints-gaming-exoskeleton/xioarm.jpg" /&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;is a partial exoskeleton for gaming.&lt;/p&gt;


&lt;p&gt;The XIO is a sleeve exoskeleton that allows you to feel forces
throughout the arm, versus just through the hand like the Novint
Falcon. There is actuation along the arm and elbow, meaning much
larger force distribution and a much more immersive feel. In applied
terms, this means that you'll be able to feel things like gun kickback
all the way through your shoulder, versus just your hand. You can also
do interesting things like simulating weight and fatigue in the arms
by restricting certain movements.&lt;/p&gt;
&lt;p&gt;This combined with depth cameras like the kinect could be HUGE.
Players would have both full body tracking AND at least partial body
actuation, which is better than the "flail without feedback" option
we've had for years with the Power Glove/P5/Wiimote/Kinect/Move.&lt;/p&gt;
&lt;p&gt;The demo video below shows a full VR rig built from consumer
hardware, using&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Vuzix HMD&lt;/li&gt;
&lt;li&gt;XIO&lt;/li&gt;
&lt;li&gt;&lt;a href="http://www.tngames.com"&gt;TNGames Third Space Vest&lt;/a&gt; (&lt;a href="http://www.github.com/qdot/libthirdspacevest"&gt;Which I also write drivers for!&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;Some new gun controller&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;
&lt;div class="mdx-video-container"&gt;
&lt;iframe allowfullscreen="true" frameborder="0" mozallowfullscreen="true" src="//www.youtube.com/embed/lV3j2Yxv7jY" webkitallowfullscreen="true"&gt;&lt;/iframe&gt;
&lt;/div&gt;
&lt;/p&gt;
&lt;p&gt;The XIO is supposed to integrate with Novint's &lt;em&gt;F-Gen&lt;/em&gt; drivers. F-Gen
is an abstract programmer layer (similar to &lt;a href="http://sites.google.com/site/carlkenner/glovepie"&gt;GlovePIE&lt;/a&gt;) made to
implement haptics on top of arbitrary games, instead of doing direct
game integration (which Novint has with things like Source Engine
games and the Penumbra series). This allows users to possibly script
haptics to whatever game they want. How well this works, I have no
idea, but it means that any game the falcon supports should also be
supported by the XIO on release.&lt;/p&gt;
&lt;p&gt;Novint's yet again done a horrible job of the PR with this one, as it
looks like no one has really picked up the news yet even though the
press release happened over a week ago, and now all of the images on
the front page of their site are broken. &lt;a href="http://www.twitter.com/tomlucient"&gt;Novint's CEO even changed his
twitter account on the launch&lt;/a&gt; for reasons I'm not real sure
of. &lt;/p&gt;
&lt;p&gt;That said, I've been incredibly happy with Novint's quality of
engineering on the Falcon. Novint knows how to make extensible
hardware, as they've shown with the grip and firmware system on the
falcon. It was a joy to reverse engineer, and I'm hoping that follows
onto this as well. I can't wait to get my hands on (and in) the XIO.&lt;/p&gt;</summary></entry></feed>