Thursday, April 19, 2018

Is it (almost) possible to create a sentient being with Amazon Web Services?

During my notable absence, I've been freelancing quite a bit -- anything from algorithmic cryptocurrency trading bots to an Alexa-controlled robotic hand. I'd been booting EC2 instances and storing things ever-so-handily in S3 buckets for quite some time, but with the massive influx of cutting-edge AWS tools, I got to thinking...
Is it possible to create a semi-sentient entity ("brain") using Amazon Web Services?
 I don't know the answer, but I want to find out. Here's a preview of what I think is pursuit-worthy in this endeavor:

Sure, I'm biased given my neuroscience background, but I see many if not most of the components of a human brain in this mix. S3-Temporal Lobe, Lambda-Thalamus, SageMaker-V1/V2... That doesn't even scratch the surface. I see association cortices here too! Not so sure about how to achieve proprioception yet... More to follow.

Sunday, April 23, 2017

Using the Awus AC1200 (AWUS036ACH) with Kali Linux 2.0 (4.9.0-amd64 Kernel)

Just a quick post today, and sorry for the hiatus. I've been super busy working on my next steps towards a new career!

I've used Kali Linux for years, basically since the switch from Backtrack. I also have been using the AWUS036NH for my wireless pen testing activities, and just today I received my new AWUS036ACH. It was kind of an impulse buy, but it's an upgrade in speed, new band (5.8GHz), and a few other goodies.

Unfortunately, I had some real trouble getting the rtl8821au drivers installed, but the fix turned out to be simple...and not quite sure why it ended up working. I also noticed there is a DKMS package in the Kali repo, realtek-rtl88xxau-dkms, but it would never compile the module since it wanted the 4.3 Linux kernel.

First, uninstall all linux-headers you currently have, and I also uninstalled the older linux-image (4.0). Next, install the newest kernel image and headers, which in my case was 4.9.

Next, grab the great code from Github, which was made by Astsam and patched for monitor mode and packet injection found at To include the ability to set txpower using iwconfig, we also need to make sure to grab the v4.3.21 branch. To do so, I used: git clone -b v4.3.21 From there, simply make, and make install.

The key for me was purging the old linux image and headers, then reinstalling the new headers. I tried 4 or 5 different drivers, including Astsam's, with absolutely no luck compiling. For whatever reason, the reinstall allowed the driver to compile without issues, and now I'm up and running!

Tuesday, January 31, 2017

A Simple Transistor Circuit for Driving an LED with PWM from an Arduino

I have a confession to make. I've spent the last 4 years constantly engaged in programming and simple circuit building, and after all this time...I've never used a transistor in a circuit. It's not that I didn't want to, or that I didn't realize the critical importance of transistors, it was just that each time I investigated using one, I never fully wrapped my head around it and was able to use a relay in its place for the projects I was working on.

Arduino with simple NPN transistor circuit driving LED.

If using a transistor is second nature for you, the rest of this post will probably be fairly boring...but I thought it could help people like me who are working with them for the first time.

Sunday, January 29, 2017

[Part 11c] Icarus ONE: High-altitude Balloon (The Code -- Raspberry Pi 3)

DISCLAIMER: As with the code before, it’s worth mentioning that I’m entirely self-taught, and I’m certain that there are many more elegant approaches to what I’ve done here. Everything works though, so I'm calling it a victory for now, but I’d love to hear any suggestions or thoughts.

Raspberry Pi 3 (Python Media Control Code)

The main Python camera control script only performs a few functions, but is responsible for coordinating most of the actions that the Raspberry Pi performs on Icarus’ journey. There are 4 specific capture modes that are active over the course of the flight, each responsible for controlling photo and video capture from each of the 3 onboard cameras. The parameters for each phase are optimized (by educated guessing) to capture the highest quality media at each flight phase. For example, the down-facing webcam captures video down at the launch site as it ascends, shutting off at 5,000ft – hypothetically, it should look pretty cool watching the launch team get smaller and smaller as Icarus rises.

The libraries used were fairly standard, with a few exceptions. The Python “picamera” library allows easy media capture from the RPI camera, and was a straight-forward solution. Media capture using this library comes directly from the script, with no external handlers necessary. The webcams, however, required slightly more work to integrate. Luckily, the RPi recognizes most mainstream webcams natively, assuming you’re running Raspbian. Thanks to Dave Akerman (as with many other things), I use “fswebcam” as the still image capture solution for the webcams and was able to do so with minimal setup (

The Python “serial” library is included to handle communication between the RPi and the Mega. This library is well documented and straight-forward to use. The main factor in making Python serial work is in choosing the correct device. I saw tutorials referencing different end-points, but mine happened to be “/dev/ttyACM0.” This can be found by opening a terminal and navigating to the “/dev” folder. When an Arduino is plugged into the RPi via USB, a new object appears in the folder, and this is what needs to be referenced.

Sunday, January 22, 2017

[Part 11b] Icarus ONE: High-altitude Balloon (The Code -- Arduino Nano & Arduino Uno)

Arduino Nano & The “Selfie” Servo Code

The code on the Nano is very simple, with the single task of deploying a servo motor to a defined position when a signal is received from the Mega. The signal is just a digital input of HIGH triggered by the Mega. The Nano waits for this pin to pull HIGH, deploys the servo, then waits for the pin to pull LOW again. When it does, the servo returns to its retracted position. The values for servo retracted position (20) and servo deployed position (100) were determined through a little testing. These values create an arc of ~80 degrees, keeping the “selfie” photo out of the way when retracted and directly in front of the PiCamera when deployed.

Download the full Arduino sketch here: hab_camservo.ino

Saturday, January 21, 2017

[Part 11a] Icarus ONE: High-altitude Balloon (The Code -- Background & Arduino Mega)

The Code (Background)

DISCLAIMER: Any and all programming skills I have are entirely self-taught. I’ve slowly picked up better, cleaner, and more accepted conventions in my program structure, etc., but there are almost certainly many better approaches to the elements I’ve included in this project. If anyone has suggestions, criticisms, or input, I’d be more than happy to hear your thoughts, so I can make improvements in future iterations.

I’ll try to explain the code as simply as I can, without getting too wordy with all of the unimportant details. With me, that’s more easily said than done, but here goes nothing…

Approach, Challenges & Lessons Learned:

Within the entire body of code of Icarus, only a few programming languages are used (those that I’m comfortable with): Arduino, Python, and shell scripting. While limited in range of programming approaches, I tried to utilize the languages I know as efficiently as possible. On individual devices, things seemed to run smoothing, and it turned out to be the communication between devices that was the most difficult challenge. Icarus was all about bringing together everything that I’ve learned over the past few years, so it served its purpose. But the difficulties I encountered only serve to highlight the moral of the story that when designing an embedded system-based device with elements that really can’t fail, less is more. I knew this in idea before, but working on Icarus really drove this idea home for me.

Main Elements & Auxillary Helpers:

Summarizing everything as simply as possible, the Arduinos run “Arduino” (processing/Wiring) [obviously], communicating with each other and the RPi by digital I/O and serial, respectively.  The Raspberry Pi 3 uses crontab to coordinate automatic launch of the main media capture script written in Python, which in turn performs various actions globally by executing shell scripts.

There are 3 communication channels used between the devices:
  1. Arduino Mega --> RPi (Serial Communication)
    1. Triggers phase change to alter media capture parameters
  2. Arduino Mega --> Arduino Nano (Digital Output)
    1. Triggers deployment of servo for "selfie" photo at peak altitude
  3. Arduino Mega --> Arduino Uno (Digital Output)
    1. Heartbeat monitor serving as hardware watchdog

Arduino Mega & The Core HAB Code

There’s a lot going on throughout the main Arduino sketch, and I could really write a book on it, but I’ll try to just break down the main functional elements. If you would like to dig deeper into the code, you can always find everything on my GitHub, and I’m happy to answer any questions that anyone posts in the comments section.

Rocket Fuel Burn Test Video (Apologies for forgetting to post!)

Sorry for the is the video I promised earlier!

This was from a few years ago when making my first batch of sucrose/KNO3 ("R-candy") rocket fuel. There's nothing quantitative about this test, it was just to confirm that the fuel would actually burn (kind of important). It should give those who have never seen sucrose/KNO3 fuel burn and idea of its properties. When formed into grains of defined shapes and stacked into a suitable container with a nozzle on the back, a remarkably powerful rocket motor is created.