[KIRKROERIG]


work

Autonomous Driving Project

Repo: Github

What is it?

I built this system and platform for autonomous racing competitions. The software and hardware is built around an embedded linux distribution. The system is designed to be used as a pipeline of seperate programs which helps ease testing by allowing me to simply swap the simulator program in for the program that normally collects sensor readings.

What did I do?

This project is comprised of a set of several very different facets, most of which I built from the ground up.

Sensor interface

The first step was to let the system see and feel. The collector program opens a connection to the I2C bus and then initiates communication with the BNO055 IMU, and the PWM-logger. I modified the PWM-logger firmware to also record readings from a homemade wheel encoder I fitted on the front right wheel of the car. A connection is then opened to the camera through the V4L2 API. The camera is configured then set to stream.

Machine learning

The competition that I was initially interested in takes place in a parking lot, with the track defined by hay bales. So my plan was to classify patches of the camera frame and steer toward the clearest path. I used Tensorflow to train a model to classify small image patches by the type of texture it believes the patch to be, unknown: 0, hay: 1, or asphalt: 2. I used a combination of synthesized and web-scraped data for training.

Simulator

A crucial component that I wrote for the system was a simulator. A simple driving simulator built using seen that could interface with the other programs in the suite. The simulator is used as the testbed by which much of the software could be rapidly iterated upon to achieve better performance.

simulator running

Learnings

  • Real-time scheduling
  • Deterministic timing
  • Simulation development
  • System architecture
  • Sensor calibration
  • Driver authoring
  • ML system design
  • Web scraping
  • Training data synthesis
  • PID controllers

Technologies and Tools Used

  • C, C++, Bash, Python
  • Tensorflow
  • V4L2
  • GNU Make
  • OpenGL 4.0
work

libnn

Repo: Github

What is it?

libnn is intended to load the parameters of a trained feed-forward, fully-connected and convolutional neural networks, and use them to make predictions. I designed it with embedded Linux systems in mind as the target. As an example, I use libnn to classify video in my autonomous driving project also shown here.

What did I do?

I started building libnn, partly out of necessity, partly as an exercise in numerical code optimization. Written in C from the ground up, my goal was to make a fast CPU based, low-level, and portable library with a clean intuitive API. As a result, I've had to dive deep into structuring my code to take advantage of SIMD instructions that can be generated by the compiler.

Learnings

  • SIMD optimization
  • Continuous integration
  • Automated testing
  • API design

Tools and Technologies Used

  • Circle CI
  • GNU Make
  • C, Python
  • Tensorflow
work

Orbital Elements Calculator

Repo: Github

Calculator Try it!

What is it?

I chose to write this orbital elements calculator as a result of a self motivated journey to learn more about spaceflight. I used this exercise to cement an intuitive meaning of the orbital elements through the building of an interactive tool.

Learnings

  • perifocal coordinate system
  • Keplerian orbital elements
  • computation of elements from position and velocity vectors
  • computation of position and velocity vectors from elements

Tools and Technologies Used

  • WebGL
  • Javascript
work

Projective

What is it?

Projective is an augmented reality game I developed for iOS devices well before the release of Apple's ARKit framework.

What did I do?

I built Projective from the ground up. In doing so, I wrote a simple OpenGL ES 2.0, and FMOD based game engine. For text rendering, I created a set of functions that convert True-Type fonts into meshes that can be rendered in game. The most critical detail was in making use of the orientation quaternions from CoreMotion to re-generate the view matrices on each frame.

Learnings

  • Project managment
  • Programatic generation of PCM audio
  • Internals of True-Type fonts.
  • Quaternions for practical applications

Technologies and Tools Used

  • OpenGL ES 2.0
  • Objective-C, C
  • FMOD
  • Blender
  • Xcode
work

PWM Logger

Repo: Github

PWM Logger

What is it?

I designed the PWM logger to facilitate the development of small-scale autonomous vehicle systems whose performance strongly depends on servo data collection and then later, reproduction. Such as those who are implementing end-to-end ML systems.

It does this by decoding PWM signals from servos and motor controllers and then allowing an I2C master to record the readings to produce labels for a training set. The logger can also be put in a mode that generates PWM rather than recording. This way it can seemlessly swap between recording a human flying or driving, to letting the computer take the wheel.

What did I do?

I designed the PCB for the PWM logger in EAGLE CAD using the Parrallax Propeller as the MCU. I then programmed the PWM decoding and generation in it's native Propeller assembly. Finally, I wrote a software I2C driver in Propeller ASM to interface with other systems.

Learnings

  • Assembly programming
  • Hardware design, prototyping, assembly and debugging
  • I2C bus protocol
  • SMD soldering
  • PCB design

Technologies and Tools Used

  • Parallax Propeller MCU
  • GNU Make
  • EAGLE
  • Bus analyzer
work

Seen: Graphics engine

Repo: Github

seen demo3

What is it?

I started building seen to make graphical, and data-visualization experiments quick, expressive and easy. seen seeks to minimize the number of lines the programmer must commit to simply twiddling OpenGL's switches, and rather let them get right down to it.

What did I do?

Writing seen has mostly involved designing an API that takes the pain out of configuring OpenGL's state machine and gets a proper rendering scene on the screen quickly. In addition to API design, implementation of some basic rendering algorithms was needed. Blinn-Phong shading, shadow-cubes, environment-mapping, normal-mapping, and variance shadow mapping to name a few.


Technologies and Tools Used

  • C++, GLSL
  • GNU Make
  • OpenGL 4.0
work

Tracer

Repo: Github seen demo3

What is it?

I built Tracer for fun. It's a simple path tracer implementation which writes light intensities into a buffer. In this demo, the render buffer is then read and used to manipulate the terminal via ncurses to yield a 'retro' aesthetic.

What did I do?

I implemented a simple path-tracing algorithm in C++. In addition to the algorithm, I also defined several intersection equations which give surfaces their form during rendering.

I also improved performance by rendering each frame using several concurrent threads. The path-tracing algorithm is highly seperable. Meaning that the computations done for one 'pixel' in the buffer are completely independent from the others. Such a property of an algorithm lends itself well to parallelism. So I split the frame up, one column per thread. This significantly increases frame throughput.


Technologies and Tools Used

  • C++
  • GNU Make
  • Ncurses