Dot files

My own personal collection of hidden configuration files, extra binaries, etc.

Redirect

File Patcher

Old program I wrote in C# for a game engine, no longer in development, that parses an XML file for a couple of commands with the intention of using it to download new game patches and apply them.

Redirect

GridSide

Personal portfolio, gallery, and blog for use by the Hugo generator. Used by chipsenkbeil.com.

Redirect

Keyboard state

Small program written to demonstrate capturing modifier key input on Linux using the raw keyboard input device.

Redirect

RealFeel Game Engine

Really old fork of Mirage Source VB6 game engine using DirectX7 and WinSock to create a 2D MMORPG.

Redirect

Resume

My work and academic resumes, written in LaTeX. Contains utility functions to write your own resume easily in the same format.

Redirect

Scala Debugger

A Scala wrapper library around the JDI (Java Debugger Interface).

Redirect

Scala Debugger Akka Extension

Represents the Akka extension to the Scala Debugger API.

Redirect

Spark Kernel

A Jupyter kernel written in Scala for use with Apache Spark.

Redirect

tmux XMonad bindings

Contains a tmux configuration script to provide XMonad-like keyboard bindings. Also contains necessary utility functions to capture modifier keys (like shift) needed for XMonad.

Redirect

Overview of the Spark Kernel Client Library

In this third and final part of the Spark Kernel series (part 1, part 2), we will focus on the client library, a Scala-based library used to interface with the Spark Kernel. This library enables Scala applications to quickly communicate with a Spark Kernel without needing to understand ZeroMQ or the IPython message protocol. Furthermore, using the client library, Scala applications are able to treat the Spark Kernel as a remote service, meaning that they can run separately from a Spark cluster and use the kernel as a remote connection into the cluster.

Redirect

Spark Kernel Architecture

In the first part of the Spark Kernel series, we stepped through the problem with enabling interactive applications against Apache Spark and how the Spark Kernel solved this problem. This week, we will focus on the Spark Kernel’s architecture: how we achieve fault tolerance and scalability using Akka, why we chose ZeroMQ with the IPython/Jupyter message protocol, what the layers of functionality are in the kernel (see figure 1 below), and elaborate on an interactive API from IPython called the Comm API.

Redirect