Image from NYTimes Faced with planning a series of complicated scenic transitions involving set pieces the size and weight of a city bus, I worked with the production team at the Park Avenue Armory to develop a previzualization tool. This tool allowed the productions’ choreographer, composer, director and set designer to align their expectations and desires for key transition moments within this play. Once set, this tool helped train the 6 stagehands to operate heavy-lift electric tuggers to move these set pieces safely and in keeping with the choreographic intentions of the creative team.
Brooklyn Research Fellowship During a fellowship at Brooklyn Research, I came across a dusty steel platform, replete with motors, linkages, multipin connectors and a power supply large enough to pique my interest. It turned out to be a 3 degree of freedom (DOF) motion platform which – lacking both a hardware and software interface – was sitting entirely unused. This video is a brief overview of the work I did (along with Brooklyn Research’s support) to resurrect this motion platform and provide an extensible, reusable software library for its ongoing use:
Click on the gif below for a live version: Reinforcement learning is exciting. It also is quite difficult. Knowing both of these things to be true, I wanted to find a way to use Unity’s ML-Agents reinforcement learning framework to train neural networks for use on the web with TensorFlow.js (TFJS). Why do this? Specifically, why use Unity ML Agents rather than training the models in TFJS directly? After all, TFJS currently has at least two separate examples of reinforcement learning, each capable of training in the browser with TFJS directly (or somewhat more practically, training with TensorflowJS’s Node.
As part of a Fellowship at Terreform One, a nonprofit architecture and urban design group based in the New Lab at the Brooklyn Navy Yard, I was tasked with proposing and rapidly prototyping museum interactives for the as-yet-unfinished Davis Family Butterfly Vivarium. One of my protypes was an immersive interface for 3D computed tomography (CT) scan data of painted lady butterflies, the goal of which was to give museum-goers a sense of agency to engage with a process of scientific discovery.