Welcome

This page is about my interests, projects, and profession (technical artist in the video games industry).  Most of my hardware\software projects are coded in PythonProcessing, Arduino, or MEL (Maya Embedded Language), in addition to 3d printing stuff on my Makerbot Replicator:  You can find my latest prints for download over on Thingiverse.

Speaking of Processing\Android\Python, you can link to my various programs\apps via the above title bar.

I also maintain several Mel\Python\Pygame\Processing wikis that I update far more often than this blog.  See them on their page.

All information on this site is copyright under the Apache Licence v2.0, unless otherwise noted.  Which means you can pretty much use the information here for whatever you like, but I always appreciate credit where applicable.

Have a look around.  Thanks for stopping by.

– Eric Pavey

Lost & Foundary “Model A” Furnace: First Melt

I finally decided to upgrade from my “flowerpot furnace” to something slightly larger, slightly faster, and quite a bit hotter:  the “Model A furnace” from Lost & Foundry.  Very pleased with it:  Fired right up, and I had molten aluminum 15 minutes later.  Next up, some real sand-casting.

Fun learning Unity

I’ve made small games/apps in the past using Python/PyGame  & Processing.  When developing them, I’m always responsible for designing all the systems:  And while a great learning experience, I only have a fixed amount of time to work on these side projects.  So I thought I’d expand out and try a larger game development framework letting me focus on higher level concepts rather than the lower-level coding needed for basic systems.

The three big ones I’m aware of are Unreal (Epic), Cryengine (Crytek), and Unity.  Nowadays you can download them all for free, which is fantastic.  I’ve used Unreal & Cryengine professionally in the past, so I had interest in seeing how Unity worked.  And one of the big deciding factors was the “scripting language” the engine uses:  Unreal 4 has dropped their “UnrealScript” in favor of pure C++ (which I can read, but not really write), while Cryengine uses Lua (never touched it).  Unity however supports C#, JavaScript, and Boo:  C# is very similar to Java (Processing), which I am familiar with, so that made the decision pretty easy.  And while arguments could be made for the “power” of a given engine, and my guess would be that Unity would be lower on the list, they can all do far more than I’ll ever need, so Unity was the choice.

And so far, I’ve been very pleased:  Their documentation (user manualcomponent referencescript reference) and example tutorials/projects are fantastic .  Easy to read, well spoken, and a breeze to follow.  Plus, building the games and deploying to the web has been a snap.  To date I’ve completed the two below ‘projects’, and I include a link to the completed game.  While the ‘games’ are super simple, they were also super easy to make (thanks to the great tutorials):

roll-a-ball spaceShooter

Next I plan to go over their “Stealth” project next.  And like the subject reads, it’s been fun:  It’s nice to be in a development environment where it ‘makes sense’ and things ‘just work’.  So far, my only complaint is they don’t have a build in interactive shell (REPL), and they should really add Python to their scripting language selection :)

Time-lapse photography with the Raspberry Pi

Thought it would be fun to setup a time-lapse rig with my Raspberry Pi & it’s camera, having never tried that type of photography before.  A bit of afternoon coding, and success:

11 hours compressed to 60 seconds.

This is what the camera-rig looks like:

SAMSUNG

Used some MircoRax to create a simple frame for the Pi & its camera.

Install dependencies

You can download the results of my Python time-lapse code here.   For the below examples, just stick it in your home (~) folder.

It calls to the fantastic picam library.  Install:

sudo pip install picamera

Record the stills

Executing the timelapse.py code is easy:  It will create a /time-lapse subfolder where it will place all the jpgs.  It doesn’t need any arguments to run:  In that case, it will record for an hour, with enough images to make a one-minute long video at 30fps:  It’s designed to take the guesswork out of trying to figure out how many frames to render at what interval based on the framerate.  It handles it all for you behind the scenes. Plus it’s all configurable.  To query the help:

$ python timelapse.py -h
usage: timelapse.py [-h] [-ct float] [-dur int] [-fps int] [-xres int]
                    [-yres int] [-q int] [-y int] [-m int] [-d int] [-hr int]
                    [-min int] [-s int]

Time for time-lapse! To start recording at a certain time, pass in any or all
of the time related args. If no time-related args are passed in, recording
will start immediately.

optional arguments:
  -h, --help            show this help message and exit
  -ct float, --captureTime float
                        in HOURS, default 1.0
  -dur int, --duration int
                        of final movie in SECOMDS, default 60
  -fps int, --framesPerSecond int
                        of final movie (default 30)
  -xres int, --Xresolution int
                        of image (default 1280)
  -yres int, --Yresolution int
                        of image (default 720)
  -q int, --quality int
                        of jpeg from 1-100 (default 85)
  -y int, --year int    ...to start recording
  -m int, --month int   ...to start recording
  -d int, --day int     ...to start recording
  -hr int, --hour int   ...to start recording
  -min int, --minute int
                        ...to start recording
  -s int, --second int  ...to start recording

So for example, to capture for 12 hours, and end up with a 1 minute long video:

python timelapse.py -ct 12.0 -dur 60

It also supports a delayed start, if you pass in any of the time values.  For example, if you pass in an hour, it will wait for that hour to start recording.   If you pass in a minute, it’ll wait for that minute of the current hour, etc.  You can pass in any of the year, month, day, hour, minute, second, or none.  If none, it starts capturing immediately.

Finally, I’ve learned that if you’re logging in via ssh, you should launch your code via nohup:

nohup python timelapse.py -ct 12.0 -dur 60

If you don’t do that, when you close the remote shell, it’ll kill the process, and no timelapse for you!

Make a movie

After you capture all the stills, how to make into a movie?  The mencoder software can be used on the pi for that.  I found a tutorial here that provides a solution.  To install:

sudo apt-get install mencoder

First make a list of files from your /time-lapse folder (from the above tutorial link):

cd time-lapse
ls *.jpg > stills.txt

Then, to convert them into a movie with mencoder (modified version of the above example):

mencoder -nosound -ovc lavc -lavcopts vcodec=mpeg4:aspect=16/9:vbitrate=8000000 -o tlcam_01.avi -mf type=jpeg:fps=30 mf://@stills.txt

Copy to your PC

This will create a new avi file on the Pi.  To get that moved to your PC, on Mac/Linux you can use scp (below example is my Pi’s IP, change it to match yours).  Note, the below code is executed from your PC, not the Pi, and copies it to my Mac’s home folder:

scp pi@192.168.2.27:~/time-lapse/tlcam_01.avi ~/tlcam_01.avi

Or you can use this great tutorial on how to use SFTP via FileZilla, if you’re more comfortable in a windowed environment.

Once I got my first movie copied over, I couldn’t play it (on my Mac) via the Quicktime player.  However, my install of VLC opened it no problem.  From there it was uploaded to YouTube: Done!

Lost PLA casting: Take 1

I’ve been experimenting with some backyard sand casting, and have had the desire to try and cast a 3d-printed object. I’d heard you can do a “lost PLA” cast, where the molten aluminum vaporizes the printed item, so I thought I’d give it a shot.  Long story short:  Works… sort of.  This is my first try, so I didn’t really know what to expect, but the results were far less than I wanted:  No detail in the object, and not all the PLA totally melted: I printed the object with only 5% infill and 1 shell (to generate as little material as possible).  Regardless, it does make a good conversation piece ;)  In the future I’m going to get better sand (finer grain, better clay binder) to help preserve the detail, and probably try for a real lost-wax solution (3d print a mold for the wax), or a true sand-cast where I remove the item to be printed and simply fill the void with metal.

The end result is (an attempt at) the logo for the company I work for, Sledgehammer Games  (a sledgehammer head seemed something worth forging):

hammertime

A dime is next to it for scale

You can see on the left and right sides where I had to grind off the sprues.

Here’s a shot of my “flowerpot furnace” in action melting the aluminum for the project:

And one with the top removed, showing the red-hot crucible:

First steps with the Adafruit Trinket

I recently picked up a Adafruit Trinket (3.3v), simply because they’re so cheap (about $8).  I like the idea of a tiny small Arduino-ish board.  Since I’m forgetful, below documents the overall process I got to get it working.

End result:  Franken-servo!

End result: Franken-servo!

Resource Links:

First Steps:

For all the documentation on the web, I really wasn’t sure where to start.  There’s all this talk of installing AVRDUDE, etc, but as it turns out it’s really not needed (if using the Arduino IDE). Here’s the streamline approach I finally took:

  • Setup my pre-existing Arduino IDE for use with the Trinket, following these Adafruit docs.
  • Since I’m on a Mac, I didn’t have to worry about installing any drivers, but the “Mac Arduino IDE” app download they provided didn’t work.  Mac said it was “corrupted”.  So instead I had to use the “Slow Way” example they gave.  And other than a few path differences it worked just fine.

Programming:

  • I was able to successfully run all their examples from the page “Programming with Arduino IDE“.
  • Earlier in the day I had done some servo programming on an Arduino, and I wanted to emulate it on the Trinket.  Thanks to the “Trinket Servo Control” tutorial, and their “Adafruit SoftServo” library, I was able to make it happen.  However, I was unable to create two separate servo objects:  I think I don’t quite grasp the Trinket pinouts.  The above image has two servo’s hooked up to the same pin, so they move the exact same way.

So a successfully first attempt.  Next up, I really need some batteries to make a standalone project with it…