Posts Tagged ‘ raspberry pi camera

Building the C-Bot 3D printer: Part 31 : Setting up Octoprint

Jump to C-Bot blog index to see all the posts.


My previous post talks about the specifics of getting Octoprint setup, in general.  Here, I’ll talk about actually integrating it with my C-Bot 3D printer.

Up to this point I had a Raspberry Pi (upgraded to a v2 at this point) connected to my router via a USB wifi dongle, with a RaspberryPi Camera hooked to it, connected via a 3′ ribbon cable.  This was all sitting like a pile of spaghetti on my table.  I needed a way to get all this strapped to the C-Bot itself, which is what this post mainly talks about.  I should comment that during this process I gave up on the USB wifi dongle and switched to direct ethernet:  Just too many problems getting the wifi to stream the picam correctly.

Before we get into it, the end result:

3D Printed Items:

I needed a way to mount both the RPi2 to the frame of the C-Bot, and have some sort of adjustable mount for the camera.  Both of the below items were printed on my Makerbot Replicator 1.

RPi2 case:  After searching Thingiverse, I found this great looking “Raspberry Pi 2/B+ case with VESA mounts and more” file:  It both looks slick, has a slot for the RPiCam ribbon cable, and had holes to bolt it directly into the OpenBuilds V-Slot.  I should note the holes provided were too small for the V-Slot bolts:  I had to drill them out slightly, but once that was done it was easily mounted (see above pic, it sits on the right-front vertical arm).

PiCam Mount:  After more Thingiverse searching, I tracked down “B+ PiCam Ultimaker 2 timelapse harness” : This is both an adjustable arm to hold the cam, and a separate bracket for the cam:  I printed out all the files needed for the cam, and realized that it didn’t fit the 20×40 V-Slot: The ‘hook front’ piece was too wide to clamp properly.  I’d figured this would be the case, and created a modified version in Maya to narrow it, which is currently installed on the bot.  But if you don’t want to have to deal with this, I realized you can get a pack of small sticky-notes, and just tear-off the appropriate amount to create an easy-sized shim.  You can download my modified version on Thingiverse here.

Assembly:

  • I bolted the RPi2 case to the right-front 20×40 V-Slot extrusion.
  • I attached the PiCam mount on the right side of the top-front x-extrusion.  Zip-tied it down for safety.
  • My Rumba’s USB now runs directly into the Raspberry Pi 2.

Issues:

  • RPiCam ribbon:  Not easy to route, easy to catch things on.
  • Power to the RPi, the way I mounted it sticks out the side of the bot.  Easy to catch things on.

Final thoughts:  I’ve literally just started printing with it:  I usually print from either the C-Bot’s LCD, or from Simplify3D : So this is a whole new interface to learn (although, obviously similar to what I’m used to).  But I’m excited to start nocking out some timelapse movies :)

Setting up OctoPrint

OctoPiThis post will be a continual scratchpad of info as I use Octoprint…

I’ve toyed around with OctoPrint in the past:  Before I built the C-Bot I was seriously considering buying a Type A Machines Series 1 printer:  They use OctoPrint, so I’d installed it on a spare Raspberry Pi and played around with it and my Replicator 1.

With the C-Bot, based on its large build size, I want a way to be able to remotely track & control my prints:  If I’m not around I can check up on them, pause\stop them if needed in case of failure.  OctoPrint is perfect for this.  So I decided to finally get it installed and configured on my Raspberry Pi.

Below are the rough steps I went though to make this happen.  It’s still not quite ready yet for primetime:  I need a way to get the Raspberry Pi & it’s camera to correctly point at the build plate, and a slick case\mount to stick the RPi to the printer (see the next post).  But technically I have everything up and running now.

Update:  Note that all the trouble I had at any step with the install stemmed in some way from trying to configure wifi successfully:  I later switched over to ethernet, and all my problems went away.

My Tech Specs:

Setup Process:

  • Install OctoPi disk image.
    • GitHub : Download & install docs I followed.  See “Problems Encountered” below….
  • If you plan on accessing OctoPrint via the web, make sure that “Access Control” is on (which it is by default with any modern install of OctoPi), and you have a valid login.  See here.  First line of defense keeping people from taking control of your printer remotely.
  • In your router, give the RPi a static IP.  This allows you to log in consistently (locally or remotely), without worrying that your router can dynamically change the IP at some point.
    • While it was connected to my router (provided via Comcast), via the router’s control panel, I found and stored it’s MAC address.
    • Next I had to make my router “forget” the RPi:  For me, I had to first turn off the RPi, then in the router “block” it, then “delete it”.
    • With the RPi unconnected, using its MAC address I was able to add it it with a staticIP (My static range was 10.0.0.253+), rather than DHCP.
  • To allow for remote (web-based) control, in the router control panel, setup port forwarding based on that static IP, for port 80.
  • To view OctoPrint locally, I can browse directly to the static IP I assigned.  Note I was never able to go to hoctopi.local/ either over ethernet of wifi:  I get a constant “webpage not available”.
  • To view Octoprint over the web, I use a search like “what’s my ipv4” to find my homes IPv4 address.  Plugging that back into the browser takes me directly to the Octoprint control panel (presuming it’s on).
  • I have yet to setup a Dynamic DNS service for my IP.
  • Yet to mess with any of the Haproxy stuff.

Configuring the PiCam:

The defaults for the RaspberyPi Cam appear to be 640×480, I’m not sure of the framerate.  That’s a good base starting point, but it can be better.

  • I edited/boot/octopi.txt to set the res to 1280×720 (720p) at 30fps:  The camera tech-specs claim it can do this at 60fps, but I think that’s unnecessary for 3d printing.
  • In Octoprint, in Settings, under Webcam:
    • Set the ‘Timelapse bitrate’ to 10000k, to improve the timelapse movie quality after conversion.
    • Set ‘FFMPEG threads’ to 4 (since I have a RPi 2B, that is quad-core) : This will make the timelapse movie creation faster.
  • Next, I need to get a pair of +2 reading glasses to help bring the focal distance in.  Right now it seems to focus best at the rear of my printbed.

Problems Encountered:

  • I had decided to update my OctoPi image to current since the last install I had done was a good ten months ago:
    • After following all the install instructions, I was unable to get wifi working.
    • Based on the install instructions (here, under “How To Use It”), they have you modifying the octopi-network.txt file on the SD card:  I’m on a Mac, and used TextEdit to do this.
    • After three hours of being unable to connect over wifi, I finally dug out my spare monitor and keyboard, so I could log into the RPi directly.
    • I used nano to edit octopi-network.txt:  To my surprise, the password and ssid values, that should have been surrounded in quotes “”, were instead surrounded by solid white squares:  Some weird special character:  Changing these to “real quote characters” and rebooting fixed the connection issues.  Thanks a lot Apple…
  • I get terrible camera refresh when connected over wifi:  Maybe 1 new frame every… 30 seconds?  Unusable.  I pay for a smoking hot internet connection, so something is amiss.  My google ping is around 900 ms.  If I switch over to ethernet the cam us up to 1-5fps (just guessing) and my google ping is 10-20ms:  Actually usable.  In either case however, the machine control panel is responsive:  I’m able to remote control it without much lag.  At any rate, I ordered a new wifi adaptor (the one listed above), and I immediately got acceptable refresh:  A 1-2 second lag with 5-10fps.  Which wifi dongle you get really matters…
    • Update:  After two days, my new wifi dongle started behaving the exact same:  Super slow refresh.  I had to drop my capture rate to 3fps, at 640×480 for it to behave.  Any faster capture rate would cause increasingly bad lag in the view.  So I went out, got a 50′ cat5 cable, switched to ethernet, and problem solved.  Super speedy camera refresh.  Ethernet FTW.
  • On a number of occasions the PiCam wouldn’t turn on.  Long story short:  The 3d printed bracket it fits into was causing the small connector on the front of the cam to actually disconnect from the cam’s PCB.
  • For the longest time I couldn’t get OctoPrint to shutdown via its ‘System’ menu.  All the other buttons worked except those.  Long story short:  It appaered that having Chrome auto-log me in was causing this:  Logging out & logging back in (without ‘remember me’ checked) seems to have fixed it.

Links:

FAQ Topics:

Time-lapse photography with the Raspberry Pi

Thought it would be fun to setup a time-lapse rig with my Raspberry Pi & it’s camera, having never tried that type of photography before.  A bit of afternoon coding, and success:

11 hours compressed to 60 seconds.

This is what the camera-rig looks like:

SAMSUNG

Used some MircoRax to create a simple frame for the Pi & its camera.

Install dependencies

You can download my Python time-lapse code here.   For the below examples, just stick it in your home (~) folder.

It calls to the fantastic picam library.  Install:

sudo pip install picamera

Record the stills

Executing the timelapse.py code is easy:  It will create a /time-lapse subfolder where it will place all the jpgs.  It doesn’t need any arguments to run:  In that case, it will record for an hour, with enough images to make a one-minute long video at 30fps:  It’s designed to take the guesswork out of trying to figure out how many frames to render at what interval based on the framerate.  It handles it all for you behind the scenes. Plus it’s all configurable.  To query the help:

$ python timelapse.py -h
usage: timelapse.py [-h] [-ct float] [-dur int] [-fps int] [-xres int]
                    [-yres int] [-q int] [-y int] [-m int] [-d int] [-hr int]
                    [-min int] [-s int]

Time for time-lapse! To start recording at a certain time, pass in any or all
of the time related args. If no time-related args are passed in, recording
will start immediately.

optional arguments:
  -h, --help            show this help message and exit
  -ct float, --captureTime float
                        in HOURS, default 1.0
  -dur int, --duration int
                        of final movie in SECOMDS, default 60
  -fps int, --framesPerSecond int
                        of final movie (default 30)
  -xres int, --Xresolution int
                        of image (default 1280)
  -yres int, --Yresolution int
                        of image (default 720)
  -q int, --quality int
                        of jpeg from 1-100 (default 85)
  -y int, --year int    ...to start recording
  -m int, --month int   ...to start recording
  -d int, --day int     ...to start recording
  -hr int, --hour int   ...to start recording
  -min int, --minute int
                        ...to start recording
  -s int, --second int  ...to start recording

So for example, to capture for 12 hours, and end up with a 1 minute long video:

python timelapse.py -ct 12.0 -dur 60

It also supports a delayed start, if you pass in any of the time values.  For example, if you pass in an hour, it will wait for that hour to start recording.   If you pass in a minute, it’ll wait for that minute of the current hour, etc.  You can pass in any of the year, month, day, hour, minute, second, or none.  If none, it starts capturing immediately.

Finally, I’ve learned that if you’re logging in via ssh, you should launch your code via nohup:

nohup python timelapse.py -ct 12.0 -dur 60

If you don’t do that, when you close the remote shell, it’ll kill the process, and no timelapse for you!

Make a movie

After you capture all the stills, how to make into a movie?  The mencoder software can be used on the pi for that.  I found a tutorial here that provides a solution.  To install:

sudo apt-get install mencoder

First make a list of files from your /time-lapse folder (from the above tutorial link):

cd time-lapse
ls *.jpg > stills.txt

Then, to convert them into a movie with mencoder (modified version of the above example):

mencoder -nosound -ovc lavc -lavcopts vcodec=mpeg4:aspect=16/9:vbitrate=8000000 -o tlcam_01.avi -mf type=jpeg:fps=30 mf://@stills.txt

Copy to your PC

This will create a new avi file on the Pi.  To get that moved to your PC, on Mac/Linux you can use scp (below example is my Pi’s IP, change it to match yours).  Note, the below code is executed from your PC, not the Pi, and copies it to my Mac’s home folder:

scp pi@192.168.2.27:~/time-lapse/tlcam_01.avi ~/tlcam_01.avi

Or you can use this great tutorial on how to use SFTP via FileZilla, if you’re more comfortable in a windowed environment.

Once I got my first movie copied over, I couldn’t play it (on my Mac) via the Quicktime player.  However, my install of VLC opened it no problem.  From there it was uploaded to YouTube: Done!

Playing with the Raspberry Pi’s camera

I recently picked up a camera module for the Raspberry Pi from Adafruit.  This post will serve to be notes to myself on how to use the dang thing.  There’s a lot of info on the web, so I’m going to collect what’s applicable to myself here.  This post will continue to evolve over time.

Note, I’m using a Macbook Air, so all software and commands are centric to OSX (10.8.5).

First off, you need some sort of stand for it.  I made one that should survive the Zombie Apocalypse out of some MicroRAX:

SAMSUNG

First Time Setup:

Based on the latest installation of Rasberian via NOOBS, the hardware installed easily, and was auto-detected by the Pi.  First time setup can be found here on raspberrypi.org.

Documentation:

Official documentation can be downloaded off of Github.  The first time setup above covers many basics.

Camera forum can be found here.

Capturing and Viewing:

Important note:  You can’t view anything over VNC, and obviously you can’t do it via a ssh terminal.  This post explains the reasons behind it.  This is however, a bummer:  To run any of the “demo” code, you need to be viewing the Pi directly over hdmi.

Super Simple Commands:

You’ll find these on all the sites:

$ raspistill -o image.jpg
$ raspivid -o video.h264
$ raspivid -o video.h264 -t 10000

1 : Capture an image
2 : Capture a (5 second, default) video, at 1920×1080 (1080p)
3 : Capture a 10 second video (in milliseconds)

Viewing a video stream from the Pi on your Mac:

Need to instal mplayer on the Mac so you can access it from the command-line.  The best luck I had was to install it via macports.

$ sudo port selfupdate
$ sudo port install mplayer

Now, thanks to a post by spudnix from this thread, here’s how you can stream video from the Pi to your Mac:

Mac shell code to start netcat listening to port 5001, and piping that to mplayer:

$ nc -l 5001 | mplayer -fps 31 -cache 1024 -

Pi shell code to start streaming vid and pipe it to netcat on port 5001, shooting it to the mac’s local ip:

$ raspivid -t 999999 -o - | nc 192.168.2.15 5001

There’s a few second lag, but it worked right away.  My guess is the lag is because of the full 1080p signal being broadcast.  Dropping the resolution down had some lag at first, then caught up after a few minutes, odd:

$ raspivid -t 999999 -o -w 640 -h 480 - | nc 192.168.2.15 5001

Record raw video, convert to mp4, play:

The h264 video the camera records is “raw”.  To make it easily viewable by the Pi or Mac (or other PC’s) it needs to be converted.  Thanks to this post, here’s how you can do it:

First, you need to install gpac on the Pi, then run MP4Box (part of that install) to do the convert:

$ sudo apt-get update
$ sudo apt-get install -y gpac
$ MP4Box -fps 30 -add myvid.h264 myvid.mp4

To play video on the Pi, you need omxviewer.  I think it may come installed iwth NOOBS now(?), but if not:

$ sudo apt-get install omxplayer

Then play in a window (again, this doesn’t work over VNC, need to be on a monitor connected to the Pi) or to the HDMI port:

$ omxplayer myvid.mp4
$ omxplayer -p -o hdmi myvid.mp4

Copy data from the Pi to the Mac:

Once you record a nice video, how do you get it to your PC?  Presuming you have a ssh connection, execute this from a shell on your Mac, to copy the data from the Pi, to the Mac:

$ scp user_name@the_pi_ip:/path/to/source/file/on/pi /path/to/destination/file/on/pc

For example:

$ scp pi@192.168.2.27:~/piVid.mp4 ~/macVid.mp4

Broadcast video to the Internet:

Using VLC

Thanks to this post, I was able to get the Pi cam streaming to the web, and viewable on my Mac (via the “middle” option they described).  It wasn’t entirely straight forward though, so these are the steps I went though:

Install VLC:

On the Pi, it’s easy:

$ sudo apt-get install vlc

On the Mac, at first, I tried to install it via Macports like so:

$ port install vlc

It took forever.  And while vlc was then available at the command line, it kept giving me missing plugin errors.  I found a App for it in my downloads folder (which I moved to Applications) and tried via the gui to “File -> Open Network…” : But it wouldn’t recognize the stream from my Pi (info below).  Soooo, I went to the official download page here, installed the App that way, and it started working!

Port Forward the Router:

I accessed my routers web page 192.168.x.x and via the “virtual servers” option, opened up port 8554 for outside listening.  I’m sure this process can vary widely per router.

Stream from the Pi:

After ssh’ing into the Pi, I executed this to start the video stream (note I knocked down the resolution from the default 1080p):

$ raspivid -w 640 -h 480 -o - -t 9999999 |cvlc -vvv stream:///dev/stdin --sout '#standard{access=http,mux=ts,dst=:8554}' :demux=h264

View via VLC:

I accessed “What’s My Ip” to find the external IP of my router.

Launching VLC, I accessed “File -> Open Network…”, and entered:

http://<ip of my router>:8554

And hit “open” :  Next popped up a (delayed by about 5 seconds) stream from my Pi’s cam.  Awesome.

Using MJPG-Streamer

It’s slightly more involved, but this tutorial shows how to broadcast video straight to a web page via MJPG-Streamer. All things considered, it’s really easy to setup.  I followed the tutorial my Miguel Mota, and it worked the first time I tried.  Nice!  :  “Raspberry Pi camera board video streaming

Miguel made two shell scripts, start_stream.sh & stop_stream.sh that handle all the heavy lifting of starting and stopping all the services:  Make a copy of them to your home dir for easy execution.  Note, I changed them to up the resolution, jpg quality, and add a password to the site.  I only made one change:  Since I previously port-forwarded port 8554, I also changed their code to use that port, rather than 9000.

To add your own password, edit start_stream.sh and change the line including the block of code below to include the “-c” stuff shown here, changing myUserName:myPassword appropriately.  Note, the -c argument must be inside the quotes, after the www, or things won’t work so well.

-o "output_http.so -p 8554 -w /opt/mjpg-streamer/www -c myUserName:myPassword"

Then browse to:

http://<ip of your router>:8554/stream_simple.html

To login and start watching from the auto-generated web page!  Looks like I’m getting around 1fps.

raspistill Image Formats

--encoding <format>

The default is jpg, but you can change it, where <format> is jpg, bmp, gif, & png.  From the docs: “Note that unaccelerated image types (gif, png, bmp) will take much longer to save than JPG which is hardware accelerated.”

If using jpg, you can set the quality via:

--quality #

Where # is a value from 1 -> 100.  They say that 75 is a good number.

Python Bindings

 picamera

Here on PyPi.  Official documentation here.  Source on Github here.  Forum discussion here.

Easy to install with pip:

$ sudo pip install picamera

I’ve successfully ran the quickstarts via Adafruit WebIDE successfully (while having the Pi hooked up over HDMI to preview the results).

picam

Homepage here.  Source on over on Github.  It returns PIL Image objects.

$ sudo pip install https://github.com/ashtons/picam/zipball/master#egg=picam

Project Links: