RGB colour correction

So I’ve been playing with different colour correction techniques for my LED’s.

So here’s a simple one. Square it. Well ok, square it then divide through by the upper value (square and scale.) So:
unsigned int value = (i*i)/255;

which sort of works. But it’s a little jumpy in the lower regions, as the mapping looks like,
(0,15)->0,
(16,22)->1,
(23,27)->2, ect….

The next I’m trying is something called Quadratic interpolation using a Lagrange Polynomial. BIG words for a curve of best fit. It works by generating a polynomial through a number of pre determined points. I have worked it out using 1->1, 255->255, and then 128->c. I can then change c, effectively changing the correction to the colour. Only problem is that it gets a little complex. So the maths formula is;

(-cx^2 +256cx -255c+128x^2 -16639x+32640)/(16129)

I attempted and failed to implement this as Arduino isn’t really the right platform for doing this kind of arithmetic. It works in excel ‘tho and produces a lovely graph.

mapping curves

A graph showing various possible curves

The Black curve in the graph is actually the square and scale method above. The other are for various choices of c.

So I quickly mapped the first 32 values i->x to the 32 LEDs I have and they are quite linear, from there on I found every bank to be quite similar. Which led me to the conclusion that for small values a one to one correspondence was ideal, then a kind of s curve would be ideal. I tried a few cubics but they were quite complex to implement and even tried a quartic.

So back to a basic square and scale I think for now.

Hyper-V and Vlans

So I’ve now got 2 identical hosts running server 2012 datacenter, with VM’s on both. Now it’s time to get them talking to one and other. Simple right, just sit them all on ‘external’ virtual switches and they can use the physical network to talk to each other. Great except… I don’t really want my test environment hooked upto my live environment, not do I want to change my existing DHCP server settings or have two on the same broadcast domain!

So, Hyper V can use VLANs to segregate traffic. This is the way forward for me.

  1. 1 create a new VLAN for the test environment,
  2. tag the ports for the VLAN,
  3. put all VMs on the new VLAN…
  4. oh and change the management VLAN in the virtual switch manager for the hosts…

Groovey now I can talk to the host over my LAN (on VLAN 1) and the VMs can talk to each other on VLAN 200, but they as VLANs are completely segregated in terms of broadcast domains (for DHCP) and packets. Now I can set up a (virtual) router to bridge between them. PFSENCE time.

Raspberry PI Time-Lapse Camera Fun.

So I’ve decided that I wanted to do some time-lapse photography for my next project, and my brother wont let me borrow his GoPro…. I could go and buy one – but that’s no fun.

Anyway, I’ve had a Raspberry PI  for some time now (sitting under my bed), and a camera module for it (thanks for an awesome christmas present sis), so I thought… surely that’s possible, no?

Well – have a look at this for how to make it… super simple.

Following those instructions:
install camera, sudo raspi-config then enable camera,
using the following command to take a pic, raspistill -o cam.jpg

And to do multiple shots: raspistill -o frame%04d.jpg -tl 5000 -t 60000

This will store a series of still images in the current directory. And then to turn these into a video file you will need to do the following;
sudo apt-get update (requires network/interenet access)
sudo apt-get install libav-tools
avconv -r 10 -i frame%04d.jpg -r 10 -vcodec libx264 -crf 20 -g 15 timelapse.mp4

wahhoooo – except it didn’t work for me the first, or second time… Meh, I’ll just take the pics and worry about video assembly later.

Anyway – there’s many great resources on how to call the raspistill command/function on the net.

You can re-size the images as they are captured by adding height and width to the call:
raspistill --width 1280 --height 960 -o cam.jpg
And the following link has more about the raspistill command.

Moving on, I want to be able to access my files remotely so time to install samba,
sudo apt-get install samba samba-common-bin
sudo leafpad /etc/samba/smb.conf

and add the following:
[timelapse]
comment = the timelapse folder
path=/home/pi/timelapse
browsable = yes
writable = yes
only guest = yes
create mask = 0777
directory mask = 0777
public = yes

Which will make the folder publicly available on the network (you might need to make the directory first too with mkdir). You will also need to open the file browser with root access and change the permissions on the folder so that ‘other’ is read and write (otherwise you wont be able to remove the files so the PI doesn’t become full (I only have an 8GB SD card in mine!)).

Great so we have a folder and a way of capturing images… but, I want my time-lapse to be over weeks, not hours, so I’m using an interval of 4 mins between frames.

Step One – create a bash script to take the picture. Create a new file called picture.sh or what ever takes your fancy, but with a .sh extension.

Step Two – populate file with the following:
#!/bin/bash
DATE=$(date +"%Y-%m-%d_%H%M")
raspistill -n --width 1280 --height 720 -o /home/pi/timelapse/$DATE.jpg
and SAVE IT. You can probably work out that this takes a picture, and stores it in the folder we created above (or another folder if that takes your fancy) with a date_time stamp. Once saved you will need to modify the properties so that the file is executable.

Step Three – Cron job time. Open a terminal prompt up and type “sudo crontab -e” to get to the cron job editor. Add the following:
*/4 * * * /home/pi/timelapse/picture.sh then write and close (ctrl O and ctrl X)

As you guessed this will fire the bash script we created above once every 4 mins. Search for “cron every x minutes” for more details on how to create and edit cron jobs.

Brill. Job Done… Well… Job almost done. Now I just need to make sure the PI doesn’t fill up and run out of storage and then I will need to make all of the images into a time lapse video.

And here’s the result.