Art Projects

Selling the Mona Lisa

A few months ago I went to Paris for work. I had hoped to see the Mona Lisa in the Louvre after i had finished but it was a Tuesday and everything is closed in Paris on a Tuesday,no one told me that.  I came home with only a blurry picture of the Eifell tower as a souvenir.

blurry Eiffel tower
Eiffel Tower, told you it was blurry

Oddly I know exactly what the Mona Lisa looks like having seen many pictures of it in books when I was younger and now on the internet.  This has left me wondering what exactly are those reproductions that we see on the internet. Is it the Mona Lisa or is it something else? Can I have an opinion on that famous smile or thoughts about Da Vinci's genius of being a master of both art and engineering without ever actually seeing the original?

To help me try and understand what digital images are i decided to see what would happed if I base64 encode the Mona Lisa and other images of artworks.

If you are not familiar with it base64 encoding is a method of representing digital data for when it needs to be transmitted over a medium that only supports sending textual data. If want to know more have a look at the Base64 Wikipedia page

I wrote a short Python script to convert a binary file to a based64 encoded text file.  Opening this text file in a text editor was odd.  The data wasn't recognisable as the original Mona Lisa in any way. There is no way of telling what parts are the background or the different colours that are present.

Mona Lisa

The Mona Lisa image file  I  used was taken from Wiki Media commons and according to Wikimedia the image is in the public domain so I could use it without fear of anyone demanding money for its use.  That doesn't go for all images of the Mona Lisa.  Corbis a company owned by Bill Gates sell licenses to images of the Mona Lisa.  Is there's a better image of the Mona Lisa because its the official licensed version.  Would I get more please or a better understanding of the work if I looked at the Corbis image over the Wikimedia Image?

Girl with a Peal Earring



As well as Base64 encoding a mona Lisa image I did the same with The Girl with a Pearl Earring and Sunflowers  Both of these I used the smallest image size that was available.  For sunflowers I decided to make it smaller still. When I base64 Encoded that tiny picture which became pixelated and distorted when zooming into it there was a odd string of text in roughly the centre of the file




Not sure if that has any meaning or if its just a strange human thing that we give meaning to something without any meaning, after all that string represents some digital data.


To complete this exercise I decided I would take these text files and create digital images from them and put them up for sale.  Maybe in a few years time these images are what robots will hang on their walls of the homes.

The images are available from Redbubble as everything from a hoodies to iPhone covers.  Not sure Van Gogh,Vermeer and Da Vinci would ever have imagined that their paintings would be transformed into a series of  text characters adorning mugs and iPad covers when they painted them.

Red bubble

I'm not quite sure If i've satisfied my own curiosity about what these images actually mean, if anything i'm even more intrigued now

I'll put the Python script and Base64 Encoded text files on to git hub later on for anyone interested


Art Projects thoughts

Making a Twitter Art

I make lots of things, If you want you can have a look at the ones i've wrote about on this blog now that i've organised them all onto one page at 

Some of them are for a real good cause like the little boat, I made for Oceans Project . Others are to learn about something new like the Unofficial guide to the Natural History Museum

Sometimes I just want to make things that don't exist  or turn a tweet into something real  and sometimes I just want to make statues fart 

What i've never called any of the things I make is art, to me all the things I've made are just things i've made. But my latest project I think it is art simply because I can't think of anything else it could be.

When I was working on the Tweeting Satellite  project for work last year i became fascinated with twitter bots and how metronomic they are,  sending out tweets regularly, not needing something funny to happen to them on the way to work or to be frustrated that the milk in the fridge has gone off for them to tweet.  The Tweeting Satellite was only a short project but I wanted to create something whose entire point was that it would tweet regularly for a long time.

Its taken a while since I wrote it but @ColoursAll is now live, it will tweet every hex colour from 0x000000 to 0xFFFFFF in order. The reason its taken so long is that rather than just sign up to an hosting service I wanted it to actually physically exist.  For this I took advantage of Colocker offering a free hosting to a Raspberry Pi for members of London Hackspace.  This was most of the delay, getting around to setting up the Raspberry Pi.

At around  the same time as writing @ColoursAll I wrote @TickTockBot which ran over New Years Eve.  I always find New year a bit strange, its seems such an arbitary celebration that I don't understand  and every year now I just tend to hide under my duvet. @TickTockBot was my way of dealing with it last year.

Anyway @ColoursAll is running and is due to complete in approximately 1915 years time.   It feels odd making something that has the potential to last so long. I'm guessing that in actual fact it won't last that long. Something will happen  first. Twitter will shut down,Colocker will shut down or the world will just end.

I should probably have  put a message to be tweeted at the end but I haven't, or have I?  You'll just have to wait and see.


Hacking a rowing Robot

Robots are cool right? and the coolest robot around is Nao Robot right? and wearable technology is cool right? So what would be better than a Hackathon themed around interfacing Nao to some interesting wearable technology?  I couldn't think of anything cooler so when I saw exactly that I signed up straight away.

Originally I thought it would be interesting to do something museum related, but as I'd already spent a day with Nao and museum people exploring how Nao could be used in a Museum I decided to look at something different.

Recently I'd used blender to design a little version of the Rowing boat that is going to be used by Sarah Weldon when she attempts to become the first woman to row solo around the U.K in 2016.  I thought it would be interesting to think what if Sarah could take a Nao on board with her and how it could be used.

The day started off with an introduction to Nao, the devices that were available to interface with and a good introduction to programming Nao using  Choregraphe and python.

I teamed up with Sam Ahern who I'd previously met at a Flossie event   and is working with Lego Mindstorms for her MSc, and Alan Rushforth who brought a Nao robot for use to uses and much needed knowledge of C++.

Getting started we managed to get a Thalmic Labs Myo armband  and a  pulse oximeter.  I started playing around with the Myo armband and Sam started on the Oximeter.

With the help of Alan and his C++ skills we managed to get data out of the armband and write a python program to roughly detect a rowing motion, based on the roll axis of the device.   Sam had been fighting with her new laptop and the oximeter so we werent able to use that.

On day 2 we  put together our demo.  Sam worked on making Nao move to indicate if the strokes per minutes that you were rowing was too high,too low or just about right.  With a bit of fine tuning and practise it worked pretty well, right up to the point we had to demo it.

Nao sitting
Our Nao robot for the weekend

It was really great to watch what the other teams had worked on.  My favourite was the group who had used Nao to treat people with Sleep Apnoea using the Oximeter to detect the sleep problem and Nao to either tap the sleeping person to take them out of the state or direct a person to move the sleeping person to a better position.

It was a really great weekend and I learnt so much.  It was all about thinking of novel  ideas of how Nao and robots could be used.  I could really imagine Nao been taken on long solo voyages to act as both a companion and as an intelligent dashboard, setting rowing pace by interpreting sensor data from the rower or sailer , taking data feeds from GPS and radio to warn of obstacles and act as a mini cox. A robot has been taken to the International Space Station so maybe when Sarah is ready to depart on her expedition around the U.K Nao will have got his sea legs and be ready to accompany her.

If you want to learn more about Sarah and her epic row around the U.K go to her website Oceans Project  and follow her on youtube 

To find out more about Nao look on the Aldebaran robotics website 

And a big thank you to Carl Clement and everyone at UKNAO , and QMUL  for making the great hackathon  possible.






making Projects

Making a small boat

Last year I volunteered at the first Technopop festival in East London for three weekends.  The first two weekends was helping teach kids to program in Scratch and build and program Robots.  On the last of these weekends I was stood next to a boat,  that might sound boring but its not just any boat.


The boat in question belongs to Sarah Weldon of Oceans Project, Sarah is aiming to row around the U.K a challenge named The Great British Viking Quest  As well as rowing around the coast of the U.K Sarah will be documenting her progress with wearable technologies and using an online learning platform to communicate STEM (Science,Technology,Engineering and Maths) subjects to students all around the world.



All the time Sarah will be collecting data as part of her PhD research into 'effects of calorific stress on the neuro-cognitive performance on ocean rowers'.

It was really cool to talk to Sarah at Technopop festival and find out all about the rowing.   I really liked the technology aspect to the expedition and was  impressed by all the planing and preparation she has to do for the trip.

Skipping  forward a few months I have started to become interested in paper model making and am trying to build a model McLaren P1 Car from a kit .  Lets say its going quite badly at the moment.  I've already scrapped the first two versions of it so decided to take a break before trying again.  But had become interested in the process of turning a 3D model into a Papercraft version.  Also wanting to learn how to use Blender  to create 3D models,knowing that Sarah had just launched her Kickstarter to raise the funds for the expedition and wanting to support her  I had an idea.  I would make a Papercraft model of her boat.

Creating the model in Blender wasn't too hard.  There are a few ways of doing it but by far the simplest is to start off with a cube and then stretch,extrude and add faces and edges as needed.  Its not perfect replica but I didn't want to make it too complex knowing that it would be being turned into a paper model, so wanted to keep it nice and simple.

When it came to turning the 3D model into a paper model I knew of  PepaKura would do the job but thats Windows only and I use a Mac.  Fortunately while learning Blender I found the plug in system so wondered if there was a plug in that would do the same  and there is 

After exporting the model to create the Paper mesh I used Illustrator to scale it up and then split into three pieces to make it a decent size and fit onto a single A4 piece of paper.

It took about three prototypes altering the tabs and tidying up a few details before I was happy with it and send to Sarah.



The model is now available on the Oceans Project Website   Why don't you download one, make it and decorate and send the results to Oceans Project.

It would be great if you too supported The Great British Vikings Quest by backing the  Kickstarter for the project.





Unofficial guide to the Natural History Museum

Do you  remember Farting Statues from last year? I like to think of it as the  cult hit of 2014.  I like to think that its far ahead of its time and in many years to come it will be seen as a defining moment in the fart apps genre.  Probably won't though.

But writing it did get me thinking a lot about how people can be inspired by museums to write apps and learn to code.  But its not all about coding.  I started to write a computer game based around the Horninman museum.  But while writing it I got a bit distracted by the Guardian article about Twine

I'd heard of Twine a few weeks before but not really looked at it much,   and I still hadn't intended to as I wanted to concentrate on the Horniman game.  That changed when I visited the Natural History Museum one lunchtime.  Wanting to see Sophie the Stegosaurus that had recently been put on display I headed to the Dinosaurs Gallery, but no Sophie.

Discovering later that Sophie wasn't with the rest of the Dinosaurs but in the Earth Hall gave me a couple of ideas.  One is a game based around Dippy and Sophie and the other was a re-thinking of the traditional museum guide or map.  The Dippy & Sophie game will come later (Hopefully) but I got working straight away on the museum guide.

Using only the Natural History museum map  and website I've created a basic interactive guide basing it loosely on the early eighties text adventure games like  Sphinx Adventure  The guide gives you a description of your location and the options of where to go next.


Structure of the guide
Structure of the guide

The guide is up online  for you to try out.  I'm not going to make any claims for its accuracy, as I've not had chance to try it out in the museum yet to check, but will do.


Screen Shot 2015-02-08 at 12.20.21

I'll put the code up on github for anyone who wants to build versions or just have a look how it was built.  It barely scratches  what Twine can do using only links between the passages.  There is the possibility to add in Macros,variables and add more interactivity.

Would love to hear what you think of the guide and using Twine for developing interactive museum games.












technical thoughts

Making A Nothing - The story of two Twitter bots

Making nothing is just sitting around not doing anything, but making A Nothing is making something that has no content.  While developing the twitter bot @X3Prospero I become fascinated with twitter bots, not just the types and variety and content of them but how their continous metronomic beat of  their function, sending out messages repeatedly, never stopping or resting or needing any user input.

Yes there are ones that are intermittent and maybe linked to physical phenomena,data feed or  machines but these don't  have the same fascination to me. What fascinated me was that the operation of the twitter bot was more the message than the content it carried.

just after Christmas I decided to start experimenting with this thought and have developed two twitter bots so far.  The first one @ColoursAll  is a bot that will tweet every colour represented by the RGB colour model from 0x000000 (Black) to 0xFFFFFF (White). The code is finished but I'm just sorting out hosting for it, should be up and running in about a week.

Yesterday I visited the Whitechapel Gallery and saw David Batchelor  exhibtion  He has taken five hundred photographs of white squares and rectangles in cities around the world.  Seeing the projections of the squares quickly cycle through, the background changing but always a white rectangle in the centre of the screen made me think abut making a twitter bot that had even less content in it than @ColoursAll but went through the most limited content I could think of with that same beat.

Last night I ran @TickTockBot for just over one hour, its tweets were either the word 'tick' or 'tock' followed by an ascending count.  The count was only their because twitter doesn't allow duplicate tweets. After two hundred and ninety three tweets it stopped. The bot hit the limit imposed by twitter.


The code for @AllColours is on Github

The Storify of the @ticktockbot is here 



museums technical wearable

Virtual Reality - Is this real life or just fantasy?

virtual reality helmet from the 1990s Back in the 1990's Virtual Reality (VR) was going to be the next big thing, for a moment it did look that way, then it all went wrong.  The headsets were big and clumsy and generally not very good. VR went away for a long time until the Oculus Rift took Kickstarter by surprise.   The company  went on to be bought by Facebook for over $2 Billion dollars and itself kickstarted a new VR industry.

With this new interest in VR I thought a round up of some of the headsets and applications  would be a good idea.  The Oculus Rift is now in its 4th developer version.  A full consumer version is intended to be launched next year.

Other manufacturers haven't been resting on their laurels.

Samsung have taken a slightly different approach.  The Gear V.R does not use its own display but is a holder for the Samsung Galaxy Note 4. The Samsung phone acts as both the display and the processing power of the device.

All the current devices can be split into either a Rift type or Gear V.R type device.  Each method has its own advantages or disadvantages.  The Oculus Rift requires a wired connection to a P.C or games console.  This limits its mobility and flexibility but does enable  more  detailed and responsive graphics that a powerful Graphics Processor can give.  The Gear V.R  can be easier to set up and with not being tied to a P.C is more mobile both in the location it can be used and for the person using the device. Disadvantages of this approach are battery life, the power of the Graphics Processor (GPU) in the phone and that using the phone like this for long periods can cause the GPU to overheat so to protect itself it will scale back its output quality, either the frames per second or the detail that is shown.

Google took the Gear V.R approach to the extreme when it launched Cardboard at I/O its developer conference earlier on this year.  A cardboard kit that folds up into V.R headset and can accept a wide range of phones.  I recently bought one of these and was genuinely impressed at how good it is. There is a difference between what Cardboard can do and what the Rift can do but there is a place for both.  Rift can be such an immersive experience if you aren't used to it, it can be better to use sat down, I have seen people stumble around as they become consumed in the virtual world. Having taken Cardboard to work and letting a few people try it, it can be much more social experience passing the headset around and comparing and sharing experiences.  Less totally immersive, different but not less.

Similar to Cardboard is the DIY VR headset  also adds trucker cap mounting and inclusion of the Leap motion sensor as the Oculus Rift has done.  Headsets like cardboard and DIY VR will get more people trying out V.R, developing for it and thinking up ideas and applications, that can only be a good thing.

Other devices that fall somewhere between cardboard and Gear V.R are the Archos V.R Headset  and the Zeiss V.R One 

Don't worry I haven't missed the obligatory 3D printing and Arduino  mention here it is with the Adafruit V.R Headset 

The most well known of the other Oculus Rift type devices is the Sony Morpheus not yet released  and like the Rift will probably launch in 2015 but at the moment hasn't had the same amount of public testing.   Morpheus is intended for use with Sony consoles, though just like when the Microsoft Kinect was launched it was soon hacked to work on other devices as well , that could well happen with Morpheus.

Microsoft themselves probably  have a V.R headset in development but less is known about this and its currently more rumour than confirmed product.

There are lots of applications for Virtual reality devices.  Really it needs a separate post for them but a couple of notable ones are the Volvo app for Cardboard,   The Paul Mcartney app for Cardboard  and the Thomas cook 360 experience for the Gear V.R .  These aren't small niche ideas but large brands using V.R to demonstrate their product in new and interesting ways.

I've not heard much of V.R being used in museums or galleries yet except as an experience to try the technology. The only exhibition that I'm aware of using V.R is the De/coding the Apocalypse at Somerset house. Overall the exhibition is very good, it uses digital technology in a very restrained and grown up way, but the use of the Oculus rift didn't really work for me. I think mainly because it was tethered for i'm guessing security reasons and the cord was too short again guessing but for health and safety.  I wanted more freedom of movement.  Glad they tried though.  Looking forward to seeing more applications for V.R, given the amount of headsets available and software being developed will surely be some interesting concepts developed

Caught in a landslide, no escape from reality

technical thoughts

A New Steering wheel?

I'm not a massive film fan but there a few I really love. The original Star Wars trilogy, The original Italian job and just about every Bond film. Maybe I just enjoy films with classic and beautiful cars in them, nothing wrong with that. One of my favourite features of classic cars apart the lovely styling of the bodies and the sound of the engines are the steering wheels. Admittedly modern Steering wheels are much safer, well padded and with an airbag in them to prevent serious injuries but the simplicity of the classic Moto Lita wheel is a work of art.

Aston Martin DB5 from Skyfall
Aston Martin DB5 from Skyfall

The modern Formula one Steering wheel is packed full of electronics and technology and cost around $50,000.  I say 'around $50,000' Whether they actually cost that I have no idea. Every article and commentator always quotes that exact conveniently round figure, let just say they aren't cheap. I do like that the trophies of the Australian Grand Prix are replica's of the steering wheel of one of Jack Brabham's cars.  A lovely nod to the past.

The steering wheel for Bloodhound SSC the car hoping to be the first car to reach 1000 Mph will be 3D printed and has been designed to fit Andy Green's hands exactly. Again an amazing piece of technology but not simple  beautiful in the same way as the Moto Lita.

Early cars didn't have wheels to steer with but used tillers as found on boats.  Were tillers used just because there was no suitable analogy from existing land vehicles?  Horses use a bridle and reins, which aren't suitable as there is too much play in the mechanism.  Trains just have rails so aren't steered.

As the cars moved from three to four wheels and the technology advanced the steering wheel become the standard way to steer the car. Electric cars are fairly common in today's day ang age. Electric cars require an ev charging cable to be able to charge properly.

There have been one or two attempts to do thing differently.  The Early models of the Austin Allegro had a square 'quartic' steering wheel which was a flop and was replaced with a standard circular wheel on the later models.  When I was a child our family had a Austin Allegro.  It was a 'V' reg.  from 1980 it must have had the standard wheel although I don't remember as I sat in the back. I do remember thinking it was very luxurious as it had a pull down armrest.  I'm easily impressed me.

The Mirov 2 a 'revolutionary sports turbo from the soviet union' had a steering wheel that could change from the right to the left hand side.  Except it didn't, it was a fictional car created for the Norwich union advert.  However the Mclaren F1 whilst having a conventional steering wheel did feature a unusual seating layout of having the driver in the centre with two passengers either side and slightly further back.  The F1 wasn't the first car to have the central driving position.  That honour goes to the 1935 Alfa Romeo 6C 2300 Aerodinamica Spider  a beautiful car with many innovations.  The prototype of the Land Rover had a central driving position but as the project developed it reverted to a conventional layout.

1935 Alfa Romeo 6C 2300 Aerodinamica Spider

The Matra Bagheera of 1973 had 3 seats but the driver position was conventionally on the left hand side.  All I can say about this car is that it looks horrible and so are all the websites and videos on youtube about especially the 'sexy' ones.  Not linking to any,  the internet is no place for videos of  ladies taking their clothes off and doing rude things

So why a blog post all about steering wheels? Well it was seeing this competition  to design a steering wheel for Ford.  How odd I thought, while Google and other hi-tech firms in Silicon Valley are designing self driving cars, Ford are updating the Steering wheel.

Sadly didn't enter the competition as was too busy with my own work and study and only saw the competition a few days before the closing date. From reading the competition description it really is aimed at serious designers who can actually do things with pens and pencils.

Here is my idea for the competition.  Rather than have a steering wheel, have a flat surface in front of the driver.  on that surface have a model of the car, using cameras around the car and live satellite and other sensor data project the environment that is currently outside the vehicle  including all the other cars on the road and the buildings and surrounding areas.
Imagine something like the video below,small and fitted into the dashboard area of a car, but rather than the the marketing shots of the car driving around the track, all the live data, projected from within your own vehicle.

PERCH Car from PERCH on Vimeo.


The majority of the time the car would be computer driven but if you did want to take over, it would simply be a case of moving the model car on the flat surface is the same way you move a computer mouse.

Don't think that anything I've thought of here is going to be possible for the next few years.  The cost and bulk of the projection and being able to get the data into the car is going to need a reliable high speed data connection.

What I do think is possible is that there could be big changes coming to the whole automobile industry very soon if self driving cars become a reality, and it does look like it could happen.  There are still lots of hurdles to get over, not just technical but legal and administrative.

A quote often incorrectly attributed to Henry Ford is "If I had asked people what they wanted, they would have said faster horses."  It would have been so much cooler for this post if he had said it.

Lets just pretend he did say it.   Are Ford currently trying to make faster horses while the technology industry works on new cars that will radically change how we think about cars.   Its quite possible.  Cars are being seen as less of a status symbol by many younger people now.  Environmental pressures are demanding they be made to be cleaner.

Looking forward to seeing what the winning entry of the competition looks like. Hopefully its not just a piece of circular metal and plastic.  If it isn't radically different I can honestly  imagine that Ford will soon join Nokia as one of those companies that once dominated an industry but were soon forgotten when a competitor came from nowhere and introduced a massive change


developer python technical thoughts

Do you remember the Mackerel?

You might remember my recent post on writing a short python program to solve the Yakult problems on the tube
I said then that I didn't think it was the only or best way to solve the problem. I've spent a few minutes tonight improving it and have also made it so it takes a command line argument so it can be run from the shell

>> python mackerel

st. Johns wood

Not going to go into details her but if you are interested it is up on github

developer python technical thoughts

Solving the 'Yakult Problem' in Python

They say one of the best ways to learn something is to teach it,  I think they do, sure I have heard that somewhere, never sure who they are and  even if they don't I'm saying it now.  As I'm doing a proper course in learning Python I decided to write this short tutorial on one aspect of the python Language. I won't get into much detail as I recently learnt how to update python and still have a lot more to learn.

If you live in London you may have seen Posters advertising the Yakult Pro-bio-whatever drink stuff.  Not the faintest idea if its any good but I do like the posters with a short teaser on it.  Something similar to:

Yakult riddle



o.k Lets break the problem down.

1. load the file in

2. Go over (or Iterate over as we say in computer speak) all the station names

3.  Iterate over all the letters in the word that I am testing the  station names against.

4. Iterate over all the letters in the station name and compare it against the word being test

5 if  there is a match of letters , stop and move onto the next station, no point in carrying on

6 If no letter matches are found, store that station name for retrieval later.

Loading the File

It took a little while to find a list of all that station names in an easy to use format, but manages it after a bit of searching a messing. The stations are stored in a file called station_list.txt

#Create  a list to store the station names in 

station_list = []   

#open the file called stations_list.txt and store it in the object called stations_file

stations_file = open('./station_list.txt', 'r')  

#Read the stations in from a file
     for line in stations_file:

#Store all the stations in a list after converting the characters to lower case



I don't want to focus on the file reading as it is not core to this exerscise.  Converting the characters to lower case was something that was picked up in testing when I realised the the Upper case initial letters of the station names were not triggering matches against lower case letters in the chosen string.

Ok rather than going line by line it will make more sense if I add in the 3 iterations of step 2,3 and 4 then flesh those out with what happens in each one.


for station in station_list:

    for my_char in my_string:

        for station_char in station:



As described above the program iterates over the station_list.  'my_string' is then name of the variable that holds the string that is passed to the function when it is called, that is then iterated over and finally every character in the current station name is iterated over.

So the first thing to do is compare the current station character with the current character in my_string and if they match then stop the current loop

if station_char == my_char:



Thats stops any more tests on that station name happening for that one particular letter in my_string, but carrying on testing my_string at all is wasteful and we haven't done anything to store the station name if none of the characters match.


 if my_char_count == len(my_string) -1 and not found_letter:



You will see that these lines test a variable called  my_char_count against the length of  my_string and variable called found_letter .  This is finding out if the current character is the last one in the word and if no matches have been found.

If  that test is passed the station name is added to the list of  results with


Finally the my_char_count is incremented and when all the iterations are compete the result_stations_list is returned from the function


    my_char_count +=1

 return result_stations_list

Because the station list are separated by a newline the list shows that as '\n' these can be removed if you like, but i'll leave that as an exercise for you to try.


That shows the basic workings of the program,  the full thing is on github have a look and see what you think.  I'm not saying this is the best way to solve this problem, am sure its not the only way.   But its the way I wanted to solve it using the Python language constructs we have been learning in class and hopefully it nails the workings of the language into my head.

For another take on the problem a solution in Java can be found on this blog


Oh and if you are interested 'St Johns Wood'