Category Archives: technical

making Projects technical

Laser cutting objects that don't exist

You may remember last week, the sun still wafted around in the sky,the light evenings  felt like they would go on forever.  Or you may remember  last weeks blog post  about 3D Printing an object discovered  via @museumbot .

That last project was all about turning a photo of a real object back from its digital representation into a real object again and looking at the transformation it went through.

I knew I wouldn't have time to do the next part of that project adding the detail of the face and changing the thicknesses to match the original more closely this week, but it did get me thinking about the idea of taking things that don't really exist and turning them into physical objects.

Rather than 3D print objects I wanted to laser cut something with the laser on Laser Level Advisor, mostly this was because of wanting to put into practise the training I had, had on the laser cutter at Machinesroom.  Its ok being taught how to use something, but you don't really learn until you have a go. The two photographs that came to mind also leant themselves to being laser cut rather than a 3D print.

The first photograph is  of a leaf,but not really a leaf.  I''m not sure exactly what the process is that happened but recently as I set off for work to the Science museum I noticed several 'imprints' of leaves on the pavement. There was nothing left of the leaf except for a brown mark where it had once been.

leaf mark on pavement
leaf mark on pavement

The second was a photograph from twitter taken by Katy Barrett @SpoonsOnTrays of a shadow taken on a sunny day at the coast in Norfolk.


Photograph of a Shadow by Katy Barrett
Photograph of a Shadow by Katy Barrett

I found this photograph really interesting, the pattern of holes in whatever object that is casting the shadow, the sand and pebbles, the sea that can only be seen in the shade of the object.

For both photographs although it is possible laser etch straight from photographs I chose to draw outlines in the same way I had done for the 3D printing of the Pendant.  I knew I couldn't draw the items exactly but I wasn't trying.  Zooming into the photograph of the leaf showed on oddly digital texture as the imprint had become defined by the dimples on the paving slab.  The edges are a lot harder to follow than when looking at the image from further away. Trying to decide what constitutes the outline of the leaf and what is just the dirt on the pavement was difficult

The shadow on the beach was similar. The lines of the shadow seem really well defined when first looking at the photograph but again when zooming in, they are much softer and difficult to follow,being broken up by the contours of the sand,pebbles and the ripples of the water.

For each of the photographs I realised that if I repeated the process of tracing the outlines they would come out differently for each one, I would never end up with the same outline twice. That was ok, it was never about creating an exact copy of the photograph but looking carefully at the lines and choosing what I wanted the shapes to be and being happy with whatever the result was.


Memory of a leaf. Lasercut in 3mm Birch plywood
Memory of a leaf. Lasercut in 3mm Birch plywood


Lasercut of Shadow. 3mm Birch plywood
Laser cut of Shadow. 3mm Birch plywood

As with the 3D printing the Laser cutting was done at Machines Room Limewharf  .  If you are wanting to learn about 3D Printing,Lasercutting it is a great place to go.


I won't be making anything solid for a few weeks.  Will be busy at Technopop London volunteering mostly on the Vex Robotics workshops, so bring you kids and lets build robots together, because building robots are cool.





making museums Projects technical

3D print from a tweetbot

There are a lot of museums looking at how objects in their collections can be scanned and 3D Printed.  There are  reasons for this, most of them very serious,academic and scholarly.

The small project I have been working on is a lot less serious but still has a worthwhile reason behind it.  Just as my Farting statues App was a exploration of taking the objects in a collection, mixing them with information from Wikipedia adding in  Android code and coming up with something silly and frivolous.  This is a exploration of  turning  the output of a museum API into a 3D printed object.  I was interested in how the object would transform and change as it went through the processes of turning from a Physical object, into digital data and back into a physical object albeit in a modified form.

The project was inspired by this guardian article on twitter  bots, the interest in that comes from my current work project making the satellite X3Prospero tweet.

One of the bots in the article was @MuseumBot  this is a bot written by Darius Kazemi (@tinysubversions)  it takes the open access images that are made available by  the Met Museum, tweets an image along with a link to the page on the met museum website.

After following this bot, there was an image that was tweeted that took my interest



There was just something I found  interesting about the object,also the quite detailed but well defined outline made me think straight away that would be interesting to 3D Print, and thats how this started.

Not really knowing what to do  but knowing it is possible to build 3D models from photographs that was my first avenue of investigation.  Yes it is possible but it requires multiple photographs  taken at different angles.  I only had the one photograph taken at one angle.

From then my next was to see what I could do with the outline of the object. I realised that it wouldn't be possible to get all the detail of the face of the figurine  in the print, at least not in the first iteration of it.

I Started off following this article  but using Adobe illustrator to turn the image into a SVG file.  The output from the automatic outline feature of Illustrator was quite poor.  It was probably because of the shading and highlighting on the object.  So it was time to bite the bullet and do it myself.  This took around six hours, working at a high magnification and going very slowly.  This process did highlight  a problem with the image. It is not perfectly straight on at the top face, the outline shows some of the side edges.  Not sure what the technical term for that is, it probably has one.

I made the decision to ignore that problem and rather than try to guess what the actual outline should be just go around the outline as it is on the photograph.

The first time doing the import into 123D Design didn't go well.  It showed up a mistake made in the outlining process. Rather than creating a single outline I had inadvertently created hundreds of very small lines. A newbie Illustrator mistake.  So had to spend around another six hours joining up all the tiny lines.

The import into 123D Design was much better this time. All I had to do then was to scale the model to approximately the same size as the actual object and export the STL file for printing.

The Printing was done at Machines Room  on their Ultimaker 2 printer.  All I had to do was import the STL file into the Ultimaker Cura software and export a GCode file for the printer.

Printing was simply a Copying the gCode file on to a SD card, putting the card into the printer,selecting the file and pressing print.   The printer takes a few minutes to warm up and there is a little of the PLA oozes out as it reaches temperature, that just needs to be supported away from the print base to stop it from getting onto the print as it starts.


Once it has done started printing and been running a few minutes its ok to leave it running until it completes, so I went for a coffee and waited.

It was all straight forward, it took around one hour to print and then around ten minutes to leave to cool on the bed of the printer to prevent it from warping as it cooled.




Whats next?

I would really like to add in the face and its features.  The current design is a single thickness , I would like to make the thickness of the different parts of the object must closer to the actual object.

One interesting thing that I had not considered was the material transformation.  It was printed in yellow because that was what the printer was loaded with at the time and was close to the gold in look.  Two people I showed it to suggested gilding it or coating with gold leaf.  After thinking about this for a while I have decided not to.  The shape of the object has changed through the process, the material has changed, I don't see any need to pretend that it is the same material as the original, I think that is an interesting part of the story.

The file for printing is now on Thingiverse , so if you want to print out this object or play around and transform the model in any way, feel free.  If you do let me know what you happens,  am interested.












silly technical thoughts wearable

Stupid wearable idea of the day - distance sensing headphone band

It happens all the time, people walk around with their heads down looking at their phones,oblivious to the world around them.  If you aren't looking at your phone you have to watch out for people headed straight for you and get out of there way.

What about this for an idea.  Mount a distance  sensor (ultrasonic,infra-red or similar) and camera to the top of the headphone band. Point it at approx 45 degrees so when the head is tilted down it will be pointing straight forwards.   When it detects a obstacle it can make the phone vibrate and swap the display to the camera image.




Pretty sure this will be worth a few million of venture capital money in silicon valley but it is a stupid idea and you saw it here first

technical X3Prospero

Has anyone seen my satellite?

Sultans of Ping FC - Where me Jumper

As someone who is prone to losing jumpers and can never remember where i put my keys it might seem a bit ambitious to try and find a satellite, but I'm going to try.

Here's the plan, in fact here are both the plans.  in fact here are all the plans mostly.  When I first thought of this project, after coming up with this idea of making Prospero X3 tweet my next thought was is it actually possible?  The answer to after a quick search on the internet  was yes. Using or other similar sites its easy to find the position of pretty much any satellite that is in orbit around the earth.

For a few minutes my plan was to use that or a similar website, scrape the data from it and feed that to a twitter bot. I couldn't  write a twitter bot yet but twitter bots exist so writing one probably wasn't going to be that hard.  All in all I figured that scraping the data and writing the twitter bot could be done in a weekend or two. i could put my feet up, have a cup of coffee and cake, look back  at  the work i'd done then move on to the next thing.

Then it hit me. Prospero X3 was as far as I was aware a dead satellite, it wasn't transmitting it position back continuously,  the display was being updated  in real time so it wasn't done through observation.  It must be possible to work out where satellites are in space,that is cool and  this is when I started falling down the rabbit hole.

So far I've discovered Two Line Elements, Kepler's Laws of planetary motion, SGP4, Lagrange points and even how Astronauts go to the toilet.  It started off with buying Dr Lucy Rodgers book Its Only Rocket Science   and when I wanted to get a bit more technical Fundamentals of Astrodynamics and Applications  .

I have realised a couple of things. The work has already been done to produce the algorithms needed to work out where  satellites are, its taken may years of correcting and checking these, I was never going to be able  do this again from scratch and there was no need. , but I did want to do something more than just take the data and use it, but wasn't sure what.

The twitter bot in place, just waiting to be fed some data to tweet.  I can find the position of the satellite at any given time using the Python Implementation of the SGP4 Algorithm. The satellite position is given as a Vector  so doesn't make much sense in that form, the next job is to turn that into a Latitude and Longitude position on the surface Earth. Have been working on that today and its coming along nicely,lots of Greek letters to remember and it turns out there are multiple Latitudes .  The Maths isn't horrendously complicated, there are new symbols and words to learn but its mainly Vectors, Matrices and geometry, nothing too scary.  I Keep wanting to say its not rocket Science but actually it is which is pretty cool.

So feeding the twitter bot data from the python SGP4 script is easy and I could finish the project at that but I have decided that it would be interesting to have the entire project in nodejs which the twitter bot is written in.   Writing the  SGP4 Algorithm in nodejs  is the 'more' thing i mentioned earlier.

I am currently working through the Python Script nodejs'ing it .  I'm not doing a straight conversion, it is being written in the asynchronous nodejs style using callbacks rather than return of functions.  I'm not sure if this is the best way to do it but I am learning, when its finished that is when i'll be in a place to judge.  I'm also changing the variable names from there one and two letters into much more meaningful ones.  Again not sure if  that is the thing to do but its my project and it pains me to use single letter variable names so changing them.

Having Prospero X3 Tweeting is the top priority so will have it up and running as soon as possible then the nodejs  project will be after that. Have also decided to write blog posts going into a bit more detail about what i am  doing.  They might be a bit random jumping from the nodejs stuff to the space science stuff.  This might mean writing things that are wrong, quite often I do stuff that is wrong and have to go back and re-do them but decided it would be more interesting  than just saying i did stuff and  everything is awesome.

That seems a good place to finish on


technical thoughts

What's a MAC address and why does it matter?

If you are reading this you  may already know what a I.P address is. A quick recap if you don't .  When a computer connects to the internet or any  network that uses the TCP/IP protocol  it needs a method for uniquely identifying that device on the network. The most common I.P addresses in use today  are version 4 I.P addresses.  These are 32 bit numbers which for historical and administrative reasons are most commonly written as 4 groups of numbers between 0 - 255 separated by a '.'  for example or    When a computer has a valid I.P address for that network it can communicate easily with other computers on that network and using the ability of TCP/IP to route communications between multiple networks it has the possibility to communicate with many more.

Before a computer or any other device that connects to the internet can be assigned an I.P address there has to be some communication with the network. This is to make sure the device is both allowed to connect to the network and that the owner of the device is correctly identified.  This Authentication and authorisation process requires the device to have a unique address, but if the I.P address is the unique address on the network does that not create a catch 22 situation?  This is where the MAC address comes in.

The MAC address has no connection to Apple Mac computers.  Its an Acronym for Media Access Control. Every network interface has a MAC address so if you have a computer that has both a wired and wireless network connection that is 2 MAC addresses one each. If you have bluetooth on your computer or phone that also has a MAC address.

Why does it matter what a MAC address is? today in parliament the M.Ps have been debating  the DRIP bill, this is the bill that will allow the state to continue to intercept and store our communications. There has been a lot of criticism of the bill both for its content and how it has been rushed through Parliament without the normal amount of debate and scrutiny.

When MPs debate and introduce legislation centred around the use of technology and how it impacts  the people of the U.K I would like to think that they have an appreciation and understanding  of that technology , so I was annoyed when it was reported that Helen Goodman Labour M.P for Bishop Auckland called for MAC addresses to be tied to individuals.  I am writing this before Hansard was publish so not sure of the exact quote but if that is what she said then it is simply idiotic.

A MAC address isn't like a passport or driving licence that requires checks and more checks to be made by the government before they are handed out. MAC addresses are created on machines in the Far East and randomly sent all around the world.  They move locations and between people all the time. To try and keep track of who is using a particular address at any one time is absolutely ridiculous.  Not only that but it is possible to  mask and  change MAC addresses both for legitimate reasons but also for illegitimate reasons to gain access to networks without revealing information about the connecting device.  So not only would any attempt at tying MAC addresses to individuals be massively difficult it would also be pointless for trying to catch terrorists and pedophiles

It really goes to show how little idea M.Ps have of technology and the digital world today.  Just as the government wants everybody to learn to code and understand technology I wish MPs of all parties would take their advice it might help a little.

 UPDATE:  Hansard for yesterday has been published.  The reference to trying to tie MAC Addresses to an individual can be found at 

4th Paragraph under the section 15 July 2014 : Column 748 



technical thoughts

Don't be afraid of what you can see. Investigate what you can't

U.K Cinemas are banning Google glass over piracy fears  When I saw that article in the Guardian yesterday I was astounded.  It reminded me of the early 1980's slogan 'Home taping is killing music' , as it happens home taping didn't kill music but the record industry did their best to, with sky high CD prices and taking a long time to understand the implications of the internet,the MP3 and the iPod.

I can't think of anything worse than watching a film recorded on Glass, the 'pirate' would have to keep their head perfectly  still for an entire film, both the video and sound would be low quality. The Bone conduction microphone is designed to pick up the voice of the wearer not Cinema sounds. The camera is 5Mp and can record at 720P video.  I don't know what the sensor size is but it will be a lot smaller than a DSLR camera, given the physical size of the unit.

Glass has just gone on sale in the u.k at £1000.  That would pay for a pretty decent and discrete camera with lens and microphone.  If I was intending to video films at the cinema that is where I would spend my money.  If the cinema industry is so scared of Glass filmed copies competing with its IMAX 3D Surround cinemas they really need to have a look at what they are doing.

Maybe Cinemas need to  look at removing admission prices,get more people through the doors and make more money from food and drink sales and from advertising.  Or maybe the reaction to Glass is just a fear of the unknown,  its a new technology, that they don't understand and don't know how to deal with. I would be interested to find out if as a test anyone  has  tried recording a film on Glass, seeing how hard it is and what the result actually looks like, i'm guessing not.

Something happened this week that  with the Google Glass ban and the controversy over the Facebook Scientific paper  really made me think about the use of technology.

I was at an event looking at new technology and its applications.  It was pretty interesting but nothing unusual. I went to the event,picked up my pin on badge. Talked to some vendors selling nice tech. but nothing out of this world, had a cup of coffee and then listened to some presentations.  At this point I was shaken.

One of the presenters was talking about tracking technology for events just like the one I was at. I'm used to companies scanning a bar code or Q.R code on your badge to add you to a mailing list.  When that happens its ok I consent to it.  But the presenter showed that they were tracking which stands people had visited and there wasn't any visible tag on my badge and i hadn't had it scanned it, or touched it to any reader in any way.

Immediately I took my badge off and held it to the light.  And clearly there is  form of RFID tag visible.


The badge was just a thin paper badge,  it probably costs several  pence to make.   I was quite shocked at being able to be tracked like that but in that case it didn't bother me too much.  But the possibilities of where it could be used are more worrying.  Its well known that store loyalty cards are used to track your purchases and can even predict when a woman is pregnant  Having this technology in cards will mean that a Store can not just tell what you buy  but want you don't buy, what route you take around there store, where you walk slowly or fast or if you change direction and go back for something you forgot. It will be able to tell which entrance and exit you use, did you eat in the cafe or use the cash machine.

The Google Glass ban in cinemas is silly because anyone obviously wearing Glass isn't the one up to  no good.  While people have been rightly concerned over the Facebook study, it is research that Facebook are comfortable sharing with their competitors and the wider world. It was two years ago and for one week only.  If you are worried about that study have a think what they have been doing for the last two years,  not sitting back and twiddling their thumbs.   Amazon,Twitter,Google and Apple  must be glad that Facebook went public with the study, it takes the heat off of them and no doubt will be a reminder to keep all of their work private now.

I love technology, I work with it every day and I know it can be used for some really,fun,interesting and worthwhile applications.  But I do worry that it needs to be kept in check and monitored otherwise in the wrong hands it can be harmful.  For that to happen there has to be an understanding of what it can do and how it can be used. What it should be used for and what it shouldn't.   Learning to program with devices like the raspberry pi and the Arduino is a great start but also thinking about what all the technology around us is doing and why it is there. I would encourage everyone who is ever given a simple paper badge to hold it up to the light and see what might be hiding inside it.










developer museums Projects technical thoughts

An App called Farting Statues, I made it, here's why

Farting Statues, yes really an app called Farting Statues.  If you don't believe me go and have a look on the  Google Play Store  If you have a Android device install it and have a play with it.   Ok it's a real app that I made, explaining why I made it might take a bit longer but here goes.


Farting Statues main screen
Farting Statues main screen

At the end of last year i took a Coursera course called  Creative, Serious and Playful Science of Android Apps an introduction to computer science and writing Android apps. I'm not a really a beginner but it was a nice course to do. I picked up  some useful tips and tricks when using the Eclipse IDE and it was good to have a lot of the things that I have taught myself verified as the right way to go.

One of the early assignments was to produce a simple app that displayed a photograph of an early computer along with explanatory text. The assignment didn't require any coding as such, just to produce a portrait and landscape layout and have it swap between the two when the phone was rotated.

The assignment did get me thinking, it had a very stong museum feel to it , very similar to the sorts of apps that museums produce, but theirs are so much more polished and professional, but here is me writing a very small and simple museum type  app. Would it be possible to use app writing as a way for visitors to engage with content.  Instantly I fell in the love of the idea of guerrilla museum apps. Visitors writing apps using the content available on museum websites . Distributing them on app stores for other people to use when visiting museums.

There is a big push at the moment for people to learn to code, to use computers to not just to consume content but to create it as well.  I decided to write a museum app to explore this idea and look at the the potential pitfalls of doing this both from a app writer ,visitor point of view and what benefits and problems it would cause a museum.

First thing pick a museum and collection. The Science Museum might seem an obvious choice as I have easy access to the collection and information. I also really like my job and The Science museum had just launched an official iPad app. Creating a guerrilla version of that app seemed a really bad idea if I wanted to keep my job.  I wanted to be a little subversive but I'm not stupid.

Around Christmas Time  Team Cooper launched a game little browser game  called Farter Christmas. It was silly, childish and great fun.  That gave me the idea, combine the childishness of a fart app with the high culture of the statues in the Victoria and Albert museum.

The concept was simple and didn't change. Pick around five or six statues, find out a few  facts for each one and reveal a random fact combined with a fart noise.

The first version of the app was really easy to write and operationally didn't change through the development. It had just one small problem. The app crashed a lot.  It took quite a lot of digging around the developer docs and Stack Exchange pages to find out how to cure the problem. hitting a problem like this instantly takes the app creation process from something that an absolute beginner could do to something that requires either great determination and time spent learning other app development skills and knowledge, or assistance from somebody more experienced.

Once I had solved that problem there wasn't really any other technical problems.

Finding Content 

The next part was to find the  statues and facts about each one. The finished app only has two statues from the V & A.  They are the Dacre beasts - Dolphin and the Bather by Albert Toft. The biggest problem with selecting statues was finding the Content.  I  really loved the Dacre beasts so was glad to find information  about them, but very little on the V & A website.  There was only really Rodin's The Thinker that had a lot of easily available information because it is such a famous piece.


The Dacre Beasts, The Dolphin
The Dacre Beasts, The Dolphin

So not only did I have to widen it out to to statues not only in the V & A I had to widen it out to statues outside of museums all together. That is why the Moai Statues of Easter island are included.

Morals and Ethics

Its a silly app with farting statues, it might not seem that Morals and Ethics would be a concern.  While walking around the V & A I realised that a lot of statues are of a religious nature. They have representations of Buddha, other Indian Gods and the Madonna and Child.  Using any of those in the app could potentially be offensive to people of any of those religions. I wanted to create a fun app not one that could cause serious offence, again i wanted to be a little subversive but i'm not stupid.

Copyright and Licensing

The two V & A statues that I used the Dacre Beasts Dolphin and the Bather are both photographs that i took myself, why? I couldn't find any appropriately licensed images to use.  All of the other photographs are from Wikimedia and either Creative Commons Licensed or released into the public domain. That was the reason for the prominent credits button on the front screen, I wanted to make sure that the licensing of the images  was clear and up front.

It was only near the end of the development process I realised the image I was intending to use for the Bather wasn't licensed for use, so had to take my own photograph.

One of the statues that I did consider using has a image available from Tate images. The cost of using it was prohibitive so wasn't chosen, looking at  the categories of products and media available they were all aimed at physical products, mousemats,mugs posters etc. nothing suitable for use in a digital product. It makes me wonder how museums will handle people wanting to use images in apps

Advertising and Distribution

The app has adverts in it. The are displayed on the individual statues but not on the front screen. This was something I hadn't done before so wanted to do it from a technical point of view to see how easy it is and to consider what happens when an app developer uses a museum content to make money.  I'm not sure how much the app will make. I'm not expecting to get rich from it.  Just as the museums has no control over people developing apps with its images I have realised I have no control over the content of the adverts. On the Play store the App is marked as suitable for all ages but looking at a few ads that have come through on my phone already, one is to download a 'virtual girlfriend' not the faintest idea what that is and don't plan on finding out, but not convinced it is suitable for 'all ages' or wouldn't end up creating a massive security hole on my phone.

Are museums set up to make money from apps that other people develop. I have not made any connection in my app between myself and the V & A or other museums.  If people were to write apps using museum content and distribute it would it be clear that it isn't an official app produced by the institution. What if there are mistakes or offensive content? what would happen then. How much trouble would it cause for the museum or gallery?


So do I still think that that writing guerrilla apps is a way for people to remix and engage with museum content while learning to code?  The barebones of this app were written in a single weekend but it took a lot longer to research the content for it. I am lucky to work next door to the V & A so popping across the road to take photographs wasn't a problem but if you aren't near a large national museum or the museum that you want to take photographs of doesn't allow them could cause problems.

It won't be straightforward and I can see  apps like these developing in two ways.  People who can already code will develop apps along similar themes to Farting statues. Hopefully not loads of clones of the this the world doesn't really need any more farting statues apps. But being creative and having fun. I have been careful to make sure the images were properly licensed. The majority of the content comes from wikipedia rather than museum websites, so is the information correct?   its as good as i can make it but i'm not a expert on any of the statues or the artists. I would rather have the information come from the  definitive source of the museum website but wasn't able to.

The other possibility is for museums to run coding workshops with visitors, start with part written apps or web pages and embed museum content into them.  Web pages can easily be converted into mobile apps.  This would give people an app they can take away with them and would hopefully be a springboard into finding out more about coding and app development.

Either way it needs museums to push out more content and information, the internet isn't limited to the space on a label.  Its a lot easier to find information on wikipedia than it is on a museum website.

In the same way that museums worried that putting content online  would reduce physical visitors to their institutions I have no doubt there will be similar worries to putting content online in a way for people to re-mix and develop. While developing this app I found myself  becoming more interested in the information than just reading it, having to find useful facts and  break down  the content down in to small chunks made me draft and read and re-draft the text several times.  This is something Museum exhibit developers have to do so why not break down the barriers and  give visitors this chance to get down to the nitty gritty with the content. After  all as its digital it can easily be changed, thats the beauty of it.

If museums want to stay relevant as hopefully their  visitors become not just consumers of digital content but creators as well a shift will be needed to make more content available online and encourage its use rather than creating barriers.  Guerilla apps and Farting Statues may not have all the answers but I think it could be a start






technical thoughts

Happy Birthday TCP/IP

I love you even if nobody else does


Technology moves at an incredible pace and we are routinely used to the devices and gadgets we use becoming obsolete. What is today's must have smart phone is tomorrow's unwanted plastic brick. Many people are familiar with Moore's law that says that the number of transistors on a chip doubles every 18 months so it isn't really surprising that we can now fit in our pocket computers with the power that  not so many years ago would have filled  rooms if not buildings.

Its not just the computing power that is changing. Its also how technology works. Telephone calls used to require people to physically connect the callers together, these were replaced by electro-mechanical exchanges and then by fully digital ones. So it might seem surprising that every time you send a email,read a tweet or post a silly buzzfeed link on Facebook we are using a communication protocol that is celebrating its 40th anniversary and looks like it will be with us for a good while yet.

In May 1974 Vint Cerf and Bob Kahn published "A Protocol for Packet Network Intercommunication" that laid down the principles for a packet switching network protocol that could connect existing networks together. The protocol became known as TCP/IP and its ability to allow internetwork communication to take place enabled the ARPANET to grow into what is now the internet.

TCP/IP isn't really a single protocol but a whole suite of protocols that carries out different tasks. These include host addressing to identify unique devices on the networks whether it is a desktop computer(Remember those), laptop or smart phone ,routing of packets around the ever changing network and error handling for when things go wrong.

Adoption of TCP/IP took what might seem forever by todays standards. ARPANET didn't fully swap over to TCP/IP until 1983 and TCP/IP had reached version 4 the version that is still predominantly used today.

But as with most things in life  things don't stay still for long, even with 40 year old protocols. The internet grew vastly larger than anyone could ever imagine. Decisions that were taken in those early days turned out to be limiting factors today. IP the part that provides the unique addresses on the internet has all but used up its 4.3 Billion addresses and these are now only assigned very carefully. Fortunately IPv6 exists that has a possible 3.4×1038 address as well as other enhancements, enough I.P address to keep everybody on the planet with a connection for a  long time to come.

Its not just been 40 years of serious computer science. As an April fool's joke TCP/IP over Avian carrier was proposed and has actually been implemented by homing pigeons. An excellent interactive demonstration of how TCP/IP over any medium is TCP/IP over H2O


This post was never about a full technical breakdown of TCP/IP or computer networking. If you want that there a plenty of big thick books available. Nor was it a in depth historical look at the history of the internet, there are probably books about that as well.

The reason for writing it was I recently saw a tweet from @google announcing a celebration of the 40th Anniversary of TCP/IP.  I imagined that the main stream news sites would be all over it, articles,blog posts, top 10 lists etc..  but then nothing.

It makes me a little sad that all the internet seems to care about are headphone companies and celebrity misdemeanours.  This post is me shouting and cheering for the early internet pioneers and all those that came after and put so much hard work into creating this amazing thing we have now. Without  you the World Wibe Web Wouldn't have been possible and millions of  cute cat pictures would stay hidden.

Happy Birthday TCP/IP  you are looking good for  a 40 year old protocol heres to another 40.




Show / Hide hidden files in OSX 10.9 Mavericks

Sometimes I want to see all the hidden configuration files that various applications on OSX uses. But sometimes I want to pretend everything is beautifully clean and works with Apple like magic.

With older versions of OSX it simply needed a command from the terminal of

defaults write AppleShowAllFiles YES
 && Killall Finder

or to reverse the effect

defaults write AppleShowAllFiles NO
 && Killall Finder

But Strangely that doesn't seem to work with OSX 10.9.

It took a bit of digging in the forums but the command needed is now

defaults write AppleShowAllFiles TRUE
 && Killall Finder

That might look exactly the same but the key difference is is now  Note the 'F' is now an 'f'  The 'F' in the 'Killall Finder' is still capital

There seems to be a few variants of using TRUE/FALSE YES/NO or -boolean true /false but using TRUE/FALSE worked for me.





Projects technical thoughts

How I developed A Really bad Calculator app

A few weeks ago i was pondering on Twitter about how Calculator apps still look like traditional physical calculators.

I think I'd recently been using the Calculator widget on my Mac so the thought wasn't specifically aimed at mobile apps.  I wasn't even sure how true it was exactly.  For all I knew there could have been a massive surge in designers and developers re-thinking the  calculator experience.  I discussed this on twitter a little and did have a look around on Google Play.  There are some new apps that push the design but there hasn't  been any big revolution of Calculator apps.

I was looking for a project while I was off work over Christmas and the thoughts about calculators was still floating around in my head. I wanted to keep working on my design skills and also do a native Android App as its been a while and was feeling out of practise.

Not sure where it came from but I had the idea of doing a calculator that would add, subtract,multiply and divide two numbers simultaneously and I wanted the app to be really simple.

How I envisaged it working was that you would enter a number, press the <enter> key, enter a second number and the results of the two numbers having been added, subtracted, divided and multiplied  would be shown on separate lines.  I also wanted the results to be updated as the second number was being entered.

The interface design and development took the longest because I was very keen not to compromise two of the features of the app.


The finished interface design


The first feature is the large right angled enter key. I originally looked at  using a  Android GridView to layout the buttons, but it isn't actually possible  to have buttons that aren't rectangular.  It would probably have been possible to define a different shaped button and GridView layout to accommodate it.  But that seemed like a lot of work and didn't seem a very general purpose solution.  I'll write another more detailed blog post but it boils down to putting another image above the image used for the layout, making it invisible but still being able to detect the colours in that image


 The hidden image showing the different colours used for detecting the different buttons

I wasn't sure if I needed to detect the plus,minus,divide and multiply symbols being pressed but I decided it was better to have them in just in case.

The other difficult part was lining up the displayed  results with the +, -, ÷ and × symbols.  There is a lot of formatting in the TextView that had to be stripped out. It took a bit of googling to find out how to do it exactly.

Another design decision I made was to have the number keypad using the layout shown.  A traditional calculator has the number 1 at the bottom left and numbers go upwards.  I chose to have the keypad with the number 1 at the top left and have the numbers go downwards the same as a telephone keypad.

Once the design had been done, it took a few attempts to get the operation of the calculator just right. It was then I spotted the massive flaw in the design.

The calculator cannot mix operations  on the numbers that have been input. For example it will do

5 + 6 + 3

5 - 6 - 3,

5 × 6 × 3

5 ÷ 6 ÷ 3

all at the same time but it cannot do 5 + 6 × 3 or 5 - 6  ÷ 3

big OOPS there.  I did wonder if it would be possible to alter the interface to select which operation you would want to do next on the already displayed answers, but chose not to. It would have become very complicated very quickly.

I decided to end the development at that point.  I was happy that I came up with a new idea for a calculator, learnt some new skills designing and developing the app but didn't want to get bogged down fighting a design that didn't work

I still think that there is room for improvement and fresh thinking of the calculator.  If any ideas do pop out of my head i may have another look at the calculator.

The calculator isn't on Google play but the source code is up on Github for anyone that wants to have a look at it.


The calculator showing the results of the add,subtract,multiply and divide operations on 12 and 3