Material & Virtual Cultures
Opening The Cases: An interpretative tool to explore objects at the Museum of Childhood.
The interpretative tool I propose is a new tool (app for tablet/mobile) which focuses on objects within the collection of the Childhood Museum.
The app offers users the ability to see and experience objects from the collection in motion, and playfully explore them. Through this tool users can enjoy a more complete experience of the objects on display: In higher resolution, through zoom, with 360° view, in motion, and with sound. It also offers the potential for learning through play. Essentially the app ‘brings the objects to life’ and enhances the audiences’ experience. This tool puts more, new, information at the fingertips of the viewer.
A key motivation for the development of this tool is the inherent dichotomy expressed by displaying a toy (originally made to be touched and played with) in a glass cabinet, removed from the implicit interactions engrained in the very nature of these objects.
Installed on a portable device, visitors will be able to use the platform to interact with objects as they move through the museum. The app is targeted at families and schoolchildren but could, through the addition of further data, and re-skinning, be equally suited to other audiences.
For the purposes of this document, and in order to demonstrate the functionality of the tool I have chosen six objects from the collection to use as examples. I envisage, however, that the success of the app would ultimately be contingent on applying the technology to a critical mass of objects within the collection. The mock-up illustrations show an iPad layout though there is no reason why android or indeed windows phone apps could not be developed. (Further research into the demographic of visitors to the Museum would be required before committing to a development and roll-out schedule).
The screen size of tablets make them ideally suited to the display, and exploration of, artefacts though a simplified version for smartphone could be developed.
One model for use sees the Museum own and loan tablets, which have the data preinstalled, to visitors. – While the considerable benefits of low buffer time, and fast data retrieval would result in an excellent user experience, the relatively high cost of capital investment to the Museum might prohibit this model. An alternative (BYOD) model sees visitors ‘Bring their own device’ and choose to install the app when they arrive. Here data retrieval would be slower, and user experience would be less consistent owing to the variety of possible devices, and versions of OS, running the app. Both of these models inevitably require some investment in infrastructure as high bandwidth Wi-Fi access points and optical fibre would be required to serve the data. Potentially a server upgrade might well be necessary also.
The interactive tool will have various functions:
Initial interface
Exploration tools
Users can explore by Image Recognition. - The app makes use of the camera on the device allowing visitors to identify single objects they wish to explore.
The ‘Explore the Cases’ option makes use of NFC (near field communication) technology allowing the device to ‘sense’ where the visitor is and, first, present them with options to explore the cases right next to them. Continued swiping of the tablet would reveal cases further away. (Effectively the cases are ‘sorted’ by proximity. Additionally this tool serves as an alternative gallery map allowing visitors to self-guide around the Museum)
The illustrated examples (below) show the Optical toys section, and the Clockwork section.
Once ‘in’ a case, users have the ability to click individual objects to learn more / see them in action.
Toys in motion: Moving toys (clockwork /wind-up/ optical toys) can be can be seen in full, dynamic action using this function. For my illustrations I chose three objects from the optical toy section, (the only case which the Collections Department agreed to open). The fully developed app would, of course, include numerous optical toys with diverse mechanisms, and other moving toys. Items would be pre-recorded in action allowing viewers to access short video clips.
Phenakisitiscope 1833-1899



360° and Close up: Often museum lighting, and reflections in glass display cases can result in poor visibility. This tool allows the viewer the luxury of seeing objects in pin-sharp photographic detail. A 360° rotating tool shows objects to be seen from every angle. In many instances the ‘back’ of objects in display cases is concealed, and invariably the underside is impossible to appreciate. Users experience a true sense of discovery as they spin and pinch-zoom the objects, investigating makers’ marks and tiny details. In producing the component photographs for this tool each object is lit and handled using archival, sympathetic methods. This documentation process simultaneously provides assets for the digital tool, and a detailed archival record.
(drag to spin, and double-click to zoom)
'Panorama', Toy Theatre, made 1880-1900.
Lithographed tinplate chicken with clockwork mechanism, made in China, 1975-1979.
Lithographed horse with clockwork mechanism, made in Germany,1970.
Exploration Through Play
Play with the collection: (Drag & Drop Interactivity)
This section will allow the audience to ‘virtually’ play/interact with selected toys in the collection. To demonstrate this I have created a working mock-up of a game for under 7’s. The game, called ‘Sophie doll’, lets children dress the wooden doll using a Drag & Drop touch interface. Here I combine images of several objects from the collection including clothes from different dolls. The idea behind this is not only to offer possibilities of discovering the collection through play, but also to create instances of connectivity between toys in the collection. (When the visitor touches an image, information about the object appears on an overlay).
(Please note: the 'drag & drop feature in this mock-up doesn't work on touch devices.)
Wooden doll called 'Sophie' made in England between 1750 and Painted and turned wood, with leather and cotton; glass eyes and human hair wig.
(IMAGE MISSING)
As visitors click on the images, information about the pieces appears as an overlay.
(IMAGE MISSING)
Game with a creative output: To add an element of creative play, children would also be able to design a dress for Sophie. These designs could be then added to “Sophie’s Wardrobe”. They could be shared via social media; uploaded to the Museum website or even be shown in the museum via projection, so they might be enjoyed by other visitors too.
After designing a dress for Sophie doll, users drag their new dress to dress the doll.



Possible Developments:
The prospect of creating games with toys from the collection is endless and very exciting. Others ideas for games might include:
Story creation: Choose 2 toys to begin a conversation. - What would they say to each other?
Mixing machine: A ‘toy factory’ game that uses randomness as a surprise element. - Mix ingredients to create a new toy. The user chooses toys, or parts or toys, to put into the machine, before seeing the new combination…
Word Game: Make words using a word game from 1900. (Here bellow is unfinished animation for a possible word game.)
Word Making and Word Taking, 1900-1925 (published).
Square cards with black reverse sides and face side white printed with a black letter of the alphabet
No. of cards: about 192
Sound Trail: Owing to the Collection Department’s handling policy, I was unable to record the sounds generated by toys being ‘activated’. The concept of this function, however, is that the app would guide the audience around the museum to ‘hear’ the sounds of various toys in use.
This idea suggests further possibilities: With sufficient keyword tagging, visitors could conceivably create their own custom ‘trail’ with a rout generated by other characteristics such as material, date, colour, theme, or even by their own search word.