I can remember

Silicon memories from a local photo album

I can remember is a project that probes the creative potential of image analysis algorithms and the empathy we can feel towards machines. It uses a small series of local photographs and run them through two algorithms of image description. Through grammar creation, these output handwritten poems. These handwritten poems are then plotted by a mechanical arms in the same area where the photographs have been taken.

Artwork presentation at Reclaim Futures 2020

Whose Memories?

The piece has three main layers of memories: the most obvious is the personal one, highlighted by the location of the shooting and the data that is based on. The second, still visible, is the memories of the algorithms, the labels and words embedded inside them. The third, often unaccounted for is the memory of the people who are part of the dataset, the one that has been extracted to create algorithms.



This piece has been made in a rural area. The photograph that constitute the dataset document the landscape and daily life there. These photographs are the one used to create the poems. They are the images, the instants kept of this moment, and with years, memory will start erasing all the rest. Photograph are totems for invoking recessing feelings of past times. They are the memories that remain: partial, biased and framed.

By putting the poems back into the place they were “inspired” by, It might convey a little more of the place and show what translation errors algorithms have made.



The second type of memories are the memories of the image recognition algorithms, which can only recognise what they have been trained to. It highlights a new kind of sentience, encapsulate their understanding of, their experience of the world. It shows what cluster of pixels can trigger which transistor, reveal what electric phenomenon they know. It unearths the breadth of their vocabulary, and what words represent for them.



The third layer of memories is the one of the people who trained the algorithm, or maybe on which the algorithms have been trained. The image classification algorithms rely on people labeling data, classifying it or contouring it to explain to the computer what is what.

The description of images, the names of the colors, the handwriting styles in this piece are all derived from dataset created and/or annotated by human. When we use those technologies, it is their understanding of the world that we use, their synapses, their memories. The symbol represented by each word is taken from their understanding of the word, the curve we see in the handwritting is taken from their motor skills, their sensory experience of what it is to write. This piece encapsulate their memories (muscle memories, lexical memories, and phenomenon memories).


Witnessing the cyborg within the technology

What is sometimes named artificial intelligence technologies are cyborg technologies, they encapsulte an array of sentient experiences, they are the result of many different experiences of the world.



We can break the process of this piece in four steps: first, the creation of the dataset: photos documenting daily life, then the poem generation, the handwriting generation and finally the plotting.

Photo Dataset

The photo dataset is composed of pictures of landscapes, people, textures, and animals from the area.


Poem Generation

In order to create poems from the photos, I used two main external tools: google vision API and Color summarizer by Martin Krzywinski. These two algorithms, when used with the photograph dataset, produced two distinct vocabularies: labels and colour descriptions. These two vocabularies were subsequently merged via tracery, a python tool, to create the poem. The process is described below.


Handwriting Generation

The handwriting process was more straightforward and is more alien to the place than the poem. It is generated from an algorithm trained on a handwriting dataset emanating from the computer Vision and Artificial Intelligence department at Bern University following a process described by Alex Graves. The process used to create the handwriting is summarised in one of the supporting images.



The code use the handwriting generation with the words produced by the poem generator to create an svg file that was sent to a mechanical arm to plot it on paper.


People do not create in a void. This is especially true for artworks that use technology. This work would not have been possible without other scholars and artists paving the way for experimentation and creativity. Particularly for this piece I found info and ressources with Lynn Cherny COCO’s magical palace, hardmaru post on handwriting synthesis and sjvasquez implementation of Alex Graves paper (also thanks to sarahstrong and ChenXiaoTemp for their forks). For color analysis I used the great tool of Martin Krzywinski, color sumariser. For grammar construction, I used tracery from kate compton and the python port by Allison Parish.