+ pronouns: any pronoun is fine +

Josh Urban Davis is an american research-based artist and engineer from Texas whose practice incorporates sculpture, performance, writing, sound, and video.

His research interests span a wide spectrum of topics in human-computer interaction (HCI), with a specific emphasis on generative design, interaction techniques, and novel interface technologies.

Davis' recent creative projects explore the relationship between emerging technologies, social relationships, and identity. Explicitly engaged with the very technologies that he critiques, Davis' work viscerally confronts the viewer, inviting us to contemplate the role new technologies play in mediating our understanding of reality and identity. His work has been exhibited at D!iverseWorks, the Blaffer Art Museum, Chandler Center for the Arts, Art League Houston, and others.

He currently lives in Hanover, NH with his cat, Nocturne, where he is pursuing a PhD in computer sciences at Dartmouth.

  Download CV

Josh Urban Davis profile photo



Designing Co-Creative AI in Virtual Enviornments

Josh Urban Davis Fraser Anderson, Merten Stroetzel, Tovi Grossman, George Fitzmaurice.

Proceedings of ACM Symposium on Creativity and Cognition (C&C'21), 2021

[23.1% Acceptance Rate]

Read Paper

See 5 min Talk

A Plurality of Practices

A Plurality of Practices

Artistic Narratives in HCI Research

Miriam Sturdee, Makayla Lewis, Angelika Strohmayer, Katta Spiel, Nantia Koulidou, Sarah Fdili Alaoui, Josh Urban Davis

Proceedings of ACM Symposium on Creativity and Cognition (C&C'21), 2021

[31% Acceptance Rate]

[Best Paper: Top 1%]

Read Paper

See Demo Video


Font Your Friends and Loved Ones

On the Utility of Ugly Interfaces.

Josh Urban Davis, & Johann Wentzel.

Extended Abstracts of ACM Conference on Human Factors in Computing Systems (CHI'21), 2021

[21% Acceptance Rate]

Read Paper

Try It



An Interactive 3D Printed Circuit Education Tool for People with Visual Impairments.

Josh Urban Davis, Te-Yen Wu, Bo Shi, Hanyi Lui, Athina Panotopoulou, Emily Whiting, & Xing-Dong Yang.

Proceedings of ACM Conference on Human Factors in Computing Systems (CHI'20), 2020

[23% Acceptance Rate]

[Honorable Mention: Top 5%]

Read Paper

See Demo Video



A System for Peripherally Reinforcing Best Practices in Hardware Computing

Josh Urban Davis*, Jun Gong*, Yunxin Sun, Parmit Chilana, & Xing-Dong Yang (*co-primary).

Proceedings of ACM Symposium on User Interface Software and Technology (UIST'19), 2019

[20.6% Acceptance Rate]

Read Paper

See Demo Video



A Fiber-Optic eTextile for MultiMedia Interactions

Josh Urban Davis

Proceedings of the International Conference on New Interfaces for Musical Expression (NIME'19), 2019

[31% Acceptance Rate]

Read Paper

See Demo Video



Contact-Based, Object-Driven Interactions with Inductive Sensing.

Jun Gong, Xin Yang, Teddy Seyed, Josh Urban Davis, & Xing-Dong Yang.

Proceedings of ACM Symposium on User Interface Software and Technology (UIST'18), 2018

[21% Acceptance Rate]

Read Paper

See Demo Video


past and ongoing works


The following images and animations are generated by artificial intelligence agents called Generative Adversarial Neural Networks trained on a variety of images including colorized photographs of the universe and paintings from art history. The aesthetics of glitch art is the aesthetics of mistakes. This metaphor functions as a healing practice in that it allows us to see fallibility as valuable. Algorithms carry a utilitarian imperviousness — their failures a reflection of their creators shortcomings.

Glitch aesthetics offer an opportunity to see this failure as a bizarre healing practice. In Postcards from the Electric Void, artist and engineer Josh Urban Davis trained a neural network on a dataset of colorized space images, and image archives from art history museums. By asking a machine to imagine the composition of our universe, we facilitate a poetic dialogue between the unfathomably large void of space and the infinitesimal void of data. When the machine produces something plausible as an image of our universe, the experience is uncanny.

However, when the machine produces glitches and errors, these aesthetics reflect a recognizable shortcoming in the machine's inability to reconcile in unfathomable. In this way, the void-glancing machine is recognizable as fallible, creating a point of empathic connection between human and machine.

This project is ongoing with daily posts to Davis' Instagram and Twitter accounts.

  See More

 Publication at NuerIPS Conference

Pixel Dripper No. 3
Beyond the Beast's Tongue
Wall of Wonders
Postcard from the Electric Void No. 552
Postcards from the Electric Void No. 44522
Butterfly Epiphany
Self-Portrait as Vertigo
Feather River
Pixel Dripper No. 7
A Garden in Perpetual Collapse
Self Portrait as Scattering Glitter
Pixel Dripper No. 9


Synapstraction is a brain-computer interaction project which allows users to create an abstract painting based on neuro-feedback. The system uses a special headset that measures the electrical activity of a person’s scalp processed by machine learning algorithms to discriminate a stimuli’s effect on the brain.

Each visitor is invited to enter the installation and approach one of five “sense” stations, each with an electroencephalography (EEG) headset. We then measure the event related potential (ERP) elicited by the visitor’s brain using the EEG when the visitor receives 1 of 5 stimuli. These 5 stimuli correspond to the 5 senses (sight, touch, taste, sound, smell). In addition, each stimuli is mapped to a specific creative instrument such as a paintbrush, sponge, or marker.

Once the visitor has secured their headset, they are presented with a stimulus (e.g. a Chopin Nocturne if at the “sound” station or fresh ground clove for the “scent” station). Our system then uses a machine learning method called linear discriminant analysis to map the activity of the visitor’s brain while experiencing the stimuli to acoustic frequencies which actuate the painting implements. After visiting each of the 5 “sense” stations within the installation, the participant is invited to keep their finished painting.

The participant’s brain serves as a conduit, translating the stimulation of each sense into the finished image. In this way, the usual methodology of an artist using their senses to create a media object is inverted; the senses use the artist to create an image. Synapstraction largely takes its aesthetic interests from the abstract expressionists of the 20th century and its conceptual framework from aleatory artists such as John Cage. Unlike Cage, however, Synapstraction maps all senses to image, and renders the consumption of sensual stimuli as an act of image creation. The material of this artwork doesn’t necessarily lie in the paintings themselves, nor the equipment used in the installation, but instead rests in the speculative reconsideration of potential alternative roles for human senses in art making.

This project premiered during the Digital Arts Festival at the Black Visual Arts Center in 2017.



thisObituaryDoesNotExist comprises a physical and digital collection of obituaries generated by two artificial neural networks. The algorithms (StyleGAN and GPT-2) are shown images and text taken from publicly available online obituaries, and asked to imagine new obituaries. In this way, the system generates infinitely many photographs, death narratives, and life stories for persons who never existed.

The physical exhibition consists of a collection of funeral pamphlets containing a selection of the generated obituary text and images, each pamphlet dedicated to the life and death of one generated person. By making these printed materials indistinguishable from funeral pamphlets for physical persons, Davis invites us to consider how these objects differ-from and are similar-to each other. In this way, the line between generated fiction and physical realty becomes blurred, and we are invited to question what constitutes an authentic personal identity in an era of massive data, social media, and creative artificial intelligences.

thisObituaryDoesNotExist interrogates how technology affects, distributes, and manipulates human experiences of death and grief. The digital website generates a new obituary each time the page is refreshed.

This project premiered at Chandler Center for the Arts in 2019.

  Visit the Obituary

Josh Urban Davis profile photo


@Orwellian2017 is a contemporary translation of George Orwell's novel 1984. On November 5th, 2017 the Twitter bot @orwellian went live and tweeted the entirety of George Orwell’s 1984 with key characters and places algorithmically replaced by the Twitter handles of prominent American political figures.

The bot will tweeted a single sentence once every four hours beginning on November 5th and reached the story’s completion on November 3rd, 2020, the day of the American presidential election.

@Orwellian2017 brings attention to the modern literary potential of social media, and emphasizes the role these technolgies play in disceminating information and shaping political narratives. In this way, @Orwellian2017 shares similar interest with Orwell's novel in their exploration of the role language plays in mitigating our understanding of politics and history. This diaglogue is especially precient during a tumultuous political climate in America, where language and truth mediated by the internet perpetually problematize the machinations of democracy.

  Visit Twitter


PsycheVR is a virtual reality and biofeedback project conducted in collaboration with the Space Medicine Labratory at the Geisell School of Medicine and National Aeronautics Space Administration (NASA).

The project consists of developing virtual reality content to promote relaxation when the user is confined to small, isolated spaces for long durations of time. The products of this project were used in trial experimentation with NASA astronauts intended for future missions to Mars.

We also prototyped biofeedback mechanisms to take in a user’s biometric state as determined by GSR, EEG, facial expression and other stress indicators and subsequently adjust the content of the VR enviornment to assit calming the user.

"Three Days and a Year On Top of a Mountain" (featured right) is a VR sunrise-to-sunset timelapse created on top of Gile Mountain. Recorded over the course of a year, the video features scenes from Gile Moutain throughout it's four seasons. The final video assimilates these footages into a single timelapse, allowing the viewer to experience a day and year on top of the mountain simultaneously.


A portfolio sampling of various graphic design and digital collage projects both past and ongoing.

  See More

Head Tipper
Pretty Boi Rain Drain
Sonic Gooze No. 1
Urban Owls
Hex Plexus
Sonic Ooze No. 2


A collection of ongoing experiments in animation begun in 2017.

These films incorporate writings by living/contemporary poets and are meant to visually and sonically accompany the aesthetic narrative of the literature.

Words by Elizabeth Lyons.

The Wild Iris
Words by Louis Gluck
Inspired by Burn by Darcy Rosenburger



Lets get in touch. Send me a message:

Hanover, NH