Confronted is a project in collaboration with Zaria Howard and Tatyana Mustakos. What would happen if you were meant to face the scary AI-laced creations that you make? We wanted to recontextualize the sketch2face research, to reimagine the relationship between the sketcher and the sketched humans. This manifests as a sketch2face implementation in VR.

Zaria created the dataset that this was all based off of. She took this Helen1 dataset that had faces detected and vectorized dataset.

She then ran holistic edge detection on the faces with the line drawing face overlaid to emphasize the face, then finally still some of the faces were not showing up in the result so we overlaid the line drawing face over the edge pictures.

I then set up and ran pix2pix in pytorch on that dataset, for 200 epochs.

Tatyana set up a drawing program in Unity that I hooked up to the Occulus. This allowed the user to draw a face, or whatever they wanted, in VR and then hit a button to save a picture of their sketch, then using ZMQ and numpy to format, send it over to a computer with the trained pix2pix model on, which would then process the image and send it back to the computer running the unity project.

I posted the server code here and the client code here

By no means is this code pretty but it gets the job done.

There is a good reason to consider and nurture sketch to human relationships.

Thanks to Art and ML class taught by Dr. Eunsu Kang, Dr. Barnabas Poczos and
TA Jonathan Dinu, for they provided the prompt, Aman Tiwari, for helping with the pytorch model, and Golan Levin for additional advising. (2016) is a website as a place to put all your trust in the computer. You can pray to the computer, you can marvel at the discrete math behind the computer, you can use its binary bathroom and explore the institute. I was interested in deconstructing the parts that makes up an algorithm, and how when we piece together this entire structure we are accepting the mistakes that anyone of these cogs could have made.

Astroturfing (2016)

Astroturfing is defined as “the practice of masking the sponsors of a message or organization to make it appear as though it originates from and is supported by a grassroots participant(s)” The largest exporter of online astroturfing is the Chinese Government’s 50 Cent Army (五毛党). There are millions of posts made by low paid workers to persuade the public’s opinion in favor for the government. This piece takes all the questions in posts made by these workers. They are very subtle in their work, and a question is enough to make one wonder.

Read Digital America’s Journalist Kevin Johnson response to Astroturfing here.


Digital America’s Issue no. 9.

Apple Pie: An American Art Show
10 March 2017
Group Exhibition at Goodyear Arts, Charlotte, NC

Shortlisted for Akademie Schloss Solitude and ZKM | Center for Art and Media’s web residency. The call for proposals on the topic »Blowing the Whistle, Questioning Evidence« was curated and juried by Tatiana Bazzichelli, artistic director of the Diruption Network Lab, Berlin.

The Girls are Home (2016)

In collaboration with Alicia Iott, The Girls are Home combines VR with a traditional age-old toy, the dollhouse. Inside the dollhouse, an observer can peer through pairs of windows, to see a snippet of our unscripted daily life. By addressing the chasm that exists between cutting edge technology, material culture, and the domestic, plain-Jane realm, The Girls are Home works to bridge that gap.

The Girls are Home was exhibited at Art && Code: Weird  Reality Symposium at the Ace Hotel, Pittsburgh PA October 6-9th 2016

This would have not been possible without the support from the Frank-Ratchye Fund for Art @ the Frontier grant, an endowment founded to encourage the creation of innovative artworks by the faculty, students and staff of Carnegie Mellon University.

This piece was also supported by The Henry Armero Memorial Award for Inclusive CreativityThe Girls are Home was exhibited in the the winners show at the Miller Gallery, Pittsburgh PA September 30th 2016. This grant is in memorial of a passed BCSA student, Henry Armero, who had a bright mind and was astutely multifaceted and interdisciplinary. The Armero Family has established the Henry Armero Memorial Award Fund to honor and further Henry’s creative ideals.





Everything is Cute to me (2016)

A music video for The Moon Baby in collaboration with Kevin Ramser, Alicia Iott, David Gordon and Guy DeBree. In this piece, a masculine and exorbitant piece of technology, the Panoptic Studio, is the instrument to a surreal expression of sexuality. As Moon Baby performs her song she transforms from IRL video to a glitching sensationalized cyber body. In each stage Moon Baby is digitally abstracted further and further from reality. Each version of herself still carries her essence but is falling deeper into the uncanny valley.

Clickthrough for making of video


The WOW Report

Prosthetic Knowledge

The Desert At 4am (2016)


In The Desert at 4am is a virtual space rear-projected on to a 4 walled room where the user can walk around a physical space to explore a virtual reactive environment. The environment is inspired by the feeling of infinity that one feels when they are alone in a large open space. Open spaces have historically been marked as places for contemplation. Although a vast space is projected, the user is still confined to a 13′ by 11′ box. The walls are limiting and at the same time they are the window into the environment. To counter but also highlight the duality the scale of the user’s steps are altered in the environment; one step in the physical world equates to 10 steps in the virtual world.

Intersect (2015)


On a team with Irene Alvarado and Michelle Ma we used the Kinect V2 and an Optitrack System to each make our own visualization of movement. Intersect is my take on this.

To create a different awareness when watching a dance performance Intersect captures touch between two dancers. This is done by rendering their bodies spirit-like and wispy, except where the dancers touch. There a solid shape is made out of the intersection.

This disembodiment bears a new aspect to the performance that focuses on the visceral intimacy. This piece was made in Maya and was exported to Unity so it can be viewed through a VR headset with position tracking (using location streaming via wifi & the Optitrack system) so the viewer can be at any position around or in the dance while the dancers perform