When fashion and choreography meet artificial intelligence

At the Google Arts & Culture Lab in Paris, we’re all about exploring the relationship between art and technology. Since 2012, we’ve worked with artists and creators from many fields, developing experiments that let you design patterns in augmented reality, co-create poetry, or experience multisensory art installations. Today we’re launching two experiments to test the potential of artificial intelligence in the worlds of contemporary dance and fashion.

For our first experiment, Runway Palette, we came together with The Business of Fashion, whose collection includes 140,000 photos of runway looks from almost 4,000 fashion shows. If you could attend one fashion show per day, it would take you more than ten years to see them all. By extracting the main colors of each look, we used machine learning to organize the images by color palette, resulting in an interactive visualization of four years of fashion by almost 1,000 designers.

Everyone can now use the color palette visualization to explore colors, designers, seasons, and trends that come from Fashion Weeks worldwide.  You can even snap or upload a picture of, let’s say, your closet, or autumn leaves, and discover how designers used a similar color palette in fashion.

  • Various fashion images in visualised cloud.gif

    Explore four years of runway looks organized by color using machine learning.

  • Images of clothing on a mobile screen.gif

    Take a picture or upload a photo to see how designers used a similar color palette in fashion.

  • a visualisation showing fashion looks.gif

    Click on a palette to see which runway looks have similar colors, and compare by collection.

For our second experiment, Living Archive, we continued our collaboration with Wayne McGregor to create an AI-driven choreography tool. Trained on over 100 hours of dance performances from Wayne’s 25-year archive, the experiment uses machine learning to predict and generate movement in the style of Wayne’s dancers. In July of this year, they used the tool in his creative process for a new work that premiered at the LA Music Center

Today, we are making this experiment available to everyone. Living Archive lets you explore almost half a million poses from Wayne’s extensive archive, organized by visual similarity. Use the experiment to make connections between poses, or capture  your own movement to create your very own choreography.

The Living Archive experiment

You can try our new experiments on the Google Arts & Culture experiments page or via our free app for iOS and Android.

from When fashion and choreography meet artificial intelligence via Google voice for business

Leave a Reply

Your email address will not be published. Required fields are marked *