carrot carrot carrot Change Centers x cognizanti collaborators create-folder Data Science Decisive Infrastructure download download edit Email exit Facebook files folders future-of-work global sourcing industry info infographic linkedin location Mass Empowerment Mobile First our-latest-thinking pdf question-mark icon_rss save-article search-article search-folders settings icon_share smart-search Smart Sourcing icon_star Twitter Value Webs Virtual Capital workplace Artboard 1

Please visit the COVID-19 response page for resources and advice on managing through the crisis today and beyond.

Creators United

Snap’s Lens Studio now supports custom ML-powered Snapchat Lenses

Snapchat’s growing collection of Lenses has enabled users to augment their own photographic reality on their smartphones with facial modifications, environmental effects, and location-specific filters for quite some time. With a recent update to their desktop development app Lens Studio, Snap makes it possible for creators to use self-provided machine learning models in Lenses, to inspire partnerships between those creatives and ML developers.

Called SnapML, the update will let developers import machine learning models that can power Lenses. As a result, Snapchat will be able to instantly identify a much wider range of real-world objects and body parts. 

For example, consider a new foot-tracking ML model allowing developers to craft Lenses specifically for feet. This in turn could give lens creators/marketers a new way to engage customers by providing the ability to virtually try on shoes!

Per Snap’s Eitan Pilipski, the company will be “facilitating and connecting creators with ML developers such that they can together build experiences.” 



Sydney Kidd, Strategist, Human Centered Design, Idea Couture

Sydney Kidd

Strategist, Human Centered Design, Idea Couture