Audio AR? Now, what on earth is that? Well, we all know about visual AR, where you put a layer of extra visual information to what you see in your surroundings. For instance a map with directions on top of the street view you have in front of you. Audio AR is instead a layer of sound information added to what you see around you. And this is exactly what headphone company Bose is doing with their prototype AR sunglasses they just presented at SXSW.
So how does it work then? Well, you put the glasses on and as soon as you see something you want to know more about, you just tap on the stem and you get the info right in your ear. The glasses know what you’re looking at without needing a camera. Instead they use on-board motion sensors that can detect the direction you’re facing. They are also programmed to recognize head gestures, such as nodding and turning. For instance, you can nod your head to take an incoming call or shake it to decline.
Designwise they actually look pretty good for a prototype, and according to reports they are very light and easy to wear. So that’s promising.
But of course it’s all about the data. So far Bose is working with TripAdvisor, Yelp, Asics Studio, Strava and TuneIn, and hopefully there’s more to come for this product to work the way it’s intended.
The glasses will be released in a first version this summer.
Another image search app for fashion items has seen the light of day. Pixt, who launched in February 2018, is a New York based company aiming at helping people find new stores and unique clothes locally. It works the usual way: after downloading the app, you take or upload an image of an item with the app camera, and then you can find similar looking items nearby, and also check availability and price.
Of course, this is only useful if you happen to live in the New York area (and in the future, other urban areas of the US) but what I find interesting about this app is just that, the local factor.
With today’s online shopping habits, and big corporations and malls swallowing every attempt to do something in a small scale, it is really hard to make it as a small, local store. But at the same time, people also seem to be craving the small scale alternative, and the opportunity to find unique things in physical, local stores. Is this one way to create the much needed connection between the two? It’s definitely worth thinking about.
The annual festival SXSW Interactive starts today and this year’s schedule looks exhaustive indeed. The fashion angle is present of course, and starting off today is a discussion about how AI is transforming luxury, fashion and beauty with reps from L’Oreal and Fashion Innovation Alliance among others.
Starting off tomorrow’s schedule, Erik Bang of H&M Foundation will participate in a panel discussion about biotech’s impact on fashion and on our planet. Discussion partners include Dan Widmaier from spider silk company Bolt Threads, Suzanne Lee from lab leather makers Modern Meadow and Rachel Arthur from innovation firm The Current Daily. Should be exciting!
Clothes that change colours while wearing them, is nothing new. Thermochromatic inks, as the proper term goes, have been seen on scarfs and t-shirts for some time, but never really gotten a break-through. Maybe designer Julianna Bass, who recently showcased her latest collection at New York Fashion Week, will help change that. Unlike other colour changing fabrics who are triggered by the body heat of the wearer, Bass’s dresses are equipped with small power sources controlled by a soft button integrated in the fabric. This way the wearer controls the change in colour, instead of being surprised by it. The dresses were made in partnership with New York-company Loomia, founded by fashion tech profile Madison Maxey.
Photo credit: Dan Lacca/Julianna Bass SS18 collection, showcased at New York Fashion Week.
Gothenburg based fashion studio Atacac, have plans for the future of fashion. The Rickard Lindqvist and Jimmy Herdberg founded studio/lab, wants to change the way clothes are designed, sold and produced, in their very own way. They make very realistic 3D models of each garment that are then presented in their store. The aim is to sell them before they are produced, that way avoiding overproducing, a problem that the fashion industry is struggling to solve at the moment.
As part of that development, the group announced something quite radical when speaking at Stockholm Fashion Tech Talks last Wednesday: From now on, all their garment templates are free, which means you can download the 2D cutting patterns and the 3D visualization from their store. The idea is to encourage developments of their designs. Quite an exciting move, I have to say.
On my way to a full day of Fashion Tech at Fotografiska in the Swedish capital, arranged by Patriksson Communication and Association of Swedish Fashion Brands among others. So looking forward to speeches by, among others, Niall Murphy from Evrythng, Matt Drinkwater from London College of Fashion and Amanda Parkes from Fashion Tech Lab. Stay tuned for more reports!
Oh yes, it looks like our selfies are about to get to much better in the near future. Amazon is now adding a voice controlled, standalone selfie camera to its AI assistant Alexa. Echo Look has a lot of the same features as the last one, but with four LED lights, a depth-sensing system that blurs the background, and the possibility to take videos to get the best view of your outfit from every angle. With the function ”Style check” it uses machine learning to compare outfits and give style tips. The more you use it, the smarter it gets. The camera is not yet available to the public, but when it does it will sell for about 200 dollars, according to Fashion and Mash. Wanna see for yourself how it works? Take a look at this promo video.