Imagine a shirt that gets passed on from mother to daughter to granddaughter over the course of 20 years. During this process, the garment changes print and colour and ultimately gets back to the brand for reusing in new ways the last decades of its life, first as a jacket lining, then as an accessory.
This thought experiment is part of the Circular Design Speed project conducted this year by Swedish clothing brand Filippa K, research body Mistra Future Fashion and University Of The Arts London (UAL).
“If you can make a garment last through the process of reinvention in reasonable, commercially available and viable ways, you replace the purchase of a new product. The lifecycle assessment of the service shirt against a standard polyester blouse showed significant climate change savings.”, project lead, prof. Becky Earley, Co-Director, CCD, says to FashNerd.
I think this kind of thinking is exactly the industry needs right now. What if we could reimagine the whole idea of what fashion is supposed to be? What if we could rethink the whole set of attitudes surrounding it, keeping the idea of fashion as something living and constantly changing, while still managing to apply this to considerably fewer items? Applying the concept of change to the item itself instead of constantly changing the item?
It’s an exciting idea.
Personally I’m very excited to see in what ways digital technologies could help enable this change. AI is already being used by some as a tool in the design process, maybe it could be of assistance in this context as well, reimagining the next phase in the item’s life cycle? Not to mention AR, with its ability to let us enhance reality.
Almost half of the clothing items bought on online in the UK is returned, mostly due to incorrect sizes. This is a huge environmental problem. And many are trying to solve it, as I adressed in a recent post.
Now, online fashion retailer Asos is introducing a new online sizing tool that attempts to take a step in this direction too. Their fit assistant combines machine learning with a visual questionnaire in order to take into account a wider range of body types and shapes. That is, not only height and weight, but bust size, belly shape, hip width, age, and also what kind of fit the customer is looking for.
Then they blend this customer data with recommendations from the customer’s previous purchases and returns, along with what other customers with similar body types have been happy with.
That’s lot of data though, and quite personal too, which means you have to be ok with giving away all that to a retailer.
The question is also if getting the fit right the first time, necessarily helps lower returns (and thereby the number of harmful transports). Maybe we’ll just shop even more? Worth thinking about.
Screen shot from Asos Fit Assistance.
AI as a creative tool? The idea is getting more and more attention among labels and designers, as I have adressed in previous blog posts. The other day the italian online retailer Yoox (who is a part of the Net-a-porter group) announced a new, yet unnamed label. The new collection will be informed by data, but still be designed by the human creative team, The Current Daily reports.
“By using the data, we think the creative team can interpret better our customer needs going forward, Federico Marchetti, CEO of the Net-a-porter group, said when speaking at a Wired Smarter conference on Monday.
The YNAP aren’t newcomers when it comes to AI. Their logistics center is already fully automated and they have also given their personal shoppers an AI tool to help them give better advice to customers.
Using it in the creative process though, opens up for concerns about what it does to creativity.
Do human creativity and machines really mix? Marchetti wants to explore that. Maybe in the future we will see labels like ”Made in Italy” replaced by ”Made by humans”, he suggested.
“Man is about emotions. It’s about beauty. It’s about feelings. The machine is about speed, information power and the future. Can these two worlds co-exist? We must make choices to strike the right balance.”.
The fashion world is currently being flooded with all kinds of tech. Some of them aim to make online shopping easier, others to technically enhance the clothing itself, while yet others are all about the holy grail of it all: style.
That mysterious quality close to the concept of taste – hard to grasp, hard to just acquire. More like something you either have or don’t have, a personal trait that just seems magic in a way.
But what happens when technology starts to claim to know something about style? Is it even possible? Well, there are several attempts actually. Like Amazon’s style assistant Echo Look that analyses your outfit through a combination of algorithms and human fashion specialists. Meanwhile StitchFix sends users clothing suggestions by cross-referencing a client’s preferences with what others of similar age and demographic have bought. Meanwhile Matchesfashion.com works with personalized avatars that can try on digital clothing samples to help you see what it might look like on your body, while Net-a-Porter is trying a type of tech that will creep right into your data looking for info on what you plan to do next (events, trips and the likes) – and suggesting purchases accordingly, as reported in The Guardian.
But is that style though? And does all this help us require it? Well, I guess that depends on what we feed the algorithms. If we tell them to check for mass approval (likes on social media for instance or whatever has worked in the past) we will get just that and no progress at all. But on the other hand, are we comfortable programming algorithms to be audacious and bold the way our human style role models are?
Maybe that is the kind of trait we still prefer in humans.
How about if you could search on specific features you’re looking fo in a piece of clothing, instead of just browsing endless amounts of items until you find what you’re looking for? Fast fashion brand Forever 21 just launched a visual search tool for online shopping, that helps with that. In their feature ”Discover your style”, shoppers can click on features such as sleeve, neckline and cut and under each category find images of the different features instead of text. In other words, you don’t have to know the correct term for feature you’re looking for, you just have to know what it looks like.
The company hopes that this will help shoppers with their more subtle likes and dislikes, and also remove language barriers for shoppers.
Above all, I think it will save shoppers valuable time, not having to browse through the enormous range of products within each category, that fast fashion brands are associated with.
With the demise of several smart jewellery companies this year, it’s refreshing to see Swarg Tech’s new piece of smart jewellery Sahki, with its beautiful jewel embellishments. So what about the tech specs?
Well, apart from the fitness tracker and SOS trigger you see in most wearables, this one also has a child tracking mode and an integrated AI voice assistant to help you add items to your shopping list or switch on the lights for instance.
With the growing popularity of voice assistants, it is likely that the integration of them into smart jewllery might be a winning combo.
Last October, Amazon bought 3D model company Body Labs, their first step towards creating a virtual try-on service for clothes.
Now Amazon are inviting people to have their bodies scanned at their New York office, The Wall Street Journal reports. The participants are being asked to return every two weeks to have their bodies scanned over a total of 20 weeks. They also need to answer questions about fitness, health and weight-related loss and goals, in order for Amazon to understand how bodies shape over time.
In January , the tech giant patented a blended reality mirror that lets you that lets you try on clothes virtually, a step up from their style assistant Echo Look Camera, released a year ago.
Body Labs 3D Scan