Jacquard started out as a sensor on a denim jacket, where specially woven textile on the sleeve let the wearer control actions on their phone by touching the fabric. Swipe a palm up the sleeve to change music tracks, swipe down to call an Uber. A double-tap during a bike ride would send an ETA to a pair of headphones.
But Google's wearable sensor technology is evolving beyond just taps and swipes. The Jacquard sensor, called the Tag, can now be installed into the insole of a shoe, where it can automatically identify a series of physical motions. In its first implementation, it will track the typical movements people make when playing football (the sport Americans call soccer) like kicking, running, stopping, and accelerating again.
It's just the latest incursion into ambient computing from Google's Advanced Technology and Projects (ATAP) team, the folks behind Jacquard. I spoke to the team about how the Tag's new mechanics work and what the world will look like once the computers around us can sense our presence and offer us what we need before we even know to ask for it.
Jacquard was an experimental project, announced at Google's developer conference in 2015. Two years later, the team debuted the tech in a Levi's denim jacket. The Tag is the computer, converting up to three touch gestures made on the jacket's sleeve into customizable actions on a smartphone—ideal for people who commute by bike or scooter who can't pull out a phone while riding.
Fast-forward to 2019 when Google unveiled Jacquard 2.0, a smaller Tag that went inside more styles of Levi's jean jackets (including ones that cost less), as well as a backpack from Yves Saint Laurent. This same tag can now be plopped into a $40 insole made by Adidas called GMR (pronounced "gamer"), which can be placed into any soccer shoe, Adidas or not.
It all ties into EA Sports' FIFA Mobile app on Android and iOS. To improve the rating of your virtual FIFA Mobile Ultimate Team, your options are to play the videogame, spend actual money on in-game boosts, or now, play in the real world while using the GMR insole and Tag. You'll have certain goals to hit—like 40 powerful shots in a week—to earn coins and skill boosts in the virtual game. The more real-world achievements you complete, the better your virtual team can be.
The blending of the physical and digital worlds, whether for a game or an art project, is an idea that's gaining popularity—just look at any toy that has an augmented reality component. But unlike most AR systems, the Tag isn't using a camera to analyze its surroundings. It uses machine learning to identify the wearer's foot and body movements at a much more sophisticated level compared to understanding hand gestures on a jean jacket.
"Jacquard is no longer just about the fabrics and the yarn and the connectivity through your sleeve," says Dan Giles, product manager for Jacquard at Google. "It's really about bringing ambient computing to our users in a new way that's familiar to them and the objects around them."
When you buy the GMR insole, you get a pair of inserts (one for each shoe) and one Jacquard Tag. It's the same Tag that comes in Levi's newer jackets or the YSL backpack. Choose which shoe you want the Tag to be in, and you can put a dummy Tag in the other to feel balanced. After pairing the electronics with the FIFA game, you slip on your cleats and head out to a field. Your phone doesn't need to be anywhere near you while you run around; the Tag runs its machine learning algorithms locally on the device.
It's smart enough to know that it doesn't need to track your walk to the pitch. Instead, the Tag only starts using the bulk of its computing power when it detects you're actively making moves typical of soccer. How does the Tag know what those movements are like? It has sensors inside that can measure acceleration and angular rotations as well as a microcontroller that can run neural networks, which are algorithmic programs that are taught to recognize patterns.
"We had to build a whole suite of new machine learning algorithms that can take the sensor data coming from the Tag and interpret this based on what the motions are," says Nicholas Gillian, lead machine learning engineer for Google ATAP.
You can learn a lot by looking at patterns. Data coming from a runner, for example, will look steady across the duration of their workout, and very cyclical. Data from a footballer will look much more erratic, with sudden spurts and fast turns mixed with moments of little activity. Gillian says Google worked with Adidas, EA, and soccer experts to collect data from people playing in different contexts (whether during training or an actual game). That data was then used to train thousands of neural networks to understand these complicated football motions. The data is anonymized so it's not tied to a specific user, and there's no GPS or location-tracking abilities in the hardware.
The neural networks are so well trained now that the Tag can recognize when you make a fast turn, when you're kicking the ball, how far you've run, your peak speed, whether you are passing or shooting, and how powerful your kicks are. It can even estimate the ball's speed after you kick it. All this is happening in real time as the player moves.
Gillian noted that these machine learning models are often gigabytes in size. The ATAP team managed to export its code down to a few kilobytes so it could run on the Tag—similarly to how Google shrunk Google Assistant's algorithms so it could run locally on its Pixel phones.
In the context of the FIFA app though, the player will need to head back to their phone and wait for the data to be sent to the videogame to see progress on their goals. You can play soccer normally, or you can specifically try to hit the goals required to progress your virtual team in the videogame. It doesn't matter if you're an expert or an amateur, since the Google team specifically made sure to collect data from players with varying levels of expertise.
"We're not asking you to play soccer in a different way," Giles said. "Just go play soccer the way you always play."
Google has slowly been moving to this future of ambient computing, where the tech is seamlessly integrated into your surroundings. Its most recent Pixel phones have a sensor that can identify hand gestures, allowing owners to wave their hand above the phone to switch music tracks or play and pause music, without having to touch the phone or speak a voice command. The phone also has sensors that can detect if the owner has been in a car crash, based on machine learning algorithms of what happens during accidents, and will contact emergency services if it doesn't hear a response.
"I do think there's a direction toward these motion-based controls," Giles says. "It's this vision of ambient computing—getting it out of these smartphones or even laptops and moving it into an area that's closer to the user with more natural interactions. We love this idea of taking ambient computing and just subsuming it, really hiding it in the products we're using. It shouldn't be explicit; it should just be there, add value to you in such a natural, interactive way that you don't even know it's there."
Jacquard is just one arm of Google's ambient computing platform, but it achieves this vision far more clearly than anything else. Giles says the team started with soccer because most of the game's motions can be understood just through the feet, but the technology can be expanded to a wide number of other applications.
"Whether you put it in a wrist band or headband, it's the same model and platform," Giles says.
- How North Korean hackers rob banks around the world
- A fast walker gets stuck in the slow lane
- Spit kits, sperm donors, and the end of family secrets
- Inside Mark Zuckerberg's lost notebook
- Please, please, please don't mock conspiracy theories
- 👁 Want a real challenge? Teach AI to play D&D. Plus, the latest AI news
- 💻 Upgrade your work game with our Gear team’s favorite laptops, keyboards, typing alternatives, and noise-canceling headphones