Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
Can AI-driven fitness apps, developed with synthetic data, pump up your workout?
During the COVID-19 pandemic, home fitness apps were all the rage. From January through November 2020, approximately 2.5 billion health and fitness apps were downloaded worldwide. That trend held and shows no signs of slowing down, with new data predicting growth from $10 million in 2022 to $23 million by 2026.
As more people use fitness apps to train and track their development and performance, fitness apps are increasingly using AI to power their offerings by providing AI-based workout analysis, incorporating technologies including computer vision, human pose estimation, and natural language processing techniques.
Tel-Aviv-based Datagen, which was founded in 2018, claims to provide “high-performance synthetic data, with a focus on data for human-centric computer vision applications.”
The company just announced a new domain, Smart Fitness, on its self-service, visual synthetic data platform that helps AI developers produce the data they need to analyze people exercising and train smart fitness equipment to “see.”
“At Datagen, our focus is to aid computer vision teams and accelerate their development of human-centric computer vision tasks,” Ofir Zuk, CEO of Datagen, told VentureBeat. “Almost every use case we see in the AI space is human-related. We are specifically trying to solve and help understand the interconnection between humans and their interaction with surrounding environments. We call it human in context.”
Synthetic visual data represents fitness environments
The Smart Fitness platform provides 3D-annotated synthetic visual data in the form of video and images. This visual data accurately represents fitness environments, advanced motion, and human-object interactions for tasks related to body key point estimation, pose analysis, posture analysis, repetition counting, object identification and more.
In addition, teams can use the solution to generate full-body in-motion data to iterate on their model and improve its performance quickly. For example, in cases of pose estimation analysis, an advantage the Smart Fitness platform provides is the capability to quickly simulate different camera types for capturing a variety of differentiated exercise synthetic data.
Source: Datagen
Challenges to training AI for fitness
Pose estimation, which is a computer vision technique that helps determine the position and orientation of the human body with an image of a person, is one of the unique solutions that AI has to offer. It can be used in avatar animation for artificial reality, for example, as well as markerless motion capture and worker pose analysis.
To correctly analyze posture, it is necessary to capture several images of the human actor with its interacting environment. A trained convolutional neural network then processes these images to predict where the human actor’s joints are located in the image. AI-based fitness apps generally use the device’s camera, recording videos up to 720p and 60fps to capture more frames during exercise performance.
The problem is, computer vision engineers need vast amounts of visual data to train AI for fitness analysis when using a technique like pose estimation. Data involving humans performing exercises in various forms and interacting with multiple objects is highly complex. The data must also be high-variance and sufficiently diverse to avoid bias. Collecting accurate data which covers such a variety is nearly impossible. On top of that, manual annotation is slow, prone to human error, and expensive.
While an acceptable level of accuracy in 2D pose estimation has already been reached, 3D pose estimation lacks in terms of generating accurate model data. That is especially true for inference from a single image and with no depth information. Some methods make use of multiple cameras pointed at the person, capturing information from depth sensors to achieve better predictions.
However, part of the problem with 3D pose estimation is the lack of large annotated datasets of people in open environments. For example, large datasets for 3D pose estimation such as Human3.6M were captured entirely indoors to eliminate visual noise.
There is an ongoing effort to create new datasets with more diverse data regarding environmental conditions, clothing variety, strong articulations, and other influential factors.
The synthetic data solution
To overcome such problems, the tech industry is now widely using synthetic data, a type of data produced artificially that can closely mimic operational or production data, for training and testing artificial intelligence systems. Synthetic data offers several significant benefits: It minimizes the constraints associated with the use of regulated or sensitive data; can be used to customize data to match conditions that real data does not allow; and it allows for large training datasets without requiring manual labeling of data.
According to a report by Datagen, the use of synthetic data reduces time-to-production, eliminates privacy concerns, provides reduced bias, annotation and labeling errors, and improves predictive modeling. Another advantage of synthetic data is the ability to easily simulate different camera types while generating data for use cases such as pose estimation.
Exercise demonstration made simple
With Datagen’s smart fitness platform, organizations can create tens of thousands of unique identities performing a variety of exercises in different environments and conditions – in a fraction of the time.
“With the prowess of synthetic data, teams can generate all the data they need with specific parameters in a matter of a few hours,” Zuk said. “This not only helps retrain the network and machine learning model, but also allows you to get it fine-tuned in no time.”
Source: Datagen
In addition, he explained, the Smart Fitness platform optimizes your ability to capture millions of substantial visual exercise data, eliminating the repetitive burden of capturing each element in person.
“Through our constantly updating library of virtual human identities and exercise types, we provide detailed pose information, such as locations of the joints and bones in the body, that can help analyze intricate details to enhance AI systems,” he said. “Adding such visual capabilities to fitness apps and devices can significantly improve the way we see fitness, enabling organizations to provide better services both in person and online.”
Source: Datagen
Fitness AI and synthetic data in the enterprise
According to Arun Chandrasekaran, distinguished VP Analyst at Gartner, synthetic data is, so far, an “emerging technology with a low degree of enterprise adoption.”
However, he says it will see growing adoption for use cases for which data must be guaranteed to be anonymous or privacy must be preserved (such as medical data); augmentation of real data, especially where costs of data collection are high; where there is a need to balance class distribution within existing training data (such as with population data), and emerging AI use cases for which limited real data is available.
Several of these use cases are key for Datagen’s value proposition. When it comes to enhancing the capabilities of smart fitness devices or apps, “of particular interest will be the ability to boost data quality, cover the wide gamut of scenarios and privacy preservation during the ML training phase,” he said.
Zuk admits that it is still early days for bringing AI and synthetic data, and even digital technologies overall, into the fitness space.
“They are very non-reactive, very lean in terms of their capabilities,” he said. “I would say that adding these visual capabilities to these fitness apps, especially as people exercise more in their own home, will definitely improve things significantly. We clearly see an increase in demand and we can just imagine what people can do with our data.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.
One of the most irritating (and slightly painful) parts of joining a Microsoft Teams call could soon be fixed by a new update.
The video conferencing service is a popular choice for many companies, meaning calls with large numbers of participants joining at the same time, and from the same location (such as a meeting room) are a common occurrence.
However, often when multiple people join a meeting in the same room, a feedback loop is created, which causes echo, which in most cases quickly escalates to howling – with Microsoft likening the noise to when a musician holds the mic too close to a loudspeaker.
Teams’ howling
Fortunately, a new fix is coming for Microsoft Teams users. In its entry in the official Microsoft 365 roadmap (opens in new tab), the new “Ultrasound Howling Detection” describes how it aims to prevent this noise for users on Windows and Mac across the world.
Microsoft says that the update should mean if multiple users on laptops join from the same location, it will share with the user that another Teams Device is detected in their vicinity and is already joined with audio to the current meeting.
If a user has already joined with their audio on, Microsoft Teams will automatically mute the mic and speakers of any new the person who then joins the call, hopefully putting an end to the howling and screeching feedback.
Thankfully, the update is already listed as being in development, with an expected general availability date of March 2023, so users shouldn’t have to wait too long to enjoy.
The new updates are the result of using a machine learning model trained on 30,000 hours of speech samples, and include echo cancellation, better adjusting audio in poor acoustic environments, and allowing users to speak and hear at the same time without interruptions.
Sign up to theTechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
Mike Moore is Deputy Editor at TechRadar Pro. He has worked as a B2B and B2C tech journalist for nearly a decade, including at one of the UK’s leading national newspapers and fellow Future title ITProPortal, and when he’s not keeping track of all the latest enterprise and workplace trends, can most likely be found watching, following or taking part in some kind of sport.
Shazam! Fury of the Gods lands in theaters on March 17. (Image credit: Warner Bros.)
The final trailer for Shazam! Fury of the Gods has debuted online – and it looks even more charming, funnier, frenetic, and darker than its predecessor.
Shazam’s sequel flick arrives in theaters worldwide on March 17, so it’s about time we were given another look at the forthcoming DC Extended Universe movie (read our DC movies in order guide to find out where it’ll fit in that timeline). Luckily, Warner Bros. has duly obliged. Check it out below:
Okay, there’s some messy CGI and a slightly corny vibe about Shazam 2. But hey, the first problem can be ironed out before the superhero film takes flight, while the latter is part of what makes this movie series spellbinding (see what we did there?).
But we digress – you’re here because you want to find out what you missed from Shazam! Fury of the Gods‘ new trailer. Below, we’ve pointed out six things you might have overlooked. So, what are you waiting for? Shout “Shazam!” and let’s dive in.
1. Who are the Daughters of Atlas?
New movie, new villains. (Image credit: Warner Bros.)
For a film centered around Shazam, we don’t actually see the titular superhero appear in the official trailer for the first 20 seconds.
Instead, we get another glimpse at Fury of the Gods‘ villains, aka the Daughters of Atlas. The powerful trio comprises the power-hungry Hespera (Helen Millen), dragon-riding Kalypso (Lucy Liu), and Athena (Rachel Zegler), the latter of whom seems particularly torn about how the sisters are going about their business.
So, why are they gunning for Shazam and his superpowered foster siblings? Essentially, when Billy Batson was gifted his abilities by Djimon Hounsou’s wizard in the film film (available now on HBO Max), one of those powers was the Stamina of Atlas. The Daughters of Atlas aren’t too happy about their father’s ability being passed down to a child, so they want to take back what is theirs – and they’ll do it so by any means necessary.
2. Mythological monsters
Shazam isn’t the only person taking flight in Fury of the Gods. (Image credit: Warner Bros.)
Shazam’s first DCEU outing featured some horror-imbued creatures in the form of the Seven Deadly Sins. How, then, do you go about topping (or, at the very least) matching what came before? Throw in a bunch of myth-based monsters, of course.
Kalypso’s imposing dragon is the most notable inclusion. It feature prominently throughout the trailer, and we even get an amusing Game of Thrones reference from Shazam – “Hey, Khaleesi!” – in the movie. Hey, Warner Bros. loves to mention its suite of IPs in as many of its films as possible.
But Kalypso’s wyvern isn’t the only fairy-tale-based beast we see. Minotaurs, griffons, and demonic unicorns are just three of the other monsters who’ll turn up in Fury of the Gods. Basically, don’t expect this to be an easy fight for Shazam and company to save the world.
3. You can’t get the staff these days
“So I just point it and then what?” (Image credit: Warner Bros.)
Saving earth from a new titanic threat will be even harder when Shazam’s adoptive family are stripped of their powers, too. And it seems that the staff, which was wielded by Hounsou’s wizard in the first movie, is the key to giving and taking those abilities away.
In 2019’s Shazam!, the titular hero gave powers to his foster siblings to help him combat the Seven Deadly Sins and Doctor Sivana. They’ve still got those power in Fury of the Gods, too, but they won’t have them for long, based by what the trailer suggests.
The footage shows Freddy Freeman and Mary Bromfield being drained of their abilities by the Daughters of Atlas at various points. The trio are using the wizard’s staff to rob the teens of their powers, so it’s clearly of major importance to the movie’s main players.
Later, we see Shazam wielding it – not before he asks the wizard to take his powers back, mind you, when he becomes convinced he can’t defeat the Daughters of Atlas. Anyway, Shazam’s brandishing of the staff suggests he needs it to boost his own abilities if he’s going to defeat the movie’s antagonists and give his siblings their powers back. Expect the staff to play a vital role in Fury of the Gods‘ plot, then.
4. Prison break
Time to break out, Mr. Wizard. (Image credit: Warner Bros.)
In order to get the wizard’s staff, it seems the Daughters of Atlas go after Hounsou’s magic wielder to obtain it.
We see Hounsou’s character imprisoned at various points, including a shot of Hespera chastising him for giving the power of the gods to Billy, Freddy, and company. “You ripped it from our father’s core,” she tells him, which implies Hounsou’s wizard might not be as mighty and heroic as we were led to believe.
Anyway, Hounsou’s wizard interacts with Shazam later in the trailer, so he clearly escapes captivity. Whether he does so alone, or he enlists Shazam’s help – does that magic-infused dust, which he sends through his prison cell window, have something to do with it? – is unclear. Regardless, we’ll see Hounsou’s character break out at some stage.
5. Is that you, Doctor Strange?
Where have we seen this kind of aesthetic before? (Image credit: Warner Bros.)
Remember when we said Zegler’s Athena doesn’t seem as keen to destroy earth as her sisters? That’s because, at the 1: 14 mark, we see her use her powers with a uncertain look on her face. You wouldn’t look like that if you were convinced you were doing the right thing, would you?
Based on the fact she’s pushed away by Kalypso (using the staff no less), seconds later, it seems she’ll be swapping sides at some stage.
Interestingly, it seems the wizard’s staff can do more than give or take a person’s powers away. One perceived ability certainly has an air of the Doctor Strange/Marvel-based mystic arts about them. Just look at the Escher-style nature of how the scenery bends and folds in on itself when Athena is pushed back, and when Shazam evades numerous buildings at the 1: 44 mark. We’d be very surprised if DC and Warner Bros. didn’t take a leaf out of the MCU’s book with such an aesthetic.
6. Light the way
A yellow bolt out of the blue. (Image credit: Warner Bros.)
Shazam and his fellow superheroes get a costume upgrade in Fury of the Gods. The group’s threads are more streamlined and less plastic-looking this time around, which is pleasing to see.
Fans had been worried, though, that these suits wouldn’t feature one of the first movie’s most underrated (if somewhat tacky) aspects: the glowing lightning bolt on Shazam’s chest. Shazam’s costume in the 2019 movie was manufactured in a way that allowed the bolt to physically light up, avoiding the problem of having to add awkward lighting effects during the post-production phase.
Thankfully, Shazam! Fury of the Gods‘ official trailer confirms that Shazam’s lightning bolt will glow. However, given the sleeker look of the costumes this time around, it appears that the illumination effect has been added in post. Regardless of how it’s been implemented, we’re just glad it’s a feature that’s been retained.
Sign up to get breaking news, reviews, opinion, analysis and more, plus the hottest tech deals!
As TechRadar’s entertainment reporter, Tom covers all of the latest movies, TV shows, and streaming service news that you need to know about. You’ll regularly find him writing about the Marvel Cinematic Universe, Star Wars, Netflix, Prime Video, Disney Plus, and many other topics of interest.
An NCTJ-accredited journalist, Tom also writes reviews, analytical articles, opinion pieces, and interview-led features on the biggest franchises, actors, directors and other industry leaders. You may see his quotes pop up in the odd official Marvel Studios video, too, such as this Moon Knight TV spot (opens in new tab).
Away from work, Tom can be found checking out the latest video games, immersing himself in his favorite sporting pastime of football, reading the many unread books on his shelf, staying fit at the gym, and petting every dog he comes across.
Got a scoop, interesting story, or an intriguing angle on the latest news in entertainment? Feel free to drop him a line.
Jokes aside about Chrome’s incognito mode, the ability to open a private tab for sensitive browsing is incredibly useful. You can perform searches that you want to keep from affecting your recommendations or appearing in your search history—which applies as much to tax information and medical questions as anything more scintillating.
And now on all phones and tablets, you can protect your incognito tabs from prying eyes by locking them down. A quick tweak to Chrome settings on iOS and Android makes biometric or PIN authentication required to view your private tabs whenever you leave the app and then return. It’s an extra layer of protection for when you forget to close a tab when you’re done—easy to do if you’re constantly hopping between apps. No need to worry about banking info sitting unguarded, for example.
Trying to feature out for yourself is easy. If it’s rolled out to your Android device (or if you’re only now trying it on your iPhone or iPad), just tap on the three dot menu in Chrome, then Privacy and Security. Toggle on Lock Incognito Tabs When You Close Chrome. Now when you switch away from Chrome and then come back, you’ll have to pass an authentication check before you can see and interact with those private tabs again.
Flipping the toggle is all you need to do to enable this feature. (Shown here in iOS.)
PCWorld
For folks who use incognito tabs more on mobile than dedicated apps, this feature is a very welcome addition—and one I hope to see come to desktop computers next. I leave my incognito windows open on PC for long stretches way more often than on a phone or tablet. I haven’t yet met a browser window stuffed with tabs that I didn’t like to keep around. And sometimes I’m reading up on something I don’t want roommates to know about; other times, I have private correspondence I’m working on that I really don’t want to be seen.
I can always lock my PC, but I occasionally forget to slam my fingers on Win + L before dashing off to deal with an overflowing pot or vomiting cat. The best alternative is setting up Dynamic Lock in Windows, but that only works if you move far enough away from your computer to trigger the auto-lock. It unfortunately doesn’t prevent someone also in your kitchen from wandering by your screen and teasing you about your recent discovery of r/illegallysmolcats. Ask me how I know.
Alaina Yee is PCWorld’s resident bargain hunter—when she’s not covering PC building, computer components, mini-PCs, and more, she’s scouring for the best tech deals. Previously her work has appeared in PC Gamer, IGN, Maximum PC, and Official Xbox Magazine. You can find her on Twitter at @morphingball.