Listen to the article
It’s not in a polished demo video or keynote that this story begins to feel authentic. Between sets, people are constantly looking down at their phones, scrolling with the same seriousness they used to save for the mirror, in a gym where the air has a slight rubber and disinfectant odor. Now that its AI-powered Personal Health Coach is coming to the iPhone, Fitbit is entering that market by giving iOS users in the US and a number of other English-speaking countries a public preview.
When you take away the marketing glitz, Fitbit’s pitch is straightforward: a conversational coach that uses Google’s Gemini models to read your history, including your sleep patterns, activities, and trends, and reacts like a trainer who recalls your motivation in November and what you did last Tuesday.
| Category | Details |
|---|---|
| Company | Fitbit (Google) |
| Feature | Personal Health Coach (AI-powered, conversational coaching) |
| Expansion | Rolling out to iOS (iPhone) and additional countries |
| Availability | Public Preview (not final release) |
| Requirements | Fitbit Premium + eligible device; rollout occurs “over the next few weeks” |
| Supported Regions (Expansion) | U.S., UK, Canada, Australia, New Zealand, Singapore (English) |
| AI Backbone | Google Gemini models |
| How to Access | Join “Public Preview” inside the Fitbit app |
| Official Reference | https://blog.google/products-and-platforms/devices/fitbit/personal-health-coach-expansion/ |
Google has been working toward this for some time; in late 2025, it will first give qualified Fitbit Premium users in the US an Android preview of the coach before expanding the circle. Because it alters who is invited to the experiment, it’s possible that the iPhone expansion is more important than the underlying technology.
Since the feature is still in its public beta stage, Fitbit is still gathering user feedback and cautiously acknowledging that it may malfunction, misinterpret user intent, or occasionally provide advice that seems a little strange. That candor is refreshing, but it also brings up a question that haunts all AI-in-health products: who bears the responsibility when the advice is incorrect—the user, the business, or the gray fog of “well, it’s still in preview”?
The coach can do a lot on paper. When users request a plan, such as building strength with minimal equipment or training for a 10K, the system creates routines and modifies them as schedules change.
The goal is not only to create workouts but also to modify them based on the individual data that wearables are already gathering. According to Google’s own research, the coach is constructed using Gemini models and thoroughly evaluated by humans, including specialists in a variety of health-related domains. Even though the user interface is essentially a chat box, the ambition is lofty.
People who have worked with a good personal trainer, however, know that the work is not about “making a plan.” Plans are inexpensive. Seeing form breakdown on the third rep, hesitating before a lift, and reading the half-lie in “yes, I slept fine” are the true signs of service. By observing patterns such as resting heart rate, sleep stages, and training load, an AI coach can roughly predict this, but it still seems to be speculating about the most human aspect: why a person didn’t complete the workout.
However, when you consider the economics, the appeal becomes clear. In many places, hiring a human trainer is more expensive than most people would like to acknowledge. The coach is not a free public good; Fitbit Premium is the gate. Investors appear to think that the subscription layer, which transforms wearables from one-time purchases into recurring partnerships, is where the long-term revenue resides. However, when algorithm-based relationships start to feel preachy, or worse, generic, they often become fragile.
The geography of the rollout is telling. The iOS expansion comes with increased accessibility in the UK, Canada, Australia, New Zealand, and Singapore, all of which are currently available in English. That’s risk management, not just convenience. Nuance breaks in language. When giving advice to people about their bodies, sleep patterns, stress levels, and limitations, subtlety is crucial.
Additionally, there is the silent privacy concern that lingers whenever a health app requests more extensive access. This coach generates recommendations based on your past data. Giving someone a diary written in heartbeats and sleepless nights can feel as empowering as finally receiving something valuable after years of tracking. It’s difficult to ignore how readily people give their consent when the promise serves as motivation as you watch this happen.
Can it take the place of personal trainers, then? Perhaps for some individuals. An iPhone’s built-in coach could be sufficient for someone who primarily requires structure, gentle accountability, and a plan that adapts when life becomes chaotic. It might even be superior to a mediocre trainer—the type that repeatedly repeats routines and obsesses over the time. However, it still feels awkward to replace a great trainer—someone who knows when to push and when to back off, who corrects posture mid-rep.
A reorganization seems more likely. The default layer is AI coaches, which are inexpensive, reliable, patient, occasionally surprisingly intelligent, and occasionally incorrect in a subtle way. When the stakes are higher—injury recovery, serious performance goals, or simply the need for a real person standing by when motivation wanes—human trainers become more specialized, more costly, and possibly more valued.
Fitbit needs to expand its test audience, and the iPhone does just that. If this coach is successful, it won’t be because it mimics a trainer exactly.
The reason for this is that it transforms data into something that feels more like a gentle reminder than a lecture by fitting into the little gaps of everyday life, such as during a lunch break, a bus ride, or the quiet minute before bed. Although the experiment is no longer limited to one side of the smartphone divide, it is still unclear if that nudge will be sufficient to alter behavior over the long run.








