At Samsung Galaxy Unpacked this year, company president TM Roh took the stage in Brooklyn, NY, to tout the transformative nature Galaxy AI. The presentation talked about how Samsung’s AI tech customizes information and systems to become your personal companion. It gives you morning briefs, synthesizes your health information and can integrate across different form factors, like foldables, VR and wearables.
I am an active user of the Samsung Galaxy Z Fold 6 phone, but I’ve never once seen any of Samsung’s Galaxy AI features surface in my daily use. And it’s not like I’m not looking out for this stuff, I’m literally an AI reporter at CNET. So what am I missing?
Part of the reason I never notice Galaxy AI is that it’s hampered by the defaults of Google’s open source Android operating system. Unlike Apple, Samsung doesn’t control the software running on its devices. Instead, it uses Android.
I’ve used a Galaxy Z Fold 6 for the last year or so.
Samsung and other smartphone makers can add their own software features on top of Android. Google, however, doesn’t allow Android partners to completely delete Google’s included apps. So, if partners want their own interpretations of a calling app or text messaging app, it has to live alongside Google’s versions.
Years ago Samsung did attempt to launch its own mobile operating system called Tizen, but, like with Windows Phone and other mobile operating systems, getting app developers on board proved challenging.
Where’s my AI, Samsung?
I have yet to see any of the AI features Samsung touts. Apart from Samsung’s daily brief there’s a host of editing features, including audio eraser for clearer audio, auto trim for video editing and generative edit, which lets you use AI to retouch images. On Samsung’s Galaxy AI website, the company says there’s call transcript, writing assist and interpreter features.
I’m personally not a big photo bug, so I don’t spend too much time snapping pics and setting aside time to edit photos for Instagram. So, being blind to these features is on me.
But I am reporter, so the transcript and interpreter features are particularly handy. Well, they should be if I’d ever seen them. I jumped on an impromptu call with my co-worker Corin Cesaric to test if these transcript features would activate. They didn’t.
I went into the settings to double-check if any of the AI features were enabled. Apart from Note Assist, all of them were turned on in the settings. The problem was perplexing. Turns out, it comes down to which apps you set as default when setting up your device.
It’s all about defaults
To get Samsung’s AI features, you have to use Samsung’s apps. But when I first set up my phone, I guess I clicked the box for Google’s suite of apps, maybe because I was coming from a Google Pixel device. For example, there’s separate versions of the phone, messages and photo apps, one from Google and one from Samsung. The Samsung Messages app wasn’t even installed on my device as Samsung made Google Messages the native app back in 2021. I had to go to the Samsung Store (yes, Samsung has a separate app store on Android) to install it.
The bicameral nature of Samsung devices running Google’s operating system is to blame for all this confusion. Many Samsung and Google apps have the exact same names, making the issue even more confusing.
I think Google has tacitly acknowledged this division as a problem. Google advertises its Pixel-line of devices as having software that’s embedded deeply with Google DNA, mixing design and AI in ways that just work. If Google wants Samsung and other partners to have that same kind of clean integration, it needs to allow them to have greater control. Otherwise consumers will simply conclude that Samsung, OnePlus or Motorola are to blame.