Google has formally announced that the Home app's "home panel" feature, which gives you "quick access to your Spaces and Favorites directly from your lock screen", is coming to other Android devices with Android 14.
Shown above is the old Device Controls UI on a OnePlus Open running Android 13 (left) versus the new Device Controls UI on a OnePlus 11 running Android 14 (right). This new home panel UI has also appeared on Galaxy S23 series devices and the Nothing Phone 2 running Android 14, among others.
The "home panel" is accessed by tapping the Device Controls shortcut on the lock screen or in the Quick Settings panel.
The "home panel" takes advantage of a new API called
With Android 14, the API is now public and third-party apps, like Home Assistant, can utilize it to show a custom activity in the Device Controls panel as well.
Shown above is the old Device Controls UI on a OnePlus Open running Android 13 (left) versus the new Device Controls UI on a OnePlus 11 running Android 14 (right). This new home panel UI has also appeared on Galaxy S23 series devices and the Nothing Phone 2 running Android 14, among others.
The "home panel" is accessed by tapping the Device Controls shortcut on the lock screen or in the Quick Settings panel.
The "home panel" takes advantage of a new API called
ControlsProviderService#META_DATA_PANEL_ACTIVITY that lets apps embed a custom activity in the Device Controls interface. When Pixels added support for the home panel in the June 2023 Pixel Feature Drop (Android 13 QPR3), the API wasn't public and could only be used by the Google Home app.With Android 14, the API is now public and third-party apps, like Home Assistant, can utilize it to show a custom activity in the Device Controls panel as well.
👍46❤4
Android 14 finally lets apps show content on both screens on a foldable! Thanks to new Jetpack WindowManager APIs and a new WindowManager Extension, apps can now use both the inner & outer display on a foldable.
Here's my article for Android Police that explains how this new feature works.
Oh, and I would also appreciate if you check out the article I wrote for Android Central over the weekend, which explains what reviewers actually mean when we say many apps aren't "optimized" yet for foldables.
Here's my article for Android Police that explains how this new feature works.
Oh, and I would also appreciate if you check out the article I wrote for Android Central over the weekend, which explains what reviewers actually mean when we say many apps aren't "optimized" yet for foldables.
Android Police
Android 14 finally lets apps show content on both screens on a foldable
Android 14's dual display mode could spur creative third-party development for foldables
👍37❤12
Google has announced that "developers with newly created personal Play Console accounts will soon be required to test their apps with at least 20 people for a minimum of two weeks before applying for access to production."
More specifically, Google says developers with personal accounts created after November 13, 2023, have to "run a closed test for [their] app with a minimum of 20 testers who have opted-in for at least the last 14 days continuously." By "continuously", Google means that they "won't count testers who opted in, tested for less than 14 days, and then opted out."
Certain features in the Play Console, such as Production (Release > Production) and Pre-registration (Release > Testing > Pre-registration) will be disabled until these requirements are met. Google will ask developers "some questions to help [them] understand your app, its testing process, and its production readiness."
More specifically, Google says developers with personal accounts created after November 13, 2023, have to "run a closed test for [their] app with a minimum of 20 testers who have opted-in for at least the last 14 days continuously." By "continuously", Google means that they "won't count testers who opted in, tested for less than 14 days, and then opted out."
Certain features in the Play Console, such as Production (Release > Production) and Pre-registration (Release > Testing > Pre-registration) will be disabled until these requirements are met. Google will ask developers "some questions to help [them] understand your app, its testing process, and its production readiness."
👍37👎18🤔18
This media is not supported in your browser
VIEW IN TELEGRAM
While support for SDR dimming, a feature that dims SDR UI layers without compromising HDR content, isn't strictly necessary to display HDR images on Android, it is nice to have!
A lot of people complain about HDR images "blinding them" when they're scrolling through social media. That's because the screen's brightness is cranked up in order to properly display the HDR content's highlights, but as a result, any SDR content (which includes most Android UI elements) will appear way too bright.
SDR dimming solves this by lowering the pixel intensities of SDR layers only while keeping the display brightness high enough so that HDR content can maintain their brighter highlights.
In the video embedded above, via Mozart Louis, you can see how parts of the image brighten on the right when the Ultra HDR version of the image is loaded, but UI elements like the status bar and in-app buttons remain the same brightness.
The SDR dimming feature was added in Android 13, but so far, only Google's Pixel 7 and newer are configured to support it. Android 14 added the
This ratio is undefined on the few non-Pixel Android 14 devices I tested, like the Galaxy S23+, OnePlus 11, and Nothing Phone 2. Older Pixels running Android 14 like the Pixel 6a also don't support it. Hopefully more devices add support for SDR dimming, especially when social media platforms start rolling out support for posting/viewing HDR images!
A lot of people complain about HDR images "blinding them" when they're scrolling through social media. That's because the screen's brightness is cranked up in order to properly display the HDR content's highlights, but as a result, any SDR content (which includes most Android UI elements) will appear way too bright.
SDR dimming solves this by lowering the pixel intensities of SDR layers only while keeping the display brightness high enough so that HDR content can maintain their brighter highlights.
In the video embedded above, via Mozart Louis, you can see how parts of the image brighten on the right when the Ultra HDR version of the image is loaded, but UI elements like the status bar and in-app buttons remain the same brightness.
The SDR dimming feature was added in Android 13, but so far, only Google's Pixel 7 and newer are configured to support it. Android 14 added the
Display#isHdrSdrRatioAvailable() method which lets apps check if the device's display supports reporting an HDR/SDR ratio. If the ratio isn't defined, it means that SDR dimming isn't enabled. This ratio is undefined on the few non-Pixel Android 14 devices I tested, like the Galaxy S23+, OnePlus 11, and Nothing Phone 2. Older Pixels running Android 14 like the Pixel 6a also don't support it. Hopefully more devices add support for SDR dimming, especially when social media platforms start rolling out support for posting/viewing HDR images!
👍51❤11👌1
The source code for libultrahdr, the reference codec for Google's new Ultra HDR image format, is available on GitHub under the Apache 2.0 license. Ultra HDR itself includes Gain Map technology under license from Adobe.
libultrahdr supports decoding and encoding Ultra HDR images on Android, Linux, macOS, and Windows.
Edit: libultrahdr is included in AOSP with the release of Android 14. Fun fact: the old name for Ultra HDR was "JPEG Recovery Map", hence why the file format is still called JPEG_R
libultrahdr supports decoding and encoding Ultra HDR images on Android, Linux, macOS, and Windows.
Edit: libultrahdr is included in AOSP with the release of Android 14. Fun fact: the old name for Ultra HDR was "JPEG Recovery Map", hence why the file format is still called JPEG_R
👍44👏5👌1
Android might swap which open-source decoder it uses for playing back AV1 videos in order to improve performance on lower-end hardware.
Android 10 added native support for playing back videos encoded in the AV1 format by bundling libgav1, an open-source AV1 decoder developed by Google.
libgav1 isn't the only AV1 decoder out there, though. There's also libaom, the reference decoder developed by the Alliance for Open Media, as well as libdav1d developed by VideoLAN.
Like libgav1, libdav1d is a CPU-based decoder, but benchmarks here and there have shown that it offers better performance.
For example, here are charts (courtesy of Reddit user unlord_ from the AV1 community) that compare the software decoding performance of libaom, libgav1, and libdav1d on 1080p test clips across a variety of Arm-based devices (the ODROID-C2, Raspberry Pi 4, Google Pixel 2, Google Pixel 3, and Xiaomi Mi 9T Pro). The data is from mid-2020, though.
Recently, Google started experimenting with using libdav1d instead of libgav1 for decoding AV1 videos. external/libdav1d was just added to the platform manifest in the main branch, meaning the decoder's source code is now included by default when syncing AOSP.
In other patches, the dav1d decoder was added and enabled as a new codec2 component, enabling AV1 decompression to be done using dav1d instead of gav1.
Since this is still an experiment, I don't know if Android will actually switch over to the libdav1d codec. If it does happen, it might be backported to current versions of Android since Media Codecs is a Project Mainline module. More importantly, if ExoPlayer were to bundle libdav1d like it currently does with libgav1, then many existing media player apps would benefit. (Media player apps that use libvlc already benefit.)
Android 10 added native support for playing back videos encoded in the AV1 format by bundling libgav1, an open-source AV1 decoder developed by Google.
libgav1 isn't the only AV1 decoder out there, though. There's also libaom, the reference decoder developed by the Alliance for Open Media, as well as libdav1d developed by VideoLAN.
Like libgav1, libdav1d is a CPU-based decoder, but benchmarks here and there have shown that it offers better performance.
For example, here are charts (courtesy of Reddit user unlord_ from the AV1 community) that compare the software decoding performance of libaom, libgav1, and libdav1d on 1080p test clips across a variety of Arm-based devices (the ODROID-C2, Raspberry Pi 4, Google Pixel 2, Google Pixel 3, and Xiaomi Mi 9T Pro). The data is from mid-2020, though.
Recently, Google started experimenting with using libdav1d instead of libgav1 for decoding AV1 videos. external/libdav1d was just added to the platform manifest in the main branch, meaning the decoder's source code is now included by default when syncing AOSP.
In other patches, the dav1d decoder was added and enabled as a new codec2 component, enabling AV1 decompression to be done using dav1d instead of gav1.
Since this is still an experiment, I don't know if Android will actually switch over to the libdav1d codec. If it does happen, it might be backported to current versions of Android since Media Codecs is a Project Mainline module. More importantly, if ExoPlayer were to bundle libdav1d like it currently does with libgav1, then many existing media player apps would benefit. (Media player apps that use libvlc already benefit.)
👍44🔥8🏆3
Android 14 seems to have quietly eliminated a trick some apps were using to keep themselves alive when the OS tried to kill them.
As spotted by Greenify developer Oasis Feng, Android now freezes a package's cgroup before killing it. Control groups (cgroups) is a Linux kernel feature that organizes processes into groups so their resource usage can be monitored/controlled.
Before Android 14, apps were "able to prevent their death by forking multiple processes under different services and monitoring for the death of any of these. When a child death [was] detected, the remaining process restart[ed] the terminating/terminated service before it's able to be killed itself."
Android 14 prevents this "by freezing the entire cgroup of the package to be killed before killing the individual processes. After the kills are completed synchronously, the cgroup can be unfrozen to allow for restarts. Before freezing the cgroup, the binder interfaces of the processes about to be frozen are also frozen to prevent indefinite blocking by synchronous Binder callers."
Apparently, this trick that Android 14 patches was used by libraries like MarsDaemon in order to keep apps from being killed. The library itself isn't malicious, but it was used by a lot of malware in the past.
The MarsDaemon library hasn't been updated in years, as Oasis says that Chinese OEM forks of Android patched the method it used long ago. Given how much more aggressively Chinese OEM forks of Android manage background processes, this wouldn't surprise me.
It's good to see Android crack down on abusive background behavior by apps. Other related improvements in Android 14 include a reduction in how long it takes for cached apps to be frozen (10 minutes --> 10 seconds) and an increase in the maximum number of cached apps (32 --> 1024).
As spotted by Greenify developer Oasis Feng, Android now freezes a package's cgroup before killing it. Control groups (cgroups) is a Linux kernel feature that organizes processes into groups so their resource usage can be monitored/controlled.
Before Android 14, apps were "able to prevent their death by forking multiple processes under different services and monitoring for the death of any of these. When a child death [was] detected, the remaining process restart[ed] the terminating/terminated service before it's able to be killed itself."
Android 14 prevents this "by freezing the entire cgroup of the package to be killed before killing the individual processes. After the kills are completed synchronously, the cgroup can be unfrozen to allow for restarts. Before freezing the cgroup, the binder interfaces of the processes about to be frozen are also frozen to prevent indefinite blocking by synchronous Binder callers."
Apparently, this trick that Android 14 patches was used by libraries like MarsDaemon in order to keep apps from being killed. The library itself isn't malicious, but it was used by a lot of malware in the past.
The MarsDaemon library hasn't been updated in years, as Oasis says that Chinese OEM forks of Android patched the method it used long ago. Given how much more aggressively Chinese OEM forks of Android manage background processes, this wouldn't surprise me.
It's good to see Android crack down on abusive background behavior by apps. Other related improvements in Android 14 include a reduction in how long it takes for cached apps to be frozen (10 minutes --> 10 seconds) and an increase in the maximum number of cached apps (32 --> 1024).
👍59👀9👏4😁2
Starting today, developers submitting apps for distribution on Google TV will be required to provide a square app icon asset in addition to a banner logo.
This is because in early 2024, the Google TV launcher's "For you" tab will display app icons in a circular format.
The aspect ratio of the launcher icon should be 1:1, the resolution of the provided launcher icon at xhdpi should be at least 160x160, and the launcher icon should be contained within the 72x72 safe zone so it doesn't get cropped in an undesirable way.
This is because in early 2024, the Google TV launcher's "For you" tab will display app icons in a circular format.
The aspect ratio of the launcher icon should be 1:1, the resolution of the provided launcher icon at xhdpi should be at least 160x160, and the launcher icon should be contained within the 72x72 safe zone so it doesn't get cropped in an undesirable way.
👍44❤9👎1🥰1
RIP: the deal between Qualcomm and Iridium to bring satellite connectivity to smartphones for two-way emergency messaging has collapsed, as reported by PCMag.
Qualcomm announced Snapdragon Satellite back in January of this year and said that "emergency messaging on Snapdragon Satellite is planned to be available on next-generation smartphones, launched in select regions starting in the second half of 2023."
However, Iridium in a press release today said that "the companies successfully developed and demonstrated the technology; however, notwithstanding this technical success, smartphone manufacturers have not included the technology in their devices."
Given all the hype around satellite messaging on Apple devices, I'm surprised that there wasn't much interest from Android smartphone makers! Perhaps the cost to support this in terms of extra hardware needed plus patent licensing wasn't worth the relatively small number of users who might utilize this feature?
—-
EDIT: It wasn't that Android smartphone makers weren't interested in bringing satellite connectivity to their phones, but instead they wanted a "standards-based solution" instead of a proprietary one.
Qualcomm said in a statement to CNBC that smartphone makers have "indicated a preference towards standards-based solutions" for satellite-to-phone connectivity. "We expect to continue to collaborate with Iridium on standards-based solutions while discontinuing efforts on the proprietary solution that was introduced earlier this year," the company said.
—-
In any case, Iridium says that, with the termination of its agreement with Qualcomm, it will be "free to directly re-engage with smartphone OEMs, other chipmakers, and smartphone operating system developers that the Company had been collaborating with previously." It will also pursue "new relationships with smart device OEMs, chipmakers, and developers for its existing and future service plans."
Qualcomm announced Snapdragon Satellite back in January of this year and said that "emergency messaging on Snapdragon Satellite is planned to be available on next-generation smartphones, launched in select regions starting in the second half of 2023."
However, Iridium in a press release today said that "the companies successfully developed and demonstrated the technology; however, notwithstanding this technical success, smartphone manufacturers have not included the technology in their devices."
Given all the hype around satellite messaging on Apple devices, I'm surprised that there wasn't much interest from Android smartphone makers! Perhaps the cost to support this in terms of extra hardware needed plus patent licensing wasn't worth the relatively small number of users who might utilize this feature?
—-
EDIT: It wasn't that Android smartphone makers weren't interested in bringing satellite connectivity to their phones, but instead they wanted a "standards-based solution" instead of a proprietary one.
Qualcomm said in a statement to CNBC that smartphone makers have "indicated a preference towards standards-based solutions" for satellite-to-phone connectivity. "We expect to continue to collaborate with Iridium on standards-based solutions while discontinuing efforts on the proprietary solution that was introduced earlier this year," the company said.
—-
In any case, Iridium says that, with the termination of its agreement with Qualcomm, it will be "free to directly re-engage with smartphone OEMs, other chipmakers, and smartphone operating system developers that the Company had been collaborating with previously." It will also pursue "new relationships with smart device OEMs, chipmakers, and developers for its existing and future service plans."
😢45👍24💩5❤2💔1
Samsung recently announced the global rollout of a neat feature called Temporary Cloud Backup, which lets you store your phone's data on Samsung Cloud free for 30 days. This could be helpful when you need to repair, reset, or change devices.
Many outlets covering the news reported that this feature requires One UI 6, but that's actually not true!
It turns out Temporary Cloud Backup has already been available in select markets since January 2023 as part of Samsung's "Maintenance Mode" (
So check your device: You might already have access to this feature! Samsung says the Temporary Cloud Backup feature requires Android 13 or higher, Samsung Cloud app 5.3.0.32 or higher, and Samsung Smart Switch app 3.7.40.4 or higher.
If you don't have this feature yet, then you'll have to wait for Samsung's full global rollout of the feature with the One UI 6 update.
Many outlets covering the news reported that this feature requires One UI 6, but that's actually not true!
It turns out Temporary Cloud Backup has already been available in select markets since January 2023 as part of Samsung's "Maintenance Mode" (
Settings > Battery and Device Care > Maintenance Mode > Temporary Cloud Backup), and since June 2023, has also been included within the Reset function (Settings > General Management > Reset > Temporary Cloud Backup).So check your device: You might already have access to this feature! Samsung says the Temporary Cloud Backup feature requires Android 13 or higher, Samsung Cloud app 5.3.0.32 or higher, and Samsung Smart Switch app 3.7.40.4 or higher.
If you don't have this feature yet, then you'll have to wait for Samsung's full global rollout of the feature with the One UI 6 update.
👍47❤9🔥7
ML Kit's new subject segmentation API is now in beta. This API can be used to separate multiple "subjects" from the background in a picture.
"Subjects" are the most prominent people, pets, or objects in the foreground. This means that if 2 subjects are very close or touching each other, they are considered a single subject.
The API takes a static input image and generates an output mask or bitmap for the foreground as well as a mask and bitmap for each of the subjects detected.
Images are processed on-device. Google says the average latency measured on a Pixel 7 Pro is around 200ms.
"Subjects" are the most prominent people, pets, or objects in the foreground. This means that if 2 subjects are very close or touching each other, they are considered a single subject.
The API takes a static input image and generates an output mask or bitmap for the foreground as well as a mask and bitmap for each of the subjects detected.
Images are processed on-device. Google says the average latency measured on a Pixel 7 Pro is around 200ms.
❤36👍14🔥6
Media is too big
VIEW IN TELEGRAM
In Android 14, SystemUI's
You have to install an app to launch Device Controls, though, since Quick Tap by default doesn't include an action to launch it.
Here's the commit (h/t Kieron Quinn) that made the ControlsActivity exported, and here's a simple app you can use to launch the Device Controls activity via Quick Tap.
ControlsActivity can now be launched by any other app, meaning you can configure the Pixel's Quick Tap action to launch the Device Controls interface with a double tap on the back of your phone!You have to install an app to launch Device Controls, though, since Quick Tap by default doesn't include an action to launch it.
Here's the commit (h/t Kieron Quinn) that made the ControlsActivity exported, and here's a simple app you can use to launch the Device Controls activity via Quick Tap.
👍52🔥6❤1🥰1
Mishaal's Android News Feed
Android might swap which open-source decoder it uses for playing back AV1 videos in order to improve performance on lower-end hardware. Android 10 added native support for playing back videos encoded in the AV1 format by bundling libgav1, an open-source AV1…
Here's an informal performance comparison between Google's libgav1 and VideoLAN's libdav1d, two open source, CPU-based AV1 decoders.
My test device was a Pixel 3 XL with Qualcomm's Snapdragon 845, running AOSP Android 13. I played back two videos encoded in AV1: a 6m30s clip of the Artemis I launch at 4K60 (8-bit, 1165 kb/s) and a 9m11s clip from Netflix called "Sparks" at 1080p25 (10-bit, 915 kb/s).
To test libgav1, I used a media player called "Just (Video) Player" which uses Android's ExoPlayer library. To test libdav1d, I used VLC for Android which uses libvlc. The app I used to record framerate data was TakoStats.
My test device was a Pixel 3 XL with Qualcomm's Snapdragon 845, running AOSP Android 13. I played back two videos encoded in AV1: a 6m30s clip of the Artemis I launch at 4K60 (8-bit, 1165 kb/s) and a 9m11s clip from Netflix called "Sparks" at 1080p25 (10-bit, 915 kb/s).
To test libgav1, I used a media player called "Just (Video) Player" which uses Android's ExoPlayer library. To test libdav1d, I used VLC for Android which uses libvlc. The app I used to record framerate data was TakoStats.
👍44🤯18🔥11❤6
Google says that the new optimizations in the ART 14 release, which are available on all devices running Android 12 and later via a Google Play System Update to the ART Mainline module, have resulted in an average of 50-100MB in storage savings thanks to a 9.3% reduction in optimized code size, without impacting performance.
When taking into account how many devices the ART Mainline module is installed on, Google says these optimizations save 47-95 petabytes globally. This is because "an average phone can have 500MB-1GB in optimized code" and these optimizations reach over 1 billion devices.
ART, short for the Android Runtime, is responsible for executing the Dalvik bytecode produced from apps and system services written in Java or Kotlin. ART compiles apps from the DEX format to native code using dex2oat, which parses the DEX code and generates an Intermediate Representation (IR). Dex2oat then "performs a number of code optimizations."
Google's blog post goes into technical detail about the code size optimizations they made and tested "on over half a million APKs present in the Google Play Store".
When taking into account how many devices the ART Mainline module is installed on, Google says these optimizations save 47-95 petabytes globally. This is because "an average phone can have 500MB-1GB in optimized code" and these optimizations reach over 1 billion devices.
ART, short for the Android Runtime, is responsible for executing the Dalvik bytecode produced from apps and system services written in Java or Kotlin. ART compiles apps from the DEX format to native code using dex2oat, which parses the DEX code and generates an Intermediate Representation (IR). Dex2oat then "performs a number of code optimizations."
Google's blog post goes into technical detail about the code size optimizations they made and tested "on over half a million APKs present in the Google Play Store".
🔥74👍21🤯6🤔4👏2
Messaging apps that are locked to specific devices are cringe. Mr. Cook, tear down this wall!
In all seriousness, it's interesting to see solutions like Nothing Chat gain more traction, because I wonder just how Apple will respond. Before, it was just a couple of independent software startups (Sunbird, Beeper) doing this but now an actual hardware maker is stepping in (Nothing) with a first-party iMessage-on-Android solution available for its own devices (I realize the irony in my first sentence).
Nothing Chat uses Sunbird's architecture, of course, so you have to trust them with access to your iCloud account. (IIRC, they basically just relay iMessage from a Mac mini farm somewhere, which is why it needs your iCloud account.)
I've never bothered setting up Sunbird or Beeper (just haven't had the time), but I might give Nothing Chat a shot once it's out. Nothing says it'll be available for Phone (2) users in North America, the EU, and other European countries starting Friday, November 17.
In all seriousness, it's interesting to see solutions like Nothing Chat gain more traction, because I wonder just how Apple will respond. Before, it was just a couple of independent software startups (Sunbird, Beeper) doing this but now an actual hardware maker is stepping in (Nothing) with a first-party iMessage-on-Android solution available for its own devices (I realize the irony in my first sentence).
Nothing Chat uses Sunbird's architecture, of course, so you have to trust them with access to your iCloud account. (IIRC, they basically just relay iMessage from a Mac mini farm somewhere, which is why it needs your iCloud account.)
I've never bothered setting up Sunbird or Beeper (just haven't had the time), but I might give Nothing Chat a shot once it's out. Nothing says it'll be available for Phone (2) users in North America, the EU, and other European countries starting Friday, November 17.
👍60🤣22🤔16🔥3😢3❤2😁2❤🔥1👎1