Skip to main content

Android 12 Developer Preview 3 is the last update before the beta next month at Google I/O

 

Today Google released its third and final version of the Android 12 Developer Preview. Next month we should see the first beta where there will be even more changes. Highlights mentioned in today's release can be found below.


Improved web linking - In Android 12 we’re making some changes to help users get to their content faster and more seamlessly. First, we’ve changed the default handling of links that aren’t verified through Android App Links or manually approved for links by the user. Now the OS will directly open them in the default browser, rather than showing a chooser dialog. To make it easier for users to approve your app for links, we’ve added a new Intent that takes them to “Open by default” in Settings. If you want to ensure that only your app can handle links from your domain, you can use App Links. We’ve added new adb commands to help you configure and test your links. More here.


Rich haptic experiences - We’re expanding the tools we offer for creating informative haptic feedback for UI events, immersive and delightful effects for gaming, and attentional haptics for productivity. We’ve added expressive effects like low tick that take advantage of the broader frequency bandwidth of the latest actuators. Game developers can now access multiple, different actuators independently in game controllers to deliver the same effect synchronously or different haptic effects on multiple actuators. For developers, we recommend using the constants and primitives as building blocks for rich haptic effects - constants to enhance UI events and haptic composer to sequence primitives for more complex effects. You can try these APIs to the fullest on Pixel 4 devices today, and we’re continuing to work with our device-maker partners to bring the latest in haptics support to users across the ecosystem.


Video encoding improvements - Android 12 standardizes the set of keys for controlling the range of the video Quantization Parameters (QP), allowing developers to avoid vendor-specific code. The new keys are available in the MediaFormat API and also in the NDK Media library. Video encoders must specify a minimum video quality threshold to ensure that users don't experience extremely low quality when videos are complex.


Camera2 vendor extensions - Many of our device manufacturer partners have built custom camera effects—such as bokeh, HDR, night mode, and others—that they want apps to use to create differentiated experiences on their devices. We’ve already supported these custom effects through a set of vendor extensions in our CameraX library, and now in Android 12 we’re exposing the vendor extensions directly in the platform as well. This helps apps that have complex Camera2 implementations to take advantage of the extensions without having to make significant changes to legacy code. The extension APIs expose exactly the same set of effects as in CameraX, and those are already supported on many different devices, so you can use them right out of the box. More here.


Quad bayer camera sensor support - Many Android devices today ship with ultra high-resolution camera sensors, typically with Quad / Nona Bayer patterns, and these offer great flexibility in terms of image quality and low-light performance. In Android 12, we’re introducing new platform APIs that let third-party apps take full advantage of these versatile sensors. The new APIs support the unique behavior of these sensors and take into account that they might support different stream configurations and combinations when operating in full resolution or ‘maximum resolution’ mode vs ‘default’ mode.


Faster machine learning - In Android 12, we invested in key areas so that developers can make the most of ML accelerators and always get the best possible performance through the Neural Networks API. In terms of performance improvements - we have more than halved inference call overhead by introducing improvements such as padding, sync fences and reusable execution objects. We’ve also made ML accelerator drivers updatable outside of platform releases, through Google Play services. This will make it easier for developers to take advantage of the latest drivers on any compatible device, and make sure that ML performance improvements and bug fixes reach users faster than ever before.


Standardizing GPU compute - We are deprecating the RenderScript APIs in favor of cross-platform GPU compute solutions such as Vulkan and OpenGL. We want you to have confidence that your high-performance workloads will run on GPU hardware, and many devices are already shipping with only CPU support for RenderScript. The existing APIs will continue to work for the time-being, and we've open-sourced a library for RenderScript intrinsics such as blur that uses the highly-optimized intrinsics platform code. Samples and a migration guide for using Vulkan to implement image processing are also available. More here.


May 18th can't come soon enough and Google just gave us our last glimpse of what they have been working and I'm really hyped for it!


Link(s)

Google


Comments

Popular posts from this blog

Soundcore K20i review | A look at Ankers budget friendly earbuds (video)

Soundcore K20i by Anker, Semi-in-Ear Earbuds, Bluetooth Wireless, 36H Playtime, Fast Charge, Clear Sound, Comfortable Fit, ENC 2-Mic Clear Calls, Custom EQ, IPX5, Bluetooth 5.3, App Control. https://a.co/d/a27Mi9t

Reviewing the Pixelbook Go after 5 yrs | Software that is outpacing the hardware

  No doubt, the Pixelbook Go has served me well in my 5 yrs with the laptop. This is not necessarily a review of the features on the laptop but a review of the PBG hardware and its usability in 2025. I acknowledge that I am a part of a very small club of users that still have and use this device but I must say that I still enjoy using the PBG every single day. Now, to the review. Hardware Right off the bat. The hardware is probably the least compelling aspect of the PBG. When the laptop was first released in 2019 it was already a year or two behind the competition with the onboard soc. Surprisingly, this has not negatively impacted using the laptop with most tasks. Web surfing is a breeze and most Android apps run pretty effortlessly. The 8th Gen i5 Intel soc paired with 8GB of RAM is well optimized and I feel that Google is getting as much as it can out of the chip.  Where the PBG falls short is with video editing apps. I've tried several Android apps and a couple of Linux ap...

Just got Google's new experimental AI feature News Briefs on my Google TV Streamer

Google is always experimenting with AI and one of those new features I really hope sticks around is now available on my Google TV Streamer. News Briefs is an AI experiment that Google is rolling out to a small group of users as part of a test of the feature. When I turned on my TV this evening I quickly noticed the News Briefs feature. It looked similar to any other option on the Google TV home screen but with News Briefs for 5pm so assume there are News Briefs created throughout the day. The picture below is displayed at the end of the News Briefs. The feature is very reminiscent of what NotebookLM does with the two AI hosts that talk in what sounds like a podcast. News Briefs only has one voice that is somewhat similar to the Google Assistant voice on Android devices. I have only used this feature once and I already see the potential here. What was shown to me as part of the News Briefs was actually relevant and current news. The news discussed was very surface level, however, a YouT...