Posted by Yi Yang (Software program Engineer)It’s essential to remain on high of your app efficiency to ensure your customers can simply use your app. When an app experiences points resembling animation jank, frozen frames, and excessive reminiscence utilization, it negatively impacts the consumer expertise which may result in decrease rankings or app deletion. To repair these efficiency points, we first want the proper instruments to measure app efficiency appropriately.
The debug construct lets you use options helpful for improvement, like Apply Adjustments, working with the debugger, or the Database Inspector. As well as, it additionally permits profiling instruments to examine the state of a working app unavailable to the discharge construct.
Underneath the hood, the debug construct units the debuggable flag within the Android Manifest to true.
Whereas helpful, the debug construct is supposed to offer extra data at the price of efficiency. That’s as a result of when debuggable is true, lots of compiler optimizations are turned off.
To indicate you the efficiency distinction between the debug and launch builds, we recorded an app working on the identical machine however in these two construct variants. To visualise the body rendering time, we turned on Profile GPU Rendering (or Profile HWUI rendering in some Android variations) in Developer Choices when recording the display screen. Every vertical bar on the underside of the display screen represents how lengthy every body takes to render. The shorter these bars are, the smoother the animation is.
The display screen recording beneath reveals the identical app working on the identical machine. The left-hand facet is on a debug construct, the right-hand facet a launch construct. The debug model has extra stuttering frames, often known as UI jank. This implies once you profile the debug construct, you may even see timing measurements considerably totally different from what your customers see within the launch construct, and you could find yourself optimizing one thing that isn’t the issue.
To deal with that subject, the Android platform launched a tag known as profileable. It permits many profiling instruments that measure timing data, with out the efficiency overhead of the debug construct. Profileable is on the market on gadgets working Android 10 or increased.
AndroidManifest.xml
Let’s take a look at one other display screen recording. This time, the left facet reveals a profileable launch app and the proper facet an unmodified launch app. There’s little efficiency distinction between the 2.
With profileable, now you can measure the timing data far more precisely than the debug construct.
This characteristic is designed for use in manufacturing the place app safety is paramount. Due to this fact we determined to solely assist profiling options resembling Callstack Sampling and System Hint, the place timing measurement is crucial. The Reminiscence Profiler solely helps Native Reminiscence Profiling. The Vitality Profiler and Occasion Timeline are usually not accessible. The whole checklist of disabled options may be discovered right here. All these restrictions are put in place to maintain your app’s information protected.
Now that you already know what the profileable tag does, let me present you the right way to use it. There are two choices: robotically and manually.
Choice 1: Use the choice in Android Studio.
With Android Studio Flamingo and Android Gradle Plugin 8.0, all it’s worthwhile to do is simply choose this selection from the Profile dropdown menu within the Run toolbar: “Profile with low overhead”. Then Android Studio will robotically construct a profileable app of your present construct kind and fasten the profiler. It really works for any construct kind, however we extremely advocate you to profile a launch construct, which is what your customers see.
When a profileable app is being profiled, there’s a visible indicator together with a banner message. Solely the CPU and Reminiscence profilers can be found.
Within the Reminiscence Profiler, solely the native allocation recording characteristic is on the market on account of safety causes. |
This characteristic is nice for simplifying the method of native profiling however it solely applies once you profile with Android Studio. Due to this fact, it might nonetheless be useful to manually configure your app in case you need to diagnose efficiency points in manufacturing or in the event you’re not prepared to make use of the newest model of Android Studio or Android Gradle plugin but.
Choice 2: Guide configuration.
It takes 4 steps to manually allow profileable.
1. Add this line to your AndroidManifest.xml.
AndroidManifest.xml
2. Swap to the discharge construct kind (or any construct kind that’s not debuggable).
3. Be sure you have a signing key configured. To forestall compromising your launch signing key, you possibly can quickly use your debug signing key, or configure a brand new key only for profiling.
4. Construct and run the app on a tool working Android 10 or increased. You now have a profileable app. You may then connect the Android Studio profiler by launching the Profiler device window and choosing the app course of from the dropdown checklist.
Actually, many first-party Google apps resembling Google Maps ship their app to the Play Retailer as profileable apps.
Right here’s a desk that reveals which construct kind must be used:
To study extra about profilable builds, begin by studying the documentation and the the consumer information.
With these instruments offered by the Android crew, we hope you can also make your app run sooner and smoother.