Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[animation-frame-rate] Permission Prompt API for power-hungry animations (including TestUFO) (Also improves security) #89

Open
mdrejhon opened this issue May 3, 2022 · 3 comments
Labels
animations useful for animation-related explainers

Comments

@mdrejhon
Copy link

mdrejhon commented May 3, 2022

Suggestion: Add a Permission API for high-frame-rate

Purpose: To allow users to permit/deny higher-framerate animations to save battery power.

I suggest a permissions-based API for high-rate animations that applies equally for all animations (WebGL, requestAnimationFrame(), Web Animations API). The popup could be similar to the "may I use the microphone?" or the "may I have your location?" permission popups.

What would happen is that there would be a call that requests a high-rate API, and then the device would pop up the permission request. The user would intentionally permit it for specific sites (e.g. TestUFO)

This would be an agnostic permission-request callback API that applies to everything that is >60 Hz.
Yes, even including Web Animations API. Which will also give it time to improve to get the same animation-info-parity as requestAnimationFrame().

Also, new video websites now are appearing that allows 120fps 120Hz real-time HFR video (High Frame Rate), not to mention long-term future UltraHFR (e.g. 480fps on 480Hz displays), at www.blurbusters.com/ultrahfr -- the Christie Digital 4K 120Hz DLP E-Cinema projector for theatres now has a 240Hz and 480Hz mode for ultra-highrate uses, and Formula 1 racing currently uses Ultra HFR internally for some purposes. But some users don't want to eat up extra battery if YouTube later adds 120fps 120Hz support, such phones denying 120 would simply display every other frame instead for battery-save.

From a download of source code of WebKit, it was only a 1-line change to enable 120Hz requestAnimationFrame(). Since this is such a simple change to requestAnimationFrame() -- and combined with the long migration window required for full Internet-wide transition to Web Animations API --

The permission request MUST be made by the root window (not an iframe), so that eliminates the ability for ad banners to get a higher frame rate than the root window.

Please note that TestUFO is capable of running in an IFRAME, for displaying animations in a webpage, so there should be a method to cascade a permission API to other authorized IFRAMEs. The root webpage that successfully gets authorized HFR animations SHALL filter that permission down to all authorized origins (e.g. same origin or whitelisted URLs, e.g. blurbusters.com embedding the affilated testufo.com in embedded animations such as the animations embedded inside multiple areas of https://blurbusters.com/1000hz-journey -- are testufo animations running in an IFRAME). Also consider there is now an undocumented method of real time YouTube 120fps HFR video on 120Hz displays (real time, not slow-motion) with a hidden trick, and it's theoretically possible Google may make this official, as a frame rate selector (30, 60, 120) in the future, once 120Hz becomes more widespread. So we have to pre-plan for permission-filter-down considerations of IFRAME support of HFR.

(Note: Desktop browser settings can set it to a universal allow by default in settings, e.g. desktop computers. The setting can be toggled to "Ask for Permission" or defaulted if running on a mobile device instead. And for advanced users who likes advanced settings.

As a bonus, it could allow a route for degraded animation precision (make it harder to leak refresh rate information, if there was an ever a security implication of that), unless the permission is granted -- then the full refresh cycle precision accessible to TestUFO, for example, could happen. So it also provide a route for removing any security implications, without breaking existing sites -- since timer fuzz (Meltdown/Spectre) of 2ms actually interferes with animation precision on 240Hz displays. See the problem of inability to roll back security issue? My suggestion solves the problem!

Note: Formerly TestUFO was only used on desktop computers, but is slowly improved to be more mobile-compatible (works very well on Android phones and 60Hz iOS devices). A new TestUFO update will pass all mobile compatibility settings. With the big boom of high-Hz mobiles, many users use TestUFO to check the refresh rate of their phone, such as www.testufo.com/refresh-rate .

I think that a permission-API is necessary even for Web Animations API (for battery saving) -- and thus makes it a natural fit for the 1-line requestAnimationFrame() mod (to enable 120Hz support).

Background Information on Rationale

TestUFO is one of the world's most popular high-refresh-rate requestAnimationFrame() website.

www.testufo.com is visited by over 1 million users per month whom run at refresh rates above 60 Hz.

Some background information about why my TestUFO is so popular by hundreds of other content creators:
There are over 30 selectable display tests selected at upper-right corner. Plus customizable settings for each one of them, to generate millions of different educational tests as disparate as www.testufo.com/eyetracking#speed=-1 or other elaborately constructed tests to compare www.testufo.com motion blur with the motion blur of https://www.testufo.com/blackframes#count=3&bonusufo=1 or a custom adjusted scientific pursuit camera animation at https://www.testufo.com/ghosting#background=004040&separation=240&pps=720&graphics=bbufo-tinytext-markers.png&pursuit=1&fullscreen=1 ... The infinite flexibility of TestUFO via parameters, allow creation of new TestUFO tests that I have not personally anticipated too -- and my users are creative! Some tests requires camera capture, with refresh-cycle-precision, e.g. www.testufo.com/frameskipping (google "frameskipping test for displays" for thousands of content creators and commentators that use the test).

It is used by hundreds (possibly thousands) of YouTubers, bloggers, review sites (some as prestigious as TomsHardware, CNet, RTINGS, and the scientific researchers listed at www.blurbusters.com/area51 since TestUFO is cited in more than 20 peer reviewed science papers because of its scientifically-accurate refresh rate detection and frame-perfect animations)

Years ago, I had developed heuristics to detect the refresh rate from the tick-tock of requestAnimationFrame() and it successfully works on all major browsers including Chome, FireFox, Opera, Edge, and even Android browsers (of high-Hz phones like Razer)

TestUFO requires:

  1. Ability to detect refresh rates. This is done by a heuristical formula.

  2. Ability to detect if refresh rate is accurate or faked. And ability to detect refresh rate changes in real-time. This is done by a heuristical formula, combined with a simultaneous useragent blacklist & whitelist

  3. Ability to reliably self-detect stutters. This is done by comparing the timestamp of the current frame versus the expected frame. This is the only way to produce science papers, by guaranteeing that a stutter was detected (and a message displayed). This is very important for frameskipping tests as well as the "display motion blur" pursuit camera I peer-reviewed paper with NIST.gov, Nokia, and Keltek -- (indie invention that turns a $30,000 science rig into something that can be tested by end users) -- and is used by dozens of YouTubers. (info: www.blurbusters.com/pursuit-camera
    .....) and is a major part of RTINGS.com tests as well as other 10-millon-viewer websites/youtubers

  4. Combined, all the websites that use Blur Busters inventions (including the TestUFO website that uses requestAnimationFrame() ...), are viewed by over 250 million viewers per month, as Blur Busters is an incubator of indie display-testing inventions for the industry. Even if the users don't view TestUFO directly, the reviewers themselves and the scientists themselves trust & use TestUFO. Anybody in esports or high-end computing recognize the UFO -- anytime the UFO is used on any other website, that's their use of one of the many Blur Busters testing inventions.

I am very concerned that Web Animations API does not satisfy the requirements of the scientists I'm in the peer reviewed papers at www.blurbusters.com/research-papers

Yes, there are long-term plans to port TestUFO

Although there are long-term plans to port TestUFO to Web Animations API while keeping the requestAnimationFrame() fallback -- there are some rather serious issues in Web Animations API in regards to bullet 3. It's an excellent standard otherwise, but early indications is that it needs to be further improved because it's more opaque/abstract layered than requestAnimationFrame()

It was measured that in requestAnimationFrame(), that the exit of RequestAnimationFrame(), if it exits within 50% of a refresh cycle, the frame was guaranteed to be displayed (no stutter) -- at least when tested in all Chrome-engined browsers on all GPU-accelerated platforms. TestUFO was designed bare-to-metal (hand-optimized JavaScript in many places) to allow it to perform the draw within 1%-10% of a refresh cycle on most machines, and it self-detects when drawtime start / drawtime completions varies away from expected estimated refreshtime, and statistical/heuristical analysis has successfully honed the animation formula over the long term to become the world's most accurate refresh-cycle-pacing animation website.

Currently, plan B is to run a degraded Web Animations API on iOS devices, while keeping the legacy requestAnimationFrame() for the reviewers / youtubers / bloggers / researchers / scientists using other devices. Tests may be run on concurrently (in beta browsers) Web Animations API and the requestAnimationFrame() API to see if that is a possible solution, since the tick-tock of "every-other-refresh-cycle" might still allow me to do stutter-detection on Web Animations API.

Although more testing is needed to determine workarounds for the Web Animations API's opaqueness, there is still many years left in legacy requestAnimations API for the needs of TestUFO that are used by creators that (combined, totalled) are viewed by hundreds of millions.

For example, LinusTechTips (>14M subscribers) use TestUFO to test their displays -- and this is just one entity of hundreds of entities that use Blur Busters inventions. A very small curated sample of reviewers are ones listed at www.blurbusters.com/reviewers-using-pursuit-camera -- but also don't forget the peer reviewed scientists/researchers at www.blurbusters.com/research-papers -- research papers at several universities and several television manufacturers cite my TestUFO multiple times.

Even computer monitor manufacturers now put my UFO logo on their monitors -- e.g. https://www.viewsonic.com/global/products/lcd/XG2431 -- scroll down 1 page to see the Blur Busters logo.

It is true there is an outsized nuts-and-bolts behind-the-scenes influence in the industry --

Please do not depreciate high-Hz suppot from requestAnimationFrame for Apple devices. Yet.

This means there's a long depreciation window required.

Therefore, even before full deprecation, 120Hz support should be added to requestAnimationFrame() until Web Animations API satisfies the ability to self-detect stutters (e.g. immediately detect when animations aren't perfectly framepaced in sync with refresh rate). Please consider this.

Known Security Problem That Permission Prompt Can Solve

This is why I post this "permission-prompt" suggestion as a universal compromise that catches-all Web Animations API and requestAnimationFrame() ....

Known Potential Security Issue (Browser fingerprinting): Refresh rate is controlled by the GPU clockchip, which filters back to requestAnimationFrame() and Web Animations API, as well as WebGL. Different GPUs can be fingerprinted down to a precision of 5 or 6 digits after a 5 or 30 minute wait at www.testufo.com/refreshrate#digits=8 as different GPUs vary slightly in Hz (clock chip), and this is fingerprintable even in icognito mode. Run that test longer, and you will see different computers settle at different exact values at specific computer temperatures (clock drift), but sometimes an exact Hz can help increase certainty that a specific computer is a specific computer, because of clockchip differences.

  1. Launch www.testufo.com/refreshrate#digits=8
  2. The longer it stays up, the more accurate the refresh rate fingerprint becomes (and distinguishes between multiple computers).
  3. This can essentially run in the background. The easily-readable JavaScript code at www.vsynctester.com is capable of running in a 1-pixel unnoticed, collecting background refresh rate information. for users that lingers on a website for a sustained time.

The heuristics ignores fluctuations caused by background processing & skipped frames, so it's amazingly more fingerprintable than expected. Some systems will stablize on the 6th or 7th digit, other systems will stabilize on the 4th or 5th decimal digit -- heuristics can also estimate the error margin too (eventually).

image

Background apps and tabs has nearly no effect on TestUFO refresh rate heuristics of the number displayed in that page. This is startling.

Run long enough, it is apparently sensitive enough to easily detect a 1us difference in clockchips (1 microsecond) -- but sometimes even fingerprinted much more accurately, to 10ns or even 1ns (nanosecond) -- since manufacturing of chips means two phones won't run atomic-clock-precision-in-sync of each other. This is actually already measurable and fingerprintable via all Hz-synced animation APIs (Web Animations API, requestAnimationFrame, WebGL, etc)

Nanosecond-league GPU fingerprinting precision possible on some systems by heuristics

On the most precise computer systems where www.testufo.com/refreshrate#digits=8 stabilize at roughly the 7h digit, 143.9999999 Hz versus 144.000000 Hz is ... a 1 nanosecond difference in refresh rate.

image

Even on systems that only stabilize on the 5th digit, that's still a sub-100 nanosecond precision (less than 0.1 microsecond precision). That's still potentially enough precision to target a specific GPU in a specific corporation, which can be valuable for hackers.

On some systems, the randomness has so much noise you can't target a specific device. But certain GPUs have very precise clockrates that don't vary enough to be fingerprinting-proof over 10 minutes -- but that different computers will tick at a slightly different clockrate. One GPU may ticktock at 1.5033349 GHz and the exact same GPU chip in another factory specimen of the same device may tick at 1.5033205 GHz due to manufacturing tolerance differences. Subtle differences can means one GPU can output a refresh rate of 59.9400127 Hz and another GPU outputs a refresh rate of 59.9399983 Hz This is 100% detectable in a web browser via the above instructions.

This is GPU fingerprinting, not CPU fingerprinting

In my tests, I have found that refresh rate fingerprinting is much more accurate than I have feared. In the hunger to make animations Apple-perfect-smooth, necessarily leaks data. Measure long enough, from first to last refresh cycle over an hourlong period, you fall into potentially nanosecond fingerprinting precision. There are other ways to fingerprint via CPU, but this is GPU fingerprinting -- which is an additional chip to fingerprint separately of CPU (since GPU controls the refresh rate, and the refresh rate is a math divisor on the GPU clockrate).

But on the software side, there are billions of legacy GPUs and legacy fixed-Hz screens, leaking information to hackers today as we speak, that might be used for extortions, wars, corp espionage, etc. Therefore, a software mitigation is needed: My permission API suggestion.

This can help fingerprinting even in Private/Icognito mode (e.g. distinguish between two identical smartphones, at least some of the time -- but with other supplemental fingerprinting information, it improves accuracy of fingerprinting). At least when the phone is running cool, and both phones are at roughly the same temperature room (e.g. room temperature). Temperature can affect GPU clockspeed differences, so 18C room temperature a low-power app (no phone thermals) versus 21C for the same room temperature creating a fractionally different refresh rate. This is usually an error margin.

As workers go back to headquarters and corporate offices, back to stable temperatures, the refresh rate fingerprinting security issue becomes bigger, because of more accurate room temperature (corporate thermostats), increasing the likelihood of specific-device-in-corporation fingerprinting.

A potential black agent, would factor this too. Many people already know the thermostat setting at many corporate headquarters, so we know a visitor from, say ACME, two identical phones at ACME (e.g. ACME Pixel Model 6), so the clock-generators are stable. And if geolocate (if permitted) shows a corporate headquarters, it's likely indoors at that temp. Fingerprinting accuracy skyrockets, and it may be possible to slowly trace to the CEO phone (over a long period), with some of the help of ultra-tiny refresh rate differences, increasing likelihood of attempted targetted hackings.

(Or more mudane, maybe just more annoying popup ads targetted at a specific users that enabled Private....)

Possible Mitigation Measures

There are hardware-side and software-side mitigations possible.

Graphics driver mitigation: Use the VRR mechanism to simulate fixed-Hz that's slightly timer-fuzzed. One mitigation is to intentionally vary the refresh rate in 0.001Hz increments. (So a 59.94 Hz screen could intentionally vary from 59.93999 Hz thru 59.94001 Hz to prevent refresh rate fingerprinting. This works perfectly on VRR technologies such as ProMotion, FreeSync, GSYNC, and VESA Adaptive-Sync screens -- using VRR to emulate fixed-Hz with an intentional anti-fingerprinting imperceptible inaccuracy).

Silicon mitigation: Future GPU silicon (by NVIDIA, by AMD, by Intel, and by Apple, etc) could intentionally vary the GPU clockrate to disguise the exact manufacturing-specific GPU clockspeed, as an anti-fingerprinting measure. Some GPU chips clockrate vary so much by temperature, but other GPU chips have unexpectedly stable clocks (especially if VRR is turned off). In fact turning FreeSync/GSYNC on a Windows system actually causes fewer digits to stabilize in TestUFO Refresh Rate -- this is presumed because the refresh rate is now controlled by Present() or glSwapBuffers() instead of by the GPU itself. So it's now 100% CPU driven, and no nanosecond-precise GPU clockspeed fingerprinting is possible anymore via measuring the refresh rate.

Browser code mitigation: Permission API that I suggest here! Limited animation precision could automatically persist by default (e.g. 2ms intentional animation timestamp fuzz) unless the permission is allowed, then full precision occurs. 2ms imprecision in animations don't seem to affect fluidity on 60Hz displays for example, but many browsers set more precise today to prevent breaking things and breaking high-Hz users. But if a permission API is added, additional timer fuzz can safely be added to animations to improve security and make it harder to fingerprint. The full animation timestamp precision (fingerprint risk), the battery hog factor, and the full animation rate) would only be allowed if user intentionally allows HFR animations in the permission prompt.

Universal Compromise That Also Improves Security

By adding a permission API, this provides a route for reducing information-leakage (e.g. refresh rate fingerprinting), while not denying today's use cases.

Once the permission prompt is added, this eliminates the important reason (power consumption) the 1-line source code change to add 120Hz support to requestAnimationFrame() in Safari on iOS. And restores scientific precision to refresh rate detection.

Then to improve security industry-wide (harder to fingerprint) -- larger amounts of intentional timer fuzz could be added to animations, UNLESS the user clicks the permission prompt?

Multiple GPU-fingerprinting-proofing algorithms are theoeretically possible -- with their pros/cons -- Further research is necessary to determine the ideal algorithm to resist GPU-fingerprinting without hugely compromising animation smoothness (slewing the clocks randomly around in imperceptible amounts, and allowing an intentional stutter once every few minutes).

The fuzzing can be disabled when HFR permission is granted, and it'd now framepace perfectly, with low latency, no lag variances, and no dropped frames (assuming no other tabs or background).

More research is needed to risk-analyze this, so this may be an early-canary of the future to think-this-through. As animations become more precise in the future and ever higher refresh rates, I anticipate GPU fingerprinting becomes achievable more quickly in less time. Thus, the security vulnerability issue of refresh rate fingerprinting may increase over time as tech progresses.

An intentional visit to TestUFO is often trusted (HFR will almost always be permitted intentionally), while a visit to random https//fakeonlinebank.acme displaying a HFR permission prompt is suspicious. While it's very true that fraudulent games and others could also be an issue, consider many corporations don't allow games or high-Hz content anyway that might require higher precision clocks. Also, theoretically a system policy could be used if it ever became a sufficiently big concern.

Fortunately, TestUFO is happy to run framepaced perfectly with intentional "20%-Hz" timer fuzz (e.g. a framepacing imprecision that is as much as 20% of a refresh cycle), users who just want to run a 60Hz TestUFO, won't be affected by a reasonable amount of intentional Hz-timer-fuzzing (using VRR or ProMotion to intentionally vary the refresh rate by imperceptible amounts, as a security improvement to compensate for the known manufacturing behaviors of specific GPUs).

This brings iOS in line with all other platforms (iOS is the lone non-120Hz rAF() holdout) while giving more time to transition to Web Animations API -- related issue at #85

Current iOS devices don't support 120 Hz TestUFO. This item hits three birds with one stone.

However, this issue was created as a separate tracking item because it's a generic useful permission API that affects ALL power-hungry high-framerate APIs, including Web Animations API, including WebGL, and including legacy requestAnimationFrame()

As a 3-way trade between end user needs, scientific animation needs (including TestUFO rAF), industry security needs, and industry battery-saver needs, the HFR permission prompt allows 120Hz requestAnimationFrame() to be greenlighted by Apple/@smfr for iOS devices?

Is this a brilliant idea that allows easier rolling back certain security issues, while satisfying many more needs?

@mdrejhon mdrejhon changed the title Permission Prompt API for power-hungry animations (including TestUFO) [animation-frame-rate] Permission Prompt API for power-hungry animations (including TestUFO) May 3, 2022
@mdrejhon mdrejhon changed the title [animation-frame-rate] Permission Prompt API for power-hungry animations (including TestUFO) [animation-frame-rate] Permission Prompt API for power-hungry animations (including TestUFO) (Also improves security) May 4, 2022
@gsnedders gsnedders added the animations useful for animation-related explainers label May 4, 2022
@mdrejhon
Copy link
Author

mdrejhon commented Jul 22, 2022

[Moved from #93]

I'm not sure why you qualify Web Animations as a "low-scientific-quality fallback". They should provide identical callback reliability to requestAnimationFrame.

@smfr, we need reliable single-framedrop detection logic (e.g. suddenly know when we see 2 refresh intervals pass instead of 1 refresh interval), to automatically invalidate tests --

Other animation APIs have more callback-timing-jitter than rAF() which is the problem here...

Some tests require instant notification (e.g. change color away from green) immediately upon a framedrop (any divergence away from framerate=Hz), because the scientific researcher is pointing a camera at the screen, with a mandatory requirement of framerate=Hz. Sometimes it's a pursuit camera, like this photo:

TestUFO Ghosting: www.testufo.com/ghosting
image

This inexpensive DIY moving-camera rig outperformed a $30,000 laboratory piece of equipment, thanks to the temporal test pattern invented by me (but requires mandatory perfect framerate=Hz during the photograph, and instant invalidation upon framedrop)

This DIY rig is now peer reviewed by National Institute of Standards and Technology (see my research paper that is co-authored by a NIST.gov researcher), and consequently is used by many researchers worldwide.

If the photograph contains anything that's not a green shade ("READY") it means something happened (e.g. a frame drop shows as an orange or red), and the photograph can be thrown away, and the scientific test restarted.

Now over 500 content creators (combined audience: 100 million -- RTINGS 9M, LinusTechTips 14M) use this test, because I've made it so inexpensive for them.

Another example more applicable to consumers is TestUFO is www.testufo.com/frameskipping
Which is used at a lesser scale by display testers to see if there are bugs in a display that is frameskipping (e.g. early ViewSonic XG2530s), or when overclocking a display. This is a popular test at the Overclock-NET forums and also requires a camera.

There are other tests that require scientific-reliability framedrop detection.

So different use cases are covered by different audiences (end users thru researchers)

The problem is at first impression, the callback timing data is more opaque.

Although single-framedrop detection is undocumented (it is not part of any web specification), it is relatively reliable in rAF() but not reliable in other animation APIs.

TestUFO works in all kinds of browsers (smart TV browsers, streaming-device browsers, mobile browsers, and PC browsers), many of which do not currently support other animation APIs but reliably supports rAF(). It will take a long time to discontinue rAF(). And TestUFO is a "educate yourself in a single link" (e.g. demos like www.testufo.com/persistence are hugely self-educational) without needing to install an executable application. So we're very wedded to using browser architectures for mass appeal and widespread flexibility in all kinds of odd-equipment odd-testing situations --

Scientifically-reliable framedrop detection is a mandatory part of TestUFO; and increased timer jitter (Meltdown/Spectre or GPU rendering or different precision than rAF) can make this harder, especially during ultratiny refresh intervals;

You can see some hints of the algorithm in action at https://www.testufo.com/animation-time-graph -- disrupt it by doing things like window resizes, simultaneous tabs, etc, and see the yellow/red spikes.

Perhaps in time the APIs will improve; but right now it's still a low-scientific-quality, because of various factors; many other animation APIs don't provide sufficiently-reliable callback-interval timing stream to correctly detect refresh rate & use the data to detect framedrops (e.g. missed refresh cycles). Either there is more timing jitter, or it is algorithmically unsynchronized (e.g. half frame rate, third frame rate) with less consistency between web browsers.

As refresh rates goes up, the timing precision required goes up, and rAF() stayed precise while others animation APIs (e.g. WebGL) jitters too much (e.g. 3D rendering pipelining effects, and 3D graphics is slower than 2D, etc), given the emergence of 120Hz-500Hz displays. This is not the only issue though;

Once new animation APIs become available, I will test them more fully for low-jitter suitable for realtime data analysis to detect framedrops (e.g. detecting refresh rate, detecting whether animation is sync'd to refresh rate, and also detecting double-interval and triple-interval events)

Also in our old 120Hz browser tests from almost 10 years ago, we discovered we could still accurately heuristically detect framedrops even if the animation rendering jittered massively from 0% to 80% of a refresh cycle interval (e.g. frame took 0.8/60sec to render on a 60Hz display).

The margins are really tight at 240Hz-500Hz+ (and 1000Hz displays in the lab already -- I helped Microsoft add a registry tweak to Windows Insider build to unlock 1000Hz display support) and rAF() succeeds where other animation APIs fail. TestUFO runs with correct refresh rate detection at 500fps 500Hz (on the new 500Hz display) for all tests on modern AMD and NVIDIA GPUs.

If animation frames slew all over the place without clear interval-doublings, then framedrop detection logic fails. 500Hz means 2ms per refresh cycle, and detecting difference between 2ms (normal) and 4ms callback interval (framedropped) in the midst of timer jitter, is much harder. Or say, lower-end GPUs (streaming sticks, Fire TV, previous model Apple TV, etc) that takes a lot of CPU to run certain TestUFO tests -- so this problem is still applicable at 60Hz too;

I will need to retest Animation API to see if least some browsers now has sufficient callback-interval precision for non-rAF() animation APIs accurate enough for needs. Running on underpowered Intel GPUs on a 4K display, is a lot less reliable in other APIs than rAF() -- e.g. budget laptops or chromebooks connected to a 4K TV for testing. Or even running in the built-in SmartTV browser or on a streaming box (Fire TV, Apple TV, etc), or even Android phone connected by USB-C to HDMI in a "use limited equipment at hand' situation worldwide. I already prevent the green READY from showing up on devices that have insufficient callback-timing precisions -- forcing scientific-quality users to use a different higher-performing device;

I will likely run a new pass of timer-jitter accuracy tests on the various callback accuracies of other non-rAF() APIs, but right now, rAF() is the gold standard for low-timing-jitter that is easy to heuristically detect real framedrops (e.g. very clear sudden double-intervals, and such effects).

Questions

  1. Is there any data or any new non-rAF animation timing API suitable for single framedrop detection that is as reliable as the logic I've developed for rAF()?

  2. If not, is there still an opportunity to standardize framedrop-detection capability more officially?

Perhaps this is an opportunity to standardize callback timing data, to provide sufficient enough information to distinguish performance-throttling versus real-refresh-rates, plus also the ability to detect framedrops (e.g. 59fps at 60Hz, or 239fps at 240Hz) during fixed-Hz fixed-framerate scientific tests.

@mdrejhon
Copy link
Author

mdrejhon commented Aug 3, 2022

We researched more on Web Animation API.

I spent some time to look carefully at possible dependences on Web Animation API, and I don't see any dependency with HDR.

This is because we are preparing to try and support Web Animations API, but we need to make sure we can somehow help influence the API to meet our future needs. Even requestAnimationFrame() works with HDR canvas, so we may be stuck with it for a decade (or more), but we will be attempting to port to parallel APIs (e.g. WebGL or Web Animations API or other)

Our further research still further confirms Web Animation API is still lower science-quality than requestAnimationFrame() for our specific needs. More info why framedrop detection is harder with other APIs than requestAnimationFrame() can seen at #93 ...

Nontheless, we want to make sure we influence Web Animation API to be compatible with our future needs. We'll post a bugtracking item at the appropriate web consortiums (W3C and WHATWG), to see if we can add a minor amendment (a per frame callback specifically for frame timing / frame drop measurement, not for drawing) -- as we need to measure the now() timestamp of frame presentation (i.e. Present() or glxxSwapBuffers() inside the browser engine) -- to Web Animation API to allow it to successfully replace requestAnimationFrame() for scientific-accuracy needs #93

So HDR compatibility and frame-timing-callback ability are mandatory prereq's for our ability to replace requestAnimationFrame()

That being said,

  • GOOD: It appears there are no HDR dependencies with Web Animation API, based on our further research
  • BAD: However, Web Animation API fails our needs because of a lack of frame presentation callback, forcing us to stay with requestAnimatonFrame() for better scientific accuracy because we need accurate timestamps of frame presentation times for every single frame (requestAnimationFrame() is able to do that)

@mdrejhon
Copy link
Author

mdrejhon commented Aug 3, 2022

Since I have now spent sufficient hours of research on Web Animation API, and personally confirmed the lack of HDR dependences.

This issue is probably no longer requires further discussion -- except as a "make sure you don't shoot yourself in your foot by accident" reference for other different HDR-inexperienced team members.

However, this is still highly relevant (why we can't use Web Animation API yet): #85 (comment)

-- HDR concern is now deprecated, as I don't see any (unintentional/accidental) dependencies with Web Animation API --

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
animations useful for animation-related explainers
Projects
None yet
Development

No branches or pull requests

2 participants