Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Factor the lambda compilation passes of VM and native #19015

Merged
merged 5 commits into from May 16, 2024

Conversation

ppedrot
Copy link
Member

@ppedrot ppedrot commented May 12, 2024

This is a series of cleanups that make the VM and native generate the same lambda code up to structured values. Apart for the simple cleanups, the VM pipeline is untouched. This changes the output of native compilation in the following way:

  • Let-bound closures and evars are not substituted unconditionally anymore in the simplification phase.
  • Instead, arbitrary let-bindings are inlined using the remove_let heuristic, i.e. the bound variable must occur linearly outside of a closure.

EDIT: the last commit also modifies the VM compilation by restricting the remove_let heuristic to syntactic values.

@ppedrot ppedrot added kind: cleanup Code removal, deprecation, refactorings, etc. request: full CI Use this label when you want your next push to trigger a full CI. labels May 12, 2024
@ppedrot ppedrot requested review from a team as code owners May 12, 2024 12:27
@coqbot-app coqbot-app bot removed the request: full CI Use this label when you want your next push to trigger a full CI. label May 12, 2024
@ppedrot
Copy link
Member Author

ppedrot commented May 12, 2024

@coqbot bench

Copy link
Contributor

coqbot-app bot commented May 13, 2024

🏁 Bench results:

┌─────────────────────────────────────┬─────────────────────────┬───────────────────────────────────────┬───────────────────────────────────────┬─────────────────────────┐
│                                     │      user time [s]      │              CPU cycles               │           CPU instructions            │  max resident mem [KB]  │
│                                     │                         │                                       │                                       │                         │
│            package_name             │   NEW      OLD    PDIFF │      NEW             OLD        PDIFF │      NEW             OLD        PDIFF │   NEW      OLD    PDIFF │
├─────────────────────────────────────┼─────────────────────────┼───────────────────────────────────────┼───────────────────────────────────────┼─────────────────────────┤
│                             coq-vst │  879.99   888.39  -0.95 │  4000446417147   4039017637163  -0.95 │  6743280651003   6743289200174  -0.00 │ 2323104  2322276   0.04 │
│ coq-neural-net-interp-computed-lite │  300.09   302.62  -0.84 │  1370603760283   1382181569101  -0.84 │  3617709073913   3617795762350  -0.00 │ 1112948  1119300  -0.57 │
│                         coq-coqutil │   42.51    42.84  -0.77 │   187421161707    187999931579  -0.31 │   270870473598    270838474471   0.01 │  561672   563696  -0.36 │
│          coq-performance-tests-lite │  706.23   711.25  -0.71 │  3187665455268   3211267522872  -0.73 │  5682056378890   5684960905408  -0.05 │ 2278908  2275024   0.17 │
│             coq-metacoq-safechecker │  419.91   422.83  -0.69 │  1922131125882   1934924282238  -0.66 │  3184276168761   3184083262340   0.01 │ 2071748  2069652   0.10 │
│                       coq-fiat-core │   59.47    59.70  -0.39 │   248800438399    249482844391  -0.27 │   368550968109    368600016236  -0.01 │  482780   482136   0.13 │
│               coq-mathcomp-solvable │  117.92   118.37  -0.38 │   538620128347    540281348018  -0.31 │   857295567997    857277630711   0.00 │  877020   879352  -0.27 │
│                           coq-color │  251.08   251.85  -0.31 │  1129608213991   1130577709569  -0.09 │  1636116553269   1636190434470  -0.00 │ 1179048  1175924   0.27 │
│         coq-rewriter-perf-SuperFast │  787.39   789.28  -0.24 │  3579075504735   3588645720459  -0.27 │  6256530213360   6256922843627  -0.01 │ 1576636  1574160   0.16 │
│                            coq-hott │  160.43   160.69  -0.16 │   716804556644    717255161953  -0.06 │  1140442350353   1140463415959  -0.00 │  478188   480688  -0.52 │
│                        coq-compcert │  281.94   282.38  -0.16 │  1275828018965   1276611252183  -0.06 │  1946057436274   1945826167273   0.01 │ 1106020  1105712   0.03 │
│                            coq-core │  127.45   127.56  -0.09 │   505734899652    506883175838  -0.23 │   531770110798    532123279393  -0.07 │  457240   457096   0.03 │
│                      coq-verdi-raft │  581.00   581.09  -0.02 │  2643573394939   2643204178960   0.01 │  4161368310478   4161390278980  -0.00 │  848472   847716   0.09 │
│                      coq-coquelicot │   39.60    39.60   0.00 │   176473337315    176842305894  -0.21 │   249314285931    249289502454   0.01 │  852140   854740  -0.30 │
│                        coq-rewriter │  389.91   389.89   0.01 │  1773061857072   1771816244449   0.07 │  2990915193159   2990771733406   0.00 │ 1476892  1477836  -0.06 │
│                            coq-corn │  716.68   716.45   0.03 │  3254267853544   3251853945949   0.07 │  5078451095954   5078467125647  -0.00 │  760272   760284  -0.00 │
│                    coq-math-classes │   85.81    85.78   0.03 │   384886329336    384912445059  -0.01 │   536936787838    536922954419   0.00 │  504540   506864  -0.46 │
│                       coq-fourcolor │ 1360.92  1360.29   0.05 │  6223915993304   6220900755086   0.05 │ 12910018872986  12909955937765   0.00 │ 2182576  2182444   0.01 │
│                 coq-category-theory │  693.13   692.67   0.07 │  3154267891310   3151156705746   0.10 │  5263800287489   5263853429060  -0.00 │  957212   958632  -0.15 │
│                         coq-bignums │   29.57    29.55   0.07 │   134422974874    133760037541   0.50 │   192314402314    192289929795   0.01 │  474804   472020   0.59 │
│                      coq-test-suite │  700.76   699.68   0.15 │  2992252455366   2988344151406   0.13 │  5379380772056   5374248105577   0.10 │ 2461756  2457144   0.19 │
│                coq-metacoq-template │  151.07   150.76   0.21 │   669513101633    667982559968   0.23 │  1046965470912   1047007755282  -0.00 │ 1514740  1513612   0.07 │
│                 coq-metacoq-erasure │  502.06   500.74   0.26 │  2290419153069   2285347995223   0.22 │  3581566763593   3581472690352   0.00 │ 2104048  2105188  -0.05 │
│                           coq-verdi │   49.05    48.92   0.27 │   220419828562    220191204938   0.10 │   340875549693    340931094330  -0.02 │  534028   531920   0.40 │
│        coq-fiat-crypto-with-bedrock │ 6226.85  6210.02   0.27 │ 28362418728123  28287827210087   0.26 │ 51223894346090  51220062651283   0.01 │ 3245956  3246108  -0.00 │
│                    coq-fiat-parsers │  312.68   311.83   0.27 │  1392831399691   1390849250843   0.14 │  2471461731735   2471199651159   0.01 │ 2411092  2408956   0.09 │
│                   coq-iris-examples │  469.36   467.50   0.40 │  2134565468113   2126003327980   0.40 │  3276815447352   3276756064714   0.00 │ 1116644  1121344  -0.42 │
│                        coq-coqprime │   48.40    48.18   0.46 │   216899428145    216682233494   0.10 │   334453943113    334499996375  -0.01 │  784988   786648  -0.21 │
│               coq-engine-bench-lite │  156.86   156.12   0.47 │   664348677251    662028154212   0.35 │  1228017528013   1223933378163   0.33 │ 1324236  1326240  -0.15 │
│               coq-mathcomp-fingroup │   30.65    30.49   0.52 │   139612293185    139305735073   0.22 │   207751044150    207781548738  -0.01 │  565044   565628  -0.10 │
│                         coq-unimath │ 2469.17  2455.46   0.56 │ 11240073289165  11177928789752   0.56 │ 22200981081020  22199697048917   0.01 │ 1254084  1254676  -0.05 │
│                        coq-bedrock2 │  360.68   358.53   0.60 │  1644597607836   1632772622403   0.72 │  3102735405383   3102832751354  -0.00 │  858760   861912  -0.37 │
│                  coq-mathcomp-field │  143.28   142.28   0.70 │   653736245022    649448281644   0.66 │  1074355067950   1074312565730   0.00 │ 1487868  1484884   0.20 │
│                   coq-metacoq-pcuic │  992.98   985.87   0.72 │  4447158146714   4421378810486   0.58 │  6470298830730   6470116621844   0.00 │ 2679864  2681472  -0.06 │
│                          coq-stdlib │  350.37   347.57   0.81 │  1492207664138   1484779648545   0.50 │  1258622968104   1258542846417   0.01 │  715152   717392  -0.31 │
│              coq-mathcomp-character │  109.58   108.65   0.86 │   500605114555    496395855233   0.85 │   790350655925    790360607693  -0.00 │ 1136352  1136148   0.02 │
│            coq-metacoq-translations │   17.12    16.96   0.94 │    76196469069     75792490552   0.53 │   125131217102    125132813994  -0.00 │  844864   845212  -0.04 │
│                       coq-equations │    7.72     7.64   1.05 │    31087727423     31288488868  -0.64 │    50863400114     50858909172   0.01 │  387684   387288   0.10 │
│              coq-mathcomp-ssreflect │  221.38   219.07   1.05 │  1011243247685   1000603229368   1.06 │  1675274039627   1675851629814  -0.03 │ 1734796  1726964   0.45 │
│                coq-mathcomp-algebra │  261.57   258.37   1.24 │  1193213512685   1178012649002   1.29 │  2000523645365   2000513032309   0.00 │ 1339588  1341608  -0.15 │
│              coq-mathcomp-odd-order │  799.76   788.34   1.45 │  3661847206891   3610795590757   1.41 │  6078090182584   6078009358127   0.00 │ 1655164  1652976   0.13 │
└─────────────────────────────────────┴─────────────────────────┴───────────────────────────────────────┴───────────────────────────────────────┴─────────────────────────┘

🐢 Top 25 slow downs
┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                                 TOP 25 SLOW DOWNS                                                                 │
│                                                                                                                                                   │
│   OLD       NEW      DIFF   %DIFF    Ln                      FILE                                                                                 │
├───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ 255.9510  259.1060  3.1550   1.23%  1629  coq-metacoq-pcuic/pcuic/theories/PCUICSR.v.html                                                         │
│  62.5600   64.9690  2.4090   3.85%   609  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html           │
│  95.8370   97.5680  1.7310   1.81%   376  coq-unimath/UniMath/ModelCategories/Generated/LNWFSMonoidalStructure.v.html                             │
│   3.8200    5.0120  1.1920  31.20%    22  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/ArithWithCasts.v.html                                  │
│ 180.7470  181.8840  1.1370   0.63%   233  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/DecodeByExtension.v.html │
│ 100.2940  101.2980  1.0040   1.00%    20  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/NBE.v.html                                             │
│  16.7210   17.6920  0.9710   5.81%   607  coq-mathcomp-odd-order/theories/PFsection9.v.html                                                       │
│  45.9960   46.9260  0.9300   2.02%   110  coq-bedrock2/bedrock2/src/bedrock2Examples/full_mul.v.html                                              │
│  19.6650   20.3790  0.7140   3.63%  2061  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlatToRiscvFunctions.v.html        │
│  63.1640   63.8690  0.7050   1.12%   609  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                             │
│ 132.4140  133.1150  0.7010   0.53%    22  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/ArithWithCasts.v.html                                  │
│  38.2980   38.9830  0.6850   1.79%   338  coq-unimath/UniMath/ModelCategories/Generated/LNWFSMonoidalStructure.v.html                             │
│   2.6970    3.3720  0.6750  25.03%    34  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/ArithWithCasts.v.html                                  │
│  46.2300   46.8870  0.6570   1.42%   926  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/GarageDoor.v.html                               │
│  30.7910   31.4410  0.6500   2.11%   147  coq-vst/veric/expr_lemmas4.v.html                                                                       │
│  68.3920   69.0360  0.6440   0.94%   368  coq-mathcomp-odd-order/theories/PFsection4.v.html                                                       │
│  29.9830   30.5080  0.5250   1.75%   194  coq-vst/veric/expr_lemmas4.v.html                                                                       │
│  46.0870   46.6090  0.5220   1.13%   110  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/full_mul.v.html            │
│ 154.8150  155.3070  0.4920   0.32%  1190  coq-unimath/UniMath/CategoryTheory/GrothendieckConstruction/IsPullback.v.html                           │
│  23.5090   24.0000  0.4910   2.09%    49  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/AffineProofs.v.html                                 │
│  11.2100   11.6990  0.4890   4.36%  2103  coq-mathcomp-ssreflect/mathcomp/ssreflect/order.v.html                                                  │
│  21.2700   21.7340  0.4640   2.18%   376  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/compiler/src/compiler/FlatToRiscvMetric.v.html           │
│  24.5580   25.0000  0.4420   1.80%   374  coq-unimath/UniMath/ModelCategories/Generated/LNWFSMonoidalStructure.v.html                             │
│  39.5020   39.9260  0.4240   1.07%   835  coq-fiat-crypto-with-bedrock/src/Fancy/Compiler.v.html                                                  │
│  24.5910   24.9900  0.3990   1.62%   375  coq-unimath/UniMath/ModelCategories/Generated/LNWFSMonoidalStructure.v.html                             │
└───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
🐇 Top 25 speed ups
┌────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                                TOP 25 SPEED UPS                                                                │
│                                                                                                                                                │
│   OLD       NEW      DIFF     %DIFF    Ln                     FILE                                                                             │
├────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ 220.2720  217.2370  -3.0350   -1.38%   103  coq-fiat-crypto-with-bedrock/src/Arithmetic/BarrettReduction.v.html                                │
│ 261.1130  258.4160  -2.6970   -1.03%     8  coq-neural-net-interp-computed-lite/theories/MaxOfTwoNumbersSimpler/Computed/AllLogits.v.html      │
│  15.9140   14.3370  -1.5770   -9.91%  1204  coq-vst/floyd/Component.v.html                                                                     │
│  15.9900   14.4400  -1.5500   -9.69%  1218  coq-vst/floyd/Component.v.html                                                                     │
│   8.3490    7.3010  -1.0480  -12.55%  1216  coq-vst/floyd/Component.v.html                                                                     │
│   8.0770    7.1560  -0.9210  -11.40%  1504  coq-vst/floyd/Component.v.html                                                                     │
│   8.2330    7.3120  -0.9210  -11.19%  1217  coq-vst/floyd/Component.v.html                                                                     │
│   8.4810    7.5710  -0.9100  -10.73%   751  coq-vst/floyd/Component.v.html                                                                     │
│   8.2090    7.3720  -0.8370  -10.20%  1201  coq-vst/floyd/Component.v.html                                                                     │
│  67.1340   66.3320  -0.8020   -1.19%    27  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/ToFancyWithCasts.v.html                           │
│   8.2310    7.4400  -0.7910   -9.61%  1203  coq-vst/floyd/Component.v.html                                                                     │
│  12.3220   11.5600  -0.7620   -6.18%   194  coq-fiat-crypto-with-bedrock/src/Fancy/Barrett256.v.html                                           │
│   7.9230    7.1780  -0.7450   -9.40%   778  coq-vst/floyd/Component.v.html                                                                     │
│  38.4730   37.7960  -0.6770   -1.76%     3  coq-fiat-crypto-with-bedrock/src/ExtractionJsOfOCaml/bedrock2_fiat_crypto.v.html                   │
│   3.7770    3.2570  -0.5200  -13.77%   490  coq-stdlib/Reals/Cauchy/ConstructiveCauchyRealsMult.v.html                                         │
│  17.9190   17.4250  -0.4940   -2.76%    32  coq-performance-tests-lite/src/pattern.v.html                                                      │
│ 242.3560  241.8760  -0.4800   -0.20%   141  coq-fiat-crypto-with-bedrock/src/UnsaturatedSolinasHeuristics/Tests.v.html                         │
│  44.5640   44.1500  -0.4140   -0.93%   827  coq-vst/veric/binop_lemmas4.v.html                                                                 │
│   7.7940    7.4580  -0.3360   -4.31%  1501  coq-vst/floyd/Component.v.html                                                                     │
│   7.3990    7.0880  -0.3110   -4.20%   602  coq-unimath/UniMath/CategoryTheory/EnrichedCats/Limits/Examples/StructureEnrichedLimits.v.html     │
│  26.9220   26.6220  -0.3000   -1.11%    17  coq-fiat-crypto-with-bedrock/src/Spec/Test/X25519.v.html                                           │
│  26.9400   26.6430  -0.2970   -1.10%    22  coq-fiat-crypto-with-bedrock/src/Spec/Test/X25519.v.html                                           │
│  42.1000   41.8050  -0.2950   -0.70%   139  coq-fiat-parsers/src/Parsers/Refinement/SharpenedJSON.v.html                                       │
│   7.1190    6.8440  -0.2750   -3.86%   604  coq-unimath/UniMath/CategoryTheory/EnrichedCats/Colimits/Examples/StructureEnrichedColimits.v.html │
│   4.4870    4.2290  -0.2580   -5.75%   199  coq-fiat-crypto-with-bedrock/src/Fancy/Montgomery256.v.html                                        │
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

@@ -452,6 +459,11 @@ let rec remove_let subst lam =
if def == def' && body == body' then lam else Llet(id,def',body')
| _ -> map_lam_with_binders liftn remove_let subst lam

let optimize lam =
let lam = simplify lam in
let lam = remove_let (subs_id 0) lam in
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not too happy about this line. I consider the remove_let optimization to be quite dubious already, especially in the presence of primitive arrays (see #12947, #13234). And now, if I understand correctly, native_compute will also suffer from it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can just remove this line if you prefer. My only preoccupation is to generate the same code in native and VM.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That said, I stand by my original position in these bug reports, i.e. relying on let-bindings to order computation in Coq is a terrible idea that conflicts with the internal semantics of CIC. The only sane way to force computation order is by pattern matching, i.e. replace let x := t in u with match t with Box x => u end where Inductive box (A : Type) := Box : A -> box A.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can also take a middle-ground mixing the current VM and native heuristic, by only let-substituting values.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most of the values have already been substituted by simplify itself. So, the remaining ones are just the ones for creating closures: Llam, Lfix, and Lcofix. That would be fine, since moving them around cannot arbitrarily change the time complexity of an algorithm, not even when taking the garbage collector into account, if I am not mistaken.

And by the way, in my opinion, having the user replace let x := t in u with match Box t with Box x => u end is just insane.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would be fine

Patch incoming.

just insane

Beware, it's not an η expansion as you wrote, to indicate call-by-value you must return a Box type, so all strict primitives must have type Box A for some A, or unit if they do not return anything. Indeed, let _ : unit := t in u and match t with tt => u are two very different things computationally speaking.

We use the criterion from the VM rather than the native one, because it is
more reasonable. The code for the VM allowed additionally 63-bit integers,
while the one for native allowed λ-abstractions and evars. Duplicating
the latter, which may contain arbitrary subterms, does not sound like a
great idea for code size.
This ensures that this optimization doesn't wreak havoc with strict effectful
primitives such as array modification.
@ppedrot ppedrot added request: full CI Use this label when you want your next push to trigger a full CI. and removed kind: cleanup Code removal, deprecation, refactorings, etc. labels May 13, 2024
@coqbot-app coqbot-app bot removed the request: full CI Use this label when you want your next push to trigger a full CI. label May 13, 2024
@ppedrot
Copy link
Member Author

ppedrot commented May 13, 2024

@silene tweak implemented. Let's @coqbot bench to see whether this affects the VM somehow.

@ppedrot
Copy link
Member Author

ppedrot commented May 13, 2024

I'm quite concerned that this last change in inlining will affect negatively #19014. Code generated by tactics typically contain a lot of let-bindings whose body is pretty much arbitrary (typically, an application). This will prevent erasure to work as effectively.

I checked and actually this patch doesn't change anything there for unimath.

Copy link
Contributor

coqbot-app bot commented May 13, 2024

🏁 Bench results:

┌─────────────────────────────────────┬─────────────────────────┬───────────────────────────────────────┬───────────────────────────────────────┬─────────────────────────┐
│                                     │      user time [s]      │              CPU cycles               │           CPU instructions            │  max resident mem [KB]  │
│                                     │                         │                                       │                                       │                         │
│            package_name             │   NEW      OLD    PDIFF │      NEW             OLD        PDIFF │      NEW             OLD        PDIFF │   NEW      OLD    PDIFF │
├─────────────────────────────────────┼─────────────────────────┼───────────────────────────────────────┼───────────────────────────────────────┼─────────────────────────┤
│              coq-mathcomp-ssreflect │  220.52   223.37  -1.28 │   994697085384   1007329047055  -1.25 │  1675283260491   1675857412293  -0.03 │ 1731324  1724900   0.37 │
│               coq-mathcomp-fingroup │   30.20    30.45  -0.82 │   137168562806    138092820977  -0.67 │   207751390234    207750278309   0.00 │  568104   568620  -0.09 │
│                       coq-equations │    7.55     7.60  -0.66 │    31076927881     31197812946  -0.39 │    50842631705     50846588893  -0.01 │  389532   389364   0.04 │
│                 coq-metacoq-erasure │  501.82   504.99  -0.63 │  2273823793588   2285926934132  -0.53 │  3581698332227   3581778528964  -0.00 │ 2106872  2107368  -0.02 │
│                          coq-stdlib │  353.83   355.98  -0.60 │  1474964431183   1478181072158  -0.22 │  1258569798306   1258613138867  -0.00 │  716544   717260  -0.10 │
│                            coq-core │  130.87   131.65  -0.59 │   500069430789    493168816882   1.40 │   532010976193    531646264442   0.07 │  457460   458008  -0.12 │
│                    coq-math-classes │   85.74    86.19  -0.52 │   383785400314    384983011318  -0.31 │   536878367633    536988360114  -0.02 │  504248   505044  -0.16 │
│                      coq-coquelicot │   39.64    39.82  -0.45 │   175795004756    175710330984   0.05 │   248962606534    248955277151   0.00 │  852180   852636  -0.05 │
│                        coq-rewriter │  390.88   392.59  -0.44 │  1761182486684   1768780604008  -0.43 │  2990715195632   2991202163298  -0.02 │ 1477744  1478020  -0.02 │
│                           coq-verdi │   48.98    49.17  -0.39 │   219754218467    219731565497   0.01 │   340901486488    340917351532  -0.00 │  534144   530680   0.65 │
│                coq-metacoq-template │  150.94   151.51  -0.38 │   664965585811    666900034302  -0.29 │  1046922329139   1047120006752  -0.02 │ 1516384  1513432   0.20 │
│         coq-rewriter-perf-SuperFast │  795.27   798.06  -0.35 │  3557672623060   3568622644398  -0.31 │  6256914381780   6256947386760  -0.00 │ 1574816  1574424   0.02 │
│                   coq-iris-examples │  465.48   466.96  -0.32 │  2107549829027   2112838459907  -0.25 │  3277036368575   3276901853554   0.00 │ 1113096  1119300  -0.55 │
│             coq-metacoq-safechecker │  421.25   422.55  -0.31 │  1920250117452   1926479042186  -0.32 │  3184671090913   3184374139174   0.01 │ 2090176  2072296   0.86 │
│                  coq-mathcomp-field │  142.77   143.17  -0.28 │   644181947173    645576822807  -0.22 │  1074466372900   1074325072107   0.01 │ 1483000  1485948  -0.20 │
│                 coq-category-theory │  690.31   691.74  -0.21 │  3128699789821   3132013738517  -0.11 │  5263936354999   5263758004996   0.00 │  957024   956624   0.04 │
│                   coq-metacoq-pcuic │  987.32   989.34  -0.20 │  4400854836252   4411883134222  -0.25 │  6469935988050   6470340165564  -0.01 │ 2682740  2681528   0.05 │
│              coq-mathcomp-character │  109.15   109.37  -0.20 │   496070103736    496813222374  -0.15 │   790395114547    790330344333   0.01 │ 1136920  1135636   0.11 │
│                        coq-coqprime │   48.24    48.33  -0.19 │   215644323046    215246915013   0.18 │   334676921670    334524404270   0.05 │  785060   785492  -0.05 │
│               coq-engine-bench-lite │  156.40   156.66  -0.17 │   655482790896    656829941265  -0.21 │  1222464158109   1224668322546  -0.18 │ 1321368  1325056  -0.28 │
│                        coq-compcert │  282.17   282.60  -0.15 │  1266799285981   1268481137807  -0.13 │  1945579978988   1945845876940  -0.01 │ 1106416  1103608   0.25 │
│                         coq-coqutil │   43.01    43.07  -0.14 │   187770438886    187933331120  -0.09 │   270929935564    270955194968  -0.01 │  561036   561252  -0.04 │
│               coq-mathcomp-solvable │  118.16   118.25  -0.08 │   537504874857    537627233222  -0.02 │   857287912005    857309529991  -0.00 │  879900   879348   0.06 │
│                         coq-unimath │ 2464.68  2466.34  -0.07 │ 11175541486090  11179797897816  -0.04 │ 22200569896207  22200476046837   0.00 │ 1254560  1254528   0.00 │
│                       coq-fiat-core │   60.11    60.15  -0.07 │   247701723202    247714448754  -0.01 │   368789922063    368762838785   0.01 │  482740   482436   0.06 │
│            coq-metacoq-translations │   17.15    17.16  -0.06 │    75378642082     75490285811  -0.15 │   125306130399    125126353141   0.14 │  845432   846628  -0.14 │
│                             coq-vst │  881.80   881.82  -0.00 │  3985156557908   3983177756293   0.05 │  6743417121295   6743155281894   0.00 │ 2322120  2323480  -0.06 │
│                            coq-corn │  713.74   713.68   0.01 │  3229950770805   3230323063232  -0.01 │  5078248976532   5078066118710   0.00 │  755356   755440  -0.01 │
│        coq-fiat-crypto-with-bedrock │ 6233.91  6233.18   0.01 │ 28241997709950  28235692396305   0.02 │ 51225820442233  51216669163607   0.02 │ 3245964  3245876   0.00 │
│                           coq-color │  252.67   252.60   0.03 │  1127783038966   1126060925527   0.15 │  1636058274814   1636286462707  -0.01 │ 1176408  1178372  -0.17 │
│                coq-mathcomp-algebra │  260.60   260.51   0.03 │  1177169039740   1177298038849  -0.01 │  2000634155869   2000538129642   0.00 │ 1343356  1342212   0.09 │
│              coq-mathcomp-odd-order │  792.56   792.06   0.06 │  3617034652921   3617075785349  -0.00 │  6078452640521   6078098473990   0.01 │ 1652648  1655144  -0.15 │
│          coq-performance-tests-lite │  716.37   715.81   0.08 │  3210291668064   3204968376923   0.17 │  5683660436300   5685713260575  -0.04 │ 2276524  2275284   0.05 │
│                      coq-verdi-raft │  581.87   581.01   0.15 │  2640525574046   2638021032399   0.09 │  4161586907087   4161386697865   0.00 │  848252   847860   0.05 │
│                      coq-test-suite │  706.39   704.33   0.29 │  2998778571826   2990498075968   0.28 │  5374263401607   5376524955163  -0.04 │ 2694508  2463276   9.39 │
│                    coq-fiat-parsers │  314.88   313.35   0.49 │  1393622697761   1384973531722   0.62 │  2487557678592   2471272507795   0.66 │ 2415488  2408912   0.27 │
│                            coq-hott │  162.01   160.78   0.77 │   715091218881    712993485013   0.29 │  1140154424312   1140425803174  -0.02 │  477916   478828  -0.19 │
│                        coq-bedrock2 │  364.09   360.70   0.94 │  1643459826088   1627998837834   0.95 │  3110286444151   3110087156558   0.01 │  859904   858696   0.14 │
│                         coq-bignums │   29.72    29.44   0.95 │   133462122909    132717883399   0.56 │   192341531692    192306024738   0.02 │  472604   474796  -0.46 │
│                       coq-fourcolor │ 1401.63  1367.16   2.52 │  6377439986159   6222841660106   2.48 │ 13511774917963  12909871951180   4.66 │ 2187292  2183060   0.19 │
│ coq-neural-net-interp-computed-lite │  310.86   302.45   2.78 │  1418106136540   1380082743666   2.76 │  3617936927566   3617707154665   0.01 │ 1113092  1113096  -0.00 │
└─────────────────────────────────────┴─────────────────────────┴───────────────────────────────────────┴───────────────────────────────────────┴─────────────────────────┘

🐢 Top 25 slow downs
┌──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                                TOP 25 SLOW DOWNS                                                                 │
│                                                                                                                                                  │
│   OLD       NEW      DIFF   %DIFF   Ln                      FILE                                                                                 │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ 260.7810  268.9420  8.1610  3.13%     8  coq-neural-net-interp-computed-lite/theories/MaxOfTwoNumbersSimpler/Computed/AllLogits.v.html           │
│  96.6050   97.9480  1.3430  1.39%   376  coq-unimath/UniMath/ModelCategories/Generated/LNWFSMonoidalStructure.v.html                             │
│ 217.5900  218.9120  1.3220  0.61%   103  coq-fiat-crypto-with-bedrock/src/Arithmetic/BarrettReduction.v.html                                     │
│  62.4210   63.7350  1.3140  2.11%   609  coq-bedrock2/bedrock2/src/bedrock2Examples/lightbulb.v.html                                             │
│  30.2980   31.5630  1.2650  4.18%    12  coq-fourcolor/theories/job254to270.v.html                                                               │
│ 256.6540  257.8910  1.2370  0.48%  1629  coq-metacoq-pcuic/pcuic/theories/PCUICSR.v.html                                                         │
│  26.9830   28.1280  1.1450  4.24%    35  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/MontgomeryLadderRISCV.v.html                    │
│  25.9970   27.1300  1.1330  4.36%    12  coq-fourcolor/theories/job611to617.v.html                                                               │
│ 180.8380  181.9150  1.0770  0.60%   233  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/deps/riscv-coq/src/riscv/Proofs/DecodeByExtension.v.html │
│  26.4840   27.5390  1.0550  3.98%    12  coq-fourcolor/theories/job287to290.v.html                                                               │
│  27.8560   28.8700  1.0140  3.64%    12  coq-fourcolor/theories/job563to588.v.html                                                               │
│  28.8260   29.8010  0.9750  3.38%    32  coq-fiat-crypto-with-bedrock/src/Bedrock/End2End/X25519/MontgomeryLadderRISCV.v.html                    │
│  48.0400   48.9600  0.9200  1.92%   558  coq-fiat-crypto-with-bedrock/rupicola/bedrock2/bedrock2/src/bedrock2Examples/insertionsort.v.html       │
│  32.2980   33.2150  0.9170  2.84%    12  coq-fourcolor/theories/job439to465.v.html                                                               │
│  24.9430   25.8240  0.8810  3.53%    12  coq-fourcolor/theories/job499to502.v.html                                                               │
│  28.7350   29.6080  0.8730  3.04%    12  coq-fourcolor/theories/job323to383.v.html                                                               │
│  28.6400   29.5070  0.8670  3.03%    12  coq-fourcolor/theories/job589to610.v.html                                                               │
│  47.7880   48.6160  0.8280  1.73%   558  coq-bedrock2/bedrock2/src/bedrock2Examples/insertionsort.v.html                                         │
│  29.3920   30.2170  0.8250  2.81%    12  coq-fourcolor/theories/job165to189.v.html                                                               │
│  29.3700   30.1730  0.8030  2.73%    12  coq-fourcolor/theories/job001to106.v.html                                                               │
│  20.9240   21.7250  0.8010  3.83%    12  coq-fourcolor/theories/job542to545.v.html                                                               │
│  40.8390   41.6140  0.7750  1.90%   236  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Examples/PerfTesting/LiftLetsMap.v.html               │
│  27.6510   28.4160  0.7650  2.77%    12  coq-fourcolor/theories/job107to164.v.html                                                               │
│  24.6370   25.3960  0.7590  3.08%    12  coq-fourcolor/theories/job279to282.v.html                                                               │
│  24.7760   25.5230  0.7470  3.02%    12  coq-fourcolor/theories/job535to541.v.html                                                               │
└──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
🐇 Top 25 speed ups
┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│                                                               TOP 25 SPEED UPS                                                                │
│                                                                                                                                               │
│   OLD       NEW      DIFF     %DIFF    Ln                     FILE                                                                            │
├───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ 101.8470  101.0540  -0.7930   -0.78%    20  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/NBE.v.html                                       │
│  95.6530   94.9980  -0.6550   -0.68%   999  coq-performance-tests-lite/src/fiat_crypto_via_setoid_rewrite_standalone.v.html                   │
│  23.2950   22.6420  -0.6530   -2.80%  1073  coq-metacoq-safechecker/safechecker/theories/PCUICSafeReduce.v.html                               │
│   7.6800    7.0850  -0.5950   -7.75%   602  coq-unimath/UniMath/CategoryTheory/EnrichedCats/Limits/Examples/StructureEnrichedLimits.v.html    │
│  95.7050   95.1540  -0.5510   -0.58%   968  coq-performance-tests-lite/src/fiat_crypto_via_setoid_rewrite_standalone.v.html                   │
│  18.2810   17.7350  -0.5460   -2.99%    32  coq-performance-tests-lite/src/pattern.v.html                                                     │
│  39.8980   39.4200  -0.4780   -1.20%   835  coq-fiat-crypto-with-bedrock/src/Fancy/Compiler.v.html                                            │
│  81.8700   81.4360  -0.4340   -0.53%    48  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/AffineProofs.v.html                           │
│ 155.6210  155.2620  -0.3590   -0.23%  1190  coq-unimath/UniMath/CategoryTheory/GrothendieckConstruction/IsPullback.v.html                     │
│  11.5290   11.1810  -0.3480   -3.02%   546  coq-metacoq-erasure/erasure/theories/ErasureFunctionProperties.v.html                             │
│  12.5680   12.2300  -0.3380   -2.69%   409  coq-category-theory/Theory/Metacategory/ArrowsOnly.v.html                                         │
│  37.7090   37.4000  -0.3090   -0.82%     3  coq-fiat-crypto-with-bedrock/src/ExtractionJsOfOCaml/bedrock2_fiat_crypto.v.html                  │
│   4.9020    4.6000  -0.3020   -6.16%   446  coq-mathcomp-odd-order/theories/PFsection12.v.html                                                │
│  11.3210   11.0210  -0.3000   -2.65%   410  coq-verdi-raft/theories/RaftProofs/LeaderLogsLogMatchingProof.v.html                              │
│  16.7610   16.4720  -0.2890   -1.72%    81  coq-fiat-crypto-with-bedrock/rupicola/src/Rupicola/Examples/Utf8/Utf8.v.html                      │
│   9.9670    9.6810  -0.2860   -2.87%   418  coq-category-theory/Theory/Metacategory/ArrowsOnly.v.html                                         │
│  18.3590   18.0830  -0.2760   -1.50%   196  coq-unimath/UniMath/HomologicalAlgebra/KATriangulated.v.html                                      │
│   0.8520    0.5780  -0.2740  -32.16%   160  coq-stdlib/Numbers/HexadecimalNat.v.html                                                          │
│  25.9490   25.6780  -0.2710   -1.04%   129  coq-fiat-crypto-with-bedrock/src/Curves/Weierstrass/Projective.v.html                             │
│  17.1870   16.9170  -0.2700   -1.57%    24  coq-fiat-crypto-with-bedrock/src/Rewriter/Passes/MulSplit.v.html                                  │
│  18.2970   18.0410  -0.2560   -1.40%   708  coq-fiat-crypto-with-bedrock/src/Rewriter/RulesProofs.v.html                                      │
│  10.4530   10.2040  -0.2490   -2.38%   496  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Wf.v.html                                       │
│  20.7220   20.4820  -0.2400   -1.16%  2736  coq-metacoq-safechecker/safechecker/theories/PCUICTypeChecker.v.html                              │
│  41.1330   40.9020  -0.2310   -0.56%   236  coq-rewriter/src/Rewriter/Rewriter/Examples/PerfTesting/LiftLetsMap.v.html                        │
│  29.4470   29.2170  -0.2300   -0.78%    79  coq-rewriter-perf-SuperFast/src/Rewriter/Rewriter/Examples/PerfTesting/SieveOfEratosthenes.v.html │
└───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘

@ppedrot
Copy link
Member Author

ppedrot commented May 14, 2024

This optimization tweak seems to impact fourcolor to some extent, but I think that the diff is small enough that we can just accept it. Otherwise I think we can keep the previous heuristic. @silene ?

@ppedrot ppedrot added the kind: cleanup Code removal, deprecation, refactorings, etc. label May 14, 2024
@ppedrot ppedrot added this to the 8.20+rc1 milestone May 14, 2024
@silene
Copy link
Contributor

silene commented May 15, 2024

I am a bit confused as to why there is a slowdown. The only explanation I can think of is that the code contains the following kind of construct:

let v := foo x in
if b then bar v else baz

and it was implicitly relying on the optimization to get instead

if b then bar (foo x) else baz

Unfortunately, I don't have time to investigate right now. So, unless someone comes with a suggestion soon, I propose to merge as is.

@ppedrot
Copy link
Member Author

ppedrot commented May 15, 2024

I argue for merging as this, especially because it unlocks a tremendous reduction of the size of degenerate vo files in the dependent PR #19014.

@SkySkimmer SkySkimmer self-assigned this May 16, 2024
@SkySkimmer
Copy link
Contributor

@coqbot merge now

@coqbot-app coqbot-app bot merged commit a5f461d into coq:master May 16, 2024
7 checks passed
@ppedrot ppedrot deleted the vm-native-factor-simpl branch May 16, 2024 11:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind: cleanup Code removal, deprecation, refactorings, etc.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants