More specialization benchmarks #806
Merged
Add this suggestion to a batch that can be applied as a single commit. This suggestion is invalid because no changes were made to the code. Suggestions cannot be applied while the pull request is closed. Suggestions cannot be applied while viewing a subset of changes. Only one suggestion per line can be applied in a batch. Add this suggestion to a batch that can be applied as a single commit. Applying suggestions on deleted lines is not supported. You must change the existing code in this line in order to create a valid suggestion. Outdated suggestions cannot be applied. This suggestion has been applied or marked resolved. Suggestions cannot be applied from pending reviews. Suggestions cannot be applied on multi-line comments. Suggestions cannot be applied while the pull request is queued to merge. Suggestion cannot be applied right now. Please check back later.
I do not want us to add specialization benchmarks one by one in future pull requests so this is mostly done!
Not all of our iterators are benchmarked here:
group_by,chunks,tee,rciterjust like in More specialization tests #799.peeking_take_whileandtake_while_refdo not take ownership (see Ownership issues forItertoolsmethods #710) of the iterator they adapt and it's therefore problematic to add them to thebench_specializationsmacro.process_resultsdoes not give an iterator but process one and we can't create aProcessResultsobject outside of the library.MapForGroupingused forGroupingMapBybut it's strictly internal to the library, we will need to benchmark a (non-iterator) method ofGroupingMapByto benchmark the futureMapForGrouping::fold.unfold,iterate: infinite iterators, some benchmarks would not end. We can avoid running them but maybe we should handle them differently later.After this, we will be able to specialize most
foldmethods (see #755) faster because their specialization tests/benchmarks are written and we will just need to write the specialization in a commit, run tests, benchmark it before/after and review the PR.