Gaussian elimination refactoring numpy and optimization. #9110
Add this suggestion to a batch that can be applied as a single commit. This suggestion is invalid because no changes were made to the code. Suggestions cannot be applied while the pull request is closed. Suggestions cannot be applied while viewing a subset of changes. Only one suggestion per line can be applied in a batch. Add this suggestion to a batch that can be applied as a single commit. Applying suggestions on deleted lines is not supported. You must change the existing code in this line in order to create a valid suggestion. Outdated suggestions cannot be applied. This suggestion has been applied or marked resolved. Suggestions cannot be applied from pending reviews. Suggestions cannot be applied on multi-line comments. Suggestions cannot be applied while the pull request is queued to merge. Suggestion cannot be applied right now. Please check back later.
Describe your change:
Instead of a nested loop, vector operations (refactoring numpy and optimization) were used.
To replace a nested loop, access to array values is used through slices. With a 'coefficients' array size of up to 1900, it works faster than with two loops.
code performance tests
At each iteration, arrays are generated: coefficient(n x n), vector(n x 1). Diagonal elements are made large 10 * n so that division by zero (diagonal_element) does not occur. The time spent by each algorithm in seconds, new and old, is displayed and the results are compared for equality.
Output:
Just in case, I’m attaching the result with a size of 5000:
array size n x n 5000 time_old 166.356578 time_new 307.890158 difference in time 0.54 comparison result array True
Need to decide what is more important: using a larger array (what size is usually used) or refactoring and using an array up to 1900 in size (if larger, the calculation will take longer)?
Checklist: