The purpose of this assignment is to understand why algorithms are judged by their Big-Oh notations rather than a more precise model, or runtime measurement
Using Bridges LineChart object, plot the runtime of an algorithm for problem size ranging from 1 to 105. (Don't use all values of n, take some values in the middle.)
Using Bridges LineChart object, plot the runtime of an algorithm for problem size ranging from 1 to 104. (Don't use all values of n, take some values in the middle.)
Using Bridges LineChart object, plot the runtime of an algorithm for problem size ranging from 1 to 102. (Don't use all values of n, take some values in the middle.)
Does it really matter that you can get a slightly faster machine if you can get a lower BigOh complexity? Does it really matter that you can gain a factor of 100 if you can get a lower BigOh complexity?
Bridges documentation LineChart documentation
Bridges documentation LineChart documentation