To figure out the specific numbers, and if you'll need to cache the results from min and max, you'll need to measure the actual time your code takes, and decide if it is fast enough for your use case.Ĭolloquially we refer to algorithms as $O(1)$ or $O(n\ \log n)$ or such. An algorithm that will always take a million years no matter the input is still O(1), after all. This here is the big limitation of complexity analysis: it can only tell you how fast certain functions grow, it will not tell you the absolute time taken. This is superexponential, which isn't great, but if you can fix K to be bounded by a small constant, your runtime is polynomial, which is much better! (The downside is that your algorithm will have to refuse certain inputs, but hey those were going to take forever anyway.) This is especially useful with certain algorithms that have time complexities that look like e.g. In general, if the time complexity of an algorithm is O(f(X)) where X is a characteristic of the input (such as list size), then if that characteristic is bounded by a constant C, the time complexity will be O(f(C)) = O(1).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |