There are several different data structures or collections that we use in our daily programming environment but when it comes to dealing with 100000 of records, the amount of time taken really matters. The time taken for an operation relatively depends on the time complexity of performing an operation on the data structure that you have used.

There are many articles available that talk on how to calculate the complexities and also available sources on the complexity for every data structure. This article is just a brief of the complexities of the commonly used Data Structures.

  • For a LinkedList the methods add and remove have a complexity of O(1), while searching for an item in a linked list is O(n), this also is the same for a get function

EDIT 1:
Inserting an item at the beginning of the list has big-O of 1 (constant time), while inserting it at the end of the list if of-course O(n) as you would have to iterate through the list to reach the end. If you have to insert an item in the middle of the list or at a particular position, the time complexity is O(n) also as the same rule applies as when adding an item to the end of the list. You can consider this to be equally same when deleting an item from a linked list. It is ideally said to have constant time when deleting and item or inserting if you already know the location of that item.

EDIT 2:
If you however use the inbuilt iterators for the linked list to perform an insert and delete, the time complexity will be O(1) -> constant. For instance, using the methods addFirst(), removeFirst(), removeLast(), remove(), the time complexity would be constant time, as they have been created in a way such that it holds reference to the nodes. While add() would, add an element to the end of the LinkedList, and hence complexity is O(1) as opposed to the method add(int index, Object element).

  • For an array list with an underlying implementation of an Array, inserting an element at the first position is O(n) since it has to move all the elements to the next positions, while inserting at the end is an amortized constant time O(1), since if the array list is full, a new copy is created , the other elements copied to it, and then the new element added (in this special case it turns out to be 0(n) but it is generally O(1))
  • For a stack, insertions and deletions and all other functions take place in constant time O(1) except for getSize() which has a complexity of O(n) in time
  • An array will have the same time complexity as an array list, just that if the array is sorted then the time complexity of searching an item can be reduced to O(log n) if binary search is used
  • All set operations have a time complexity of constant time which is O(1). A set is an implementation of a HashSet which does not allow duplicate values.
  • All the operations of a TreeSet have a complexity of O(log n). A TreeSet is an implementation of a Set which performs a search using a binary search
  • In the case of a HashMap, if the hashing function performs at constant time, the complexity of putting an item into a hashmap can be termed as O(1), constant too. However, the complexity when getting an item from the hashmap can vary depending on how many items exist in a bucket. If many items, then the complexity can be ideally termed as O(n) as the hashmap would go through all the items in the bucket to return to you the value you are looking for, else it would be constant time O(1) if all is well.

References:

  1. https://philipstel.wordpress.com/2011/03/07/determining-the-complexity-of-an-algorithm-the-basic-part/
  2. https://simplenotions.wordpress.com/2009/05/13/java-standard-data-structures-big-o-notation/
  3. https://stackoverflow.com/questions/33987542/time-complexity-of-deletion-in-a-linked-list
  4. https://stackoverflow.com/questions/322715/when-to-use-linkedlist-over-arraylist
  5. https://docs.oracle.com/javase/7/docs/api/java/util/LinkedList.html#add(E)
Advertisements