- Open Access
- Authors : Kunal , Tushar , Gargi Chakraborty
- Paper ID : IJERTV10IS040081
- Volume & Issue : Volume 10, Issue 04 (April 2021)
- Published (First Online): 15-04-2021
- ISSN (Online) : 2278-0181
- Publisher Name : IJERT
- License: This work is licensed under a Creative Commons Attribution 4.0 International License
Decomposed Algorithm for Reducing Time Complexity in Binary Search
Kunal 1, Tushar 2, Gargi Chakraborty 3
1Department of Computer Science, Calcutta Institute of Engineering and Management
2Department of Computer Science, Amity University Chhattisgarh
3Department of Computer Science, Calcutta Institute of Engineering and Management
Abstract: Searching is one of the principal tasks in computer science. Binary Search is one of the most common and efficient algorithms used. Binary Search targets the middle element and checks for the target key in the list. The worst- case Time Complexity of Binary Search is O(log(n)) where n less length of the search list. In this paper, we have proposed a fast and efficient approach to binary Search by decomposing the main search list into multiple search lists. The Time Complexity for the proposed algorithm is O(log(k)) where k < n and n is the length of the sorted list and k is the length of the sub-list.
to start from zeroeth index to the end of the list. The Time Complexity of Linear Search is O(n).
Keywords:- Binary search, algorithm efficiency, sorting and searching, time complexity, decomposed algorithm.
2. Binary Search: –
Binary search is one of the well-known searching techniques used to search the key value from the sorted set of data. The algorithms for Binary search can be found in many classic Textbooks such as Algorithms in C  and Introduction to Algorithms, Third Edition . The algorithm of classical binary search constantly targets and checks the mid element of the sorted list. The Binary search can be implemented both iteratively and recursively.
The Proposed algorithm works very similarly to the Binary Search. It searches the list using an element as its basis and comparing whether or not present in the list in a given iteration. However, unlike the Binary search the presented algorithm instead of searching the whole list it searches in the decomposed list which is always smaller than the original list.
The time complexity in the worst-case scenario in Linear search, Binary search, and the decomposed binary search are (n), (log(n)), and (log(K)) respectively where K< n and n is the length of the sorted list on which the searches are implemented and K is the length of the decomposed searching list.
(A) POPULAR OF SEARCHING APPROACH
There are multiple techniques for searching data. Each searching techniques have their pros and cons. The most commonly used searching algorithms used are:- Linear Search, Binary Search, Binary Search Tree, and more. We will discuss some of these popular searching algorithms in this section.
Linear Search is a widely used searching algorithm in computer science. In Linear search, each element is checked
In classical Binary Search, the complete search array is repeatedly divided into halves. The key is searched in the left half if the middle element is smaller than the key else key is searched in the right half. The process of division of left and right half continues till the key is found as per the condition of the middle element and search key.
Donald Knuth proposed Uniform Binary Search. The proposed algorithm stores the index of the middle element and changes in the middle element between the next two iterations. The main disadvantage of this algorithm is that the proposed methodology is only quick in only in those cases where it is unfit to calculate the middle point. 
Bently et al. proposed Exponential Search in 1976. They proposed the new searching algorithm by
targeting the first segment with an index which is the power of two and greater than the target value and then shifts to binary Search. The main disadvantage of this proposed algorithm is that it is only fast than binary search when the target value is near the starting index of the array. 
Chazelle et al. proposed Fractional Cascading in 1986. The authors proposed an efficient searching technique. The proposed algorithm splits the sorted array into multiple sorted arrays and performs the searching operation in each sorted array. The main disadvantage of this proposed algorithm is that it needs some extra space which increases the space complexity and so increases the expense of searching.
Praveen Kumar proposed Quadratic Search in 2013. The proposed algorithm is the advancement of binary search. The algorithm targets 1/4th, middle, and 3/4th elements of the sorted array and searches for the target element, if the target element is there in the target index then it will return the index of the target element or else continue to search by reducing search space as per the condition. The biggest disadvantage of this algorithm is that a lot of condition is to be checked to reduce the array to find the target element and also have to maintain more pointers.
Nitin Arora et al. proposed Two way linear Search in 2014. The Authors proposed a searching algorithm in which they used two pointers, first at the beginning and the other at the end of the search array. The first pointer is incremented while the second pointer is decremented in each iteration to find the target element in the search array. The major disadvantage of this proposed algorithm is maintaining multiple pointers in linear search.
Alwin Francis et al. proposed Modulo Ten Search
An Alternate to Linear Search in 2011. The authors proposed a searching algorithm for finding the target element in the search array by getting the modulus Ten of the number and the list will be divided into ten lists based on modulo ten of the number.
Karthick Sowndarajan proposed Odd-Even Based Binary Search in 2016. The authors proposed a searching algorithm by sorting the list into the odd part and even part and then implement the Binary Search technique based on the type of search element. If the search element is odd then the binary search is only executed in the odd part of the array and the same for the even key element, binary search will be executed in the even part of the array. The proposed algorithm works efficiently only if the array is provided in sorted on an odd-even basis.
Ankit R. Chadha et al. proposed the Modified Binary Search Algorithm in 2014. The proposed algorithm checks the presence of the input element with the middle element of the search array in each iteration. This proposed algorithm optimizes the
time complexity at the worst case of the classical binary search algorithm by comparing the input element with the first and last of the data set. 
Similarly, there are much research works on searching techniques for different purposes e.g Fibonacci Search, Quantum binary Search, ASH Search. Each algorithm has its pros and cons.
In the proposed Decomposed binary search approach first, we derive the binary equivalent of the length of the main list. Each of the 1s in the binary representation is converted to the power of two corresponding to their positions where the index starts from 0 to K-1 and only the values greater than or equal to one is stored in the list. The reverse of the new list created is the decomposed list. The main idea behind the creation of the decomposed list is to reduce the search time by reducing the search space. So the Decomposed List consists of the distribution of indices from the main list where the key could be fond out. So each element in the Decomposed List will be represented as the index of the main list whose value can be equal to or less than the key. If the value of the element in the Main List referenced by the Decomposed List is equal to Key then the key is found or else if the value is less than the Key then the element referenced from the Decomposed list in the Main list becomes the High Value and the value of One index less than the referenced index in the decomposed list becomes the Low value. And if the key is less than the first value referenced from the first Index of the Decomposed list as the index of the main list then it becomes the High value and Zero becomes the Low value. Now, we have the High and the Low value which we will pass to the binary search function which in turn will reduce the time complexity by only searching the sublist of the main list. So by comparing the Decomposed list value as an index of the main list from the key we can say directly say where is the probability of the element found which is way faster than searching the whole list.
PROPOSED ALGORITHM DecomposedBinarySearch(List, N, B, B_N, Search) Purpose : To get the subset of the given array for which the possibility of occurrence of the element is highest
Input : List[0N-1] the list of sorted elements.
: N the number of elements in the list.
: B Binary Representation of the length of the list.
: B_N No. of ones in the
: Search the key to be searched.
: RevDec[0K-1] Reverse of
Output : Decomp_list[0K-1] the list of the decomposed set elements in the power of two of the length of the main list.
: High the Upper Bound of the List for the Binary Search Function.
: Low the Lower Bound of the List for the Binary Search Function.
: K the number of decomposed elements in the Decomp_list.
Step 1 : [Check whether the total number of ones
in the binary representation of the length of the array is greater than one]
If B_N <= 1 then
Print Decomposed Binary Search not possible.
Goto Step 10 Endif
Step 2 : [Initialize the variables] Count -1
Length length of B Count_decomp_list 0
Step 3 : Repeat Step 4 for i0 to Length-1
Step 4 : [Creating a list for each '1' present in the
binary representation with its equivalent base 10 value]
If B % 10 = 1 then
Count Count +1 Decomp_list[i] 2^Count B B / 10
Count_decomp_list Count_decomp_list + 1 Else
Consider a list of 13 elements in sorted order arranged in ascending order.
The binary representation of the length of the list is 1101. Here the position of maxterm is at 3 so we convert it to the power of 2 which is 2^3 = 8 which means the first subarray will contain 8 elements whose index will start from 0 to 8-1
= 7. The next position of one is at 2nd position the power of two will be 2^2 = 4 which means the second subarray will contain 4 elements with minterm 7+1=8 and maxterm 7+4 =
11. The last one is at the minterm which is at position 0 so we have 2^0 = 1 term in the last sub-array having minterm as 11+1=12 and max term as 11+1=12.
Count Count + 1 B B / 10[End of Step 4] Step 5 : Count 0
Step 6 : Repeat Step 7 for i 0 to Length -1 for reversed Decomposed List
Step 7 : Count Count + RevDec[i] RevDec[i] Count
Step 8 : Repeat Step 9 for i0 to Count_decomp_list Step 9 : [Slice the Main array to the range where the
element is likely to be found]
If List[RevDec[i]] = search then High RevDec[i]
Goto Step 9
Else if search < List[RevDec] then High RevDec-1
Else if search > List[RevDec[i]] and search
< List[RevDec[i+1]] then High RevDec[i+1]-1
Now, let us say we have to find Key=56 from the main list. Our algorithm will first compare the element at index 7 in the main List which we can see is Main_list[**Decomp_list = 7]=57>Key hence the High
= 7 and Low =0 is returned which is then passed to the binary_search function as an argument. Where Decomp_list is the first element of the decomposed list formed.[**Note: In the algorithm, we have used RevDec list to denote the Decom_list to ease the calculation]
Endif[End of Step 9 loop] Step 10 : Exit
Once the DecomposedBinarySearch algorithm is executed we will get the High and the Low value for the Main list which is then passed to the classical Binary search Algorithm reducing the space and time complexity of the search.
For experimental analysis, the proposed decomposed binary search algorithm is codded in Python 3.9.4 with configuration i5 preprocessor (2.50GHz Intel processor with 8MB cache memory) and 8GB RAM(DDR4 2933Mhz). We
had tested our algorithm with datasets with more than 1000 pages to find different search keys and compared the steps encountered in the classical binary search to our proposed decomposed binary search.
Here we took a random dataset of 1023 pages and applied classical binary search and our proposed decomposed binary search to find 1851 in this dataset. We found that Binary Search took 9 steps and our binary search took only 6 steps to find the index for the same given key after the decomposed algorithm is implemented on the main list.
Here again, we took the same dataset but this time we applied both the approach with different key i.e 2236, and again we found that our proposed algorithm is efficient over the classical binary search approach. Classical Binary search takes 9 steps to identify that the search key is not present in the given dataset where our proposed approach identifies the same in 6 steps.
Now, again we took another key 3008 in the same dataset and found that the classical binary search approach took 9 steps to find the index of the search key where our proposed approach took only 1 step to find the index of the search key when the decomposed algorithm is implemented.
After applying multiple datasets and different keys in the classical binary search and our proposed searching algorithm, we found that the algorithm we proposed to find the search element in the datasets is more efficient and faster. The Time complexity of the classical binary search is O(log
where the Time complexity of our algorithm is O(log(k)) where k < n and n is the length of the sorted list and k is the length of the sub-list.
Robert Sedgewick. 2002. Algorithms in C (3rd. ed.). Addison- Wesley Longman Publishing Co., Inc., USA.
Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. 2009. Introduction to Algorithms, Third Edition (3rd. ed.). The MIT Press.
D.E. Knuth, The Art of Computer Programming, Vol. 3: Sorting and Searching, Addison Wesley, 1973
Jon Louis Bentley, Andrew Chi-Chih Yao, An almost optimal algorithm for unbounded searching, Information Processing Letters, Volume 5, Issue 3,1976, Pages 82-87, ISSN 0020- 0190,https://doi.org/10.1016/0020-0190(76)90071-5.
B. Chazelle. L. J. Guibas, Fractional Cascading: A data structure technique, Algorithm, Vol 1, Issue 1-4, pp. 113-162, November 1986
Parveen Kumar, " Quadratic Search: A New and Fast searching Algorithm (An extension of classical Binary search strategy)".International Journal of Computer Applications 65, no.14 (2013): 43-46.
Arora, Nitin & Bhasin, Garima & Sharma, Neha. (2014). Two-way Linear Search Algorithm. International Journal of Computer Applications. 107. 6-8. 10.5120/19137-9622.
A. Francis and R. Ramachandran, "Modulo Ten Search- An Alternative to Linear Search,"2011 International Conference on Process Automation, Control and Computing, Coimbatore, India, 2011, pp. 1-4, DOI: 10.1109/PACC.2011.5979034.
Karthick Sowndarajan (2016). Odd-Even-based Binary Search. International Journal of Computer Engineering & Technology (IJCET). 7. 40-55.
Ankit R Chadha & Rishikesh Misal & Tanaya Mokashi, (2014). Modified Binary Search Algorithm. International Journal of Applied Information Systems. 7. 10.5120/ijais14-451131.
David E. Ferguson, Fibonaccian searching, Communications of ACM, Vol. 3, Issue 12, NY, USA, DOI: 10.1145/367487.367496, 1960.
Vladimir Korepin, Ying Xu, Binary Quantum Search, International Journal of Modern Physics B, Vol. 21, ISSN 5187- 5205, DOI: 10.1117/12.717282, May 2007.
Ashar Mehmood. ASH Search: Binary Search Optimization. International Journal of Computer Applications 178(15):10-17, DOI: 10.5120/ijca2019918788, May 2019.