# finding the maximum number of 1's in a row

A 2d matrix is given filled with 1's and 0's. It is given that all 1's in a row come before all the 0's. We have to find the maximum number of 1's in a row.

I have made the solution that we can apply binary search on every row to get the last index of last 1 in that row before 0's begin and hence the no. of 1's will be its index+1. So we can do this at every row. So the complexity would be O(mlogn),where m is the no. of rows and n is the no. of columns. Can there be a better solution to this?

It can be done in O(n+m).

Then process rows one by one. Increase curmax while there are at least curmax ones in that row, i.e. checking if curmax value is one.

The answer will be curmax-th, after all rows are processed.

This will work in O(n+m).

Will it be quicker than O(m*logn)? It depends. If m is less then n/(log(n) - 1) it will work, actually longer then O(m*log n) and quicker otherwise, just in terms of complexity. Considering constants is another problem, when approximating time. So for n and m of one magnitude this will be quicker, for different there is only one choice - try both, pick better.