Getting a default value on index out of range in Python

<>

This question already has an answer here:

Answers


In the Python spirit of "ask for forgiveness, not permission", here's one way:

try:
    b = a[4]
except IndexError:
    b = 'sss'

In the non-Python spirit of "ask for permission, not forgiveness", here's another way:

b = a[4] if len(a) > 4 else 'sss'

In the Python spirit of beautiful is better than ugly

Code golf method, using slice and unpacking (not sure if this was valid 4 years ago, but it is in python 2.7 + 3.3)

b,=a[4:5] or ['sss']

Nicer than a wrapper function or try-catch IMHO, but intimidating for beginners. Personally I find tuple unpacking to be way sexier than list[#]

using slicing without unpacking:

b = a[4] if a[4:] else 'sss'

or, if you have to do this often, and don't mind making a dictionary

d = dict(enumerate(a))
b=d.get(4,'sss')

another way:

b = (a[4:]+['sss'])[0]

You could create your own list-class:

class MyList(list):
    def get(self, index, default=None):
        return self[index] if len(self) > index else default

You can use it like this:

>>> l = MyList(['a', 'b', 'c'])
>>> l.get(1)
'b'
>>> l.get(9, 'no')
'no'

You could also define a little helper function for these cases:

def default(x, e, y):
    try:
        return x()
    except e:
        return y

It returns the return value of the function x, unless it raised an exception of type e; in that case, it returns the value y. Usage:

b = default(lambda: a[4], IndexError, 'sss')

Edit: Made it catch only one specified type of exception.

Suggestions for improvement are still welcome!


For a common case where you want the first element, you can do

next(iter([1, 2, 3]), None)

I use this to "unwrap" a list, possibly after filtering it.

next((x for x in [1, 3, 5] if x % 2 == 0), None)

or

cur.execute("SELECT field FROM table")
next(cur.fetchone(), None)

try:
    b = a[4]
except IndexError:
    b = 'sss'

A cleaner way (only works if you're using a dict):

b = a.get(4,"sss") # exact same thing as above

Here's another way you might like (again, only for dicts):

b = a.setdefault(4,"sss") # if a[4] exists, returns that, otherwise sets a[4] to "sss" and returns "sss"

I’m all for asking permission (i.e. I don’t like the try…except method). However, the code gets a lot cleaner when it’s encapsulated in a method:

def get_at(array, index, default):
    if index < 0: index += len(array)
    if index < 0: raise IndexError('list index out of range')
    return array[index] if index < len(a) else default

b = get_at(a, 4, 'sss')

Since this is a top google hit, it's probably also worth mentioning that the standard "collections" package has a "defaultdict" which provides a more flexible solution to this problem.

You can do neat things, for example:

twodee = collections.defaultdict(dict)
twodee["the horizontal"]["the vertical"] = "we control"

Read more: http://docs.python.org/2/library/collections.html


If you are looking for a maintainable way of getting default values on the index operator I found the following useful:

If you override operator.getitem from the operator module to add an optional default parameter you get identical behaviour to the original while maintaining backwards compatibility.

def getitem(iterable, index, default=None):
  import operator
  try:
    return operator.getitem(iterable, index)
  except IndexError:
    return default

Using try/catch?

try:
    b=a[4]
except IndexError:
    b='sss'

If you are looking for a quick hack for reducing the code length characterwise, you can try this.

a=['123','2',4]
a.append('sss') #Default value
n=5 #Index you want to access
max_index=len(a)-1
b=a[min(max_index, n)]
print(b)

But this trick is only useful when you no longer want further modification to the list


Need Your Help

Read a double from 32-bit address flash memory

c pointers types type-conversion cortex-m3

I have a 32-bit microcontroller and have just written a double to the flash memory. I now want to read the double back from memory, but I am doing something illegal as the microcontroller goes to the

Spark Streaming not distributing task to nodes on cluster

apache-spark spark-streaming rdd dstream

I have two node standalone cluster for spark stream processing. below is my sample code which demonstrate process I am executing.