Sql updating large number of rows

posted by | Leave a comment

This maximizes your table availability since you only need to keep locks for a short amount of time.

In case of adding a new column, you can set it temporarily as nullable and start gradually filling it with new values.

By updating more than 20% rows in the table you are triggering a statistics update.

When you update a value in a column, Postgres writes a whole new row in the disk, deprecates the old row and then proceeds to update all indexes.

If you have a table with hundreds of millions of rows you will find that simple operations, such as adding a column or changing a column type, are hard to do in a timely manner.

Doing these kind of operations without downtime is an even harder challenge.

Taking a bit of inspiration from this post I have put together this simple PHP function that will allow you to update as many rows as you want.

The trade off is basically memory usage as you build up a big array of row update information preparing for the batch.

Leave a Reply

Xxx rated webcams